Discrete Mumford-Shah Model: from Image to Graph Analysis · Di culty to label data. For Physics or...

50
Discrete Mumford-Shah Model: from Image to Graph Analysis Nelly Pustelnik CNRS, Laboratoire de Physique de l’ENS de Lyon Graph signals : learning and optimization perspectives May, 2nd 2019

Transcript of Discrete Mumford-Shah Model: from Image to Graph Analysis · Di culty to label data. For Physics or...

  • Discrete Mumford-Shah Model: from Image toGraph Analysis

    Nelly PustelnikCNRS, Laboratoire de Physique de l’ENS de Lyon

    Graph signals : learning and optimization perspectivesMay, 2nd 2019

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Motivation

    Defi Imag’IN CNRS SIROCCO (2017-2018): Multiphasic flow experimentmodeling gas and liquid in a porous medium.

    K

    z = η1

    z = η2

    • Goal : classify gas/liquid + accurate estimation of the perimeter.• Datasize: Image composed with 2.107 pixels. Analysis to be

    performed on a sequence of images.2/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Motivation

    Defi Infinity CNRS OptMatGeo (2018): vote transfer matrix estimationbetween two elections

    • Goal : clustering the areas with similar transfer matrix (i.e. similarelectoral behaviour) + sharp transitions.

    • Datasize: ∼ 2.1073/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Motivation

    K

    z = η1

    z = η2

    • Structured data (e.g. images, graphs).• Clustering + accurate interface detection.• Difficulty to label data.• For Physics or societal applications, the truth is not as sharp as

    clustering but it is closer from regression (smooth behaviour) andpossibly sharp transition.

    • Huge amount of data.

    Goal: Design algorithmic solutions with convergence guarantees to

    extract piecewise smooth behaviour.4/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Mumford-Shah (1989)

    minimizeu,K

    1

    2

    ∫Ω

    (u− z)2dxdy︸ ︷︷ ︸fidelity

    ∫Ω\K|∇u|2dxdy︸ ︷︷ ︸

    smoothness

    +λH1(K ∩ Ω)︸ ︷︷ ︸length

    • Ω: image domain,• z ∈ L∞(Ω): data,• u ∈W 1,2(Ω): piecewise smooth approximation of z,

    W 1,2(Ω) ={u ∈ L2(Ω) ∂u ∈ L2(Ω)

    }where ∂ weak derivative operator

    • K : set of discontinuities,• H1: Hausdorff measure.

    5/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Discrete Mumford-Shah like models

    minimizeu,K

    12

    ∫Ω(u− z)

    2dxdy + β∫

    Ω\K |∇u|2dxdy + λH1(K ∩ Ω)

    • Potts model (1952):[Rudin et al., 1992] [Cai, Steidl, 2013] [Storath, Weinmann, 2014]

    minimizeu

    1

    2‖u− z‖22 + γ‖Du‖0

    • Blake-Zisserman problem (1987):[Strekalovskiy, Cremers, 2014] [Hohm et al., 2015]

    minimizeu

    1

    2‖u− z‖22 + γ

    ∑i

    min(|(Du)i |p, αp)

    • Ambrosio-Tortorelli (1990)[Foare et al., 2016]

    minimizeu,e

    1

    2‖u− z‖22 + β‖(1− e)� Du‖2 + λ

    (ε‖D̃e‖22 +

    1

    4ε‖e‖22

    )6/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Mumford-Shah like models: summary

    Potts Blake-Zisserman Ambrosio-Tortorelli

    Smooth estimate X V V

    Data-term flexibility X X X

    Convergence V X X

    Large scale dataset V V X

    Open contours X V V

    ⇒ Revisit Ambrosio-Tortorelli model in order to provide a large-scaleflexible convergent discrete MS like model .

    7/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed Discrete Mumford-Shah like (D-MS) model

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2 + λR(e)

    • Ω = {1, . . . ,N1} × {1, . . . ,N2};• z ∈ R|Ω|: input data = image;• u ∈ R|Ω|: piecewise smooth approximation of z;• L: data fidelity term;• D ∈ R|E|×|Ω|: models a finite difference operator;• e ∈ R|E|: edges between nodes whose value is 1 when a contour

    change is detected and 0 otherwise;• R: favors sparse solution (i.e.“short |K |”).

    8/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed Discrete Mumford-Shah like (D-MS) model

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2 + λR(e)

    • Ω = {1, . . . ,N1} × {1, . . . ,N2};• z ∈ R|Ω|: input data = image;• u ∈ R|Ω|: piecewise smooth approximation of z;• L: data fidelity term;• D ∈ R|E|×|Ω|: models a finite difference operator;• e ∈ R|E|: edges between nodes whose value is 1 when a contour

    change is detected and 0 otherwise;

    • R: favors sparse solution (i.e.“short |K |”).

    ⇒ Identify the assumptions on L and R to design an algorithmicscheme with convergence guarantees (confidence in the solution).

    8/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: Gauss-Seidel iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • Gauss-Seidel scheme = coordinate descentSet e[0] ∈ R|E|.For k ∈ N⌊

    u[k+1] ∈ arg minu Ψ(u, e[k])e[k+1] ∈ arg mine Ψ(u[k+1], e)

    • Under technical assumptions, convergence of the sequence(u[k], e[k])`∈N to a critical point (u

    ∗, e∗) of Ψ.

    Technical assumptions = minimum is attained at each iteration, e.g.by assuming strict convexity w.r.t one argument.

    9/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: Gauss-Seidel iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • PAM [Attouch et al. 2010]Set e[0] ∈ R|E| and u[0] ∈ R|Ω|.For k ∈ N⌊

    u[k+1] ∈ arg minu Ψ(u, e[k]) + ck2 ‖u− u[k]‖2

    e[k+1] ∈ arg mine Ψ(u[k+1], e) + dk2 ‖e− e[k]‖2

    • Under technical assumptions, the sequence (u[k], e[k])`∈N converges toa critical point (u∗, e∗) of Ψ.

    Technical assumptions = closed form of the proximity operator.

    10/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator

    Definition [Moreau,1965] Let ϕ ∈ Γ0(H) where H denotes a real Hilbertspace. The proximity operator of ϕ at point x ∈ H is the unique pointdenoted by proxϕx such that

    (∀x ∈ H) proxϕx = arg miny∈H

    ϕ(y) +1

    2‖x − y‖2

    11/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator

    Definition [Moreau,1965] Let ϕ ∈ Γ0(H) where H denotes a real Hilbertspace. The proximity operator of ϕ at point x ∈ H is the unique pointdenoted by proxϕx such that

    (∀x ∈ H) proxϕx = arg miny∈H

    ϕ(y) +1

    2‖x − y‖2

    Examples: closed form expression• proxλ‖·‖1 : soft-thresholding with a fixed threshold λ > 0.

    0 0.5 1 1.5 2 2.5

    ×104

    -10

    -8

    -6

    -4

    -2

    0

    2

    4

    6

    8

    10

    Identity

    Soft-thresholding

    λ

    αi

    • prox‖·‖1,2 [Peyré,Fadili,2011].• prox‖‖pp with p = {

    43 ,

    32 , 2, 3, 4}[Chaux et al.,2005].

    • proxDKL [Combettes,Pesquet,2007].• prox∑

    g∈G ‖·‖q with overlapping groups [Jenatton et al., 2011]

    • Composition with a linear operator: proxϕ◦L closed form if LL∗ = νId[Pustelnik et al., 2016]

    11/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator

    Definition [Moreau,1965] Let ϕ ∈ Γ0(H) where H denotes a real Hilbertspace. The proximity operator of ϕ at point x ∈ H is the unique pointdenoted by proxϕx such that

    (∀x ∈ H) proxϕx = arg miny∈H

    ϕ(y) +1

    2‖x − y‖2

    Examples: closed form expression• proxλ‖·‖1 : soft-thresholding with a fixed threshold λ > 0.• prox‖·‖1,2 [Peyré,Fadili,2011].• prox‖‖pp with p = {

    43 ,

    32 , 2, 3, 4}[Chaux et al.,2005].

    • proxDKL [Combettes,Pesquet,2007].

    • prox∑g∈G ‖·‖q with overlapping groups [Jenatton et al., 2011]

    • Composition with a linear operator: proxϕ◦L closed form if LL∗ = νId[Pustelnik et al., 2016]

    11/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator

    Definition [Moreau,1965] Let ϕ ∈ Γ0(H) where H denotes a real Hilbertspace. The proximity operator of ϕ at point x ∈ H is the unique pointdenoted by proxϕx such that

    (∀x ∈ H) proxϕx = arg miny∈H

    ϕ(y) +1

    2‖x − y‖2

    Examples: closed form expression• proxλ‖·‖1 : soft-thresholding with a fixed threshold λ > 0.• prox‖·‖1,2 [Peyré,Fadili,2011].• prox‖‖pp with p = {

    43 ,

    32 , 2, 3, 4}[Chaux et al.,2005].

    • proxDKL [Combettes,Pesquet,2007].• prox∑

    g∈G ‖·‖q with overlapping groups [Jenatton et al., 2011]

    • Composition with a linear operator: proxϕ◦L closed form if LL∗ = νId[Pustelnik et al., 2016]

    11/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator

    Definition [Moreau,1965] Let ϕ ∈ Γ0(H) where H denotes a real Hilbertspace. The proximity operator of ϕ at point x ∈ H is the unique pointdenoted by proxϕx such that

    (∀x ∈ H) proxϕx = arg miny∈H

    ϕ(y) +1

    2‖x − y‖2

    Examples: closed form expression• proxλ‖·‖1 : soft-thresholding with a fixed threshold λ > 0.• prox‖·‖1,2 [Peyré,Fadili,2011].• prox‖‖pp with p = {

    43 ,

    32 , 2, 3, 4}[Chaux et al.,2005].

    • proxDKL [Combettes,Pesquet,2007].• prox∑

    g∈G ‖·‖q with overlapping groups [Jenatton et al., 2011]

    • Composition with a linear operator: proxϕ◦L closed form if LL∗ = νId[Pustelnik et al., 2016]

    11/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator → nonconvex setting

    Proximal map for nonconvex functions [Rockafellar,Wets,1998]

    Let f : RN →] −∞,+∞] be a proper, l.s.c function with infRN f > −∞.Given x ∈ RN and γ > 0, the proximal map associated to f is defined as

    proxγf (x) = arg min1

    2‖x − u‖22 + γf (u)

    • The set proxγf (x) is nonempty and compact.• The set proxγf (x) is a set-valued map.• If f is also convex: unicity of proxγf (x).

    12/40

    https://www.springer.com/us/book/9783540627722

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: PAM iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β ‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • PAM [Attouch et al. 2010]Set e[0] ∈ R|E| and u[0] ∈ R|Ω|.For k ∈ N⌊

    u[k+1] ∈ arg minu Ψ(u, e[k]) + ck2 ‖u− u[k]‖2

    e[k+1] ∈ arg mine Ψ(u[k+1], e) + dk2 ‖e− e[k]‖2

    • Under technical assumptions, the sequence (u[k], e[k])`∈N converges toa critical point (u∗, e∗) of Ψ.

    Difficulty = proximity operator of a sum of two functions.

    13/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: PAM iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β ‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • PAM [Attouch et al. 2010]Set e[0] ∈ R|E| and u[0] ∈ R|Ω|.For k ∈ N⌊

    u[k+1] ∈ prox 1ck

    Ψ(·,e[k])(u[k])

    e[k+1] ∈ arg mine Ψ(u[k+1], e) + dk2 ‖e− e[k]‖2

    • Under technical assumptions, the sequence (u[k], e[k])`∈N converges toa critical point (u∗, e∗) of Ψ.

    Difficulty = proximity operator of a sum of two functions.

    13/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: PAM iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β ‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • PAM [Attouch et al. 2010]Set e[0] ∈ R|E| and u[0] ∈ R|Ω|.For k ∈ N⌊

    u[k+1] ∈ prox 1ckL(·;z)+ β

    ckS(D·,e)(u

    [k])

    e[k+1] ∈ arg mine Ψ(u[k+1], e) + dk2 ‖e− e[k]‖2

    • Under technical assumptions, the sequence (u[k], e[k])`∈N converges toa critical point (u∗, e∗) of Ψ.

    Difficulty = proximity operator of a sum of two functions.

    13/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: PAM iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β ‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • PAM [Attouch et al. 2010]Set e[0] ∈ R|E| and u[0] ∈ R|Ω|.For k ∈ N u[k+1] ∈ prox 1ck L(·;z)+ βck S(D·,e)(u[k])

    e[k+1] ∈ prox 1dkβS(Du[k+1],·)+ λ

    dkR(e

    [k])

    • Under technical assumptions, the sequence (u[k], e[k])`∈N converges toa critical point (u∗, e∗) of Ψ.

    Difficulty = proximity operator of a sum of two functions.

    13/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proximity operator of a sum of two functions

    proxϕ1+ϕ2 = proxϕ2 ◦ proxϕ1 ?

    • [Combettes-Pesquet, 2007] N = 1, ϕ2 = ιC of a non-empty closedconvex subset of C and ϕ1 is differentiable at 0 with h

    ′(0) = 0.

    • [Chaux-Pesquet-Pustelnik,2009] C and ϕ2 are separable in the samebasis.

    • [Yu, 2013][Shi et al., 2017] ∂ϕ2(x) ⊂ ∂ϕ2(proxϕ1(x)).• Other recent results [Pustelnik, Condat, 2017][Yukawa, Kagami,

    2017][del Aguila Pla, Jaldén, 2017]

    14/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS: PALM iterations

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e)

    • PALM [Bolte, Sabach, Teboulle, 2013]Set u[0] ∈ R|Ω| and e[0] ∈ R|E|.For k ∈ N

    Set γ > 1 and ck = γχ(e[k])

    u[k+1] = prox 1ckL(·;z)

    (u[k] − 1ck∇uS

    (Du[k], e[k]

    ))Set δ > 1 and dk = δν(u

    [k+1])

    e[k+1] = prox 1dkλR

    (e[k] − 1dk∇eS

    (Du[k+1], e[k]

    ))• Under technical assumptions, the sequence (u[k], e[k])`∈N converges to

    a critical point (u∗, e∗) of Ψ.

    15/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed D-MS and algorithmic solution

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e),

    • Proposed Semi Linearized PAM (SL-PAM)

    [Foare, Pustelnik, Condat, 2017]

    Set u[0] ∈ R|Ω| and e[0] ∈ R|E|.For ` ∈ N

    Set γ > 1 and ck = γχ(e[k]).

    u[k+1] = prox 1ckL(·;z)

    (u[k] − 1ck∇uS

    (Du[k], e[k]

    ))Set dk > 0.

    e[k+1] = prox 1dkλR+S(Du[k+1],·)

    (e[k])

    16/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed D-MS and algorithmic solution

    Proposition [Foare, Pustelnik, Condat, 2017]

    The sequence (u[k], e[k])k∈N generated by SL-PAM converges to a criticalpoint of Ψ if

    1. the updating steps of u[k+1] and e[k+1] have closed form expressions;

    2. the sequence (u[k], e[k])k∈N generated by SL-PAM is bounded;

    3. L(A·, z), R and Ψ(·, ·) are bounded below;4. Ψ is a Kurdyka- Lojasiewicz function;

    5. ∇u and ∇e are globally Lipschitz continuous with moduli ν(e)

    and

    ε(u)

    respectively, and for all k ∈ N, ν(e[k])

    and ε(u[k])

    are boundedby positive constants.

    17/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed D-MS and algorithmic solution

    minimizeu,e

    Ψ(u, e) := L(u; z) + β‖(1− e)� Du‖2︸ ︷︷ ︸S(Du,e)

    +λR(e),

    • Proposed Semi Linearized PAM (SL-PAM)[Foare, Pustelnik, Condat, 2017]

    Set u[0] ∈ R|Ω| and e[0] ∈ R|E|.For ` ∈ N

    Set γ > 1 and ck = γχ(e[k]).

    u[k+1] = prox 1ckL(·;z)

    (u[k] − 1ck∇uS

    (Du[k], e[k]

    ))Set dk > 0.

    e[k+1] = prox 1dkλR+S(Du[k+1],·)

    (e[k])

    • The sequence (u[k], e[k])`∈N converges to a critical point (u∗, e∗) of Ψ⇒ Difficulty: Computation prox 1

    dkλR+S(Du[k+1],·)

    18/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed D-MS and algorithmic solution

    Proposition [Foare, Pustelnik, Condat, 2017]We assume that S is separable, i.e,

    (∀e=(ei )1≤i≤|E|) R(e) =|E|∑i=1

    σi (ei ),

    where σi :R|E|→]−∞; +∞] with a closed form proximity operator expres-sion.Let dk > 0, then

    prox 1dkλR+S(Du[k+1],·)(e

    [k]) =

    (prox λσi

    2β(Du[k])2i

    +dk

    β(Du[k+1])2i + dke[k]i2β(Du[k+1])2i +

    dk2

    )i∈E

    19/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed D-MS and algorithmic solution

    minimizeu,e

    L(u; z) + β‖(1− e)� Du‖2 + λR(e)

    • R: favors sparse solution (i.e.“short |K |”) and convex.

    1. Ambrosio-Tortorelli approximation:

    R(e) = ε‖De‖22 +1

    4ε‖e‖22 with ε > 0

    2. `1-norm: R(e) = ‖e‖1

    3. Quadratic `1:[Foare, Pustelnik, Condat, 2017]

    R(e) =∑|E|

    i=1 max

    {|ei |,

    e2i4ε

    }.

    20/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed D-MS and algorithmic solution

    Proposition [Foare,Pustelnik,Condat, 2017]For every η ∈ R and τ, � > 0

    proxτ max{|.|, |.|

    2

    4�}(η) = sign(η) max

    {0,min

    [|η| − τ,max

    (4�,

    |η|τ2� + 1

    )]}

    21/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments

    1. Visual results (proposed versus state-of-the-art) whenL(u; z) = 12‖u− z‖

    22

    2. Convergence comparisons (proposed versus PALM).

    3. Sensitivity to the initialization.

    4. Visual results when L(u; z) = 12‖Au− z‖22.

    22/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments

    23/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 1

    TV (Strekalovskiy, Discret AT Quadratic-`1Cremers, 2014) (Foare et al., 2016)

    24/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 2

    Convergence PALM versus SL-PALM: Ψ(u[`], e[`]) w.r.t. iterations `

    25/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 3

    26/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 4

    27/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS for images: conclusions

    • Efficient (convergence guarantees and time) algorithm for solving ageneric D-MS model.

    • Flexibility in the data-term: possibility to handle many imagedegradation to improve interface detection (Poisson noise, blur,...).

    • Flexibility in the choice of the discrete difference operator consideredhere.

    28/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS for mixing matrix estimation

    • V ={v (n)

    ∣∣ n ∈ {1, . . . ,N}} set of vertices = objects

    • E ={e(n,m)

    ∣∣ (n,m) ∈ E} set of edges = object relationships

    • directed nonreflexive graph:|E| ≤ N(N − 1)

    • Goal: For each node n, estimation of a mixing matrice Mn ∈ RP×Qsuch that sn = Mnzn with (zn, sn) known.

    Example: vote transfer matrix estimation withN : number of vote location

    zn/sn : number of vote for each candidate at the first/second tour

    Mn : vote postponement

    29/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS for mixing matrix estimation

    • Goal: For each node n, estimation of a mixing matrice Mn ∈ RP×Qsuch that sn = Mnzn with (zn, sn) known.Example: vote transfer matrix estimation withN : number of vote location

    zn/sn : number of vote for each candidate at the first/second tour

    Mn : vote postponement29/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS for mixing matrix estimation

    minimizeM,e

    Ψ(M, e) :=N∑

    n=1

    ‖zn −Mnsn‖2︸ ︷︷ ︸L(M;z,s)

    +N∑

    n=1

    ιC (Mn)

    + β∑

    (n,m)∈E

    (1− en,m)2‖Mn −Mm‖2F︸ ︷︷ ︸S(M,e)

    +λR(e)

    Simplified notations:

    minimizeM,e

    Ψ(M, e) := L(M; z, s) + ιC (M) + βS(M, e) + λR(e)

    30/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    D-MS for mixing matrix estimation

    minimizeM,e

    Ψ(M, e) :=N∑

    n=1

    ‖zn −Mnsn‖2︸ ︷︷ ︸L(M;z,s)

    +N∑

    n=1

    ιC (Mn)

    + β∑

    (n,m)∈E

    (1− en,m)2‖Mn −Mm‖2F︸ ︷︷ ︸S(M,e)

    +λR(e)

    Simplified notations:

    minimizeM,e

    Ψ(M, e) := L(M; z, s) + ιC (M) + βS(M, e) + λR(e)

    30/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    HL-PAM for mixing matrice estimation

    minimizeM,e

    Ψ(M, e) := L(M; z, s) + ιC (M) + βS(M, e) + λR(e)

    • Proposed HL-PAM for mixing matrice estimationSet M[0] ∈ RP×Q×N and e[0] ∈ R|E|.For k ∈ N

    Set γ > 1 and ck = γχ(e[k]).

    M[k+1] ∈ prox 1ckL(·;z;s)+ιC

    (M[k] − 1ck∇1S

    (M[k], e[k]

    ))Set dk > 0.

    e[k+1] ∈ prox 1dkλR+S(M[k]+1,·)

    (e[k])

    ⇒ Difficulty : Not a closed form expression for prox 1ckL(·;z;s)+ιC

    31/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed algorithm for mixing matrice estimation

    minimizeM,e

    Ψ(M, e) := L(M; z, s) + ιC (M) + βS(M, e) + λR(e)

    • Proposed HL-PAM for mixing matrice estimationSet M[0] ∈ RP×Q×N and e[0] ∈ R|E|.For k ∈ N

    Set γ > 1 and ck = γχ(e[k]).

    M[k+1] ∈ PC(

    M[k] − 1ck∇1S(M[k], e[k]

    )− 1ck∇L(M

    [k]; z; s))

    Set dk > 0.

    e[k+1] ∈ prox 1dkλR+S(M[k]+1,·)

    (e[k])

    • The sequence (M[k], e[k])`∈N converges to a critical point (∗, e∗) of Ψ.[Kaloga, Foare, Pustelnik, Jensen, 2019]

    32/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Proposed algorithm for mixing matrice estimation

    minM,e

    Ψ(M, e) :=∑n

    (Ln(Mn; zn, sn) + ιCn(Mn) + βSn(Mn, en)

    )+ λR(e)

    • Proposed Block HL2-PAM for mixing matrice estimationSet M

    [0]n ∈ Cn and e[0] ∈ R|E|.

    For k ∈ N

    Set χn(e[k]) the Lipschitz constant of ∇1S(·, e[k]

    )+∇L(·; z; s);

    Set γ > 1 , dk > 0 and ck,n = γχn(e[k]), .

    For n = 1, . . . ,N

    bM[k+1]n ∈ PC(

    M[k]n − 1ck,n∇1S

    (M

    [k]n , e[k]

    )− 1ck∇L(M

    [k]n ; zn; sn)

    )e[k+1] ∈ prox 1

    dkλR+S(M[k]+1,·)

    (e[k])

    • The sequence (M[k], e[k])`∈N converges to a critical point (z∗, e∗) ofΨ. [Kaloga, Foare, Pustelnik, Jensen, 2019]

    33/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments

    1. Visual results of the estimation.

    2. Convergence comparisons (block versus global).

    3. Sensitivity to the initialization.

    34/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 1

    Original Estimated

    35/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 2

    36/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Experiments 3

    37/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Take home messages

    • Proximal alternate scheme with linearization fit D-MS model.• Design of a new algorithm with convergence guarantees for

    minimization problem of the form

    minimizeu,e

    L(u; z) + β‖(1− e)� Du‖2 + λR(e)

    when R is a separable function and L(·; z) has a closed formexpression.

    • Design of a new algorithm with convergence guarantees forminimization problem of the form

    minimizeM,e

    Ψ(M, e) := L(M; z, s) + ιC (M) + βS(M, e) + λR(e)

    when R is a separable function, L is a differentiable with gradientLipschitz function, and PC has a closed form expression.

    • Flexibility in the objective function design.38/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    Open questions

    minimizeu,e

    L(u; z) + β‖(1− e)� Du‖2 + λR(e)

    • Acceleration of this algorithmic schemes.• From MS to semi-supervised learning.• Robustness to the initialization → stronger results in the biconvex

    case ?

    • Discretization scheme to improve interface detection (choice of D).Convergence to the true MS model (cf. [Belz, Bredies;2019]) ?

    • Selection of the hyperparameters (λ, β) using a Bayesian formulation.

    39/40

  • Context Mumford-Shah D-MS for images D-MS for graphs Conclusions

    ReferencesM. Foare, N. Pustelnik, and L. Condat, Semi-linearized proximal alternatingminimization for a discrete Mumford-Shah model, preprint, 2018.

    Y. Kaloga, M. Foare, N. Pustelnik, and P. Jensen, Discrete Mumford-Shah ongraph for mixing matrix estimation, accepted to IEEE Signal Processing Letters,2019.

    N. Pustelnik, L. Condat, Proximity Operator of a Sum of Functions; Application toDepth Map Estimation, IEEE Signal Processing Letters, vol. 24, no. 12, pp. 1827 -1831, Dec. 2017.

    M. Foare, N. Pustelnik, L. Condat A new proximal method for joint imagerestoration and edge detection with the Mumford-Shah model, IEEE ICASSP,Calgary, Alberta, Canada, Apr. 15-20, 2018.

    B. Pascal, N. Pustelnik, P. Abry, M. Serres, V. Vidal Joint estimation of localvariance and local regularity for texture segmentation. Application to multiphaseflow characterization, IEEE ICIP, Athens, Greece, Oct. 7-10, 2018.

    40/40