Introduction Segmentation Detection Representation Tracking Conclusions
Learning Spectral Graph Segmentation
Transcript of Learning Spectral Graph Segmentation
![Page 1: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/1.jpg)
Learning Spectral Graph Segmentation
Timothée CourJianbo Shi
Nicolas Gogin
AISTATS 2005
Computer and Information Science DepartmentUniversity of Pennsylvania
Computer ScienceEcole Polytechnique
![Page 2: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/2.jpg)
2
Graph-based Image Segmentation
• Weighted graph G =(V,W)
• V = vertices (pixels i)• Similarity between pixels i
and j : Wij = Wji ≥ 0
Wij
Segmentation = graph partition of pixels
![Page 3: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/3.jpg)
3
Image I Graph Affinities W=W(I,Θ)
IntensityColorEdges
Texture…
Spectral Graph Segmentation
![Page 4: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/4.jpg)
4
Image I Graph Affinities W=W(I,Θ)
IntensityColorEdges
Texture…
Spectral Graph Segmentation
( , )( , )
cut A BNCut A BVol A Vol B
=×
![Page 5: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/5.jpg)
5
Image I Graph Affinities W=W(I,Θ)
Eigenvector X(W)
Spectral Graph Segmentation
( , )( , )
cut A BNCut A BVol A Vol B
=×
WX DXλ=1 if
( )0 if A
i AX i
i A⎧ ∈⎪⎪= ⎨⎪ ∉⎪⎩
![Page 6: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/6.jpg)
6
Image I Graph Affinities W=W(I,Θ)
Eigenvector X(W)
Spectral Graph Segmentation
( , )( , )
cut A BNCut A BVol A Vol B
=×
WX DXλ=
⎩⎨⎧
∉∈
=AiAi
iX A if 0 if 1
)(
Discretisation
![Page 7: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/7.jpg)
7
Image IGraph Affinities
W=W(I,Θ)Eigenvector
X(W)
Intensity+
Color+
Edges+
Texture+…
Learn graph parameters Θ
Spectral Graph Segmentation
Target segmentation
Reverse pipeline
![Page 8: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/8.jpg)
8
intensity cues edges cues color cues
[Shi & Malik, 97]
Texture cuesMultiscale cues [Yu, 04; Fowlkes 04]
![Page 9: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/9.jpg)
9
Edges cues ? Color cues ? Texture cues ?
Do you use
Where is Waldo ?
-That’s not enough, you need Shape cuesHigh-level object priors
![Page 10: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/10.jpg)
10
Image IGraph Affinities
W=W(I,Θ)Eigenvector
X(W)
Intensity+
Color+
Edges+
Texture+…
Learn graph parameters Θ
Spectral Graph Segmentation
Target segmentation
![Page 11: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/11.jpg)
11
Image I Affinities W=W(I,Θ) Eigenvector X(W)
Θ
Target X* Target P*~W*
[1] Meila and Shi (2001)[2] Fowlkes, Martin, Malik (2004)
-
Error criterion on affinity matrix W
![Page 12: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/12.jpg)
12
Image I Affinities W=W(I,Θ) Eigenvector X(W)
Θ
Target X* Target P*~W*
[1] Meila and Shi (2001)[2] Fowlkes, Martin, Malik (2004)
-
Error criterion on affinity matrix W
Constraining Wij is overkill:
Many W have segmentationW: O(n2) parametersSegments : O(n) parameters
![Page 13: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/13.jpg)
13
Image I Affinities W=W(I,Θ) Eigenvector X(W)
Θ
Target X*
-Error criterion on partition vector X only!
![Page 14: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/14.jpg)
14
Energy function for segmenting 1 image I
* 2( ) || ( ( , )) ( ) ||IE W X W I X I= Θ −
Target X*
-Eigenvector X(W)
![Page 15: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/15.jpg)
15
* 2
images
Min. ( ) || ( ( , )) ( ) ||I
E X W I X IΘ Θ= −∑
Energy function for segmenting 1 image I
* 2( ) || ( ( , )) ( ) ||IE W X W I X I= Θ −
Target X*
-Eigenvector X(W)
![Page 16: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/16.jpg)
16
* 2( ) || ( ( , )) ( ) ||IE W X W I X I= Θ −
Energy function for image I
Target X*
-Eigenvector X(W)
Can use gradient descent…
* 2
images
Min. ( ) || ( ( , )) ( ) ||I
E X W I X IΘ Θ= −∑
( )...but ( ) is only implicitly defined, by ( , ) - 0, ( , )
X WW I D X
X f WλΘ =
≠ Θ Can Not backtrack easily
![Page 17: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/17.jpg)
17
Gradient descent:
E E X WX W
η η∂ ∂ ∂ ∂∆Θ=− =−∂Θ ∂ ∂ ∂Θ
* 2
images
Min. ( ) || ( ( , )) ( ) ||I
E X W I X IΘ Θ= −∑
![Page 18: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/18.jpg)
18
Gradient descent:
E E X WX W
η η∂ ∂ ∂ ∂∆Θ=− =−∂Θ ∂ ∂ ∂Θ
* 2
images
Min. ( ) || ( ( , )) ( ) ||I
E X W I X IΘ Θ= −∑
X(W) is implicit
E(X) is quadratic depends on W(Θ)
![Page 19: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/19.jpg)
19
( )
( )
1
†
The map W (X, ) is C over and we can express the derivatives over any C path ( ) as :
' '( ( ))
( ( )) = ' '
T
T
W t
X W D Xd W tdt X DX
dX W t dW D W Ddt
λ
λλ
λλ λ
∞→ Ω
−=
− − − −
Theorem : Derivative of NCut eigenvectors
D Xdt
⎞⎛ ⎟⎜ ⎟⎜ ⎟⎟⎜⎝ ⎠
E X WX W
η ∂ ∂ ∂∆Θ=−∂ ∂ ∂Θ
![Page 20: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/20.jpg)
20
2 2 2 2
Feasible set in the space of graph weight matrices : iff 1) is symmetric matrix
2) 0 3) is single with
WW n nW
WX DXλ λ
∈Ω×
>=
1
( )
( )
1
†
The map W (X, ) is C over and we can express the derivatives over any C path ( ) as :
' '( ( ))
( ( )) = ' '
T
T
W t
X W D Xd W tdt X DX
dX W t dW D W Ddt
λ
λλ
λλ λ
∞→ Ω
−=
− − − −
Theorem : Derivative of NCut eigenvectors
D Xdt
⎞⎛ ⎟⎜ ⎟⎜ ⎟⎟⎜⎝ ⎠
E X WX W
η ∂ ∂ ∂∆Θ=−∂ ∂ ∂Θ
![Page 21: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/21.jpg)
21
( )
( )
1
†
The map W (X, ) is C over and we can express the derivatives over any C path ( ) as :
' '( ( ))
( ( )) = ' '
T
T
W t
X W D Xd W tdt X DX
dX W t dW D W Ddt
λ
λλ
λλ λ
∞→ Ω
−=
− − − −
Theorem : Derivative of NCut eigenvectors
D Xdt
⎞⎛ ⎟⎜ ⎟⎜ ⎟⎟⎜⎝ ⎠
E X WX W
η ∂ ∂ ∂∆Θ=−∂ ∂ ∂Θ
[Meila, Shortreed, Xu, ’05]
![Page 22: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/22.jpg)
22
2 2| | | | e j jx i iyx x y yijW σ σ− − − −=
100 points at ( , )i ix y
( )parameters : ,x yσ σΘ=
Task 1: Learning 2D point segmentation
![Page 23: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/23.jpg)
23
( )2 2Learn exp ( ) ( )xij i j i jyW x x y yσ σ= − − − −
( ) * 2x y x y( , ) || ( , ) ||E X W Xσ σ σ σ= −
Ground truth segmentation
X* and X(W) learned
![Page 24: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/24.jpg)
24
Learning 2D point segmentation
( )2 2Learn exp ( ) ( )xij i j i jyW x x y yσ σ= − − − −
( ) * 2x y x y( , ) || ( , ) ||E X W Xσ σ σ σ= −
Ground truth segmentation
X* and X(W) learned
![Page 25: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/25.jpg)
25
( )
2
The PDE either
converges to a global minimum : E W(t) 0, exponentially
or escapes any compact K this happens when 1) ( ) 1
EWW
W
tλ
∞
∂=−∂
•→
• ⊂Ω
→
Proposition : Exponential convergence
2 3
( )
or ( ) ( ) 0 2) ( , ) 0 for some W t
t tD i i i
λ λ− →→
![Page 26: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/26.jpg)
26
Is there a ground truth segmentation ?
Graph nodes edge at ( , ) with angle i i i ie x y α=
![Page 27: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/27.jpg)
27
Task 2: Learning rectangular shape
Convex Concave Impossible
( , ) ( , , ; , , )i j i i i j j jW e e f x y x yα α=
Edge location Edge orientation
,i i
i
x yα
![Page 28: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/28.jpg)
28
Task 2: Learning rectangular shape
Convex Concave Impossible
( , ) ( , , ; , , )i j i i i j j jW e e f x y x yα α=
10 10 edge locations4 edge angles×
2| | 160,000 parametersW n n nx y α
⎞⎛ ⎟⎜= ⋅ ⋅ =⎟⎜ ⎟⎝ ⎠
,Edge location Edge orientaion
i i
i
x yα
![Page 29: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/29.jpg)
29
Task 2: Learning rectangular shape
Convex Concave Impossible
( , ) ( , , ; , , )i j i i i j j jW e e f x y x yα α=
Edge location ,Edge orientaion
i i
i
x yα
( , ) ( , , , )i j j i j i i jW e e f x x y y α α= − −Invariance through parameterization :
20 × 20 × 4 × 4 = 6400 parameters
![Page 30: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/30.jpg)
30
training
Input edges
testing
Input edges
Ncut eigenvector after training
Ncut eigenvector
target
Synthetic case
![Page 31: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/31.jpg)
31
Testing on real images Learning with precise edges
![Page 32: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/32.jpg)
32
Comparison of the spectral graph learning schemes
Our method
*argmin || ( ) ||W
X W X−
Bach-Jordan
),(minarg *XWJW
1 1 *arg min || ( ) ||W
D W D W X− −−
Meila and Shi, Fowlkes, Martin and Malik
Bach and Jordan (2003)Learning spectral clusteringUse a differentiable approximation
of eigenvectors
Our workExact analytical formcomputationally efficient solution
![Page 33: Learning Spectral Graph Segmentation](https://reader035.fdocument.org/reader035/viewer/2022071612/6156f731a097e25c764f7a99/html5/thumbnails/33.jpg)
33
Conclusion
• Supervise the un-supervised spectral clustering
• Exact and efficient learning of graph weights using derivatives of eigenvectors