# On An Interpretation Of Spectral Clustering Via Heat ... An Interpretation Of Spectral Clustering...

date post

18-May-2018Category

## Documents

view

212download

0

Embed Size (px)

### Transcript of On An Interpretation Of Spectral Clustering Via Heat ... An Interpretation Of Spectral Clustering...

On An Interpretation Of Spectral ClusteringVia Heat Equation And Finite Elements Theory

Sandrine Mouysset, Joseph Noailles and Daniel Ruiz

Abstract Spectral clustering methods use eigen-vectors of a matrix, called Gaussian affinity matrix, inorder to define a low-dimensional space in which datapoints can be clustered. This matrix is widely usedand depends on a free parameter . It is usually in-terpreted as some discretization of the Heat EquationGreen kernel. Combining tools from Partial Differen-tial Equations and Finite Elements theory, we proposean interpretation of this spectral method which offersan alternative point of view on how spectral clusteringworks. This approach develops some particular geo-metrical properties inherent to eigenfunctions of somespecific partial differential equation problem. We an-alyze experimentally how this geometrical propertyis recovered in the eigenvectors of the affinity matrixand we also study the influence of the parameter .

Keywords: Clustering, Machine Learning, SpatialData Mining

1 Introduction

Clustering aims to partition a data set by grouping sim-ilar elements into subsets. Two general main issues con-cern, on the one hand, the choice of a similarity criterionand, on the other hand, how to separate clusters the onefrom the other. Spectral methods, and in particular thespectral clustering algorithm introduced by Ng-Jordan-Weiss (NJW) [8], are useful when considering non-convexshaped subsets of points. Spectral clustering (SC) con-sists in defining a low-dimensional data space by select-ing particular eigenvectors of a matrix called normalizedaffinity matrix in which data points are clustered. Inthis clustering method, two main problems arise : first,what sense can we give to the notion of cluster when con-sidering finite discrete data sets and then, how can welink these clusters to some spectral elements (eigenval-ues/eigenvectors extracted in SC).Spectral Clustering can be analyzed using graph theoryand some normalized cut criterion leading to the solutionof some trace-maximum problems (see, for instance, [8],[9]). As the aim is to group data points in some sense ofcloseness, Spectral Clustering can also be interpreted byexploiting the neighbourhood property in the discretiza-tion of the Laplace-Beltrami operator and heat kernel on

IRIT-ENSEEIHT, 2 rue Charles Camichel,University of Toulouse, France, 31017 BP 122{sandrine.mouysset,joseph.noailles,daniel.ruiz}@enseeiht.fr

manifolds [2] : in the algorithm, neighbourhood propertyis represented by the step of adjacency graph. Moreover,consistency of the method was investigated by consider-ing the asymptotic behaviour of clustering when samplesbecome very large [11, 10]. Some properties under stan-dard assumptions for the normalized Laplacian matriceswere proved, including convergence of the first eigenvec-tors to eigenfunctions of some limit operator [11, 7]. Tothe best of our knowledge, the main results establish thisrelation for huge numbers of points. However, from a nu-merical point of view, SC still works for smaller data sets.So, in this paper, we try to give explanations that mayaddress a given data sample.A second problem appears when considering the mostwidely used affinity matrix in spectral clustering tech-niques, as in the NJW-algorithm. It is based on theGaussian measure between points which depends on afree scale parameter which has to be properly defined.Many investigations on this parameter were led and sev-eral definitions were suggested either heuristically [12, 3],or with physical considerations [6], or from geometricalpoint of views [5]. The difficulty to fix this choice seems tobe tightly connected to the lack of some clustering prop-erty explaining how the grouping in this low-dimensionalspace defines correctly the partitioning in the originaldata. We will show how this free parameter affects clus-tering results.In this paper, as spectral elements used in SC do notgive explicitly this topological criteria for a discrete dataset, we are drawing back to some continuous formula-tion wherein clusters will appear as disjoint subsets. Soeigenvectors of the affinity matrix A will be interpreted aseigenfunctions of some operator. This drawback is per-formed (section 3) using Finite Elements (FE). Indeed,with FE whose nodes correspond to data points, repre-sentation of any L2-function is given by its nodal value.So we expect to interpret both A and its eigenvectors as arepresentation of respectively a L2 operator and L2 func-tions on some bounded regular .Now, the commonly used operator whose Finite Elementsrepresentation matches with A is the kernel of the Heatequation on unbounded space (noted KH). As its spec-trum is essential, we cannot interpret eigenvectors of A asa representation of eigenfunctions of KH . However, on abounded domain O, KH is closed to KD, the kernel of theHeat equation on a bounded domain , for including

Proceedings of the World Congress on Engineering 2010 Vol I WCE 2010, June 30 - July 2, 2010, London, U.K.

ISBN: 978-988-17012-9-9 ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)

WCE 2010

strictly O (section 2.2). Now, operator SD (convolutionby KD) has eigenfunctions (vni) in H10 (). From this,we show (proposition 1) that operator SH (convolutionby KH) admits these vni as near eigenfunctions plus aresidue (noted ). At last, using Finite Elements approx-imation, we show (proposition 2) that eigenvectors of Aare a representation of these eigenfunctions plus a residue(noted ).

To summarize, the main result of this paper is that fora fixed data set of points, the eigenvectors of A are therepresentation of functions which support is included inonly one connected component at once. The accuracy ofthis representation is shown to depend for a fixed den-sity of points, on the affinity parameter t. This result isillustrated with small fixed data points (section 4).

1.1 Spectral clustering Algorithm

Let us first give some notations and recall the NJW algo-rithm for a n points data set in a p-dimensional euclideanspace. Assume that the number of targeted clusters k isknown. The algorithm contains few steps which are de-scribed as follows:

Algorithm 1 Spectral Clustering AlgorithmInput : data set, number of clusters k

1. Form the affinity matrix A Rnn defined by:

Aij =

{exp

(xixj

2

22

)if i = j,

0 otherwise,(1)

2. Construct the normalized matrix:L = D1/2AD1/2 with Di,i =

nj=1Aij ,

3. Assemble the matrix X = [X1X2..Xk] Rnk bystacking those eigenvectors associated with the klargest eigenvalues of L,

4. Form the matrix Y by normalizing each row in then k matrix X,

5. Treat each row of Y as a point in Rk, and groupthem in k clusters via the K-means method,

6. Assign the original point xi to cluster j when row iof matrix Y belongs to cluster j.

In this paper, we wont take into account the normal-ization of the affinity matrix described in step 2. Butwe focus on the two main steps of this algorithm: as-sembling the Gaussian affinity matrix and extracting itseigenvectors to create the low-dimensional space.

2 Clustering property of Heat kernel

As recalled in introduction, Spectral Clustering consistsin selecting particular eigenvectors of the normalizedaffinity matrix defined by (1). Gaussian affinity coeffi-cients Aij are usually interpreted as a representation ofthe Heat kernel evaluated at data points xi and xj .

Let KH(t, x, y) = (4t)p2 exp

(xy

2

4t

)be the Heat

kernel in R+ Rp Rp. Note that the Gaussian affinitybetween two distinct data points xi and xj is definedby Aij = (22)

p2 KH(

2/2, xi, xj). Now, consider theinitial value problem in L2(Rp), for f L2(Rp):

(PRp)

{tuu = 0 for (x, t) Rp R+\{0},u(x, 0) = f for x Rp,

and introduce the solution operator in L2(Rp) of (PRp):

(SH(t)f)(x) =

RpKH(t, x, y)f(y)dy, x Rp. (2)

Now if f(y) xj (y), where xj denotes the Dirac func-tion on xj , one can observe that:(SH(

2

2)f

)(xi) KH

(2

2, xi xj

)=

(22

) p2 (A+ In)ij .

Thus, the spectral properties of matrix A used in theSpectral clustering algorithm seems to be related to thoseof operator SH(t). We propose to analyze this in detailsin the following.

2.1 Heat equation with Dirichlet boundaryconditions

To simplify, we shall fix in the following the number ofclusters k to 2, but the discussion can be extended to anyarbitrary number k. Consider then a bounded open set in Rp made up of two disjoint connected open subsets 1and 2. Assume that the boundary = 12 with12 = is regular enough so that the trace operatoris well-posed on and the spectrum decomposition ofLaplacian operator is well-defined. For instance, isC1 on both 1 and 2. According to [4], determiningthe eigenvalues (n())n>0 and associated eigenfunctions(vn)n>0 of the Dirichlet Laplacian on :

(PL)

{vn = vn in ,vn = 0 on ,

is equivalent to define a function of the eigenvalues(n1(1))n1>0 and (n2(2))n2>0 and of the eigenfunc-tions of the Dirichlet Laplacian on 1 and 2 respec-tively. In other words, if vn H10 (), then v = von if and only if for i {1, 2}, vi = v|i H10 (i)satisfies vni = vni on i. Therefore, {n()}n>0 =

Proceedings of the World Congress on Engineering 2010 Vol I WCE 2010, June 30 - July 2, 2010, London, U.K.

ISBN: 978-988-17012-9-9 ISSN: 2078-0958 (Print); ISSN: 2078-0966 (Online)

WCE 2010

{n1(1)}n1>0 {n2(2)}n2>0. Additionally, if we con-sider vni the solution on i of PLi , an eigenfunction ofthe Laplacian operator associated to ni , and if weextend the support of vni i to the whole set by set-ting vni = 0 in \i, we also get an eigenfunction of theLaplacian in H10 () associated to the same e

Recommended

*View more*