8
0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. See http://www.ieee.org/publications_standards/publications/rights/index.html for more information. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence 1 Segmentation of 3D meshes using p-spectral clustering Mohamed Chahhou, Lahcen Moumoun, Mohamed El Far and Taoufiq Gadi Abstract—In this work, we propose a new approach to get the optimal segmentation of a 3D mesh as a human can perceive using the minima rule and spectral clustering. This method is fully unsupervised and provides a hierarchical segmentation via recursive cuts. We introduce a new concept of the adjacency matrix based on cognitive studies and we also introduce the use of 1-spectral clustering which leads to the optimal Cheeger cut value. Index Terms—3D mesh, segmentation, spectral clustering, Cheeger cuts, minima rule. 1 I NTRODUCTION O NE of the important problems in mesh processing is mesh segmentation where the goal is to partition a mesh into segments to suit particular needs. Several algorithms use mesh partitioning as an initial stage, e.g. compression, parametrization, morphing, matching, shape retrieval, collision detection. We can distinguish two approaches for mesh segmentation, the first one is based on a part-type segmentation which partitions a mesh into meaningful components (such as the parts of a human body) and the second one is patch-type segmentation which decomposes the surface of an object into disk like patches for which basic primitives can be fitted. We will consider, in this work, only the part-type segmentation. The segmentation of a mesh depends on criteria which dictate which elements of the mesh belong to the same partition. These criteria are built upon the segmenta- tion objective which in turn depends on the applica- tion/context. However, there seem to be common con- stituents in the general objectives of many segmentation algorithms [1],[2]. One of the most important of those criteria is based on the minima rule from cognitive studies, introduced by Hoffman and Richards [3]. It relies on the principle wherein an object is segmented by human perception at areas of concavity, which means that cut boundaries should consist of surface points at negative minima of principal curvatures. In [1],[4] serveral segmentation techniques have been discussed but most of the approaches have many draw- backs : sensitivity to the pose of the model, the extraction of very small features is not easy and the difficulty to M. Chahhou and M. El Far, Faculty of Science Dhar Mahraz, University Sidi Mohamed Ben Abdellah, Fes, Morroco. L. Moumoun and T. Gadi are with the LAVETE Laboratory, Faculty of Science and Technology, University Hassan 1er, Settat, Morroco. E-mail: [email protected], [email protected], [email protected], [email protected] provide a fully automated segmentation without user interaction. In fact, most of the algorithms are dependant on some parameters that must be set interactively and sometimes affect the output segmentations dramatically if they are not chosen correctly. In this work, we propose a segmentation approach, using the minima rule criteria, to overcome the global issues of segmentation techniques, and we provide a fully automated segmentation without user interaction. Since the result of our segmentation is based on a recursive bipartitioning of the mesh, the user can choose the level of details without any further computation. 2 RELATED WORKS Many segmentation techniques using minima rule cri- teria are based on clustering. In those approaches, each segment of the mesh corresponds to a cluster and each segment contains elements which are similar to each other. The measure of similarity depends on the appli- cation domain. 2.1 Spectral clustering One better known and effective method that relates the clustering problem to a min-cut graph partitioning is spectral clustering. It’s based on computing a few leading eigenvectors of an affinity or weighted graph Laplacian matrix, those eigenvectors provide a new low dimensional embedding for which the clustering prob- lem is more easily solved. The first real attempt to segment a mesh using spectral clustering was made by Liu and Zhang. [5], an affinity matrix is constructed where the affinity measure com- bine both geodesic distances and curvature information. Then, a clustering method is applied on the eigenvectors given by the eigendecomposition of this matrix.

Segmentation of 3D Meshes Usingp-Spectral Clustering

  • Upload
    taoufiq

  • View
    213

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

1

Segmentation of 3D meshes usingp-spectral clustering

Mohamed Chahhou, Lahcen Moumoun, Mohamed El Far and Taoufiq Gadi

Abstract —In this work, we propose a new approach to get the optimal segmentation of a 3D mesh as a human can perceive usingthe minima rule and spectral clustering. This method is fully unsupervised and provides a hierarchical segmentation via recursive cuts.We introduce a new concept of the adjacency matrix based on cognitive studies and we also introduce the use of 1-spectral clusteringwhich leads to the optimal Cheeger cut value.

Index Terms —3D mesh, segmentation, spectral clustering, Cheeger cuts, minima rule.

1 INTRODUCTION

ONE of the important problems in mesh processingis mesh segmentation where the goal is to partition

a mesh into segments to suit particular needs. Severalalgorithms use mesh partitioning as an initial stage,e.g. compression, parametrization, morphing, matching,shape retrieval, collision detection. We can distinguishtwo approaches for mesh segmentation, the first one isbased on a part-type segmentation which partitions amesh into meaningful components (such as the partsof a human body) and the second one is patch-typesegmentation which decomposes the surface of an objectinto disk like patches for which basic primitives can befitted. We will consider, in this work, only the part-typesegmentation.

The segmentation of a mesh depends on criteria whichdictate which elements of the mesh belong to the samepartition. These criteria are built upon the segmenta-tion objective which in turn depends on the applica-tion/context. However, there seem to be common con-stituents in the general objectives of many segmentationalgorithms [1],[2].

One of the most important of those criteria is basedon the minima rule from cognitive studies, introducedby Hoffman and Richards [3]. It relies on the principlewherein an object is segmented by human perceptionat areas of concavity, which means that cut boundariesshould consist of surface points at negative minima ofprincipal curvatures.

In [1],[4] serveral segmentation techniques have beendiscussed but most of the approaches have many draw-backs : sensitivity to the pose of the model, the extractionof very small features is not easy and the difficulty to

• M. Chahhou and M. El Far, Faculty of Science Dhar Mahraz, UniversitySidi Mohamed Ben Abdellah, Fes, Morroco.L. Moumoun and T. Gadi are with the LAVETE Laboratory, Faculty ofScience and Technology, University Hassan 1er, Settat, Morroco.E-mail: [email protected], [email protected], [email protected],[email protected]

provide a fully automated segmentation without userinteraction. In fact, most of the algorithms are dependanton some parameters that must be set interactively andsometimes affect the output segmentations dramaticallyif they are not chosen correctly.

In this work, we propose a segmentation approach,using the minima rule criteria, to overcome the globalissues of segmentation techniques, and we provide afully automated segmentation without user interaction.Since the result of our segmentation is based on arecursive bipartitioning of the mesh, the user can choosethe level of details without any further computation.

2 RELATED WORKS

Many segmentation techniques using minima rule cri-teria are based on clustering. In those approaches, eachsegment of the mesh corresponds to a cluster and eachsegment contains elements which are similar to eachother. The measure of similarity depends on the appli-cation domain.

2.1 Spectral clustering

One better known and effective method that relatesthe clustering problem to a min-cut graph partitioningis spectral clustering. It’s based on computing a fewleading eigenvectors of an affinity or weighted graphLaplacian matrix, those eigenvectors provide a new lowdimensional embedding for which the clustering prob-lem is more easily solved.

The first real attempt to segment a mesh using spectralclustering was made by Liu and Zhang. [5], an affinitymatrix is constructed where the affinity measure com-bine both geodesic distances and curvature information.Then, a clustering method is applied on the eigenvectorsgiven by the eigendecomposition of this matrix.

Page 2: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

2

2.2 Region growing

Region growing algorithm starts from some random orwell defined seeds and keep clustering neighbor ele-ments until some criteria is met.

In [6], Zhang et.al. use the minima rule (using thesign of the Gaussian curvature to mark boundaries ) andapplied a region growing algorithm. Their algorithm isable to segment meshes into meaningful parts providedthat the boundaries of the part contain deep concavities.Moumoun et.al. [7] used a watershed algorithm based onthe unbiased hierarchical queue with a height functiondefined to be the gaussian curvature of the surface ofthe mesh. The queues are first initialized with somemarked faces.When the flooded regions meet (at thesaddle faces), watershed lines are created signifying theboundaries of distinct parts.

The drawback of those algorithms is that the bound-aries must contain concavities, which is not often thecase for many objects,(e.g. The boundary between thearm and the main body in a human).

To overcome this problem, Lee et al [8] used a graphcut approach: they use the minimum curvature of themesh in order to construct the boundaries of its partsand manage to construct closed boundaries of the partseven if they are not surrounded by deep concavities. Thisis achieved by a shortest path algorithm, see Fig. 1.

We propose a similar idea but a different approachto solve this problem. After the concave boundaries arefound, we use spectral clustering to find the best cutsthrough the bending (convex) region instead of usingthe shortest path algorithm.

Fig. 1. Lee approach : on the left, selection of the concaveareas. On the right, contour completion.

3 P-SPECTRAL CLUSTERING AND CHEEGERCUTS

Mesh segmentation can be formulated as follow: let S bethe set of either the vertices, edges or faces of the mesh.Segmentation of a mesh M is called the partitioning ofS into k-disjoint connected sets:

Si = S, Si ⊂ S, Si ∩ Sj = φ, i, j = 1..k, i 6= j

3.1 Standard spectral clustering

Let W be the connectivity (adjacency) matrix:

Wij =

{

1 if i and j are neigboors,

0 otherwise.

Then, the graph Laplacian is L = D −W , where D isthe degree matrix whose diagonal are row sums of W,such that D(i, i) =

sj∈S Wij is the degree of node i.

Lij =

−Wij if i and j are neigboors,

D(i, i) if i = j,

0 otherwise.

Using the adjacency matrix, clustering the data canbe seen as a graph partitioning problem. The goal is topartition the graph into two well separated sets S1 andS2 such that :

Cut(S1, S2) =∑

si∈S1,sj∈S2

Wij is minimized.

Let v be the indicator vector of the partitioning :

v(i) =

{

−1 if si ∈ S1,

1 if si ∈ S2,

then, we have :

Cut (S1, S2) =1

4

i,j

Wij(v(i) − v(j))2

=1

2vT (D −W )v =

1

2vTLv.

By relaxing the entries in v from discrete values to con-tinuous values, subject to the constraint ||v||2 = 1, it canbe shown that the minimum value of the cut is achievedwhen v is the eigenvector of the graph Laplacian L=D-Wthat corresponds to the second smallest eigenvalue (see[9] for more details).

Since the cut favors small sets of isolated nodes,normalized cuts is proposed in [10] to deal with thisproblem :

NCut(S1, S2) = Cut (S1, S2)

(

1

V ol(S1)+

1

V ol(S2)

)

,

where V ol(Si) =∑

sj∈SiDjj .

Normalizing the cut by the volume of each clusterallow clusters to have similar sizes.

The normalized graph Laplacian is then :

Lnorm = D−1L = D−1(D −W ).

Computing the optimal solution that minimizes NCutis NP-complete, we usually find an approximate solutionby computing the second smallest eigenvector of thegeneralized eigenvalue problem: (D−W )v = λD v, andthresholding the eigenvector to derive the final cut.

3.2 p-spectral clustering

p-spectral clustering is a generalization of spectral clus-tering, it’s based on the graph p-laplacian, a non linearoperator on graphs which reduces to the standard graphLaplacian for p = 2.

Page 3: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

3

The standard graph Laplacian operator ∆2 can bedefined as the operator which induces the followingquadratic form for a function f : S → R :

〈f,∆2f〉 =1

2

n∑

i,j=1

wij (fi − fj)2.

We can similarly define an operator ∆p which inducesthe general form (for p > 1):

〈f,∆pf〉 =1

2

n∑

i,j=1

wij |fi − fj |p,

∆p is the graph p-laplacian, and can be defined asfollow (see [11] for more details ):

(∆pf)i =1

2

j∈S

wij Φp (fi − fj),

where Φp : R → R is defined for x ∈ R as :

Φp(x) = |x|p−1 sign(x).

The interest to use p-spectral clustering derive fromthe generalized isoperimetric inequality of Amghibech[11] which relates the second eigenvalue of the graphp-Laplacian to the optimal Cheeger Cut.

Let’s first introduce the normalized Cheeger Cut as :

NCC(S1, S2) =Cut(S1, S2)

min{V ol(S1), V ol(S2)},

then by taking hNCC = infS NCC(S1, S2), we get theoptimal normalized Cheeger Cut hNCC .

The isoperimetric inequality for the normalized p-Laplacian has been proven by Amghibech [11] :

2p−1

(

hNCC

p

)p

≤ λ(2)p ≤ 2p−1hNCC ,

λ(2)p is the second eigenvalue of the pLaplacian

For p=2, we have the standard isoperimetric inequal-ity:

h2NCC

2≤ λ

(2)2 ≤ 2hNCC ,

λ(2)2 is the second eigenvalue of the standard Laplacian

By thresholding the second eigenvector v(2)p of the

normalized graph p-laplacian, we get a partitioning ofthe graph. The optimal threshold is giving by :

argminNCC(St, S\St) , St = {i ∈ S | v(2)p (i) > t}

and i is the index of an element of S.

Recently, Buhler & Hein [12], proved that :

hNCC ≤ h∗

NCC ≤ p(hNCC)( 1

p),

where h∗

NCC is the normalized Cheeger cut value ob-

tained by tresholding the second eigenvector v(2)p .

For p → 1, the cut found by thresholding

v(2)p converges to the optimal Cheeger cut hNCC as

p(hNCC)( 1

p) → hNCC .

4 OUR APPROACH

In this section, we will first show how spectral clusteringrelates to the theory of salience of visual parts usingthe standard adjacency matrix. We will see that eventhough this adjacency matrix achieves good results inhighlighting the structural segments of a model, it isnot sufficient to provide a good segmentation quality ashumans do. Therefore, we will define a new adjacencymatrix using elements of cognitive studies.

4.1 The adjacency matrix and spectral clustering.

Spectral clustering can be related to one of the main ideasbehind the theory of salience of visual parts. In fact,humans tend to partition a 3d model into 2 segmentssuch that the resulting parts will both have high areasand a boundary cut with a small perimeter. This isexactly what the optimal NCC try to achieve by findingthe best partitioning S1 and S2 that minimize cut(S1,S2)and maximize the ratio of the volume of the segments (highest ratio obtained if Vol(S1)=Vol(S2) ).

This relationship holds only for regular meshes thathave all faces with equal areas and all vertices with equaldegrees (Fig. 2). In this case, we can equate between thearea and the volume of the segments.

(a) regular mesh (b) irregular mesh

Fig. 2. One bipartitioning using 1-spectral clustering andthe adjacency matrix.

In fact, for regular meshes, we can show that the ratioof the volume of two segments is equal to the ratio ofthe area of those segments.

The total area of a segment i is Areaface ∗ |Si|, where|Si| denotes the number of faces in this segment. We thenhave:

area ratio =Areaface ∗ |S1|

Areaface ∗ |S2|=

|S1|

|S2|.

The volume of a segment is : V ol(Si) =∑

sj∈SiDjj .

Since the mesh is regular then all Djj = k are equal andwe get V ol(Si) = k ∗ |Si|. We then have :

volume ratio =k ∗ |S1|

k ∗ |S2|=

|S1|

|S2|.

This relationship is what motivates our choice of thespectral clustering approach for mesh segmentation.

But cognitive studies also state that cuts should occurin concave regions, and neither the spectral clusteringnor the standard adjacency matrix take into accountthis criteria. Fig. 3 shows the limitation of the standardadjacency matrix, the cuts does not occur on the region

Page 4: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

4

of minima curvature and the fingers are not correctlyextracted even though the structural segmentation isclearly highlighted.

It appears clearly that the curvature informationshould be encoded in the adjacency matrix.

Fig. 3. Hand model segmentation using 1-spectral clus-tering and the standard adjacency matrix.

4.2 Defining a new adjacency matrix

We define a new adjacency matrix in which we encodea new connectivity of the mesh using the minima rule.

We define the new adjacency matrix as:

Wij =

1 if i and j are adjacent and not belonging

to a concave region,

0 otherwise.

But we found in practice that Wij = 0 or equal to verysmall values favors the apparition of small clusters evenif we use the normalized Cheeger Cut. In fact, since :

NCC(S1, S2) =Cut(S1, S2)

min{V ol(S1), V ol(S2)},

very small values lead to Cut (S1, S2) ≃ 0 which giveNCC (S1, S2) ≃ 0 regardless of the values of V ol(S1)and V ol(S2).

We found by experiments that values from 0.1 to0.2 provide the best results. By using this new intervalinstead of the zero value, we can encode another infor-mation derived from the salience parts theory: concaveregions with high curvatures are better candidates forcuts than the ones with low curvatures.

But because the ranges of curvature values are toodiverse among different meshes, we first normalized thevalues as described in [8]. This allows us to choose aunique threshold to determine deep concave regions.Suppose k(v) is the minimum curvature value at a vertex

v, the normalized value is k(norm)(v) = k(v)−µ

σ,where µ

is the mean and σ is standard deviation of k(v) overall vertices of the mesh. Deep concave regions are thosecontaining vertices with k(norm)(v) < −0.8.

Since areas of minima curvature can be quiet large,we implement an algorithm to thin those regions and toselect only high features areas.

The selection of candidate concave vertices where thecut should occur is done as follow:

First, we define a 2× (N +1) curvature matrix K suchthat k1i and k2i are the principal curvatures of a vertexand its 1-ring neighborhood of size N.

Step 1: we select the vertex with the deepest curvatureStep 2: we label the selected vertex, and we compute

the correlation between the curvature matrix of thisvertex region and its 1-ring neighborhood curvaturematrices using the Mahalanobis distance [13].

Step 3: The vertex with the minimal distance is se-lected. And we repeat steps 2 and 3. When a vertex islabeled twice, this means that we went from vertex Ato vertex B then from vertex B (or another vertex C) tovertex A. Vertex A is then removed from the graph to notstuck in an infinite loop, a local merging step is done andits 1-ring neighboors are marked as non concave vertices.

The algorithm stops when no concave vertices are left.We repeat this thinning process starting at step 1 untilall concave regions are processed.

The advantage of using the Mahalanobis distance isthat it allows us to select only neighboors vertices be-longing to regions that are similar to the current region.This helps in overcoming the noisy mesh problem andalso helps in obtaining regular thinned paths.

And as a final step, on the thinned regions, we apply acleanup algorithm to delete isolated vertices or insignif-icant regions :

Step 1: we compute the length of the paths of thethinned regions

Step 2: we keep the long paths, and for the short pathswe only keep those containing vertices of deep curvaturek(norm)(v) < −0.8 (e.g. regions under the arms or behindthe ears).

We then build our new adjacency matrix as follow:we first set all vertices of the mesh to be convex exceptfor the ones of the thin concave regions, and for each ofthose concave vertices, we set the value 0.1 or 0.2 as anew weight between them and their neighboors.

Wij =

1 if i and j are adjacent and not belonging

to a thinned concave region,

0.1 if abs (k(norm)i (v)− k

(norm)j (v)) ≥ 2,

0.2 if abs (k(norm)i (v)− k

(norm)j (v)) < 2,

0 otherwise.

We fixed the thresholding value to 2 by experiment.As explained in section 2, the major drawback of

algorithms based on minima rule is the difficulty toovercome the bending convex zone.

We propose to solve this problem as a min-cut graphapproach. Our matrix defines where the cuts shouldoccur and the optimal NCC will provide the best cutthrough the convex area. The min Cut will be obtainedonly if the cut is done through the shortest path (allweights in the convex area are equal to 1), which isequivalent to the solution proposed by Lee[8].

5 EXPERIMENTAL RESULTS

Most of the segmentation algorithms are tested on theirown set of 3D models; the results are presented and

Page 5: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

5

compared with other algorithms, by showing some im-ages side by side, which can be misleading in term ofcomparison. In fact, results can be pretty good for somemodels and really bad for others, depending on the 3dmodel input. To make a fair comparison of our resultswith other algorithm results, we will use the benchmarkcontaining 380 different models from the WatertightTrack of the 2007 SHREC Shape-based Retrieval Contest.Different human segmentations for each model are alsoproposed in the benchmark.

In order to evaluate the quality algorithm’s segmen-tations, Chen et al. [14] propose 4 quantitative metrics,and provide programs which automate the process ofcomputing the four metrics on the whole benchmark(http://segeval.cs.princeton.edu). They also provide thesegmentations results of different algorithms to allowefficient comparison. The algorithms are: Kmeans [15],random Walks [16], Fitting primitives [17], Normalizedcuts and Randomized cuts [18] , core extraction [19] ,Shape Diameter Function [20].

5.1 Quantitative Evaluation

We present first a short description of the four metrics,see [14] for more details.

1) Cut Discrepancy: it is a boundary-based methodthat measures the distances between cuts of the segmen-tation algorithms and those in the ground truth (human)segmentation.

2) Hamming Distance: measure the overall region-based difference between two segmentation results. Itrelies upon finding correspondences between segments

3) Rand Index: measures the likelihood that a pair offaces are either in the same segment in two segmenta-tions, or in different segments in both segmentations. Itmodels area overlaps of segments

4) Consistency Error: tries to account for nested, hier-archical similarities and differences in segmentations.

We will first evaluate our approach using 1-spectralclustering and in the next section, we will compare theperformance of 1-spectral and 2-spectral clustering.

In order to compute the optimal NCC for p=1, we used2 different algorithms proposed in [21],[22]. We’ll com-pare results of both implementations in the experimentalresults section.

In [22], the authors proved that the cheeger cut prob-lem is equivalent to a total variation problem, and useda Split-Bregman algorithm for cut minimization.

In [21], since the p-Laplacian is a non linear operator,the authors relate the 1-spectral clustering problem to anon linear eigenproblem. Then, the generalization theyderive for the inverse power method allow the compu-tation of the non linear eigenvector on which we applythe thresholding.

In both cases we will apply a recursive bipartitioningas described in [10], meaning that once a segment isextracted from a mesh, 1-spectral clustering will beapplied on the remaining mesh until the desired numberof segments is reached.

Object Cheeger Rand Shape Norm Core Rand Fit KCategory Cuts Cuts Diam Cuts Extra Walk Prim MeansHuman 1 2 6 3 8 7 4 5Cup 1 2 6 3 4 5 7 8Glasses 1 2 5 3 7 8 6 4Airplane 2 3 1 5 8 7 4 6Ant 1 3 2 4 5 6 7 8Chair 1 5 3 2 6 4 7 8Octopus 1 5 2 4 3 6 8 7Table 1 8 5 2 6 3 4 7Teddy 2 1 3 5 4 6 7 8Hand 2 1 8 4 5 6 7 3Plier 3 2 8 5 1 6 4 7Fish 4 3 1 6 2 5 8 7Bird 1 2 3 5 4 8 7 6Armadillo 1 4 2 6 8 5 3 7Bust 2 1 4 7 6 3 5 8Mech 2 5 4 1 7 3 6 8Bearing 5 2 1 3 8 6 4 7Vase 2 1 5 4 3 6 7 8Fourleg 2 3 1 7 5 8 4 6Overall 1 2 4 3 6 7 5 8

TABLE 1Comparison of segmentation algorithms for each object

category using the Rand Index evaluation metric (1 is thebest, and 8 is the worst).

We choose a separate value for the number of seg-ments for each model, setting it to be the mode of thenumber of segments appearing in segmentations createdby people for that model in the benchmark data set([14]).

Fig. 4 shows the evaluation of all the algorithms onthe whole benchmark and also include an evaluationof the human-generated segmentations with respect toone other. The results we obtain show that our approachperforms extremely well and our results are the closest tothe human segmentation with respect to the four metrics.

The IPM algorithm performs better than the TV algo-rithm. We think that the fact that the authors of TV algo-rithm didn’t provide any convergence guarantee abouttheir method may influence the results since nothingensure us to get the best cuts in the graph.

On the opposite, the authors of the IPM algorithmproved that their algorithm converge to a non constanteigenvector of the 1-Laplacian, even if they don’t guaran-tee convergence to the second eigenvector. To overcomethis problem, they recommend using multiple randominitializations and use the result which achieves the bestratio Cheeger cut.

Since both algorithms are time consuming for largemeshes, we stick to 1000 iterations for the TV algorithmand to 6 iterations for IPM method (one of the iterationsuses the second eigenvector of standard graph Laplacian,the others are random initializations). Better results canbe achieved if we increase the number of iterations.

In the following, we will only discuss the resultsobtained with the Inverse Power Method as it gives thebest results.

In Table. 1, we provide a comparison between thealgorithms for each category. Our approach gives thebest results in mostly all categories. We investigatedfurther to understand the reasons for the bad rankingin the bearing, fish and plier categories.

Page 6: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

6

(a) Cut Discrepancy (b) Hamming Distance

(c) Rand Index (d) Consistency Error

Fig. 4. Comparison between different segmentation algorithms and the ground-truth using the four metrics.CheegIPM means Cheeger cuts using the Inverse Power Method algorithm, and Cheeg TV means Cheeger cuts using theTotal Variation algorithm

Bearing Category: as it can be noticed in Fig. 5,we have a perfect segmentation of the bearing modelwhen only four clusters are selected. But our approachcannot provide the fifth good cluster (based on humanperception, the evaluation was made with five clustersfor this model and similar models).

(a) 4 segments (b) 5 segments

Fig. 5. Segmentation of the same 3d bearing model

In Fig. 6, we highlighted in white the thinned concaveregions and in red the edges connected to the concave re-gions of the bearing model (Fig. 6). Because we allowedweights on both sides of the thinned concave region totake small values, the volume (sum of weights of theedges) between the 2 white lines is very small (eachred edge is equal to 0.1 or 0.2). This means that evenif the value of the Cut is small, if we divide it by a smallvolume, the value of the normalized cheeger cut willbe high, which prevents this region from being correctlysegmented. We have also noticed that most of the objectsin the bearing category are irregular meshes for whichspectral clustering performs less better.

Fig. 6. In white, the concave areas and in red, the edgesbetween the concave areas.

Plier Category : Fig. 7(a) shows the performanceof the algorithms for the plier category. Our approachis ranked thid but we can clearly see that it performsextremely well and is as good as Random Cuts.

For some categories, like Cups, our method clearlyoutperforms the others even though most of the modelspresent a genus, see Fig. 7(b).

The Armadillo category contains 20 models of thesame armadillo model in different poses. We obtain thebest ranking in this category which clearly shows thatour approach is pose invariant.

5.2 Evaluation of p-spectral clustering and the newadjacency matrix

In this second part, we propose a more detailled eval-uation by decoupling the p-spectral clustering from thegraph construction to highlight the importance of eachone.

In Fig. 8, we compare the quality of 2-spectral clus-tering using the standard adjacency matrix (2pAdj) and

Page 7: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

7

(a) Ranking for the plier category (b) Ranking for the Cup category

Fig. 7. Ranking for the plier and cup categories.

Fig. 8. Evaluation of p-spectral clustering and the newadjacency matrix

our defined adjacency matrix (2pCurv). As we can see,the new matrix improves significantly the segmentationquality of the 3D models providing slightly better resultsthan Random Cuts with a much faster compute time(Table. 2). We notice the same improvement in qualitywhen comparing 1-spectral using the standard adjacencymatrix (Cheeg Adj) and 1-spectral using our matrix(Cheeg Curv).This clearly highliths the importance ofencoding informations from cognitive studies into theadjacency matrix.

1-spectral clustering provides better quality than 2-spectral clustering either with the standard adjacencymatrix or the new defined adjacency matrix. Eventhough the improvements in quality are not highlysignificant (Fig. 8, Cheeg Curv Vs 2pCurv), we notehowever that models without articulations and modelswith significant topological changes are not segmentedcorrectly using the 2-spectral approach (Fig. 9). Sincethe SHREC 2007 database contains only a few of thosekind models, this explains the small improvement in thesegmentation quality we obtain.

5.3 Qualitative Evaluation

We show in Fig. 10 some results we got using optimalnormalized Cheeger cuts with IPM method on some

Fig. 9. Top row : standard spectral clustering (2-spectralclustering). Bottom row: 1-spectral clustering.

models from the benchmark.

Fig. 10. More of our results using models from theSHREC benchmark.

Page 8: Segmentation of 3D Meshes Usingp-Spectral Clustering

0162-8828 (c) 2013 IEEE. Personal use is permitted, but republication/redistribution requires IEEE permission. Seehttp://www.ieee.org/publications_standards/publications/rights/index.html for more information.

This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI10.1109/TPAMI.2013.2297314, IEEE Transactions on Pattern Analysis and Machine Intelligence

8

5.4 Computing Time

The compute time is measured in seconds and has beenperformed on all the models of the benchmark. Wepresent in Table. 2, the average compute time whichincludes the restarts for both IPM and TV methods.

The compute time for the other algorithms is takenfrom [14], the computation was done on 2 Ghz processor.We did our evaluation for our approach on a 3 Ghzprocessor.

The average number of vertices per model is : 10224.

The average number of faces per model is : 20444.

Segmentaion Avg ComputeAlgorithm Time (s)Cheeger IPM 69Cheeger TV 522-Spectral Clustering 8,7Randomized Cuts 83,8Shape Diameter 8,9Normalized Cuts 49,4Core Extraction 19,5Random Walks 1,4Fitting Primitives 4,6K-means 2,5

TABLE 2Average compute time for each algorithm on the whole

benchmark.

6 CONCLUSION

We have introduced an automatic segmentation algo-rithm for 3D-objects which iteratively bisects a sub-mesh,best cuts are chosen basing on the optimal normalizedcheeger cut. All evaluations were done using the sameparameters, even though we noticed that some changesin parameters can enhance the results of some complexmeshes.

Our main idea was to encode in a single matrix bothstructural and geometrical information in order to forcethe cuts to occur in the concave regions. Then, using thespectral clustering approach we were able to find thebest cuts through the convex regions.

The recursive approach allows the extraction, first, ofthe meaningful parts of the 3d mesh before dealing withthe small parts in accordance to human perception.

The excellent results we have obtained, clearly showsthe importance of encoding different criteria based onhuman perception in segmentation algorithms and alsodenote the superiority of 1-spectral clustering over stan-dard spectral clustering for some complex meshes.

However, there are still two major points that wedid not addressed in this work: our approach can notautomatically provide the optimal number of segmentsper model and it can not deal with highly irregularmeshes. These two issues will be addressed in a futurwork.

REFERENCES

[1] A. Shamir, “Segmentation and shape extraction of 3d boundarymeshes,” in State of the Art Report, Proceedings Eurographics 2006,2006, pp. 137–149.

[2] A. shamir, “A formulation of boundary mesh segmentation,” inProceedings of the 3D Data Processing, Visualization, and Transmis-sion, 2nd International Symposium, ser. 3DPVT ’04, 2004, pp. 82–89.

[3] D. D. Hoffman and W. A. Richards, “Parts of recognition,” in Read-ings in computer vision: issues, problems, principles, and paradigms,1987, pp. 227–242.

[4] A. Agathos, I. Pratikakis, S. Perantonis, N. Sapidis, and P. Azari-adis, “3D Mesh Segmentation Methodologies for CAD applica-tions,” Computer-Aided Design and Applications, vol. 4, no. 6, pp.827–842, 2007.

[5] R. Liu and H. Zhang, “Segmentation of 3d meshes throughspectral clustering,” in Proceedings of the Computer Graphics andApplications, 12th Pacific Conference, ser. PG ’04, 2004, pp. 298–305.

[6] Y. Zhang, J. Paik, A. Koschan, and M. A. Abidi, “A simple andefficient algorithm for part decomposition of 3-d triangulatedmodels based on curvature analysis,” in in Proceedings of theInternational Conference on Image Processing, III, 2002, pp. 273–276.

[7] L. Moumoun, M. Chahhou, and T. Gadi, “3d hierarchical seg-mentation using the markers for the watershed transformation,”in International Journal of Engineering Science and Technology, vol. 2,July 2010, pp. 3165–3171.

[8] Y. Lee, S. Lee, A. Shamir, D. Cohen-Or, and H.-P. Seidel, “Meshscissoring with minima rule and part salience,” Comput. AidedGeom. Des., vol. 22, pp. 444–465, July 2005.

[9] U. Luxburg, “A tutorial on spectral clustering,” Statistics andComputing, vol. 17, pp. 395–416, December 2007.

[10] J. Shi and J. Malik, “Normalized cuts and image segmentation,”IEEE Trans. Pattern Anal. Mach. Intell., vol. 22, pp. 888–905, August2000.

[11] S. Amghibech, “Eigenvalues of the discrete p-laplacian forgraphs,” Ars Comb., vol. 67, 2003.

[12] T. Buhler and M. Hein, “Spectral clustering based on the graph p-laplacian,” in Proceedings of the 26th Annual International Conferenceon Machine Learning, ser. ICML ’09, 2009, pp. 81–88.

[13] P. C. Mahalanobis, “On the generalised distance in statistics,” inProceedings National Institute of Science, India, vol. 2, no. 1, Apr.1936, pp. 49–55.

[14] X. Chen, A. Golovinskiy, and T. Funkhouser, “A benchmark for 3dmesh segmentation,” ACM Trans. Graph., vol. 28, pp. 73:1–73:12,July 2009.

[15] S. Shlafman, A. Tal, and S. Katz, “Metamorphosis of polyhedralsurfaces using decomposition,” in Computer Graphics Forum, 2002,pp. 219–228.

[16] Y.-K. Lai, S.-M. Hu, R. R. Martin, and P. L. Rosin, “Fast meshsegmentation using random walks,” in Proceedings of the 2008ACM symposium on Solid and physical modeling, ser. SPM ’08, 2008,pp. 183–191.

[17] M. Attene, S. Katz, M. Mortara, G. Patane, M. Spagnuolo, andA. Tal, “Mesh segmentation - a comparative study,” in Proceedingsof the IEEE International Conference on Shape Modeling and Applica-tions 2006, 2006, pp. 7–.

[18] A. Golovinskiy and T. Funkhouser, “Randomized cuts for 3dmesh analysis,” ACM Trans. Graph., vol. 27, pp. 145:1–145:12,December 2008.

[19] S. Katz, G. Leifman, and A. Tal, “Mesh segmentation using featurepoint and core extraction,” The Visual Computer, vol. 21, no. 8-10,pp. 649–658, 2005.

[20] L. Shapira, A. Shamir, and D. Cohen-Or, “Consistent mesh parti-tioning and skeletonisation using the shape diameter function,”Vis. Comput., vol. 24, pp. 249–259, March 2008.

[21] M. Hein and T. Buhler, “An inverse power method for nonlin-ear eigenproblems with applications in 1-spectral clustering andsparse pca,” in Advances in Neural Information Processing Systems23, 2010, pp. 847–855.

[22] A. Szlam and X. Bresson, “Total variation and cheeger cuts,” inProceedings of the 27th International Conference on Machine Learning(ICML-10), June 2010, pp. 1039–1046.