Planted Cliques, Iterative Thresholding and Message ...yash/SLIDES/simons.pdf · Deshpande,...

Preview:

Citation preview

Planted Cliques, Iterative Thresholding and MessagePassing Algorithms

Yash Deshpande and Andrea Montanari

Stanford University

November 5, 2013

Deshpande, Montanari Planted Cliques November 5, 2013 1 / 49

Problem Definition

Given distributions Q0;Q1,

A Set ! S � [n]

Data ! Aij �(Q1 if i ; j 2 S

Q0 otherwise.

Aij = Aji

Problem: Given A, identify S

Deshpande, Montanari Planted Cliques November 5, 2013 2 / 49

Problem Definition

Given distributions Q0;Q1,

A Set ! S � [n]

Data ! Aij �(Q1 if i ; j 2 S

Q0 otherwise.

Aij = Aji

S

S

A =

Q1

Q0

Problem: Given A, identify S

Deshpande, Montanari Planted Cliques November 5, 2013 2 / 49

Problem Definition

Given distributions Q0;Q1,

A Set ! S � [n]

Data ! Aij �(Q1 if i ; j 2 S

Q0 otherwise.

Aij = Aji

S

S

A =

Q1

Q0

Problem: Given A, identify S

Deshpande, Montanari Planted Cliques November 5, 2013 2 / 49

An Example

Q1 = N(�; 1)

Q0 = N(0; 1):

S

S

A =

Q1

Q0

Deshpande, Montanari Planted Cliques November 5, 2013 3 / 49

An Example

A = �uSuTS + Z

λ

λ

λ

λ0

00

∼ N(0, 1)

Data = Sparse, Low Rank + noise

Deshpande, Montanari Planted Cliques November 5, 2013 4 / 49

An Example

A = �uSuTS + Z

λ

λ

λ

λ0

00

∼ N(0, 1)

Data = Sparse, Low Rank + noise

Deshpande, Montanari Planted Cliques November 5, 2013 4 / 49

Much work in statistics

Denoising:

y = x + noise

[Donoho, Jin 2004], [Arias-Castro, Candes, Durand 2011][Arias-Castro, Bubeck, Lugosi 2012]

Sparse signal recovery:

y = Ax + noise

[Candes, Romberg, Tao 2004], [Chen, Donoho 1998]

The simple example combines different structures

Deshpande, Montanari Planted Cliques November 5, 2013 5 / 49

Much work in statistics

Denoising:

y = x + noise

[Donoho, Jin 2004], [Arias-Castro, Candes, Durand 2011][Arias-Castro, Bubeck, Lugosi 2012]

Sparse signal recovery:

y = Ax + noise

[Candes, Romberg, Tao 2004], [Chen, Donoho 1998]

The simple example combines different structures

Deshpande, Montanari Planted Cliques November 5, 2013 5 / 49

Much work in statistics

Denoising:

y = x + noise

[Donoho, Jin 2004], [Arias-Castro, Candes, Durand 2011][Arias-Castro, Bubeck, Lugosi 2012]

Sparse signal recovery:

y = Ax + noise

[Candes, Romberg, Tao 2004], [Chen, Donoho 1998]

The simple example combines different structures

Deshpande, Montanari Planted Cliques November 5, 2013 5 / 49

Our Running Example: Planted Cliques

A is “adjacency” matrix

Q1 = �+1

Q0 =12�+1 +

12��1:

S forms a clique in the graph

[Jerrum, 1992]

Deshpande, Montanari Planted Cliques November 5, 2013 6 / 49

Our Running Example: Planted Cliques

A is “adjacency” matrix

Q1 = �+1

Q0 =12�+1 +

12��1:

S forms a clique in the graph

[Jerrum, 1992]

Deshpande, Montanari Planted Cliques November 5, 2013 6 / 49

Our Running Example: Planted Cliques

I Average case version ofMAX-CLIQUE

I Communities in networks

Deshpande, Montanari Planted Cliques November 5, 2013 7 / 49

Our Running Example: Planted Cliques

I Average case version ofMAX-CLIQUE

I Communities in networks

Deshpande, Montanari Planted Cliques November 5, 2013 7 / 49

How large should S be?

Let jS j = k

Size of largest clique in G�n; 1

2

�Second moment calculation ) k > 2 log2 n.

Deshpande, Montanari Planted Cliques November 5, 2013 8 / 49

How large should S be?

Let jS j = k

Size of largest clique in G�n; 1

2

�Second moment calculation ) k > 2 log2 n.

Deshpande, Montanari Planted Cliques November 5, 2013 8 / 49

Progress(0)

Complexity

k

n2 logn

2 log2 n

Exhaustive search

Deshpande, Montanari Planted Cliques November 5, 2013 9 / 49

A Naive Algorithm

Pick k largest degree vertices of G as bS .

Deshpande, Montanari Planted Cliques November 5, 2013 10 / 49

A Naive Algorithm

Pick k largest degree vertices of G as bS .

Deshpande, Montanari Planted Cliques November 5, 2013 10 / 49

Analysis of NAIVE

If i =2 S :

deg(i) = Binomial�n � 1;

12

�) max

i =2Sdeg(i) � n

2+ O(

qn log n)

If i 2 S :

deg(i) = k � 1 + Binomial�n � k + 1;

12

�) min

i2Sdeg(i) � k � 1

2+

n

2� O(

qn log n)

NAIVE works if: k � O(pn log n)

Deshpande, Montanari Planted Cliques November 5, 2013 11 / 49

Analysis of NAIVE

If i =2 S :

deg(i) = Binomial�n � 1;

12

�) max

i =2Sdeg(i) � n

2+ O(

qn log n)

If i 2 S :

deg(i) = k � 1 + Binomial�n � k + 1;

12

�) min

i2Sdeg(i) � k � 1

2+

n

2� O(

qn log n)

NAIVE works if: k � O(pn log n)

Deshpande, Montanari Planted Cliques November 5, 2013 11 / 49

Analysis of NAIVE

If i =2 S :

deg(i) = Binomial�n � 1;

12

�) max

i =2Sdeg(i) � n

2+ O(

qn log n)

If i 2 S :

deg(i) = k � 1 + Binomial�n � k + 1;

12

�) min

i2Sdeg(i) � k � 1

2+

n

2� O(

qn log n)

NAIVE works if: k � O(pn log n)

Deshpande, Montanari Planted Cliques November 5, 2013 11 / 49

Analysis of NAIVE

If i =2 S :

deg(i) = Binomial�n � 1;

12

�) max

i =2Sdeg(i) � n

2+ O(

qn log n)

If i 2 S :

deg(i) = k � 1 + Binomial�n � k + 1;

12

�) min

i2Sdeg(i) � k � 1

2+

n

2� O(

qn log n)

NAIVE works if: k � O(pn log n)

Deshpande, Montanari Planted Cliques November 5, 2013 11 / 49

Progress(1)

Complexity

k

n2 logn

2 log2 n

n2

√n log n

[Kučera, 1995]

Deshpande, Montanari Planted Cliques November 5, 2013 12 / 49

Spectral Method

A = uSuTS + Z

1

1

1

10

00

0

±1

±1±1

Hopefully v1(A) � uS

Deshpande, Montanari Planted Cliques November 5, 2013 13 / 49

Spectral Method

A = uSuTS + Z

1

1

1

10

00

0

±1

±1±1

Hopefully v1(A) � uS

Deshpande, Montanari Planted Cliques November 5, 2013 13 / 49

Analysis of SPECTRAL

1pnA =

�kpn

�| {z }

eSeTS +

Zpn:

By standard linear algebra:

�� Zp

n

2� 2 =) hv1(A); eSi � 1� �(�)

SPECTRAL works if k � Cpn.

Deshpande, Montanari Planted Cliques November 5, 2013 14 / 49

Progress(2)

Complexity

k

n2 logn

2 log2 n

n2

√n log n

n2 log n

C√n

[Alon, Krivelevich and Sudakov, 1998]

Deshpande, Montanari Planted Cliques November 5, 2013 15 / 49

Progress(2)

Complexity

k

n2 logn

2 log2 n

n2

√n log n

n2 log n

C√n

[Ames, Vavasis 2009], [Feige, Krauthgamer 2000]

Deshpande, Montanari Planted Cliques November 5, 2013 16 / 49

Progress(3)

Complexity

k

n2 logn

2 log2 n

n2

√n log n

n2 log n

C√n1.65

√n

[Dekel, Gurel-Gurevich and Peres, 2010]

Deshpande, Montanari Planted Cliques November 5, 2013 17 / 49

Progress(3)

Complexity

k

n2 logn

2 log2 n

n2

√n log n

n2 log n

C√n1.261

√n

[Dekel, Gurel-Gurevich and Peres, 2010]

Deshpande, Montanari Planted Cliques November 5, 2013 18 / 49

Progress(3)

Complexity

k

n2 logn

2 log2 n

n2

√n log n

n2 log n

C√n1.261

√n

nr+2

C√

n2r

[Alon, Krivelevich and Sudakov, 1998]

Deshpande, Montanari Planted Cliques November 5, 2013 19 / 49

Phase transition in spectrum of A

Deshpande, Montanari Planted Cliques November 5, 2013 20 / 49

Phase transition in spectrum of A

If k � (1 + ")pn

2−2

Limiting Spectral Density

hv1(A); eSi � �(")

Deshpande, Montanari Planted Cliques November 5, 2013 20 / 49

Phase transition in spectrum of A

If k � (1 + ")pn

2−2

Limiting Spectral Density

hv1(A); eSi � �(")

If k � (1� ")pn

2−2

Limiting Spectral Density

hv1(A); eSi � n�1=2+�0(")

Deshpande, Montanari Planted Cliques November 5, 2013 20 / 49

Phase transition in spectrum of A

If k � (1 + ")pn

2−2

Limiting Spectral Density

hv1(A); eSi � �(")

If k � (1� ")pn

2−2

Limiting Spectral Density

hv1(A); eSi � n�1=2+�0(")

[Knowles, Yin, 2011]

Deshpande, Montanari Planted Cliques November 5, 2013 20 / 49

Seems much harder than it looks!

I “Statistical algorithms” fail if k = n1=2��: [Feldman et al., 2012]

I r -Lovász-Schrijver fails for k � pn=2r : [Feige, Krauthgamer, 2002]

I r -Lasserre fails for k �pn

(log n)r2: [Widgerson and Meka, 2013]

Deshpande, Montanari Planted Cliques November 5, 2013 21 / 49

Seems much harder than it looks!

I “Statistical algorithms” fail if k = n1=2��: [Feldman et al., 2012]

I r -Lovász-Schrijver fails for k � pn=2r : [Feige, Krauthgamer, 2002]

I r -Lasserre fails for k �pn

(log n)r2: [Widgerson and Meka, 2013]

Deshpande, Montanari Planted Cliques November 5, 2013 21 / 49

Seems much harder than it looks!

I “Statistical algorithms” fail if k = n1=2��: [Feldman et al., 2012]

I r -Lovász-Schrijver fails for k � pn=2r : [Feige, Krauthgamer, 2002]

I r -Lasserre fails for k �pn

(log n)r2: [Widgerson and Meka, 2013]

Deshpande, Montanari Planted Cliques November 5, 2013 21 / 49

Our result

Theorem (Deshpande, Montanari, 2013)

If jS j = k � (1 + ")pn=e, there exists an O(n2 log n) time algorithm that

identifies S with high probability.

I will present:1 A (wrong) heuristic analysis2 How to fix the heuristic3 Lower bounds

Deshpande, Montanari Planted Cliques November 5, 2013 22 / 49

Our result

Theorem (Deshpande, Montanari, 2013)

If jS j = k � (1 + ")pn=e, there exists an O(n2 log n) time algorithm that

identifies S with high probability.

I will present:1 A (wrong) heuristic analysis2 How to fix the heuristic3 Lower bounds

Deshpande, Montanari Planted Cliques November 5, 2013 22 / 49

Our result

Theorem (Deshpande, Montanari, 2013)

If jS j = k � (1 + ")pn=e, there exists an O(n2 log n) time algorithm that

identifies S with high probability.

I will present:1 A (wrong) heuristic analysis2 How to fix the heuristic3 Lower bounds

Deshpande, Montanari Planted Cliques November 5, 2013 22 / 49

Iterative Thresholding

The power iteration:

v t+1 = A v t :

Improvement:

v t+1 = AFt(vt):

where Ft(v) = (ft(v1); ft(v2); � � � ; ft(vn))T

Choose ft(�) to exploit sparsity of eS

Deshpande, Montanari Planted Cliques November 5, 2013 23 / 49

Iterative Thresholding

The power iteration:

v t+1 = A v t :

Improvement:

v t+1 = AFt(vt):

where Ft(v) = (ft(v1); ft(v2); � � � ; ft(vn))T

Choose ft(�) to exploit sparsity of eS

Deshpande, Montanari Planted Cliques November 5, 2013 23 / 49

Iterative Thresholding

The power iteration:

v t+1 = A v t :

Improvement:

v t+1 = AFt(vt):

where Ft(v) = (ft(v1); ft(v2); � � � ; ft(vn))T

Choose ft(�) to exploit sparsity of eS

Deshpande, Montanari Planted Cliques November 5, 2013 23 / 49

Iterative Thresholding

The power iteration:

v t+1 = A v t :

Improvement:

v t+1 = AFt(vt):

where Ft(v) = (ft(v1); ft(v2); � � � ; ft(vn))T

Choose ft(�) to exploit sparsity of eS

Deshpande, Montanari Planted Cliques November 5, 2013 23 / 49

Iterative Thresholding

The power iteration:

v t+1 = A v t :

Improvement:

v t+1 = AFt(vt):

where Ft(v) = (ft(v1); ft(v2); � � � ; ft(vn))T

Choose ft(�) to exploit sparsity of eS

Deshpande, Montanari Planted Cliques November 5, 2013 23 / 49

(Wrong) Analysis

v t+1i =

1pn

Xj

Aij ft(vtj ):

Aij are random �1 r.v.

Use Central Limit Theorem for v t+1i

Deshpande, Montanari Planted Cliques November 5, 2013 24 / 49

(Wrong) Analysis

v t+1i =

1pn

Xj

Aij ft(vtj ):

Aij are random �1 r.v.

Use Central Limit Theorem for v t+1i

Deshpande, Montanari Planted Cliques November 5, 2013 24 / 49

(Wrong) Analysis

If i =2 S :

v t+1i =

1pn

Xj

Aij ft(vtj )

� N�0;

1n

Xj

ft(vtj )2

Letting v ti � N(0; �2t ). . .

�2t+1 =

1n

Xj

ft(vtj )2

= Efft(�t�)2g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 25 / 49

(Wrong) Analysis

If i =2 S :

v t+1i =

1pn

Xj

Aij ft(vtj )

� N�0;

1n

Xj

ft(vtj )2

Letting v ti � N(0; �2t ). . .

�2t+1 =

1n

Xj

ft(vtj )2

= Efft(�t�)2g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 25 / 49

(Wrong) Analysis

If i =2 S :

v t+1i =

1pn

Xj

Aij ft(vtj )

� N�0;

1n

Xj

ft(vtj )2

Letting v ti � N(0; �2t ). . .

�2t+1 =

1n

Xj

ft(vtj )2

= Efft(�t�)2g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 25 / 49

(Wrong) Analysis

If i =2 S :

v t+1i =

1pn

Xj

Aij ft(vtj )

� N�0;

1n

Xj

ft(vtj )2

Letting v ti � N(0; �2t ). . .

�2t+1 =

1n

Xj

ft(vtj )2

= Efft(�t�)2g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 25 / 49

(Wrong) AnalysisIf i 2 S :

v t+1i =

1pn

Xj2S

ft(vtj )

| {z }�t+1

+1pn

Xj =2S

Aij ft(vtj )

| {z }�N(0;�2

t+1)

where

�t+1 =1pn

Xj2S

ft(vtj )

=

�kpn

��1k

Xj2S

ft(vtj )

�= �Efft(�t + �t�)g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 26 / 49

(Wrong) AnalysisIf i 2 S :

v t+1i =

1pn

Xj2S

ft(vtj )

| {z }�t+1

+1pn

Xj =2S

Aij ft(vtj )

| {z }�N(0;�2

t+1)

where

�t+1 =1pn

Xj2S

ft(vtj )

=

�kpn

��1k

Xj2S

ft(vtj )

�= �Efft(�t + �t�)g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 26 / 49

(Wrong) AnalysisIf i 2 S :

v t+1i =

1pn

Xj2S

ft(vtj )

| {z }�t+1

+1pn

Xj =2S

Aij ft(vtj )

| {z }�N(0;�2

t+1)

where

�t+1 =1pn

Xj2S

ft(vtj )

=

�kpn

��1k

Xj2S

ft(vtj )

�= �Efft(�t + �t�)g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 26 / 49

(Wrong) AnalysisIf i 2 S :

v t+1i =

1pn

Xj2S

ft(vtj )

| {z }�t+1

+1pn

Xj =2S

Aij ft(vtj )

| {z }�N(0;�2

t+1)

where

�t+1 =1pn

Xj2S

ft(vtj )

=

�kpn

��1k

Xj2S

ft(vtj )

�= �Efft(�t + �t�)g;� � N(0; 1)

Deshpande, Montanari Planted Cliques November 5, 2013 26 / 49

Summarizing . . .

−4 −2 0 2 40

0.1

0.2

0.3

vti , i ∈ S

vti , i /∈ S

Deshpande, Montanari Planted Cliques November 5, 2013 27 / 49

State Evolution

�t+1 = �E fft(�t + �t�)g�2t+1 = Efft(�t�)2g:

Using the optimal function ft(x) = e�tx��2t

�t+1 = �e�2t =2

�2t+1 = 1

Deshpande, Montanari Planted Cliques November 5, 2013 28 / 49

State Evolution

�t+1 = �E fft(�t + �t�)g�2t+1 = Efft(�t�)2g:

Using the optimal function ft(x) = e�tx��2t

�t+1 = �e�2t =2

�2t+1 = 1

Deshpande, Montanari Planted Cliques November 5, 2013 28 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

If � < 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

If � < 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

If � < 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

If � < 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Fixed points develop below threshold!

If � > 1pe

µt+1

µt

If � < 1pe

µt+1

µt

Deshpande, Montanari Planted Cliques November 5, 2013 29 / 49

Analysis is wrong but. . .

Theorem (Deshpande, Montanari, 2013)

If jS j = k � (1 + ")pn=e, there exists an O(n2 log n) time algorithm that

identifies S with high probability.

. . . so we modify the algorithm.

Deshpande, Montanari Planted Cliques November 5, 2013 30 / 49

What algorithm?

Slight modification to iterative scheme:

(v ti )i2[n] ! (v ti!j)i ;j2[n]

v t+1i!j =

1pn

X6̀=i ;j

Ai` ft(vt`!i ):

Analysis is exact as n!1.

Deshpande, Montanari Planted Cliques November 5, 2013 31 / 49

What algorithm?

Slight modification to iterative scheme:

(v ti )i2[n] ! (v ti!j)i ;j2[n]

v t+1i!j =

1pn

X6̀=i ;j

Ai` ft(vt`!i ):

Analysis is exact as n!1.

Deshpande, Montanari Planted Cliques November 5, 2013 31 / 49

What algorithm?

Slight modification to iterative scheme:

(v ti )i2[n] ! (v ti!j)i ;j2[n]

v t+1i!j =

1pn

X6̀=i ;j

Ai` ft(vt`!i ):

Analysis is exact as n!1.

Deshpande, Montanari Planted Cliques November 5, 2013 31 / 49

What algorithm?

Slight modification to iterative scheme:

(v ti )i2[n] ! (v ti!j)i ;j2[n]

v t+1i!j =

1pn

X6̀=i ;j

Ai` ft(vt`!i ):

Analysis is exact as n!1.

Deshpande, Montanari Planted Cliques November 5, 2013 31 / 49

Fixing the heuristic

LemmaLet (ft(z))t�0 be a sequence of polynomials. Then, for every fixed t, andbounded, continuous function : R! R the following limit holds inprobability:

limn!1

1pn

Xi2S

(v ti!j) = �Ef (�t + �t�)g;

limn!1

1n

Xi2[n]nS

(v ti!j) = Ef (�t�)g;

where � � N(0; 1).

Deshpande, Montanari Planted Cliques November 5, 2013 32 / 49

Proof Technique

Key ideas:

Expand v ti!j for polynomial ft(�)

Wrong analysis works if A! At

Deshpande, Montanari Planted Cliques November 5, 2013 33 / 49

Proof Technique

Key ideas:

Expand v ti!j for polynomial ft(�)

Wrong analysis works if A! At

Deshpande, Montanari Planted Cliques November 5, 2013 33 / 49

Proof Technique

Key ideas:

Expand v ti!j for polynomial ft(�)

Wrong analysis works if A! At

Deshpande, Montanari Planted Cliques November 5, 2013 33 / 49

Proof Technique - Expanding v t

Let ft(x) = x2; v0i!j = 1

v1i!j =

Xk 6=j

Aik :i

k

j

Deshpande, Montanari Planted Cliques November 5, 2013 34 / 49

Proof Technique - Expanding v t

v2i!j =

Xk 6=j

Aik(v1k!i )

2

=Xk 6=j

Aik

�X` 6=i

Ak`

��Xm 6=i

Akm

�=Xk 6=j

X` 6=i

Xm 6=i

AikAk`Akm:

i

j

k

` m

Deshpande, Montanari Planted Cliques November 5, 2013 35 / 49

Proof Technique - Expanding v t

v2i!j =

Xk 6=j

Aik(v1k!i )

2

=Xk 6=j

Aik

�X` 6=i

Ak`

��Xm 6=i

Akm

�=Xk 6=j

X` 6=i

Xm 6=i

AikAk`Akm:

i

j

k

` m

Deshpande, Montanari Planted Cliques November 5, 2013 35 / 49

Proof Technique - Expanding v t

v2i!j =

Xk 6=j

Aik(v1k!i )

2

=Xk 6=j

Aik

�X` 6=i

Ak`

��Xm 6=i

Akm

�=Xk 6=j

X` 6=i

Xm 6=i

AikAk`Akm:

i

j

k

` m

Deshpande, Montanari Planted Cliques November 5, 2013 35 / 49

Proof Technique

v t+1i!j =

Xk 6=i

Aik ft(vtk!i ):

i

k

`m

o

p

qr

s w

�t+1i!j =

Xk 6=i

Atik ft(�

tk!i ):

i

k

`m

o

p

qr

s w

Deshpande, Montanari Planted Cliques November 5, 2013 36 / 49

Proof Technique - a Combinatorial Lemma

Lemma

v ti!j =X

T2T ti!j

A(T )Γ(T )v0(T )

where T ti!j consists rooted, labeled trees that:

1 have maximum depth t.2 do not backtrack.

(Similarly for the �ti!j)

Deshpande, Montanari Planted Cliques November 5, 2013 37 / 49

Proof Technique - Moment Method

v t+1i!j =

Xk 6=i

Aik ft(vtk!i ):

i

k

`m

o

p

qr

s w

�t+1i!j =

Xk 6=i

Atik ft(�

tk!i ):

i

k

`m

o

p

qr

s w

limn!1 Moments of v t+1 = Moments of �t+1.

Deshpande, Montanari Planted Cliques November 5, 2013 38 / 49

Proof Technique - Moment Method

v t+1i!j =

Xk 6=i

Aik ft(vtk!i ):

i

k

`m

o

p

qr

s w

�t+1i!j =

Xk 6=i

Atik ft(�

tk!i ):

i

k

`m

o

p

qr

s w

limn!1 Moments of v t+1 = Moments of �t+1.

Deshpande, Montanari Planted Cliques November 5, 2013 38 / 49

Progress(4)

Complexity

k

n2 logn

2 log2 n

n2

2√n log n

n2 log n

C√n1.261

√n

Spectral threshold =√n

√ne

Deshpande, Montanari Planted Cliques November 5, 2013 39 / 49

Is this threshold fundamental?

Rest of the talk: perhaps

Deshpande, Montanari Planted Cliques November 5, 2013 40 / 49

The “Hidden Set” Problem

Given Gn = ([n];En)

A Set ! S � [n]

Data ! Aij �(Q1 if i ; j 2 S ;

Q0 otherwise.

Aij ∼ Q1

Aij ∼ Q0

Problem: Given edge labels (Aij)(i ;j)2En , identify S

Deshpande, Montanari Planted Cliques November 5, 2013 41 / 49

The “Hidden Set” Problem

Given Gn = ([n];En)

A Set ! S � [n]

Data ! Aij �(Q1 if i ; j 2 S ;

Q0 otherwise.

Aij ∼ Q1

Aij ∼ Q0

Problem: Given edge labels (Aij)(i ;j)2En , identify S

Deshpande, Montanari Planted Cliques November 5, 2013 41 / 49

The “Hidden Set” Problem

Given Gn = ([n];En)

A Set ! S � [n]

Data ! Aij �(Q1 if i ; j 2 S ;

Q0 otherwise.

S

S

A =

Problem: Given edge labels (Aij)(i ;j)2En , identify S

Deshpande, Montanari Planted Cliques November 5, 2013 42 / 49

“Local” Algorithms

A t-local algorithm computes:

Estimate at i :

bu(i) = F(ABall(i ;t))

Deshpande, Montanari Planted Cliques November 5, 2013 43 / 49

The Sparse Graph Analogue

Gn = ([n];En); n � 1 satisfies:I locally tree-likeI regular degree ∆

FurtherI Q1 = �+1, Q0 = 1

2�+1 + 12��1

Aij ∼ Q1

Aij ∼ Q0

What can local algorithms achieve?

Deshpande, Montanari Planted Cliques November 5, 2013 44 / 49

The Sparse Graph Analogue

Gn = ([n];En); n � 1 satisfies:I locally tree-likeI regular degree ∆

FurtherI Q1 = �+1, Q0 = 1

2�+1 + 12��1

Aij ∼ Q1

Aij ∼ Q0

What can local algorithms achieve?

Deshpande, Montanari Planted Cliques November 5, 2013 44 / 49

The Sparse Graph Analogue

Gn = ([n];En); n � 1 satisfies:I locally tree-likeI regular degree ∆

FurtherI Q1 = �+1, Q0 = 1

2�+1 + 12��1

Aij ∼ Q1

Aij ∼ Q0

What can local algorithms achieve?

Deshpande, Montanari Planted Cliques November 5, 2013 44 / 49

The Sparse Graph Analogue

Gn = ([n];En); n � 1 satisfies:I locally tree-likeI regular degree ∆

FurtherI Q1 = �+1, Q0 = 1

2�+1 + 12��1

Aij ∼ Q1

Aij ∼ Q0

What can local algorithms achieve?

Deshpande, Montanari Planted Cliques November 5, 2013 44 / 49

What can we hope for?

If jS j = C np∆:

bSnaive = Random set of size jS j

) 1nEfbSnaive4Sg = Θ

�1p∆

�:

Deshpande, Montanari Planted Cliques November 5, 2013 45 / 49

What can we hope for?

If jS j = C np∆:

Poisson bound ) for any local algorithm:

1nEfbS4Sg � e�C

0p

∆:

Deshpande, Montanari Planted Cliques November 5, 2013 46 / 49

A result for local algorithms. . .

Theorem (Deshpande, Montanari, 2013)Let Gn converge locally to ∆�regular tree:If jS j � (1 + ") np

e∆there exists a local algorithm achieving

1nEfS4bSg � e�Θ(

p∆):

Conversely, if jS j � (1� ") npe∆

every local algorithm suffers

1nEfS4bSg � Θ

�1p∆

With ∆ = n � 1 we recover the complete graph result!

Deshpande, Montanari Planted Cliques November 5, 2013 47 / 49

A result for local algorithms. . .

Theorem (Deshpande, Montanari, 2013)Let Gn converge locally to ∆�regular tree:If jS j � (1 + ") np

e∆there exists a local algorithm achieving

1nEfS4bSg � e�Θ(

p∆):

Conversely, if jS j � (1� ") npe∆

every local algorithm suffers

1nEfS4bSg � Θ

�1p∆

With ∆ = n � 1 we recover the complete graph result!

Deshpande, Montanari Planted Cliques November 5, 2013 47 / 49

A result for local algorithms. . .

Theorem (Deshpande, Montanari, 2013)Let Gn converge locally to ∆�regular tree:If jS j � (1 + ") np

e∆there exists a local algorithm achieving

1nEfS4bSg � e�Θ(

p∆):

Conversely, if jS j � (1� ") npe∆

every local algorithm suffers

1nEfS4bSg � Θ

�1p∆

With ∆ = n � 1 we recover the complete graph result!

Deshpande, Montanari Planted Cliques November 5, 2013 47 / 49

To conclude. . .

I Message-passing algorithm performs “weighted” counts ofnon-reversing trees

I Such structures have been used elsewhere:� Clustering sparse networks: [Krzakala et al. 2013]� Compressed sensing: [Bayati, Lelarge, Montanari 2013]

Deshpande, Montanari Planted Cliques November 5, 2013 48 / 49

To conclude. . .

I Message-passing algorithm performs “weighted” counts ofnon-reversing trees

I Such structures have been used elsewhere:� Clustering sparse networks: [Krzakala et al. 2013]� Compressed sensing: [Bayati, Lelarge, Montanari 2013]

Deshpande, Montanari Planted Cliques November 5, 2013 48 / 49

To conclude. . .

I Message-passing algorithm performs “weighted” counts ofnon-reversing trees

I Such structures have been used elsewhere:� Clustering sparse networks: [Krzakala et al. 2013]� Compressed sensing: [Bayati, Lelarge, Montanari 2013]

Deshpande, Montanari Planted Cliques November 5, 2013 48 / 49

To conclude. . .

I What is the “dense” analogue for local algorithms?

I What about other structural properties?

Thank you!

Deshpande, Montanari Planted Cliques November 5, 2013 49 / 49

To conclude. . .

I What is the “dense” analogue for local algorithms?

I What about other structural properties?

Thank you!

Deshpande, Montanari Planted Cliques November 5, 2013 49 / 49

To conclude. . .

I What is the “dense” analogue for local algorithms?

I What about other structural properties?

Thank you!

Deshpande, Montanari Planted Cliques November 5, 2013 49 / 49

Recommended