Expanders Eliyahu Kiperwasser. What is it? Expanders are graphs with no small cuts. The later gives...

Preview:

Citation preview

Expanders

Eliyahu Kiperwasser

What is it?

Expanders are graphs with no small cuts. The later gives several unique traits to such graph, such as:– High connectivity.– No “bottle-neck”.

What is it? (First definition)

,E S S

GS

2

VS where

Graph is an expander if and only if for every subset S, is a constant larger than 1.

G=(V,E) is an expander if the number of edges originating from every subset of vertices is larger than the number of vertices at least by a constant factor.

That’s Easy…

We are all familiar with graphs which are in fact expanders with more than a constant factor.– i.e. cliques.

The challenge is to find sparse graphs which hold as expanders.

Construction of Explicit Expanders

In this section, we describe two ways to build such marvelous objects as expanders. The following two methods show that some constant-degree regular graph exist with good expansion.

We will show:– The Margulis/Gaber-Galil Expander.– The Lubotsky-Philips-Sarnak Expander.

Lubotsky-Philips-Sarnak Expander

A graph on p+1 nodes, where p is a prime. Let graph vertices be V=ZpU{inf} V is a normal algebraic field with one

extension, it contains also 0-1, meaning inf. Given a vertex x, connect it to:

– X+1– X-1– 1/X

Proof is out of this lecture’s scope.

Lubotsky-Philips-Sarnak Example

Given a vertex x, connect it to:– X+1– X-1– 1/X

Inf

Margulis/Gaber-Galil Expander

A graph on m2 nodes Every node is a pair (x,y) where x,y Zm

(x,y) is connected to – (x+y,y), (x-y,y)– (x+y+1,y), (x-y-1,y)– (x,y+x), (x,y-x)– (x,y+x+1), (x,y-x-1)

(all operations are modulo m) Proof is out of lecture scope.

Spectral gap

From now on we will discuss only d-regular undirected graphs, where A(G) is the adjacency matrix.

Since A is a real symmetric matrix, it contains the following real eigenvalues: λ1=> λ2=>…=> λn . Let λ = max{ |λi(G)| : i>1}

We define spectral gap as d-λ.

A second definition of expanders

We can define an expander graph by looking at the adjacency matrix A.

Graph is an expander if and only if A’s spectral gap is larger than zero.

Expansion vs. Spectral gap

Theorem:– If G is an (n, d, λ)-expander then

– We prove only one direction.

2

22

Gd G

d

Rayleigh Quotient

The following is a lemma which will appear useful is the future.

Lemma: Proof:

– For A(G), λ1=d is easily seen to be the largest eigenvalue accompanied by the vector of ones as an eigenvector.

– There exists an orthonormal basis V1,V2,…,Vn where each Vi is an eigenvector of A(G).

, 1 0, 0

,max

( , )nx x x

Ax x

x x

R

Rayleigh Quotient

Proof cont.

1 1

1 2

2 2 2 2

2 2 2 2

, 1 0, 0

,1 , 0 0

, , ,

,

,

,

,max

( , )n

n n

i i i i i i i i i ii i

n n n n

i i i i i i i i ii i i i

x x x

x x v a

Ax x a a v v a a v v

a v v a a a

x x

therefore

Ax x

x x

R

Lower Bound Theorem

In this section we will prove the correctness of the lower bound suggested by previous theorem.

We prove that Proof:

– Let

2

dG

v

S v Sx

S v S

1 0vv

x x S S S S

22 22v

v

x x S S S S S S S S S S n

Lower Bound Theorem

Proof cont.– By combining the Rayleigh coefficient with the

fact that (Ax,x)<=||Ax||*||x||, we get:– We will develop this inner product further:

2,Ax x x

, ,

, ,, ,

2 2

2

, 2

2 2 2

2 , , ,

,

u u v u vuu u u v E u v E

u v u v u vu v S u v Su v E S S

Ax x x Ax x x x x

x x x x x x

E S S S S d S E S S S d S E S S S

d S S n E S S n

Lower Bound Theorem

Proof cont.– After previous calculations we now have all that is

needed to use Rayleigh lemma:

– Since S contains at most half of the graph’s vertices, we conclude:

2,

,

d S S n E S S n S S n

dE S S S S

n

,

2

E S S d

S

2

nS

Markov Chains

Definition: A finite state machine with probabilities for each transition, that is, a probability that the next state is j given that the current state is i.

Named after Andrei Andreyevich Markov (1856 - 1922), who studied poetry and other texts as stochastic sequences of characters.

We will use Markov chains for the proof of our next final lemma, in order to analyze a random walk on an expander graph.

In directed graphs

There is an exponentially decreasing probability to reach a distant vertex.

In undirected graphs

In an undirected graph this probability can decrease by a polynomial factor.

Expander guarantee an almost uniform chance to “hit” each vertex. For example, clique provides a perfect uniform distribution.

Random walks

0 1 0 0

0.5 0 0.5 0

0 0.5 0 0.5

0 0 1 0

A G

0 1 0 0

1 0 1 0

0 1 0 1

0 0 1 0

A G

On the left we see the adjacency matrix associated with the graph.

On the right we see the probability of transition between vertex i to j.

Markov Chain

Random walks - Explanation

0 1 0 0

0.5 0 0.5 0

0 0.5 0 0.5

0 0 1 0

A G

0 1 0 0

1 0 1 0

0 1 0 1

0 0 1 0

A G

Hence, the probability of hitting an arbitrary vertex v on the ith step is equal to the sum over all v neighbors of the probability of hitting those vertices multiply by the probability of the transition to v.

Random walks – Algebraic notation

0 1 0 0

0.5 0 0.5 0

0 0.5 0 0.5

0 0 1 0

A G

0 1 0 0

1 0 1 0

0 1 0 1

0 0 1 0

A G

We can re-write the expression in a compact manner: Where x is the initial distribution.A x

Random walks - Example

Suppose x is a uniform distribution on the vertices then after one step on the graph we receive the following distribution:

0 1 0 0 0.25 0.25

0.5 0 0.5 0 0.25 0.25*

0 0.5 0 0.5 0.25 0.25

0 0 1 0 0.25 0.25

A G

An Expander Lemma

Let G be an (n,d,λ)-expander and F subset of E. Then the probability that a random walk, starting in the zero-th step from a random edge in F, passes through F on its t step is bounded by

Later used to prove PCP theorem.

1tF

E d

A random walk as a Markov chain

Proof– Let X be a vector containing the current

distribution on the vertices.– Let X’ be the next distribution vector, meaning the

probability to visit vertex u is (Ax)u

– In algebraic notation,

' '

,

vu u

u v E

x Axx x Ax

d d

Expressing the success probability

Proof cont. – the initial distribution x.– Observation: The distribution we reach after the i-

th step is .– Let P be the probability we are interested at,

which is that of traversing an edge of F in the t step.

– Let be the number of edges of F incident on w, divided by d.

– then

iA x

wy

1iww

w V

P A x y

Plugging the initial x

Proof cont.– To calculate X, we pick an edge in F, then pick

one of the endpoints of that edge to start on. Resulting:

– Using the previous results we get:

2w w

Fy x

d

1

1

1

2

2,

iww

w V

iww

w V

i

P A x y

FA x x

d

FA x x

d

Decomposing x

Proof cont.– Observation: The sum of each row in A/d equals

one.– Hence, if is a uniform distribution on the

vertices of the graph, then – We decompose any vector x to uniform

distribution plus the remaining orthogonal components.

– More intuitively, we separate x to V1 component and the rest of the orthogonal basis.

||x|| ||A x x

Random Walk

Final Expander Lemma

Proof cont.– By linearity,

1 1 1

1

i i i

i

A x A x A x

x A x

Final Expander Lemma

Proof cont.– Hence,

1 1 1 1

2 1 1

11

12

, , , , ,

1, ,

1 1

1

i i i i

i i

ii

i

A x x A x x A x x x x A x x

x A x x A x xn

A x x x xn n d

xn d

Final Expander Lemma

Proof cont.– Since the entries of X are positive,

– Maximum can be achieved when all edges incident to v are in F, therefore

2 2 max maxv v v v v vx x x x x

max2v v

dx

F

F

Final Expander Lemma

Proof cont.– We will continue with previous calculation:

1 1

21

1

1 1, max

1

2

i ii

v v

i

A x x x xn d n d

d

n d F

Final Expander Lemma

Proof cont.– Let’s see what we have so far,

– By combining those result, we finish our proof:

1

11

2,

1,

2

2

i

ii

FP A x x

d

dA x x

n d F

n dE

1

1

2 i

i

FP

dn d

FP

E d

1

11

2,

1,

2

i

ii

FP A x x

d

dA x x

n d F

The graph is d-regular,

2

n dE

Additional Lemma

The following lemma will be useful to us when proving the PCP theorem.

If G is a d-regular graph on the vertex set V and H is a d’-regular graph on V then G’=GUH=(V,E(G)UE(H)) is a d+d’-regular graph such that λ(G’)<=λ(G)+λ(H)

Lemma Proof

Choose x which sustains the following:

– In words, the second eigenvector normalized.

' '1, 1 0, ( ) ,x x G A G x x

'

'

, , ,A G x x A G x x A H x x

G H

resulting

G G H

The End

Questions?

Recommended