Upload
augusto-cabrera
View
216
Download
0
Embed Size (px)
Citation preview
8/12/2019 Trees Networks
1/585
8/12/2019 Trees Networks
2/585
A love and respect of trees has been characteristic of mankind since
the beginning of human evolution. Instinctively, we understood theimportance of trees to our lives before we were able to ascribe reasonsfor our dependence on them.
Americas Garden Book, James and Louise Bush-Brown, rev.ed. by The New York Botanical Garden, Charles ScribnersSons, New York, 1980, p. 142.
The cover shows a sample from the wired uniform spanning forest on the edgesof the(2, 3, 7)-triangle tessellation of the hyperbolic plane.
ic19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
3/585
Table of Contents
Preface vii
Chapter 1: Some Highlights 1
1. Branching Number 22. Electric Current 63. Random Walks 64. Percolation 85. Branching Processes 96. Random Spanning Trees 107. Hausdorff Dimension 138. Capacity 159. Embedding Trees into Euclidean Space 16
10. Notes 1811. Collected In-Text Exercises 18
Chapter 2: Random Walks and Electric Networks 19
1. Circuit Basics and Harmonic Functions 192. More Probabilistic Interpretations 253. Network Reduction 284. Energy 335. Transience and Recurrence 386. Hitting and Cover Times 507. The Gaussian Network 528. Notes 549. Collected In-Text Exercises 55
10. Additional Exercises 58
Chapter 3: Special Networks 68
1. Flows, Cutsets, and Random Paths 682. Trees 733. Growth of Trees 764. Cayley Graphs 815. Notes 856. Collected In-Text Exercises 857. Additional Exercises 86
iic19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
4/585
Chapter 4: Uniform Spanning Trees 89
1. Generating Uniform Spanning Trees 902. Electrical Interpretations 993. The Square LatticeZ2 106
4. Notes 1125. Collected In-Text Exercises 1176. Additional Exercises 119
Chapter 5: Percolation on Trees 123
1. Galton-Watson Branching Processes 1232. The First-Moment Method 1293. The Weighted Second-Moment Method 1324. Reversing the Second-Moment Inequality 1385. Surviving Galton-Watson Trees 141
6. Galton-Watson Networks 1467. Notes 1498. Collected In-Text Exercises 1499. Additional Exercises 150
Chapter 6: Isoperimetric Inequalities 155
1. Flows and Submodularity 1552. Spectral Radius 1623. Mixing Time 1674. Planar Graphs 172
5. Profiles and Transience 1786. Anchored Expansion and Percolation 1807. Euclidean Lattices 1908. Notes 1939. Collected In-Text Exercises 195
10. Additional Exercises 196
Chapter 7: Percolation on Transitive Graphs 201
1. Groups and Amenability 2032. Tolerance, Ergodicity, and Harriss Inequality 207
3. The Number of Infinite Clusters 2114. Inequalities forpc 2145. Merging Infinite Clusters and Invasion Percolation 2206. Upper Bounds for pu 2257. Lower Bounds for pu 2278. Bootstrap Percolation on Infinite Trees 2339. Notes 235
10. Collected In-Text Exercises 23811. Additional Exercises 239
iiic19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
5/585
Chapter 8: The Mass-Transport Technique and Percolation 242
1. The Mass-Transport Principle for Cayley Graphs 2422. Beyond Cayley Graphs: Unimodularity 2453. Infinite Clusters in Invariant Percolation 252
4. Critical Percolation on Nonamenable Transitive Unimodular Graphs 2565. Percolation on Planar Quasi-Transitive Graphs 2586. Properties of Infinite Clusters 2627. Percolation on Amenable Graphs 2668. Notes 2689. Collected In-Text Exercises 272
10. Additional Exercises 273
Chapter 9: Infinite Electrical Networks and Dirichlet Functions 276
1. Free and Wired Electrical Currents 276
2. Planar Duality 2783. Harmonic Dirichlet Functions 2804. Planar Graphs and Hyperbolic Graphs 2885. Random Walk Traces 2956. Notes 2997. Collected In-Text Exercises 3028. Additional Exercises 303
Chapter 10: Uniform Spanning Forests 306
1. Limits Over Exhaustions 306
2. Coupling and Equality 3113. Planar Networks and Euclidean Lattices 3174. Tail Triviality 3205. The Number of Trees 3236. The Size of the Trees 3327. Loop-Erased Random Walk and Harmonic Measure From Infinity 3448. Open Questions 3459. Notes 346
10. Collected In-Text Exercises 34611. Additional Exercises 348
Chapter 11: Minimal Spanning Forests 351
1. Minimal Spanning Trees 3512. Deterministic Results 3543. Basic Probabilistic Results 3584. Tree Sizes 3605. Planar Graphs 3666. Non-Treeable Groups 3687. Open Questions 369
ivc19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
6/585
8. Notes 3709. Collected In-Text Exercises 370
10. Additional Exercises 372
Chapter 12: Limit Theorems for Galton-Watson Processes 374
1. Size-biased Trees and Immigration 3742. Supercritical Processes: Proof of the Kesten-Stigum Theorem 3783. Subcritical Processes 3804. Critical Processes 3825. Notes 3856. Collected In-Text Exercises 3857. Additional Exercises 386
Chapter 13: Speed of Random Walks 388
1. Basic Examples 388
2. The Varopoulos-Carne Bound 3943. Branching Number of a Graph 3964. Stationary Measures on Trees 3985. Galton-Watson Trees 4056. Markov Type of Metric Spaces 4097. Embeddings of Finite Metric Spaces 4118. Notes 4159. Collected In-Text Exercises 417
10. Additional Exercises 418
Chapter 14: Hausdorff Dimension 4211. Basics 4212. Coding by Trees 4253. Galton-Watson Fractals 4314. Holder Exponent 4345. Derived Trees 4366. Collected In-Text Exercises 4407. Additional Exercises 441
Chapter 15: Capacity 443
1. Definitions 4432. Percolation on Trees 4463. Euclidean Space 4474. Fractal Percolation and Intersections 4515. Left-to-Right Crossing in Fractal Percolation 4586. Generalized Diameters and Average Meeting Height on Trees 4617. Notes 4648. Collected In-Text Exercises 4669. Additional Exercises 467
vc19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
7/585
Chapter 16: Harmonic Measure on Galton-Watson Trees 469
1. Introduction 4692. Markov Chains on the Space of Trees 4713. The Holder Exponent of Limit Uniform Measure 477
4. Dimension Drop for Other Flow Rules 4815. Harmonic-Stationary Measure 4826. Confinement of Simple Random Walk 4857. Calculations 4888. Notes 4949. Collected In-Text Exercises 494
10. Additional Exercises 495
Comments on Exercises 498
Bibliography 538
Index 566
Glossary of Notation 574
vic19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
8/585
Preface
This book began as lecture notes for an advanced graduate course called Probabilityon Trees that Lyons gave in Spring 1993. We are grateful to Rabi Bhattacharya for havingsuggested that he teach such a course. We have attempted to preserve the informal flavorof lectures. Many exercises are also included, so that real courses can be given based onthe book. Indeed, previous versions have already been used for courses or seminars inseveral countries. The most current version of the book can be found on the web. A few
of the authors results and proofs appear here for the first time. At this point, almost allof the actual writing was done by Lyons. We hope to have a more balanced co-authorshipeventually.
This book is concerned with certain aspects of discrete probability on infinite graphsthat are currently in vigorous development. We feel that there are three main classes ofgraphs on which discrete probability is most interesting, namely, trees, Cayley graphs ofgroups (or more generally, transitive, or even quasi-transitive, graphs), and planar graphs.Thus, this book develops the general theory of certain probabilistic processes and thenspecializes to these particular classes. In addition, there are several reasons for a specialstudy of trees. Since in most cases, analysis is easier on trees, analysis can be carriedfurther. Then one can often either apply the results from trees to other situations or cantransfer to other situations the techniques developed by working on trees. Trees also occurnaturally in many situations, either combinatorially or as descriptions of compact sets inEuclidean space Rd. (More classical probability, of course, has tended to focus on thespecial and important case of the Euclidean lattices Zd.)
It is well known that there are many connections among probability, trees, and groups.We survey only some of them, concentrating on recent developments of the past twentyyears. We discuss some of those results that are most interesting, most beautiful, or easiestto prove without much background. Of course, we are biased by our own research interestsand knowledge. We include the best proofs available of recent as well as classic results.Much more is known about almost every subject we present. The only prerequisite is
knowledge of conditional expectation with respect to a -algebra, and even that is rarelyused. For part of Chapter13and all of Chapter 16, basic knowledge of ergodic theory isalso required.
Most exercises that appear in the text, as opposed to those at the end of the chapters,are ones that will be particularly helpful to do when they are reached. They either facilitateones understanding or will be used later in the text. These in-text exercises are alsocollected at the end of each chapter for easy reference, just before additional exercises arepresented.
Some notation we use is for a sequence (or, sometimes, more general function),
viic19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
9/585
for the restriction of a function or measure to a set, E[X; A] for the expectation ofX onthe event A, and|| for the cardinality of a set. Some definitions are repeated in differentchapters to enable more selective reading.
A question labelled as Question m.n is one to which the answer is unknown, wherem andn are numbers. Unattributed results are usually not due to us.
Major chapter dependencies are indicated in this figure:
1: Intro
3: Special
6: Isop14: HD 13: Speed
2: RW & EN
9: Dirich4: UST
5: Perc Trees
12: GW
7: Perc Trans
8: MTP
11: MSF
10: USF
16: RW GW15: Cap
Lyons is grateful to the Institute for Advanced Studies and the Institute of Math-ematics, both at the Hebrew University of Jerusalem, for support during some of thewriting. We are grateful to Brian Barker, Jochen Geiger, Janko Gravner, Svante Janson,Tri Minh Lai, Steve Morrow, Peter Morters, Jason Schweinsberg, Jeff Steif, Adam Timar,
and especially Jacob Magnusson for noting several corrections to the manuscript.
Russell Lyons Yuval Peres
Indiana University Microsoft Corporation
[email protected] [email protected]
http://mypage.iu.edu/~rdlyons/ http://research.microsoft.com/en-us/um/people/peres/
viiic19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
http://research.microsoft.com/en-us/um/people/peres/http://mypage.iu.edu/~rdlyons/mailto:[email protected]:[email protected]8/12/2019 Trees Networks
10/585
1
Chapter 1
Some Highlights
This chapter gives some of the highlights to be encountered in this book. Some of
the topics in the book do not appear at all here since they are not as suitable to a quick
overview. Also, we concentrate in this overview on trees since it is easiest to use them to
illustrate most of the themes.
Notation and terminology for graphs is the following. A graph is a pair G = (V, E),
where V is a set ofvertices and E is a symmetric subset of V V, called the edge set.The word symmetric means that (x, y) E iff (y, x) E; here, x and y are called theendpoints of (x, y). The symmetry assumption is usually phrased by saying that the
graph is undirected or that its edges are unoriented. Without this assumption, the
graph is called directed. If we need to distinguish the two, we write an unoriented edge
as [x, y], while an oriented edge is written asx, y. An unoriented edge can be thought ofas the pair of oriented edges with the same endpoints. If (x, y)E, then we call x and yadjacentorneighbors, and we write x
y. Thedegree of a vertex is the number of its
neighbors. If this is finite for each vertex, we call the graph locally finite . If the degree
of every vertex is the same number d, then the graph is called regular or d-regular. If
x is an endpoint of an edge e, we also say that x and e are incident, while if two edges
share an endpoint, then we call those edges adjacent. If we have more than one graph
under consideration, we distinguish the vertex and edge sets by writingV(G) andE(G). A
path* in a graph is a sequence of vertices where each successive pair of vertices is an edge
in the graph. A finite path with at least one edge and whose first and last vertices are the
same is called a cycle. A graph isconnectedif there is a path from any of its vertices to
any other. A graph with no cycles is called a forest; a connected forest is a tree.If there are numbers (weights) c(e) assigned to the edges e of a graph, the resulting
object is called a network. Sometimes we work with more general objects than graphs,
called multigraphs. A multigraphis a pair of sets, VandE, together with a pair of maps
* In graph theory, a path is necessarily self-avoiding. What we call a path is called in graph theorya walk. However, to avoid confusion with random walks, we do not adopt that terminology. Whena path does not pass through any vertex (resp., edge) more than once, we will call it vertex simple(resp., edge simple). Well just saysimple also to mean vertex simple, which implies edge simple.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
11/585
Chap. 1: Some Highlights 2
from EV denotedee andee+. The images ofe are called the endpoints ofe,the former being its tailand the latter its head. Ife =e+ =x, then e is a loop at x.Edges with the same pair of endpoints are called parallel ormultiple. If the multigraph
is undirected, then for every edgee
E, there is an edge
e
E such that (
e) =e+ and
(e)+ =e. Given a network G = (V, E) with weightsc() and a subset of its vertices K,theinduced subnetworkGKis the subnetwork with vertex set K, edge set (KK)E,and weights c(K K) E.
Items such as theorems are numbered in this book as C.n, where C is the chapter
number and n is the item number in that chapter; C is omitted when the item appears
in the same chapter as the reference to it. In particular, in this chapter, items to be
encountered in later chapters are numbered with their chapter numbers.
1.1. Branching Number.A tree is calledlocally finite if the degree of each vertex is finite (but not necessarily
uniformly bounded). Our trees will usually be rooted, meaning that some vertex is des-
ignated as the root, denoted o. We imagine the tree as growing (upwards) away from its
root. Each vertex then has branches leading to its children, which are its neighbors that
are further from the root. For the purposes of this chapter, we do not allow the possibility
of leaves, i.e., vertices without children.
How do we assign an average branching number to an arbitrary infinite locally finitetree? If the tree is a binary tree, as in Figure 1.1, then clearly the answer will be 2. But
in the general case, since the tree is infinite, no straight average is available. We must take
some kind of limit or use some other procedure.
Figure 1.1. The binary tree.
One simple idea is as follows. Let Tn be the set of vertices at distance n from the
root,o. Define the lower (exponential) growth rate of the tree to be
gr T := lim infn |Tn|
1/n .
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
12/585
1. Branching Number 3
This certainly will give the number 2 to the binary tree. One can also define the upper
(exponential) growth rate
gr T := lim supn
|Tn|1/n
and the (exponential) growth rate
gr T := limn
|Tn|1/n
when the limit exists. However, notice that these notions of growth barely account for
the structure of the tree: only|Tn| matters, not how the vertices at different levels areconnected to each other. Of course, if T is spherically symmetric, meaning that for
eachn, every vertex at distance n from the root has the same number of children (which
may depend onn), then there is really no more information in the tree than that contained
in the sequence|Tn|. For more general trees, however, we will use a different approach.Consider the tree as a network of pipes and imagine water entering the network at the
root. However much water enters a pipe leaves at the other end and splits up among the
outgoing pipes (edges). Consider the following sort of restriction: Given 1, supposethat the amount of water that can flow through an edge at distance nfromois onlyn. Ifis too big, then perhaps no water can flow. In fact, consider the binary tree. A moments
thought shows that water can still flow throughout the tree provided that 2, but thatas soon as >2, then no water at all can flow. Obviously, this critical value of 2 for is
the same as the branching number of the binary tree. So let us make a general definition:the branching number of a tree T is the supremum of those that admit a positive
amount of water to flow through T; denote this critical value of by br T. As we will see,
this definition is the exponential of what Furstenberg (1970) called the dimension of a
tree, which is the Hausdorff dimension of its boundary.
It is not hard to check that br Tis related to gr T by
br T gr T . (1.1)
Often, as in the case of the binary tree, equality holds here. However, there are manyexamples of strict inequality.
Exercise 1.1.
Prove(1.1).
Exercise 1.2.
Show that br T = gr T whenTis spherically symmetric.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
13/585
Chap. 1: Some Highlights 4
Example 1.1. IfT is a tree such that vertices at even distances from o have 2 children
while the rest have 3 children, then br T = gr T =
6. It is easy to see that gr T =
6,
whence by(1.1), it remains to show that br T 6, i.e., given < 6, to show that apositive amount of water can flow to infinity with the constraints given. Indeed, we can
use the water flow with amount 6n/2 flowing on those edges at distance n from the rootwhen n is even and with amount 6(n1)/2/3 flowing on those edges at distance n fromthe root when n is odd.
Example 1.2. (The 13 Tree) LetTbe a tree embedded in the upper half plane with
oat the origin. List Tn in clockwise order asxn1 , . . . , xn2n. Letxnk have 1 child ifk2n1and 3 children otherwise; see Figure 1.2. Now, a ray is an infinite path from the root
that doesnt backtrack. Ifx is a vertex ofT that does not have the form xn2n, then there
are only finitely many rays that pass through x. This means that water cannot flow to
infinity through such a vertex x when > 1. That leaves only the possibility of water
flowing along the single ray consisting of the vertices xn2n , which is impossible too. Hence
br T= 1, yet gr T = 2.
o
Figure 1.2. A tree with branching number 1 and growth rate 2.
Example 1.3. IfT(1) andT(2) are trees, form a new tree T(1) T(2) from disjoint copiesof T(1) and T(2) by joining their roots to a new point taken as the root of T(1) T(2)
(Figure1.3). Thenbr(T(1) T(2)) = br T(1) br T(2)
since water can flow in the join T(1) T(2) iff water can flow in one of the trees.We will denote by|x| the distance of a vertex x to the root.
Example 1.4. We will put two trees together such that br(T(1) T(2)) = 1 but gr(T(1) T(2)) > 1. Let nk . Let T(1) (resp., T(2)) be a tree such that x has 1 child (resp.,
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
14/585
1. Branching Number 5
T(1) T(2)
o
Figure 1.3. Joining two trees.
2 children) for n2k |x| n2k+1 and 2 (resp., 1) otherwise. Ifnk increases sufficientlyrapidly, then br T(1) = br T(2) = 1, so br(T(1) T(2)) = 1. But ifnk increases sufficientlyrapidly, then gr(T(1) T(2)) = 2.
1 1
1 12 2
2 2
Figure 1.4. A schematic representation of a tree with branching
number 1 and growth rate 2.Exercise 1.3.
Verify that ifnk increases sufficiently rapidly, then gr(T(1) T(2)) =
2. Furthermore,
show that the set of possible values of gr(T(1) T(2)) overall sequencesnk is [
2, 2].
While gr T is easy to compute, br T may not be. Nevertheless, it is the branching
number which is much more important. Fortunately, Furstenberg (1967) gave a useful
condition sufficient for equality in(1.1): Given a vertexx in T, let Tx denote the subtree
ofTformed by the descendants ofx. This tree is rooted at x.
Theorem 3.8. If for all verticesxT, there is an isomorphism ofTx as a rooted treeto a subtree ofT rooted ato, thenbr T = gr T.
We call trees satisfying the hypothesis of this theorem subperiodic; actually, we will
later broaden slightly this definition. As we will see, subperiodic trees arise naturally,
which accounts for the importance of Furstenbergs theorem.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
15/585
Chap. 1: Some Highlights 6
1.2. Electric Current.We can ask another flow question on trees, namely: If n is the conductance of
edges at distance n from the root ofTand a battery is connected between the root and
infinity, will current flow? Of course, what we mean is that we establish a unit potential
between the root and level N ofT, let N , and see whether the limiting current ispositive. If so, the tree is said to have positive effective conductance and finite effective
resistance. (All electrical terms will be carefully explained in Chapter 2.)
Example 1.5. Consider the binary tree. By symmetry, all the vertices at a given distance
from o have the same potential, so they may be identified (soldered together) without
changing any voltages or currents. This gives a new graph whose vertices may be identified
with N, while there are 2n edges joining n 1 to n. These edges are in parallel, so theymay be replaced by a single edge whose conductance is their sum, (2/)
n
. Now we haveedges in series, so the effective resistance is the sum of the edge resistances,
n(/2)
n.
This is finite iff
8/12/2019 Trees Networks
16/585
3. Random Walks 7
Proposition 1.7. (Voltage as Probability) For any vertexx, the voltage atx equals
the probability that the corresponding random walk visitsa1 before it visitsa0 when it starts
atx.
In fact, the proof of this proposition is simple: there is a discrete Laplacian (a difference
operator) for which both the voltage and the probability mentioned are harmonic functions
ofx. The two functions clearly have the same values at ai (the boundary points) and the
uniqueness principle holds for this Laplacian, whence the functions agree at all vertices
x. This is developed in detail in Section 2.1. A superb elementary exposition of this
correspondence is given by Doyle and Snell (1984).
What does this say about our trees? Given N, identify all the vertices of level N,
i.e., TN, to one vertex, a1 (see Figure 1.5). Use the root as a0. Then according to
Proposition 1.7, the voltage at x is the probability that the random walk visits level N
before it visits the root when it starts from x. WhenN , the limiting voltages areall 0 iff the limiting probabilities are all 0, which is the same thing as saying that on the
infinite tree, the probability of visiting the root from any vertex is 1, i.e., the random walk
is recurrent. Since no current flows across edges whose endpoints have the same voltage,
we see that no electrical current flows iff the random walk is recurrent.
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 10 00 01 11 10 00 01 11 10 00 01 11 10 00 01 11 1 0 00 01 11 1 0 00 01 11 1 0 00 01 11 1 0 00 01 11 1 0 00 01 11 10 00 01 11 10 00 01 11 10 00 01 11 10 00 01 11 10 00 01 11 10 00 01 11 10 00 01 11 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0
0
1
1
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0 0
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1 1
0
000
1
111
0 0 0 0 0 0 0
0 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 0
1 1 1 1 1 1 1
1 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 1
0 0 0 0 0 0
0 0 0 0 0 00 0 0 0 0 00 0 0 0 0 0
1 1 1 1 1 1
1 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1 0 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 01 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 10 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 01 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 1000011110 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1 00001111 00001111 0 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 01 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 10 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 00 0 0 0 0 0 01 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 11 1 1 1 1 1 1 0 0 0 00 0 0 00 0 0 00 0 0 01 1 1 11 1 1 11 1 1 11 1 1 10 0 0 00 0 0 00 0 0 00 0 0 01 1 1 11 1 1 11 1 1 11 1 1 100
0
11
1
00
0
11
1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
0 0 0 00 0 0 0
0 0 0 0
1 1 1 11 1 1 1
1 1 1 1
00
0
11
1
0 0 0 00 0 0 0
0 0 0 0
1 1 1 11 1 1 1
1 1 1 1
0 0 0 00 0 0 0
0 0 0 0
1 1 1 11 1 1 1
1 1 1 1
0 0 0 00 0 0 0
0 0 0 0
1 1 1 11 1 1 1
1 1 1 1
00
0
11
1
00
0
11
1
0 0 0 00 0 0 0
0 0 0 0
1 1 1 11 1 1 1
1 1 1 1
0 0 0 00 0 0 0
0 0 0 0
1 1 1 11 1 1 1
1 1 1 1
0 0 0
0 0 0
0 0 0
1 1 1
1 1 1
1 1 1
0 0 0
0 0 0
0 0 0
1 1 1
1 1 1
1 1 1
0 0 0
0 0 0
0 0 0
1 1 1
1 1 1
1 1 1
0
0
0
1
1
1
0
0
0
1
1
1
0 0 0 0
0 0 0 0
0 0 0 0
1 1 1 1
1 1 1 1
1 1 1 1
0
0
0
1
1
1
0 0
0 0
0 0
1 1
1 1
1 1
0 0 0
0 0 0
0 0 0
1 1 1
1 1 1
1 1 1
0
0
0
1
1
1
0 0 0 0
0 0 0 0
0 0 0 0
1 1 1 1
1 1 1 1
1 1 1 1
0
0
0
1
1
1
0 0 0 0
0 0 0 0
0 0 0 0
1 1 1 1
1 1 1 1
1 1 1 1
0
0
0
1
1
1
0
0
0
1
1
1
0
0
0
1
1
1
0
0
0
1
1
1
0 0 0 0
0 0 0 0
0 0 0 0
1 1 1 1
1 1 1 1
1 1 1 1
0
0
0
1
1
1
0
0
1
1
0
0
1
1001100110 00 01 11 10 00 01 11 1 0011 0011 0011 0011 0 00 01 11 1001100110011001100110 00 01 11 10 00 01 11 1
0 0
0 0
1 1
1 1
0 0
0 0
1 1
1 1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0
0
1
1
0 0
0 0
1 1
1 1
0
0
1
1
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
0 0 0 0 0 0 0 0 0 0 0
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
1 1 1 1 1 1 1 1 1 1 1
0
000
1
111
0 0 0 0 0 0
0 0 0 0 0 00 0 0 0 0 00 0 0 0 0 0
1 1 1 1 1 1
1 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1
0 0 0 0 0 0
0 0 0 0 0 00 0 0 0 0 00 0 0 0 0 0
1 1 1 1 1 1
1 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1 0 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1000011110 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1 00001111 00001111 0 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 10 0 0 0 0 00 0 0 0 0 00 0 0 0 0 00 0 0 0 0 01 1 1 1 1 11 1 1 1 1 11 1 1 1 1 11 1 1 1 1 1 0 0 00 0 00 0 00 0 01 1 11 1 11 1 11 1 1 0 0 00 0 00 0 00 0 01 1 11 1 11 1 11 1 100
0
11
1
00
0
11
1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
00
0
11
1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
00
0
11
1
00
0
11
1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
0 0 00 0 0
0 0 0
1 1 11 1 1
1 1 1
0 0
0 0
1 1
1 1
a1
Figure 1.5. Identifying a level to a vertex, a1.
1
1
1
Figure 1.6. The relative weights at a vertex. The tree is growing upwards.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
17/585
Chap. 1: Some Highlights 8
Now when the conductances decrease by a factor of as the distance increases, the
relative weights at a vertex other than the root are as shown in Figure 1.6. That is,
the edge leading back toward the root is times as likely to be taken as each other
edge. Denoting the dependence of the random walk on the parameter byRW, we may
translate Theorem1.6into the following theorem (Lyons,1990):
Theorem 3.5. If < br T, then RW is transient, while if > br T, then RW is
recurrent.
Is this intuitive? Consider a vertex other than the root with, say, d children. If
we consider only the distance from o, which increases or decreases at each step of the
random walk, a balance between increasing and decreasing occurs when = d. Ifd were
constant, it is easy to see that indeed d would be the critical value separating transience
from recurrence. What Theorem 3.5says is that this same heuristic can be used in thegeneral case, provided we substitute the average br T for d.
We will also see how to use electrical networks to prove Polyas wonderful theorem
that simple random walk on the hypercubic latticeZd is recurrent ford2 and transientfor d3.
1.4. Percolation.
Suppose that we remove edges at random from T. To be specific, keep each edge withsome fixed probability p and make these decisions independently for different edges. This
random process is called percolation. By Kolmogorovs 0-1 law, the probability that an
infinite connected component remains in the tree is either 0 or 1. On the other hand, this
probability is monotonic inp, whence there is a critical valuepc(T) where it changes from
0 to 1. It is also clear that the bigger the tree, the more likely it is that there will be
an infinite component for a givenp. That is, the bigger the tree, the smaller the critical
value pc. Thus,pc is vaguely inversely related to a notion of average branching number.
Actually, this vague heuristic is precise (Lyons, 1990):
Theorem5.15. For any tree,pc(T) = 1/ br T.
Let us look more closely at the intuition behind this. If a vertex x has d children, then
the expected number of children after percolation is dp. Ifdp is usually less than 1, one
would not expect that an infinite component would remain, while ifdp is usually greater
than 1, then one might guess that an infinite component would be present. Theorem 5.15
says that this intuition becomes correct when one replaces the usual d by br T.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
18/585
5. Branching Processes 9
Note that there is an infinite component with probability 1 iff the component of the
root is infinite with positive probability.
1.5. Branching Processes.
Percolation on a fixed tree produces random trees by random pruning, but there is
a way to grow trees randomly due to Bienayme in 1845. Given probabilities pk adding
to 1 (k = 0, 1, 2, . . .), we begin with one individual and let it reproduce according to
these probabilities, i.e., it has k children with probability pk. Each of these children (if
there are any) then reproduce independently with the same law, and so on forever or until
some generation goes extinct. The family trees produced by such a process are called
(Bienayme)-Galton-Watson trees . A fundamental theorem in the subject is thatextinction is a.s. iffm1 andp1 1. This says that pc = 1/m a.s. on
nonextinction, i.e., br T =m.
Let Zn be the size of the nth generation in a Galton-Watson process. How quickly
does Zn grow? It will be easy to calculate that E[Zn] = mn. Moreover, a martingale
argument will show that the limit W := limn
Zn/mn always exists (and is finite).
When 1< m 0 on the event of nonextinction? The answer isyes, under a very mild hypothesis:
The Kesten-Stigum Theorem (1966). The following are equivalent when1 < m 0 a.s. on the event of nonextinction;
(ii)
k=1pkk log k
8/12/2019 Trees Networks
19/585
Chap. 1: Some Highlights 10
This will be shown in Section12.2. Although condition (ii) appears technical, we will
enjoy a conceptual proof of the theorem that uses only the crudest estimates.
1.6. Random Spanning Trees.This fertile and fascinating field is one of the oldest areas to be studied in this book,
but one of the newest to be explored in depth. A spanning tree of a (connected) graph
is a subgraph that is connected, contains every vertex of the whole graph, and contains no
cycle: see Figure1.7for an example. These trees are usually not rooted. The subject of
random spanning trees of a graph goes back to Kirchhoff (1847), who showed its relation
to electrical networks. (Actually, Kirchhoff did not think probabilistically, but, rather, he
considered quotients of the number of spanning trees with a certain property divided by
the total number of spanning trees.) One of these relations gives the probability that auniformly chosen spanning tree will contain a given edge in terms of electrical current in
the graph.
Figure 1.7. A spanning tree in a graph, where the edges of the graph
not in the tree are dashed.
Lets begin with a very simple finite graph. Namely, consider the ladder graph of
Figure1.8. Among all spanning trees of this graph, what proportion contain the bottom
rung (edge)? In other words, if we were to choose at random a spanning tree, what is the
chance that it would contain the bottom rung? We have illustrated the entire probability
spaces for the smallest ladder graphs in Figure1.9.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
20/585
6. Random Spanning Trees 11
1
2
3
n 2n 1n
Figure 1.8. A ladder graph.
1/1
3/4
11/15
Figure 1.9. The ladder graphs of heights 0, 1, and 2, together with their spanning trees.
As shown, the probabilities in these cases are 1/1, 3/4, and 11/15. The next one is
41/56. Do you see any pattern? One thing that is fairly evident is that these numbers are
decreasing, but hardly changing. It turns out that they come from every other term of the
continued fraction expansion of 3 1 = 0.73+ and, in particular, converge to 3 1.In the limit, then, the probability of using the bottom rung is
31 and, even before
taking the limit, this gives an excellent approximation to the probability. How can we
easily calculate such numbers? In this case, there is a rather easy recursion to set up and
solve, but we will use this example to illustrate the more general theorem of Kirchhoff that
we mentioned above. In fact, Kirchhoffs theorem will show us why these probabilities are
decreasing even before we calculate them.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
21/585
Chap. 1: Some Highlights 12
Suppose that each edge of our graph is an electric conductor of unit conductance.
Hook up a battery between the endpoints of any edgee, say the bottom rung (Figure1.10).
Kirchhoff (1847) showed that the proportion of current that flows directly along e is then
equal to the probability that e belongs to a randomly chosen spanning tree!
e
Figure 1.10. A battery is hooked up between the endpoints ofe.
Coming back to the ladder graph and its bottom rung, e, we see that current flows
in two ways: some flows directly across e and some flows through the rest of the network.
It is intuitively clear (and justified by Rayleighs monotonicity principle) that the higher
the ladder, the greater the effective conductance of the ladder minus the bottom rung,
hence, by Kirchhoffs theorem, the less the chance that a random spanning tree contains
the bottom rung. This confirms our observations.It turns out that generating spanning trees at random according to the uniform mea-
sure is of interest to computer scientists, who have developed various algorithms over the
years for random generation of spanning trees. In particular, this is closely connected to
generating a random state from any Markov chain. See Propp and Wilson (1998) for more
on this issue.
Early algorithms for generating a random spanning tree used the Matrix-Tree Theo-
rem, which counts the number of spanning trees in a graph via a determinant. A better
algorithm than these early ones, especially for probabilists, was introduced by Aldous
(1990) and Broder (1989). It says that if you start a simple random walk at any vertex
of a finite (connected) graph G and draw every edge it traverses except when it would
complete a cycle (i.e., except when it arrives at a previously-visited vertex), then when no
more edges can be added without creating a cycle, what will be drawn is a uniformly cho-
sen spanning tree ofG. (To be precise: ifXn (n0) is the path of the random walk, thenthe associated spanning tree is the set of edges
[Xn, Xn+1] ; Xn+1 / {X0, X1, . . . , X n}
.)
This beautiful algorithm is quite efficient and useful for theoretical analysis, yet Wilson
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
22/585
7. Hausdorff Dimension 13
(1996) found an even better one that well describe in Section4.1.
Return for a moment to the ladder graphs. We saw that as the height of the ladder
tends to infinity, there is a limiting probability that the bottom rung of the ladder graph
belongs to a uniform spanning tree. This suggests looking at uniform spanning trees on
general infinite graphs. So suppose that G is an infinite graph. LetGn be finite subgraphs
with G1 G2 G3 and
Gn = G. Pemantle (1991) showed that the weak limit
of the uniform spanning tree measures on Gn exists, as conjectured by Lyons. (In other
words, ifn denotes the uniform spanning tree measure on Gn and B, B are finite sets
of edges, then limn n(B T, B T = ) exists, where T denotes a random spanningtree.) This limit is now called the free uniform spanning forest* onG, denotedFUSF.
Considerations of electrical networks play the dominant role in Pemantles proof. Pemantle
(1991) discovered the amazing fact that on Zd, the uniform spanning forest is a single tree
a.s. ifd4; but when d5, there are infinitely many trees a.s.!
1.7. Hausdorff Dimension.Consider again any finite connected graph with two distinguished vertices a and z.
This time, the edges e have assigned positive numbers c(e) that represent the maximum
amount of water that can flow through the edge (in either direction). How much water
can flow into a and out ofz? Consider any set of edges that separatesa from z, i.e.,
the removal of all edges in would leave a and z in different components. Such a set
is called a cutset. Since all the water must flow through , an upper bound for the
maximum flow from a to z is
e c(e). The beautiful Max-Flow Min-Cut Theorem ofFord and Fulkerson (1962) says that these are the only constraints: the maximum flow
equals min a cutset
e c(e).Applying this theorem to our tree situation where the amount of water that can flow
through an edge at distance n from the root is limited to n, we see that the maximumflow from the root to infinity is
infx
|x
|; cuts the root from infinity .Here, we identify a set of vertices with their preceding edges when considering cutsets.
By analogy with the leaves of a finite tree, we call the set of rays ofT the boundary
(at infinity) ofT, denotedT. (It does not include any leaves ofT.) Now there is a natural
* In graph theory, spanning forest usually means a maximal subgraph without cycles, i.e., a spanningtree in each connected component. We mean, instead, a subgraph without cycles that contains everyvertex.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
23/585
Chap. 1: Some Highlights 14
metric on T: if , T have exactly n edges in common, define their distance to bed(, ) :=en. Thus, ifxThas more than one child with infinitely many descendants,the set of rays going throughx,
Bx :={T; |x| =x} ,
has diameter diam Bx =e|x|. We call a collection Cof subsets ofT a cover if
BC
B =T .
Note that is a cutset iff{Bx; x} is a cover. The Hausdorff dimension ofT is
defined to be
dim T:= sup
; inf
C a countable cover
BC
(diam B) >0
.
This is, in fact, already familiar to us, since
br T= sup{ ; water can flow through pipe capacities |x|}
= sup ; inf a cutset x
|x| >0
= exp sup
; inf
a cutset
x
e|x| >0
= exp sup
; inf
C a cover
BC
(diam B) >0
= exp dim T .
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
24/585
8. Capacity 15
1.8. Capacity.An important tool in analyzing electrical networks is that of energy. Thomsons
principle says that, given a finite graph and two distinguished vertices a, z, the unit
current flow from a to z is the unit flow from a to z that minimizes energy, where, if the
conductances are c(e), the energyof a flow is defined to bee an edge (e)2/c(e). (Itturns out that the energy of the unit current flow is equal to the effective resistance from
a to z.) Thus,
electrical current flows from the root of an infinite tree
(1.2)there is a flow with finite energy.
Now on a tree, a unit flow can be identified with a function on the vertices ofT that is
1 at the root and has the property that for all vertices x,
(x) =i
(yi) ,
whereyi are the children ofx. The energy of a flow is thenxT
(x)2|x| ,
whence
br T= sup
; there exists a unit flow
xT(x)2|x| 0,
random walk with parameter = e is transient cap(T)> 0 . (1.4)It follows from Theorem3.5that
the critical value of for positivity of cap(T) is dim T. (1.5)
A refinement of Theorem5.15is (Lyons,1992):
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
25/585
Chap. 1: Some Highlights 16
Theorem 15.3. (Tree Percolation and Capacity) For > 0, percolation with
parameterp= e yields an infinite component a.s. iffcap(T)> 0. Moreover,
cap(T)P[the component of the root is infinite]2 cap(T) .
When Tis spherically symmetric and p= e, we have (Exercise15.1)
cap(T) =
1 + (1 p)
n=1
1
pn|Tn|
1.
The case of the first part of this theorem where all the degrees are uniformly bounded
was shown first by Fan (1989,1990).
One way to use this theorem is to combine it with (1.4); this allows us to translate
problems freely between the domains of random walks and percolation (Lyons,1992). Thetheorem actually holds in a much wider context (on trees) than that mentioned here: Fan
allowed the probabilities p to vary depending on the generation and Lyons allowed the
probabilities as well as the tree to be completely arbitrary.
1.9. Embedding Trees into Euclidean Space.The results described above, especially those concerning percolation, can be translated
to give results on closed sets in Euclidean space. We will describe only the simplest such
correspondence here, which was the one that was part of Furstenbergs motivation in 1970.
Namely, for a closed nonempty setE[0, 1] and for any integerb2, consider the systemofb-adic subintervals of [0, 1]. Those whose intersection with Eis non-empty will form the
vertices of the associated tree. Two such intervals are connected by an edge iff one contains
the other and the ratio of their lengths is b. The root of this tree is [0, 1]. We denote the
tree by T[b](E). Were it not for the fact that certain numbers have two representations
in base b, we could identify T[b](E) with E. Note that such an identification would be
Holder continuous in one direction (only).
Hausdorff dimension is defined for subsets of [0, 1] just as it was for T:
dim E:= sup
; inf
C a cover of E
BC
(diam B) >0
,
where diam B denotes the (Euclidean) diameter ofE. Covers ofT[b](E) by sets of the
form Bx correspond to covers ofE byb-adic intervals, but of diameter b|x|, rather than
e|x|. It easy to show that restricting to such covers does not change the computation
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
26/585
9. Embedding Trees into Euclidean Space 17
0 1
Figure 1.11. In this case b= 4.
of Hausdorff dimension, whence we may conclude (compare the calculation at the end of
Section1.7) that
dim E=dim T[b](E)
log b = logb(br T[b](E)) .
For example, the Hausdorff dimension of the Cantor middle-thirds set is log 2/ log 3, which
we see by using the base b = 3. It is interesting that if we use a different base, we will still
have br T[b](E) = 2 when E is the Cantor set.
Capacity is also defined as it was on the boundary of a tree:
(capE)1 := inf d(x) d(y)|x y| ; a probability measure on E .It was shown by Benjamini and Peres (1992) (see Section15.3) that
(capE)/3cap log b(T[b](E))b capE . (1.6)
This means that the percolation criterion Theorem15.3can be used in Euclidean space. In
fact, inequalities such as(1.6)also hold for more general kernels than distance to a power
and for higher-dimensional Euclidean spaces. Nice applications to the trace of Brownian
motion were found by Peres (1996) by replacing the path by an intersection-equivalent
random fractal that is much easier to analyze, being an embedding of a Galton-Watson
tree. The point is that probability of an intersection of Brownian motion with another set
can be estimated by a capacity in a fashion very similar to (1.6). This will allow us to
prove some surprising things about Brownian motion in a very easy fashion.
Also, the fact(1.5)translates into the analogous statement about subsets E[0, 1];this is a classical theorem of Frostman (1935).
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
27/585
Chap. 1: Some Highlights 18
1.10. Notes.
Other recent books that cover material related to the topics of this book include Probabilityon Graphsby Geoffrey Grimmett (forthcoming), Reversible Markov Chains and Random Walkson Graphs by David Aldous and Jim Fill (forthcoming), Markov Chains and Mixing Times by
David A. Levin, Yuval Peres, and Elizabeth L. Wilmer, Random Trees: An Interplay BetweenCombinatorics and Probability by Michael Drmota, A Course on the Web Graph by AnthonyBonato,Random Graph Dynamicsby Rick Durrett,Complex Graphs and Networksby Fan Chungand Linyuan Lu, The Random-Cluster Model by Geoffrey Grimmett, Superfractals by MichaelFielding Barnsley, Introduction to Mathematical Methods in Bioinformaticsby Alexander Isaev,Gaussian Markov Random Fieldsby Havard Rue and Leonhard Held, Conformally Invariant Pro-cesses in the Plane by Gregory F. Lawler, Random Networks in Communication by MassimoFranceschetti and Ronald Meester, Percolationby Bela Bollobas and Oliver Riordan, Probabilityand Real Treesby Steven Evans, Random Trees, Levy Processes and Spatial Branching Processesby Thomas Duquesne and Jean-Francois Le Gall, Combinatorial Stochastic Processesby Jim Pit-man,Random Geometric Graphsby Mathew Penrose, Random Graphsby Bela Bollobas,Random
Graphsby Svante Janson, Tomasz Luczak, and Andrzej Rucinski,Phylogeneticsby Charles Sempleand Mike Steel, Stochastic Networks and Queuesby Philippe Robert, Random Walks on InfiniteGraphs and Groupsby Wolfgang Woess,Percolationby Geoffrey Grimmett,Stochastic InteractingSystems: Contact, Voter and Exclusion Processes by Thomas M. Liggett, and Discrete Groups,Expanding Graphs and Invariant Measuresby Alexander Lubotzky.
1.11. Collected In-Text Exercises.
1.1. Prove(1.1).
1.2. Show that br T = gr T when Tis spherically symmetric.
1.3. Verify that ifnk increases sufficiently rapidly, then gr(T(1) T(2)) =
2. Furthermore,
show that the set of possible values of gr(T(1) T(2)) over all sequencesnk is [
2, 2].
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
28/585
19
Chapter 2
Random Walks and Electric Networks
The two topics of the title of this chapter do not sound related to each other, but,
in fact, they are intimately connected in several useful ways. This is a discrete version of
more general profound and detailed connections between potential theory and probability;
see, e.g., Chapter II of Bass (1995) or Doob (1984). A superb elementary introduction to
the ideas of the first five sections of this chapter is given by Doyle and Snell (1984).
2.1. Circuit Basics and Harmonic Functions.Our principal interest in this chapter centers around transience and recurrence of
irreducible Markov chains. If the chain starts at a state x, then we want to know whether
the chance that it ever visits a state a is 1 or not.
In fact, we are interested only in reversible Markov chains, where we call a Markov
chain reversible if there is a positive function x (x) on the state space such thatthe transition probabilities satisfy (x)pxy =(y)pyx for all pairs of states x, y. (Such a
function () will then provide a stationary measure: see Exercise2.1. Note that() is
not generally a probability measure.) In this case, make a graph G by taking the states
of the Markov chain for vertices and joining two vertices x, y by an edge when pxy > 0.
Assign weight
c(x, y) :=(x)pxy (2.1)
to that edge; note that the condition of reversibility ensures that this weight is the same
no matter in what order we take the endpoints of the edge. With this network in hand,
the Markov chain may be described as a random walk on G: when the walk is at a vertex
x, it chooses randomly among the vertices adjacent to x with transition probabilities
proportional to the weights of the edges. Conversely, every connected graph with weights
on the edges such that the sum of the weights incident to every vertex is finite gives rise to
a random walk with transition probabilities proportional to the weights. Such a random
walk is an irreducible reversible Markov chain: define (x) to be the sum of the weights
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
29/585
Chap. 2: Random Walks and Electric Networks 20
incident to x.*
The most well-known example is gamblers ruin. A gambler needs $nbut has only $k
(1kn1). He plays games that give him chancepof winning $1 andq:= 1pof losing$1 each time. When his fortune is either $nor 0, he stops. What is his chance of ruin (i.e.,
reaching 0 beforen)? We will answer this in Example2.4by using the following weighted
graph. The vertices are{0, 1, 2, . . . , n}, the edges are between consecutive integers, andthe weights are c(i, i + 1) =c(i + 1, i) = (p/q)i.
Exercise 2.1.
(Reversible Markov Chains) This exercise contains some background information and
facts that we will use about reversible Markov chains.
(a) Show that if a Markov chain is reversible, thenx1, x2, . . . , xn,
(x1)n1i=1
pxixi+1 =(xn)n1i=1
pxn+1ixni,
whencen1
i=1 pxixi+1 =n1
i=1 pxn+1ixni ifx1 =xn. This last equation also charac-
terizes reversibility.
(b) LetXn be a random walk on G and let x and y be two vertices in G. LetP be apath from x to y andP its reversal, a path from y to x. Show that
PxXn; ny=P y < +x =PyXn; nx=P x < +y ,where w denotes the first time the random walk visits w,
+w denotes the first time
after 0 that the random walk visitsw, andPu denotes the law of random walk started
at u. In words, paths between two states that dont return to the starting point and
stop at the first visit to the endpoint have the same distribution in both directions of
time.
(c) Consider a random walk on G that is either transient or is stopped on the first visit
to a set of vertices Z. LetG(x, y) be the expected number of visits to y for a random
walk started at x; if the walk is stopped at Z, we count only those visits that occurstrictly before visiting Z. Show that for every pair of vertices x andy,
(x)G(x, y) =(y)G(y, x) .
* Suppose that we consider an edge e ofG to have length c(e)1. Run a Brownian motion on G andobserve it only when it reaches a vertex different from the previous one. Then we see the randomwalk on G just described. There are various equivalent ways to define rigorously Brownian motionon G. The essential point is that for Brownian motion on R started at 0 and for a,b >0, ifX is thefirst of{a, b} visited, then P[X= a] = a/(a + b).
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
30/585
1. Circuit Basics and Harmonic Functions 21
(d) Show that random walk on a connected weighted graph G is positive recurrent (i.e.,
has a stationary probability distribution) iff
x,yc(x, y)
8/12/2019 Trees Networks
31/585
Chap. 2: Random Walks and Electric Networks 22
Maximum Principle. LetG be a finite or infinite network. IfHG, H is connectedand finite, f : V(G) R, f is harmonic on V(H), and max fV(H) = sup f, then f isconstant onV(H).
Proof. LetK :={yV(G) ; f(y) = sup f}. Note that ifxV(H) K, then{x} Kbyharmonicity offatx. SinceHis connected andV(H) K= , it follows thatKV(H)and therefore that KV(H).
This leads to the
Uniqueness Principle. LetG= (V, E) be a finite or infinite connected network. LetW
be a finite proper subset of V. Iff, g:V R, f, g are harmonic onW, andf(V \ W) =g(V \ W), thenf=g.
Proof. Let h:= f g. We claim that h0. This suffices to establish the corollary sincethen h0 by symmetry, whence h0.
Now h = 0 off W, so if h 0 on W, then h is positive somewhere on W, whencemax hW= max h. LetHbe a connected component of
W, E(WW) wherehachieves
its maximum. According to the maximum principle, h is a positive constant on V(H). In
particular, h > 0 on the non-empty set V(H) \ V(H). However, V(H) \ V(H) V\ W,whenceh= 0 on V(H) \ V(H). This is a contradiction.
Thus, the harmonicity of the function x
Px[A < Z] (together with its values
where it is not harmonic) characterizes it.
Iff, f1, and f2 are harmonic on W and a1, a2 R with f =a1f1+a2f2 on V \ W,then f = a1f1 +a2f2 everywhere by the uniqueness principle. This is one form of the
superposition principle.
Given a function defined on a subset of vertices of a network, the Dirichlet problem
asks whether the given function can be extended to all vertices of the network so as to be
harmonic wherever it was not originally defined. The answer is often yes:
Existence Principle. Let G = (V, E) be a finite or infinite network. If W V andf0 : V \ W R is bounded, thenf :V R such thatf(V \ W) =f0 andf is harmoniconW.
Proof. For any starting point x of the network random walk, let Xbe the first vertex in
V \ Wvisited by the random walk ifV \ W is indeed visited. LetY :=f0(X) when V \ Wis visited and Y := 0 otherwise. It is easily checked that f(x) := Ex[Y] works, by using
the same method as we used to see that the functionF of(2.2)is harmonic.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
32/585
1. Circuit Basics and Harmonic Functions 23
The function F of(2.2)is the particular case of the Existence Principle where W =
V \ (A Z), f0A1, and f0Z0.In fact, for finite networks, we could have immediately deduced existence from unique-
ness: The Dirichlet problem on a finite network consists of a finite number of linear equa-
tions, one for each vertex in W. Since the number of unknowns is equal to the number of
equations, the uniqueness principle implies the existence principle. An example is shown
in Figure2.1, where the function was specified to be 1 at two vertices, 0.5 at another, and
0 at a fourth; the function is harmonic elsewhere.
Figure 2.1. A harmonic function on a 4040 square grid with4 specified values where it is not harmonic.
In order to study the solution to the Dirichlet problem, especially for a sequence
of subgraphs of an infinite graph, we will discover that electrical networks are useful.
Electrical networks, of course, have a physical meaning whose intuition is useful to us, but
also they can be used as a rigorous mathematical tool.
Mathematically, an electrical network is just a weighted graph. But now we call the
weights of the edges conductances; their reciprocals are called resistances. (Note that
later, we will encounter effective conductances and resistances; these are not the same.) We
hook up a battery or batteries (this is just intuition) betweenA and Zso that thevoltage
at every vertex in A is 1 and in Z is 0 (more generally, so that the voltages on V\Ware given by f0). (Sometimes, voltages are called potentials or potential differences.)
Voltages v are then established at every vertex and current i runs through the edges.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
33/585
Chap. 2: Random Walks and Electric Networks 24
These functions are implicitly defined and uniquely determined on finite networks, as we
will see, by two laws:
Ohms Law: Ifxy, the current flowing from xto y satisfies
v(x) v(y) =i(x, y)r(x, y) .
Kirchhoffs Node Law: The sum of all currents flowing out of a given vertex
is 0, provided the vertex is not connected to a battery.
Physically, Ohms law, which is usually stated as v =ir in engineering, is an empirical
statement about linear response to voltage differencescertain components obey this law
over a wide range of voltage differences. Notice also that current flows in the direction
of decreasing voltage: i(x, y)>0 iffv(x)> v(y). Kirchhoffs node law expresses the fact
that charge does not build up at a node (current being the passage rate of charge per unit
time). If we add wires corresponding to the batteries, then the proviso in Kirchhoffs node
law is unnecessary.
Mathematically, well take Ohms law to be the definition of current in terms of voltage.
In particular, i(x, y) =i(y, x). Then Kirchhoffs node law presents a constraint on whatkind of function v can be. Indeed, it determines v uniquely: Current flows intoG at A
and out at Z. Thus, we may combine the two laws on V \ (A Z) to obtain
x
A
Z 0 = xy i(x, y) = xy[v(x) v(y)]c(x, y) ,or
v(x) = 1
(x)
xy
c(x, y)v(y) .
That is, v() is harmonic on V \ (A Z). Since vA1 and vZ0, it follows that ifGis finite, then v=F(defined in(2.2)); in particular, we have uniqueness and existence of
voltages. The voltage function is just the solution to the Dirichlet problem.
Now if we sum the differences of a function, such as the voltage v, on the edges of a
cycle, we get 0. Thus, by Ohms law, we deduce:
Kirchhoffs Cycle Law: Ifx1x2 xnxn+1=x1 is a cycle, thenni=1
i(xi, xi+1)r(xi, xi+1) = 0 .
One can also deduce Ohms law from Kirchhoffs two laws. A somewhat more general
statement is in the following exercise.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
34/585
2. More Probabilistic Interpretations 25
Exercise 2.2.
Suppose that an antisymmetric function j (meaning that j(x, y) =j(y, x)) on the edgesof a finite connected network satisfies Kirchhoffs cycle law and Kirchhoffs node law in the
formxyj(x, y) = 0 for all xW. Show that j is the current determined by imposingvoltages offWand that the voltage function is unique up to an additive constant.
2.2. More Probabilistic Interpretations.Suppose thatA ={a}is a singleton. What is the chance that a random walk starting
at a will hit Zbefore it returns to a? Write this as
P[aZ] :=Pa[Z< +{a}] .
Impose a voltage ofv(a) at aand 0 on Z. Since v() is linear inv(a) by the superposition
principle, we have that Px[{a} < Z] =v(x)/v(a), whence
P[aZ] =x
pax
1 Px[{a}< Z]
=x
c(a, x)
(a)
1 v(x)
v(a)
=
1
v(a)(a)
x
c(a, x)[v(a) v(x)] = 1v(a)(a)
x
i(a, x) .
In other words,
v(a) =x i(a, x)
(a)P[aZ]. (2.3)
Since
x i(a, x) is thetotal amount of current flowing into the circuit ata, we may
regard the entire circuit betweena and Zas a single conductor ofeffective conductance
Ceff :=(a)P[aZ] =: C(aZ) , (2.4)
where the last notation indicates the dependence on a and Z. (If we need to indicate
the dependence on G, we will write C(a
Z; G).) The similarity to (2.1)can provide a
good mnemonic, but the analogy should not be pushed too far. We define the effective
resistance R(aZ) to be its reciprocal; in case aZ, then we also define R(aZ) :=0. One answer to our question above is thus P[aZ] = C(aZ)/(a). In Sections2.3and2.4, we will see some ways to compute effective conductance.
Now the number of visits to a before hitting Z is a geometric random variable with
mean P[a Z]1 = (a)R(a Z). According to (2.3), this can also be expressed as(a)v(a) when there is unit current flowing from a to Zand the voltage is 0 on Z. This
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
35/585
Chap. 2: Random Walks and Electric Networks 26
generalizes as follows. LetG(a, x) be the expected number of visits to x strictly beforehitting Zby a random walk started at a. Thus,G(a, x) = 0 forxZ and
G(a, a) =P[aZ]1 =(a)R(aZ) . (2.5)
The functionG(, ) is the Greenfunction for the random walk absorbed (or killed) onZ.
Proposition 2.1. (Green Function as Voltage) LetG be a finite connected network.
When a voltage is imposed on{a} Z so that a unit current flows froma to Z and thevoltage is 0 onZ, thenv(x) =G(a, x)/(x) for allx.
Proof. We have just shown that this is true for x {a} Z, so it suffices to establish that
G(a, x)/(x) is harmonic elsewhere. But by Exercise 2.1, we have thatG(a, x)/(x) =G(x, a)/(a) and the harmonicity ofG(x, a) is established just as for the function of(2.2).
Given that we have two probabilistic interpretations of voltage, we naturally wonder
whether current has a probabilistic interpretation. We might guess one by the following
unrealistic but simple model of electricity: positive particles enter the circuit at a, they
do Brownian motion on G (being less likely to pass through small conductors) and, when
they hit Z, they are removed. The net flow rate of particles across an edge would then be
the current on that edge. It turns out that in our mathematical model, this is correct:
Proposition 2.2. (Current as Edge Crossings) LetG be a finite connected network.
Start a random walk ata and absorb it when it first visitsZ. Forx y, letSxy be thenumber of transitions fromx to y. ThenE[Sxy] =G(a, x)pxy andE[Sxy Syx] =i(x, y),whereiis the current inGwhen a potential is applied betweena andZin such an amount
that unit current flows in ata.
Note that we count a transition from y to x when yZ but xZ, although we do
not count this as a visit to x in computingG(a, x).Proof. We have
E[Sxy] =E
k=0
1{Xk=x}1{Xk+1=y}
=
k=0
P[Xk =x, Xk+1 =y]
=k=0
P[Xk =x]pxy =E
k=0
1{Xk=x}
pxy =G(a, x)pxy.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
36/585
2. More Probabilistic Interpretations 27
Hence by Proposition2.1, we have
x, y E[Sxy Syx] =G(a, x)pxyG(a, y)pyx=v(x)(x)pxy
v(y)(y)pyx = [v(x)
v(y)]c(x, y) =i(x, y) .
Effective conductance is a key quantity because of the following relationship to the
question of transience and recurrence when G is infinite. Recall that for an infinite network
G, we assume that
xxy
c(x, y) 0 iff the random walk onGis transient. We call limn C(aZn)the effective conductance from a to in G and denote it by C(a ) or, if ais understood, by Ceff. Its reciprocal, effective resistance, is denoted Reff. We have
shown:
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
37/585
Chap. 2: Random Walks and Electric Networks 28
Theorem 2.3. (Transience and Effective Conductance) Random walk on an infinite
connected network is transient iff the effective conductance from any of its vertices to
infinity is positive.
Exercise 2.4.For a fixed vertex a inG, show that limn C(aZn) is the same for every sequenceGnthat exhaustsG.
Exercise 2.5.
When G is finite but A is not a singleton, define C(A Z) to be C(a Z) if all thevertices inA were to be identified to a single vertex, a. Show that if voltages are applied at
the vertices ofA Zso thatvAand vZare constants, thenvAvZ=IAZR(AZ),where
IAZ := xAyi(x, y) is the total amount of current flowing from A to Z.
2.3. Network Reduction.How do we calculate effective conductance of a network? Since we want to replace
a network by an equivalent single conductor, it is natural to attempt this by replacing
more and more ofG through simple transformations. There are, in fact, three such simple
transformations, series, parallel, and star-triangle, and it turns out that they suffice to
reduce all finite planar networks by a theorem of Epifanov; see Truemper (1989).
I. Series. Two resistors r1 and r2 in series are equivalent to a single resistor r1+r2. In
other words, if w V(G)\(AZ) is a node of degree 2 with neighbors u1, u2 and wereplace the edges (ui, w) by a single edge (u1, u2) having resistance r(u1, w) +r(w, u2),
then all potentials and currents in G \ {w}are unchanged and the current that flows fromu1 to u2 equals i(u1, w).
u1 w u2
Proof. It suffices to check that Ohms and Kirchhoffs laws are satisfied on the new network
for the voltages and currents given. This is easy.
Exercise 2.6.
Give two harder but instructive proofs of the series equivalence: Since voltages determine
currents, it suffices to check that the voltages are as claimed on the new network G.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
38/585
3. Network Reduction 29
(1) Show that v(x) (x V(G)\ {w}) is harmonic on V(G)\ (AZ). (2) Use thecraps principle (Pitman (1993), p. 210) to show that Px[A < Z] is unchanged for
xV(G) \ {w}.
Example 2.4. Consider simple random walk on Z. Let 0kn. What isPk[0 < n]?It is the voltage at k when there is a unit voltage imposed at 0 and zero voltage at n. If
we replace the resistors in series from 0 to k by a single resistor with resistance k and the
resistors fromk ton by a single resistor of resistance n k, then the voltage at k does notchange. But now this voltage is simply the probability of taking a step to 0, which is thus
(n k)/n.For gamblers ruin, rather than simple random walk, we have the conductances c(i, i+
1) = (p/q)i. The replacement of edges in series by single edges now gives one edge from0 to k of resistance
k1i=0(q/p)
i and one edge from k to n of resistancen1
i=k(q/p)i. The
probability of ruin is thereforen1
i=k(q/p)in1
i=0(q/p)i = [(p/q)nk 1]/[(p/q)n 1].
II. Parallel. Two conductorsc1 andc2 in parallel are equivalent to one conductor c1 + c2.
In other words, if two edges e1 ande2 that both join vertices w1, w2 V(G) are replacedby a single edge e joiningw1, w2 of conductancec(e) :=c(e1) + c(e2), then all voltages and
currents in G \ {e1, e2} are unchanged and the current i(e) equals i(e1) +i(e2) (ife1 and
e2 have the same orientations, i.e., same tail and head). The same is true for an infinitenumber of edges in parallel.
w1 w2
e1
e2
Proof. Check Ohms and Kirchhoffs laws with i(e) :=i(e1) + i(e2).
Exercise 2.7.
Give two more proofs of the parallel equivalence as in Exercise 2.6.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
39/585
Chap. 2: Random Walks and Electric Networks 30
Example 2.5. Suppose that each edge in the following network has equal conductance.
What is P[a z]? Following the transformations indicated in the figure, we obtainC(az) = 7/12, so that
P[az ] = C(az)(a)
=7/123
= 736
.
1
11
11
1
1/4
1/3
1
1
1
1
1
1/2
1
1
11
11
11
1
a z az
a za z
a z
1
7/12
1/2
1/2
1
1
1/2
1/2
1/2
1/2
1/2
Note that in any network G with voltage applied from a to z, if it happens that
v(x) = v(y), then we may identify x and y to a single vertex, obtaining a new network
G/{x, y} in which the voltages at all vertices are the same as in G.
Example 2.6. What is P[az ] in the following network?
1
1
1
1
1
1
1a z
There are 2 ways to deal with the vertical edge:
(1) Remove it: by symmetry, the voltages at its endpoints are equal, whence no current
flows on it.
(2) Contract it, i.e., remove it but combine its endpoints into one vertex (we could
also combine the other two unlabelled vertices with each other): the voltages are the same,
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
40/585
3. Network Reduction 31
so they may be combined.
In either case, we get C(az) = 2/3, whence P[az ] = 1/3.
Exercise 2.8.
Let (G, c) be a network. A network automorphism of (G, c) is a map : GG thatis a bijection of the vertex set with itself and a bijection of the edge set with itself such
that ifx andeare incident, then so are (x) and(e) and such that c(e) =c((e)) for all
edgese. Suppose that (G, c) isspherically symmetric abouto, meaning that ifx and y
are any two vertices at the same distance from o, then there is an automorphism of (G, c)
that leaves o fixed and that takes x to y. Let Cn be the sum ofc(e) over all edges e with
d(e, o) =n andd(e+, o) =n + 1. Show that
R(o ) = n 1Cn ,
whence the network random walk on G is transient iffn
1
Cn
8/12/2019 Trees Networks
41/585
Chap. 2: Random Walks and Electric Networks 32
Exercise 2.9.
Give at least one proof of star-triangle equivalence.
Actually, there is a fourth trivial transformation: we may prune (or add) vertices of
degree 1 (and attendant edges) as well as loops.
Exercise 2.10.
Find a (finite) graph that cant be reduced to a single edge by the four transformations
pruning, series, parallel, and star-triangle.
Either of the transformations star-triangle or triangle-star can also be used to reduce
the network in Example2.6.
Example 2.7. What is Px[a < z] in the following network? Following the transforma-
tions indicated in the figure, we obtain
Px[a< z] = 20/33
20/33 + 15/22=
8
17.
1/3
1/11
3/11
1/2
1/3
1
1/3
1/2
a
z
1
a
z
x
a
x
z
a z
x
1/11
15/22
2/11
1/2
1
1
1
1
1
1
1
1
1
1
x
20/33
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
42/585
4. Energy 33
2.4. Energy.We now come to another extremely useful concept, energy. (Although the term en-
ergy is used for mathematical reasons, the physical concept is actually power dissipation.)
We will begin with some convenient notation and some facts about this notation. Unfor-
tunately, there is actually a fair bit of notation. But once we have it all in place, we will
be able to quickly reap some powerful consequences.
We will often contract some vertices in a graph, which may produce a multigraph.
When we say that a graph is finite, we mean that V and E are finite. Define 2(V) to be
the usual real Hilbert space of functions on V with inner product
(f, g) :=xV
f(x)g(x) .
Since we are interested in flows on E, it is natural to consider that what flows one wayis the negative of what flows the other. From now on, each edge occurs with both
orientations. Thus, define 2(E) to be the space ofantisymmetric functions on E(i.e., (e) =(e) for each edge e) with the inner product
(, ) :=1
2
eE
(e)(e) =
eE1/2(e)(e) ,
where E1/2 E is a set of edges containing exactly one of each pair e, e. Since voltagedifferences across edges lead to currents, define the coboundary operator d : 2(V)
2(E) by(df)(e) :=f(e) f(e+) .
(Note that this is the negative of the more natural definition; since current flows from
greater to lesser voltage, however, it is the more useful definition for us.) This operator is
clearly linear. Conversely, given an antisymmetric function on the edges, we are interested
in the net flow out of a vertex, whence we define the boundary operator d : 2(E)2(V) by
(d)(x) := e=x (e) .This operator is also clearly linear. We use the superscript because these two operatorsare adjoints of each other:
f2(V) 2(E) (, df) = (d, f) .
Exercise 2.11.
Prove that d andd are adjoints of each other.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
43/585
Chap. 2: Random Walks and Electric Networks 34
One use of this notation is that the calculation left here for Exercise2.11need not be
repeated every time it arisesand it arises a lot. Another use is the following compact
forms of the network laws. Let i be a current.
Ohms Law: dv= ir, i.e.,eE dv(e) =i(e)r(e) .Kirchhoffs Node Law: di(x) = 0 if no battery is connected atx .
It will be useful to study flows other than current in order to discover a special property
of the current flow. We can imagine water flowing through a network of pipes. The amount
of water flowing into the network at a vertex a is d(a). Thus, we call 2(E) a flowfrom A to Z if d is 0 off of A and Z, is nonnegative on A, and nonpositive on Z.The total amount flowing into the network is then
aA d
(a); as the next lemma
shows, this is also the total amount flowing out of the network. We call
Strength() :=aA
d(a)
the strengthof the flow . A flow of strength 1 is called a unit flow.
Lemma 2.8. (Flow Conservation) LetG be a finite graph andA andZbe two disjoint
subsets of its vertices. If is a flow fromA to Z, then
aA d
(a) =zZd
(z) .
Proof. We havexA
d(x) +xZ
d(x) =
xAZd(x) = (d, 1) = (, d1) = (, 0) = 0
since d(x) = 0 forxA Z. The following consequence will be useful in a moment.
Lemma 2.9. LetG be a finite graph andA andZbe two disjoint subsets of its vertices.
If is a flow fromA to Z andfA, fZare constants and, then
(, df) =Strength()( ) .
Proof. We have (, df) = (d, f) =
aA d(a)+
zZd
(z). Now apply Lemma2.8.
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
44/585
8/12/2019 Trees Networks
45/585
Chap. 2: Random Walks and Electric Networks 36
Kirchhoffs Cycle Law: Ife1, e2, . . . , en is an oriented cycle in G, then
nk=1
ek , ir
= 0 .
Now for our last bit of notation before everything comes into focus, let denote
the subspace in2(E) spanned by the stars
e=x c(e)e and letdenote the subspace
spanned by the cyclesn
k=1ek , where e1, e2, . . . , en forms an oriented cycle. We call
these subspaces the star space and the cycle space ofG. These subspaces are clearly
orthogonal; here and subsequently, orthogonality refers to the inner product (, )r. More-
over, the sum of and is all of2(E), which is the same as saying that only the zerovector is orthogonal to both and
. To see that this is the case, suppose that
2(E)
is orthogonal to both and. Since is orthogonal to, there is a functionFsuch that = c dF by Exercise 2.2. Since is orthogonal to , the function F is harmonic. Since
G is finite, the uniqueness principle implies that F is constant on each component ofG,
whence= 0, as desired.
Thus, Kirchhoffs Cycle Law says thati, being orthogonal to , is in . Furthermore,anyi is a current by Exercise 2.2(let W :={x ; (di)(x) = 0}). Now if is any flowwith the same sources and sinks as i, more precisely, if is any antisymmetric function
such that d =di, then i is a sourceless flow, i.e., by(2.8), is orthogonal to andthus is an element of. Therefore, the expression
= i + ( i)
is the orthogonal decomposition of relative to 2(E) = . This shows that theorthogonal projectionP:
2(E) plays a crucial role in network theory. In particular,
i= P (2.9)
and2r =i2r+ i2r. (2.10)
Thomsons Principle. LetG be a finite network andA andZbe two disjoint subsets
of its vertices. Let be a flow fromA to Z and i be the current flow from A to Z with
di= d. ThenE()> E(i) unless= i.
Proof. The result is an immediate consequence of(2.10).
c19972010 by Russell Lyons and Yuval Peres. Commercial reproduction prohibited.DRAFT Version of 17 July 2010. DRAFT
8/12/2019 Trees Networks
46/585
4. Energy 37
Note that given , the corresponding current i such that di = d is unique (andgiven by(2.9)).
Recall that E1/2E is a set of edges containing exactly one of each pair e, e. Whatis the matrix ofP in the orthogonal basis
{e ; e
E1/2
}? We have
(Pe, e
)r = (ie, e
)r =ie(e)r(e) , (2.11)
whereie is the unit current frometo e+. Therefore, the matrix coefficient at (e, e) equals(P
e, e
)r/(e , e
)r = ie(e) =: Y(e, e), the current that flows across e when a unit
current is imposed between the endpoints ofe. This matrix is called the transfer current
matrix. This matrix will be extremely useful for our study of random spanning trees and
forests in Chapters4 and10. SinceP, being an orthogonal projection, is self-adjoint, we
have (Pe, e
)r = (e, P
e)r, whence
Y(e, e)r(e) =Y(e, e)r(e) . (2.12)
This is called the reciprocity law.
Consider P[aZ]. How does this change when an edge is removed fromG? whenan edge is added? when the conductance of an edge is changed? These questions are
not easy to answer probabilistically, but yield to the ideas we have developed. Since
P[a Z] = C(a Z)/(a), if no edge incident to a is affected, then we need analyzeonly the change in effective conductance.
Exercise 2.12.
Show that P[aZ] can increase in some situations and decrease in others when an edgeincident to a is removed.
Effective conductance changes as follows. We use subscripts to indicate the edge
conductances used.
Rayleighs Monotonicity Principle. LetG be a finite graph andA andZtwo disjoint
subsets of its vertices. Ifc andc are two assignments of conductances onG withcc,thenCc(AZ) Cc(AZ).Proof. By(2.7), we have C(AZ) = 1/E(i) for a unit current flow i from A to Z. Now
Ec(ic) Ec(ic) Ec(ic)