$$\newcommand{\P}{\mathbb{P}}$$ $$\newcommand{\E}{\mathbb{E}}$$ $$\newcommand{\R}{\mathbb{R}}$$ $$\newcommand{\N}{\mathbb{N}}$$ $$\newcommand{\Z}{\mathbb{Z}}$$ $$\newcommand{\bs}{\boldsymbol}$$

## 12. Random Walks on Graphs

#### Introduction

Suppose that $$G = (S, L)$$ is a connected graph with vertex set $$S$$ and edge set $$L \subseteq S^2$$. We assume that the graph is undirected (perhaps a better term would be bi-directed) in the sense that $$(x, y) \in L$$ if and only if $$(y, x) \in L$$. We also assume that the graph has no loops, so that $$(x, x) \notin L$$ for every $$x \in S$$. The vertex set $$S$$ may by infinite. Let $$N(x) = \{y \in S: (x, y) \in L\}$$ denote the set of neighbors of a vertex $$x \in S$$, and let $$d(x) = \#[N(x)]$$ denote the degree of $$x$$.

Suppose now that there is a conductance $$c(x, y) \gt 0$$ associated with each edge $$(x, y) \in L$$. The conductance is symmetric in the sense that $$c(x, y) = c(y, x)$$ for $$(x, y) \in L$$. We extend $$c$$ to a function on all of $$S \times S$$ by defining $$c(x, y) = 0$$ for $$(x, y) \notin L$$. As the terminology suggests, we imagine a fluid of some sort flowing through the edges of the graph, so that the conductance of an edge measures the capacity of the edge in some sense. The best interpretation is that the graph is an electrical network and the edges are resistors. In this interpretation, the conductance of a resistor is the reciprocal of the resistance.

Let $$C(x) = \sum_{y \in S} c(x, y)$$ and suppose that $$C(x) \lt \infty$$ for each $$x \in S$$. The Markov chain $$\bs{X} = (X_0, X_1, X_2, \ldots)$$ with state space $$S$$ and transition probability matrix $$P$$ given by

$P(x, y) = \frac{c(x, y)}{C(x)}, \quad (x, y) \in S^2$

is called a random walk on the graph $$G$$. This chain governs a particle moving along the vertices of $$G$$. If the particle is at vertex $$x \in S$$ at a given time, then the particle will be at a neighbor of $$x$$ at the next time; the neighbor is chosen randomly, in proportion to the conductance. In the setting of an electrical network, it is natural to interpret the particle as an electron.

Note that multiplying the conductance function $$c$$ by a positive constant has no effect on the associated random walk.

Suppose that each edge has the same conductance, so that $$c$$ is constant on the edges. Then

1. $$C(x) = c \, d(x)$$ for every $$x \in S$$.
2. The assumption that $$C$$ is finite means that each vertex has finite degree. That is, the graph $$G$$ is locally finite.
3. The transition probability matrix is given by $P(x, y) = \begin{cases} \frac{1}{d(x)}, & y \in N(x) \\ 0, & y \notin N(x) \end{cases}$

The chain in the previous exercise is the symmetric random walk on $$G$$. Thus, if the chain is in state $$x \in S$$ at a given time, then the chain is equally likely to move to any of the neighbors of $$x$$ at the next time.

Consider the random walk on the graph below with the given conductance values. The underlying graph is sometimes called the Wheatstone bridge in honor of Charles Wheatstone.

1. Explicitly give the transition probability matrix $$P$$.
2. Suppose that the chain starts at $$a$$. Find the probability density function of $$X_2$$.

For the matrix and vector below, we use the ordered state space $$S = (a, b, c, d)$$.

1. $$P = \left[ \begin{matrix} 0 & \frac{1}{2} & 0 & \frac{1}{2} \\ \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{2} \\ 0 & \frac{1}{3} & 0 & \frac{2}{3} \\ \frac{1}{5} & \frac{2}{5} & \frac{2}{5} & 0 \end{matrix} \right]$$
2. $$f_2 = \left( \frac{9}{40}, \frac{1}{5}, \frac{13}{40}, \frac{1}{4} \right)$$

Consider the random walk on the cube graph shown below, with the given conductance values. The vertices are bit strings of length 3, and two vertices are connected by an edge if and only if the bit strings differ by a single bit.

1. Explicitly give the transition probability matrix $$P$$.
2. Suppose that the initial distribution is the uniform distribution on $$\{000, 001, 101, 100\}$$. Find the probability density function of $$X_2$$.

For the matrix and vector below, we use the ordered state space $$S = (000, 001, 101, 110, 010, 011, 111, 101 )$$.

1. $$P = \left[ \begin{matrix} 0 & \frac{1}{4} & 0 & \frac{1}{4} & \frac{1}{2} & 0 & 0 & 0 \\ \frac{1}{4} & 0 & \frac{1}{4} & 0 & 0 & \frac{1}{2} & 0 & 0 \\ 0 & \frac{1}{4} & 0 & \frac{1}{4} & 0 & 0 & \frac{1}{2} & 0 \\ \frac{1}{4} & 0 & \frac{1}{4} & 0 & 0 & 0 & 0 & \frac{1}{2} \\ \frac{1}{4} & 0 & 0 & 0 & 0 & \frac{3}{9} & 0 & \frac{3}{8} \\ 0 & \frac{1}{4} & 0 & 0 & \frac{3}{8} & 0 & \frac{3}{8} & 0 \\ 0 & 0 & \frac{1}{4} & 0 & 0 & \frac{3}{8} & 0 & \frac{3}{8} \\ 0 & 0 & 0 & \frac{1}{4} & \frac{3}{8} & 0 & \frac{3}{8} & 0 \end{matrix} \right]$$
2. $$f_2 = \left(\frac{3}{32}, \frac{3}{32}, \frac{3}{32}, \frac{3}{32}, \frac{5}{32}, \frac{5}{32}, \frac{5}{32}, \frac{5}{32}\right)$$

Let $$\bs{X}$$ be a random walk on a graph $$G$$.

1. If $$G$$ is connected then $$\bs{X}$$ is irreducible.
2. If $$G$$ is connected and finite then $$\bs{X}$$ is irreducible and recurrent.
3. If $$G$$ is not connected then the equivalence classes of $$\bs{X}$$ are the components of $$G$$ (the maximal connected subsets of $$S$$). The finite components are recurrent.

Suppose that $$\bs{X}$$ is a random walk on a finite, connected graph $$G$$. Then $$\bs{X}$$ is either aperiodic or has period 2. Moreover, $$\bs{X}$$ has period 2 if and only if $$G$$ is bipartite. That is, the vertex set $$S$$ can be partitioned into sets $$A$$ and $$B$$ such that every edge in $$L$$ has one endpoint in $$A$$ and one endpoint in $$B$$; these sets are then the cyclic classes of the chain.

#### Random Walks on $$\Z^k$$

Random walks on the integer lattices $$\Z^k$$, where $$k \in \N_+$$, are particularly interesting. Consider first the simple random walk $$\bs{X} = (X_0, X_1, X_2, \ldots)$$ on $$\Z$$, with transition probabilities

$P(x, x + 1) = p, \; P(x, x - 1) = 1 - p, \quad x \in \Z$

where $$p \in (0, 1)$$ is a parameter. Note that the chain is irreducible.

The chain is periodic with period 2. Moreover

$P^{2\,n}(0, 0) = \binom{2\,n}{n} p^n (1 - p)^n, \quad n \in \N$
Proof:

Note that starting in state 0, the chain can return to state 0 only at even times. The chain returns to 0 at time $$2\,n$$ if and only if there are $$n$$ steps to the right and $$n$$ steps to the left.

If $$p \ne \frac{1}{2}$$ then $$\bs{X}$$ is transient, and if $$p = \frac{1}{2}$$ then $$\bs{X}$$ is recurrent. Note that when $$p = \frac{1}{2}$$, $$\bs{X}$$ is the symmetric random walk on $$\Z$$.

Proof:

From Theorem 6 and Stirling's approximation,

$P^{2\,n}(0, 0) \approx \frac{[4 \, p (1 - p)]^n}{\sqrt{\pi \, n}} \text{ as } n \to \infty$

If $$p \ne \frac{1}{2}$$ then $$G(0, 0) \lt \infty$$ and hence $$\bs{X}$$ is transient. If $$p = \frac{1}{2}$$ then $$G(0, 0) = \infty$$ and hence $$\bs{X}$$ is recurrent.

More generally, consider $$\Z^k$$, where $$k \in \N_+$$. For $$i \in \{1, 2, \ldots, k\}$$, let $$\bs{u}_i \in \Z^k$$ denote the unit vector with 1 in position $$i$$ and 0 elsewhere. The simple random walk on $$\Z^k$$ has transition probabilities

$P(\bs{x},\bs{x} + \bs{u}_i) = p_i, \; P(\bs{x}, \bs{x} - \bs{u}_i) = q_i; \quad \bs{x} \in \Z^k, \; i \in \{1, 2, \ldots, k\}$

where $$p_i \gt 0$$, $$q_i \gt 0$$ for each $$i$$ and $$\sum_{i=1}^k (p_i + q_i) = 1$$. Clearly, the larger the dimension $$k$$, the less likely the chain is to be recurrent. In turns out that the behavior when $$k = 2$$ is similar to the behavior when $$k = 1$$: the chain is recurrent only in the symmetric case $$p_i = q_i = \frac{1}{4}$$ for each $$i$$. When the dimension $$k$$ is 3 or more, the chain is transient for all values of the parameters, even in the symmetric case. Thus, for the simple, symmetric random walk on the integer lattice $$\Z^k$$, we have the following interesting dimensional phase shift: the chain is recurrent in dimensions 1 and 2 and transient in dimensions 3 or more. The proofs are similar in to the proof sketched in the previous two exercises for dimension 1, but the details are considerably more complex. A return to $$0$$ can occur only at even times and in the symmetric case,

$P^{2\,n}(\bs{0}, \bs{0}) \approx \frac{C_k}{n^{k/2}} \text{ as } n \to \infty$

where $$C_k$$ is a positive constant that depends on the dimension $$k$$. Thus $$G(\bs{0}, \bs{0}) = \infty$$ and the chain is recurrent if $$k \in \{1, 2\}$$ while $$G(\bs{0}, \bs{0}) \lt \infty$$ and the chain is transient if $$k \in \{3, 4, \ldots\}$$.

#### Positive Recurrence and Limiting Distributions

We return now to the general case of a random walk $$\bs{X}$$ on a graph $$G$$. Assume that $$G$$ is connected so that $$\bs{X}$$ is irreducible.

The function $$C$$ is left-invariant for $$P$$. The random walk is positive recurrent if and only if

$K = \sum_{x \in S} C(x) = \sum_{(x, y) \in S^2} c(x, y) \lt \infty$

in which case the invariant probability density function $$f$$ is given by $$f(x) = C(x) / k$$ for $$x \in S$$.

Consider the special case of the simple random walk on $$G$$, so that the conductance function $$c$$ is constant on the set of edges $$L$$. The random walk is positive recurrent if and only if the set of vertices $$S$$ is finite, in which case the invariant probability density function $$f$$ is given by $$f(x) = d(x) / 2\,m$$ for $$x \in S$$ where $$d$$ is the degree function and where $$m$$ is the number of undirected edges.

Recall, for example, the simple random walk on $$\Z^k$$. We showed that the walk is recurrent if the dimension $$k$$ is 1 or 2, and so we now know that the walk is null recurrent in these cases. The walk is transient if $$k \ge 3$$.

Consider the random walk on the Wheatstone bridge given in Exercise 2.

1. Show that the chain is aperiodic.
2. Find the invariant probability density function.
3. Find the mean return time to each state.
4. Find $$\lim_{n \to \infty} P^n$$.

For the matrix and vectors below, we use the ordered state space $$(a, b, c, d)$$.

1. The chain is aperiodic since the graph is not bipartite.
2. $$f = \left(\frac{1}{7}, \frac{2}{7}, \frac{3}{14}, \frac{5}{14} \right)$$
3. $$\mu = \left(7, \frac{7}{2}, \frac{14}{3}, \frac{14}{5} \right)$$
4. $$P^n \to \left[ \begin{matrix} \frac{1}{7} & \frac{2}{7} & \frac{3}{14} & \frac{5}{14} \\ \frac{1}{7} & \frac{2}{7} & \frac{3}{14} & \frac{5}{14} \\ \frac{1}{7} & \frac{2}{7} & \frac{3}{14} & \frac{5}{14} \\ \frac{1}{7} & \frac{2}{7} & \frac{3}{14} & \frac{5}{14} \\ \end{matrix} \right]$$ as $$n \to \infty$$

Consider the random walk on the cube graph given in Exercise 3.

1. Show that the chain has period 2 and find the cyclic classes.
2. Find the invariant probability density function.
3. Find the mean return time to each state.
4. Find $$\lim_{n \to \infty} P^{2\,n}$$.
5. Find $$\lim_{n \to \infty} P^{2\,n + 1}$$.

For the matrix and vector below, we use the ordered state space $$S = (000, 001, 101, 110, 010, 011, 111, 101)$$.

1. The chain has period 2 since the graph is bipartite. The cyclic classes are $$\{000, 011, 110, 101\}$$ (bit strings with an even number of 1's) and $$\{010, 001, 100, 111\}$$ (bit strings with an odd number of 1's).
2. $$f = \left(\frac{1}{12}, \frac{1}{12}, \frac{1}{12}, \frac{1}{12}, \frac{1}{6}, \frac{1}{6}, \frac{1}{6}, \frac{1}{6}\right)$$
3. $$\mu = (12, 12, 12, 12, 6, 6, 6, 6)$$
4. $$P^{2\,n} \to \left[ \begin{matrix} \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \end{matrix} \right]$$ as $$n \to \infty$$
5. $$P^{2\,n + 1} \to \left[ \begin{matrix} 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \\ \frac{1}{6} & 0 & \frac{1}{6} & 0 & 0 & \frac{1}{3} & 0 & \frac{1}{3} \\ 0 & \frac{1}{6} & 0 & \frac{1}{6} & \frac{1}{3} & 0 & \frac{1}{3} & 0 \end{matrix} \right]$$ as $$n \to \infty$$

#### Reversibility

Essentially, all reversible Markov chains can be interpreted as random walks on graphs. This fact is one of the reasons for studying such walks.

A positive recurrent random walk $$\bs{X}$$ on a graph $$G$$ is reversible.

Conversely, suppose that $$\bs{X}$$ is a reversible Markov chain on $$S$$ with transition probability matrix $$P$$ and invariant probability density function $$f$$. Suppose also that $$P(x, x) = 0$$ for every $$x \in S$$. Then $$\bs{X}$$ can be interpreted as the random walk on the state graph $$G$$ of $$\bs{X}$$ with conductance function $$c$$ given by

$c(x, y) = f(x) P(x, y), \quad (x, y) \in S^2$

Recall that the Ehrenfest chain is reversible. Interpreting the chain as a random walk on a graph, sketch the graph and find a conductance function.

A conductance function $$c$$ is $$c(x, x - 1) = \binom{m - 1}{x - 1}$$, $$c(x, x + 1) = \binom{m - 1}{x}$$ for $$x \in \{0, 1, \ldots, m\}$$.