Toolbox

C1: Random Walks on Expander Graphs (1)

Reminder: This post contains 4528 words · 13 min read · by Xianbin

Basics before Starting

Basic skills are of great importance in theory of computer science (by many people).

Given a matrix, how to select a row using matrix languages?

Example 1

Given M=(123456789)​​M = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{pmatrix} ​​

If we want to select the second row. Let 1s=(0,1,0)\mathbb{1}_s = (0,1,0) be the indicator vector.

Then, (0,1,0)(123456789)=(4,5,6)(0,1,0) \cdot \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{pmatrix} = (4, 5, 6)

We successfully do it!

Similarly, if we want to select the second column.

M=(123456789)​​M = \begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{pmatrix} ​​

Let 1t=(0,1,0)T\mathbb{1}_t = (0,1,0)^T.

Then, (123456789)(010)=(258)\begin{pmatrix} 1 & 2 & 3 \\ 4 & 5 & 6 \\ 7 & 8 & 9 \end{pmatrix} \cdot \begin{pmatrix} 0 \\ 1 \\ 0 \end{pmatrix} = \begin{pmatrix} 2 \\ 5 \\ 8 \end{pmatrix}

In summary, if we want to select the row, we use

1sM\mathbb{1}_s \cdot M

if we want to select the column, we use

M1tM\cdot \mathbb{1}_t

Example 2

Now, consider MM to be an adjacency matrix ( 0-1 matrix).

By Example 1, we can easily how to select a set SS from MM.

Let 1s\mathbb{1}_s be the indicator vector of SS, e.g.,, 1s=(0,1,,1)\mathbb{1}_s = (0,1,\cdots,1)

But what does it mean?

1s×M[:1]\mathbb{1}_s \times M[:1]:

if v1v_1 has an incident edges to any node in SS, then the above result is not zero. It means the number of edges incident from SS to v1v_1. So, the multiplication result means the vector of the number of edges from v1,v2,,vnv_1,v_2,\cdots,v_n to SS

Example 3

Let E(A,B)E(A, B) denote the number of edges from AA to BB. How to denote that using the matrix language.

To solve this problem, we can divide it into two steps.

  1. Find the edges from VV to SS.
  2. Choose TT from VV.

We use 1sM\mathbb{1}_s \cdot M to denote step 1. Then, we times 1tT\mathbb{1}_t^T to select TT from VV. In total, we use 1sM1tT\mathbb{1}_s \cdot M \cdot \mathbb{1}_t^T

Expander Mixing Lemma

Lemma\textbf{Lemma}. Let GG be a dd-regular graph with nn vertices and let λ=λ(G)=max(λ2,λn)\lambda = \lambda(G) = \max (\lvert \lambda_2\rvert, \lvert \lambda_n \rvert). Then, for any A,BVA, B \subseteq V, we have

E(A,B)dABnλAB\left\lvert \lvert E(A, B) \vert - \frac{d\lvert A \rvert \lvert B \rvert}{n} \right\rvert \leq \lambda \sqrt{\lvert A \rvert \lvert B\rvert}

What does the expander mixing lemma mean?

dABn\frac{d\lvert A \rvert \lvert B \rvert}{n} means that in a random graph, the number of edges between AA and BB in expectation.

Therefore, the expander mixing lemma means that the number of edges between AA and BB in the dd-regular graph is close to that in a random graph!

Proof\textbf{Proof}. The first thing we need to do is that we use the matrix language to denote E(A,B)E(A,B).

E(A,B)=1aM1bTE(A,B) = \mathbb{1}_a M \mathbb{1}_b^T

Since MM is an adjacency matrix, it is symmetric, which means that

M=QTQM = Q^T\wedge Q where QQ is an orthogonal matrix.

Then,

=QMQT\wedge =Q M Q^T

We have the indicator vectors (or characteristic vectors) as follows.

1a=iαiqi,1b=jβjqjT,\mathbb{1}_a = \sum_i \alpha_i q_i, \mathbb{1}_b = \sum_j\beta_j q_j^T,

where qi=1n(0,0,0,,1,0,0)q_i = \frac{1}{\sqrt{n}}(0,0,0, \cdots, 1, 0,0) (the i-th place is 1).

The reason we do not use (0,0,0,,1,0,0)(0,0,0, \cdots, 1, 0,0) to denote the 1a\mathbb{1}_a is that we will make an orthogonal vector in use.

Therefore, E(A,B)=iαiqiMjβjqjT,E(A,B) = \sum_i \alpha_i q_i M \sum_j \beta_j q_j^T,

Notice that

iqiMjqjT=E\sum_i q_i M \sum_j q_j^T = \wedge \cdot E

Then,

E(A,B)=iλiαiβi\vert E(A, B) \rvert= \sum_i \lambda_i\alpha_i \beta_i

What is λ1\lambda_1, α1,β1\alpha_1, \beta_1?

After computing eigenvector (eigenvalues) of a d-regular graph, we will obtain that

λ1=d,q1=(1/n,,1/n)\lambda_1 = d, q_1 = (1/\sqrt{n}, \cdots,1/\sqrt{n})

It is not that obvious. Let us prove that.

Formally, we need to prove the following lemma.

Lemma\textbf{Lemma}. Given a dd-regular graph, we have two properties on the eigenvalues:

Proof\textbf{Proof}.

The first property can be check by the definition, Au1=λ1u1Au_1 = \lambda_1 u_1. It is true.

For the second property, we have the following.

Aux=λuxjai,jux=λuxA u_x = \lambda u_x \\ \sum_{j}a_{i,j} u_x = \lambda u_x\\

Then, we have

λiux=jai,juxjai,jux=jai,juxdux\lvert \lambda_i \rvert\lvert u_x \rvert = \left\lvert \sum_j a_{i,j} u_x \right\rvert \\ \leq \sum_j \lvert a_{i,j} u_x\rvert \\ =\sum_j a_{i,j}\lvert u_x \rvert \\ \leq d \lvert u_x \rvert

So, λid\lvert\lambda_i \rvert\leq d.

We get α1=<α1,q1>=Sn\alpha_1 = <\alpha_1, q_1> = \frac{\lvert S \rvert}{\sqrt{n}}.

Similarly, we get β1=An\beta_1 = \frac{\lvert A \rvert}{\sqrt{n}}

Since λ1=d\lambda_1 = d, α1=An,β1=Bn\alpha_1 = \frac{\lvert A \rvert }{\sqrt{n}}, \beta_1 = \frac{\lvert B \rvert }{\sqrt{n}},

λ1α1β1=dABn\lambda_1 \alpha_1 \beta_1 = d\frac{\lvert A \vert \lvert B\rvert}{n}.

In the following, we only need to look at

i=2nλiαiβi\sum_{i=2}^n \lambda_i \alpha_i \beta_i

By Cauchy-Schwarz Inequality which is

u,v=uv\lvert \langle u, v\rangle \lvert = \lVert u \rVert \lVert v \rVert

we have i=2nλiαiβiλi=2nαiβiλAB\sum_{i=2}^n \lambda_i \alpha_i \beta_i \leq \lambda \sum_{i=2}^n \alpha_i \beta_i \leq \lambda \sqrt{\lvert A \rvert \lvert B \rvert}

Then, it is proved.

Reference

[1]. Hoory, S., Linial, N. and Wigderson, A., 2006. Expander graphs and their applications. Bulletin of the American Mathematical Society, 43(4), pp.439-561.