Grocery on Distributed Algorithms

T1: Discrepancy (2)

Reminder: This post contains 1243 words · 4 min read · by Xianbin

In discrepancy (1), we introduced the basics of discrepancy. In this post, we continue to show the techniques on discrepancy.

The BNS Method

\(\textbf{Lemma}[BNS]\). The discrepancy of a function \(f: X\times Y \to \mathbb{Z}_2\) can be bounded as follows.

\[\text{disc}(f, A\times B) \leq \mathbb{E}_{y_1,y_2}[\mathbb{E}_x[M_f(x,y_1)M_f'(x,y_2)]]\]

\(\textbf{Proof}\). By the definition,

\[\text{disc}(f, A\times B) = \sum_{x\in A, y\in B}M_f(x,y)/2^{2n}\\ =\big \lvert \mathbb{E}_{x,y}\textbf{1}_A(x) \textbf{1}_B(y) M_f(x,y) \big\rvert\]

Now, we use Cauchy-Schwarz inequality, \(\mathbb{E}[Z]^2 \leq \mathbb{E}[Z^2]\).

Then,

\[\text{disc}(f, A\times B)^2 = \left(\sum_{x\in A, y\in B}M_f(x,y)/2^{2n}\right)^2\\ =\big \lvert \mathbb{E}_{x,y}\textbf{1}_A(x) \textbf{1}_B(y) M_f(x,y) \big\rvert^2 \\ =\big \lvert \mathbb{E}_{x}\textbf{1}_A(x) \mathbb{E}_y \textbf{1}_B(y) M_f(x,y) \big\rvert^2 \\ \leq \mathbb{E}_x[\left (\textbf{1}_A(x)\mathbb{E}_y \textbf{1}_B(y) M_f(x,y)\right)^2]\\ \leq \mathbb{E}_x[\left (\mathbb{E}_y \textbf{1}_B(y) M_f(x,y)\right)^2]\\ =\mathbb{E}_x \left( \mathbb{E}_{y_1,y_2}\textbf{1}_B(y_1)\textbf{1}_B(y_2)M_f(x,y_1)M_f(x,y_2)\right)\\ =\mathbb{E}_{y_1,y_2}\textbf{1}_B(y_1)\textbf{1}_B(y_2)\left(\mathbb{E}_x M_f(x,y_1)M_f(x,y_2) \right)\\ \leq \mathbb{E}_{y_1,y_2}\left\lvert \mathbb{E}_x M_f(x,y_1)M_f(x,y_2) \right\lvert\]