# C1: A little advice can be very helpful!

This post is based on [1].

# Two-party communication model with advice

In the traditional Two-Party communication model, Alice (Bob) does not know what Bob (Alice) holds.

An interesting question is **what if Alice or Bob knows something**. Now, we introduce a new model initially invented in [2].

## Definition

We define asymmetric communication problem. \(\textup{SEL}^{k\times 1}_f: \{1,\ldots, k\} \times X^k \times Y \to \{0,1\},\) where \(\textup{SEL}^{k\times 1}_f (i, x_i,\ldots, x_k,y) = f(x_i,y)\)

In the \(A \xrightarrow B ( B\harr C)\) (ABC) model, there are two players, Bob and Charlie. Alice gives some advice to Bob. Charlie knows \(\{x_1,\ldots,x_k\}, i\) and Bob knows \(y, i\). Alice knows all \(\{x_i\},y\) (except \(i\)).

There is a conjecture [2]. Let \(f: \{0,1\}^n \times \{0,1\}^n\) be any function. Consider \({SEL}^{k\times 1}_f\) in the above-mentioned model. If Alice sends \(o(k)\) bits, the cost of communication between Bob and Charlie is \(\Omega(c)\) where \(c\) is the 2-party communication cost of \(f\).

The above conjecture says that if Alice gives only \(o(k)\) bits advice, it does not help solve the two-party communication problem!

Unfortunately (or fortunately), the above conjecture is not true. Just consider \(f\) to be the EQ function. \(EQ(x,y) = 1\), if and only if \(x = y\). Alice sends the minimum \(i \in \{1, \ldots, k\}\) for \(y = x_i\). If there is no such a \(x_i\), Alice sends 0. Then, by binary search, we can find the answer. It means that using about \(O(\log n)\) bits is enough.

## Upper Bound: Set-Disjointness under ABC model

We define \(\textup{DISJ}(x,y) = \neg \lor_{i\in[n]} (x_i \land y_i)\). In ABC model, \(\{x_i\}, y \subseteq [n]\).

\(\textbf{Theorem 1} [1].\) There is deterministic protocol for \(\textup{SEL}^{k\times 1}_{\textup{DISJ}}\) in ABC model such that Alice sends at most \(\sqrt{n}\log k\) bits and Bob sends at most \(\sqrt{n}\log k + 1\) bits, and Charlie sends at most \(\sqrt{n}\log n\) bits.

That is to say, \(\tilde O(\sqrt{n})\) is enough for solving set-disjointness in the ABC model.

\(\textbf{Proof}\) (sketch). The idea is simple. Alice repeatedly picks a set \(S\) from \(\{x_i\}\) such that

- \[\color{blue}S \text{ is disjoint from } y\]
- \(S\) contains a least \(\sqrt{n}\) elements that are not in the union of sets \(\cup S'\) ALice picked before.

Then Alice sends the set of **indices** of these sets to Bob. Next, Bob forwards the information to Charlie. Then, Charlie computes the union of these sets and removes elements from \(x_i\). If the remaining set has a size of at least \(\sqrt{n}\), they are not disjoint (because if they are disjoint, after removing elements, the size is less than \(\sqrt{n}\)). Otherwise, since the size is small, we can trivially solve this problem.

## Lower Bound via Strong Direct Product Theorems

The above section shows a nice upper bound for DISJ in the ABC model. Now, let’s see the lower bound (near tight in some way).

\(\textbf{Theorem 2}\). Let \(\beta \in (0,1)\) and \(\alpha > 0\) be two parameters. There exist constants \(\alpha\) and \(\beta\) such that in any deterministic protocol for \(\textup{SEL}^{\sqrt{n}\times 1}_{\textup{DISJ}}\) in ABC model, if Alice sends at most \(\alpha \sqrt{n}\) bits of information, then Bob and Charlie communicate at least \(\beta\sqrt{n}\) bits.

To prove Theorem 2, we must figure out the **strong direct product theorems**. I will make another post showing the details of these kinds of theorems. The motivation of this theorem or problem is as follows (by examples).

When you have \(k\) **independent** tasks, one way is to do it sequentially. But, you always do this and feel it is mundane. Then you start to wonder what if …. Maybe you want to prove that even the most intelligent man cannot do this job better than you. But do you know how to prove it? Or is there any way to do it other than the trivial way?

The shortest way to do many things is to do only one thing at a time. (Samuel Smiles)

### Terminology

For any Boolean function \(f: \{0,1\}^n \times \{0,1\}^n\), let \(f^{(k)}: \{0,1\}^{nk} \times \{0,1\}^{nk} \to \{0,1\}^k\) denote the function such thta for all \(x_1,\ldots,x_k,y_1,\ldots,y_k\in\{0,1\}^n\), we have \(f^{(k)}(x_1,\ldots,x_k,y_1,\ldots,y_k) = (f(x_1,y_1),\ldots,f(x_k,y_k))\).

\(\textbf{Theorem 3} [3]\). There exists constants \(\alpha > 0\) and \(\beta \in (0,1)\) such that for all \(k\geq 1\), every randomized protocol that computes \(\textup{DISJ}^{(k)}: \{0,1\}^{nk} \times \{0,1\}^{nk} \to \{0,1\}^k\) using at most \(\beta kn\) bits of communication has error probability larger than \(1-\frac{1}{2^{k\alpha}}\).

Let us reformulate Theorem 3 as follows. There exists constants \(\alpha > 0\) and \(\beta \in (0,1)\) such that for all \(k\geq 1\), every randomized protocol that computes \(\textup{DISJ}^{(k)}: \{0,1\}^{\sqrt{n}k} \times \{0,1\}^{\sqrt{n}k} \to \{0,1\}^k\) using at most \(\beta k\sqrt{n}\) bits of communication has error probability larger than \(1-\frac{1}{2^{k\alpha}}\) (i.e., the success probability is at most \(\frac{1}{2^{k\alpha}}\)).

**Proof by contradiction is an art for proofs.** It can help humans think deeply and logically. Before showing the proof of Theorem 2, let us take some minutes to think about how to lead a contradiction by assumption. Then, it means that when Alice sends \(\alpha k\) bits of information, Bob and Charlie use \(c\) (less than \(\beta \sqrt{n}\)) bits. To show that it is impossible, we will construct some **hard distribution** so that no algorithm can solve DISJ. Apparently, we will use the strong direct product theorem.

1) direct product

\[\begin{pmatrix} x'_1 & y'_1\\ \vdots & \vdots\\ x'_k & y'_k \end{pmatrix}\]2) \(\textup{SEL}^{k\times 1}_{\textup{DISJ}}\):

\[\begin{pmatrix} x_1 & y\\ \vdots & \vdots\\ x_k & y \end{pmatrix}\]We will use 1) direct product to produce the instance of 2) \(\textup{SEL}^{k\times 1}_{\textup{DISJ}}\).

The relationship of shapes between \(x'_i\) and \(x_i\) is simple. We can plug each \(x'_i\) into each \(x_i\), respectively. For \(y\), as there is only itself, we plug all \(\{x'_i\}\) into \(y\), i.e., \(y = y'_1y'_2,\ldots,y'_k\). At this moment, there is a problem. Recall that the length of \(y\) is \(n\). Now it becomes \(kn\). To solve it, we make \(y'_k\) have size \(\sqrt{n}\) (as well as \(x'_k\); we set \(k = \sqrt{n}\)). Since we change the length of \(x'_i\), now we fill the empty positions in \(x_i\) with \(0\).

\[\begin{pmatrix} x'_1 & 0 & 0 & \ldots\\ 0 & x'_2 & 0 & \ldots\\ \vdots & \vdots & \vdots & \vdots \\ 0 & 0 & 0 & x'_k \end{pmatrix} \begin{pmatrix} y'_1 & y'_2 & y'_3 & \ldots\\ y'_1 & y'_2 & y'_3 & \ldots\\ \vdots & \vdots & \vdots & \vdots\\ y'_1 & y'_2 & y'_3 & \ldots\\ \end{pmatrix}\]Then, we get two distributions \(\mu = \{0, 1\}^{n\times k} \times \{0,1\}^n\) and \(\mu_{pro} = \{0,1\}^{\sqrt{n}\times \sqrt{n}} \times \{0,1\}^{\sqrt{n}\times \sqrt{n}}\).

Next, using Alice’s advice with \(\alpha k\) bits, we can partition all inputs of \(\{0,1\}^{nk}\times \{0, 1\}^n\) into \(2^{\alpha k}\) classes with respect to \(\mu\) (To readers: why \(\mu\)? and why we do partitions?). Let \(C\) denote the largest class (*it means that \(C\) takes at least \(1/2^{\alpha k}\) proportion of the whole input, so the success probability is at least \(1/2^{\alpha k}\) since the protocol returns correct answers for C ). Recall that we assume that there is a protocol using \(c\) bits to solve \(\textup{SEL}^{k\times 1}_{\textup{DISJ}}\). So, for any input \(((x_1,\ldots,x_k),y)\in C\) and \(i \in \{1, \ldots, k\}\), the protocol gives the right answer.

Bob and Charlie communicate at most \(c\) bits on each input. We can output the entire vectors of answers using \(ck < \beta k \sqrt{n}\) bits. Thus, the protocol outputs a correct answer for \(\textup{DISJ}^{(k)}: \{0, 1\}^{k\times k} \times \{0, 1\}^{k\times k} \to \{0, 1\}^k\) on all inputs from \(C\). Also, because \(C\) is a largest one with respect to \(\mu\), the success probability is at least \(\frac{1}{2^{\alpha k}}\) over \(\mu_{pro}\). Thus, we obtain a contradiction to Thereom 3.

\(\textbf{Proof of Theorem 2}\) (sketch). Put the above analysis together.

## Open Questions

Notice that there is still a \(\log n\) gap between the lower bound and upper bound. A natural question: can we find \(O(\sqrt{n})\) algorithms for the ABC model?

## Related Previous Posts

## Reference

[1] Chattopadhyay, Arkadev, et al. “Upper and lower bounds on the power of advice.” *SIAM Journal on Computing* 45.4 (2016): 1412-1432.

[2] Patrascu, Mihai. “Towards polynomial lower bounds for dynamic problems.” *Proceedings of the forty-second ACM symposium on Theory of computing*. 2010.

[3] Klauck, Hartmut. “A strong direct product theorem for disjointness.” *Proceedings of the forty-second ACM symposium on Theory of computing*. 2010.