Toolbox

C1: Direct Sums in Communication Complexity

Reminder: This post contains 3693 words · 11 min read · by Xianbin

Communication complexity theory is also very helpful in the area of economic. After knowing this, I feel it would be good to write more about communication complexity theory.

The Direct Sum Theorem

\(\textbf{Theorem 1}\) (worst case). Let \(g^{n}\) denote the problem \(g\) with \(n\) copies. If there exists a randomized algorithm with communication \(cc_g\) for \(g\), then, the randomized communication complexity of \(g^n\) is \(\Omega(\frac{cc_g \sqrt{n}}{\log cc_g})\).

Tools

1. protocol and transcript

A (private-coin) protocol \(\pi\) is a decision tree where each internal node has an owner Alice or Bob.

The transcript \(\Pi\) of a protocol is the concatenation of all bits exchanges during the execution of a protocol with public randomness \(R\) and private randomness \(R_A\) and \(R_B\), i.e., \(\Pi = \Pi (x,y, R_A, R_B, R)\).

A protocol \(\pi\) is an algorithm while a transcript is a string.

Transcript (\(\Pi\)) Example (by notebooklm)

Now, let’s imagine a specific execution of this protocol:

Steps in the Protocol Execution

  1. Alice Computes Her Message:
    • \[M_1 = h_A(\{\text{apple}, \text{banana}, \text{cherry}\}, 011, 10110)\]
    • Result: \(M_1 = 11001\)
    • This is message \(\Pi_1\).
  2. Bob Computes His Message:
    • Bob receives Alice’s message and computes:
    • \[M_2 = h_B(\{\text{date}, \text{fig}, \text{apple}\}, 110, 10110)\]
    • Result: \(M_2 = 01010\)
    • This is message \(\Pi_2\).

Transcript (\(\Pi\))

The transcript of this particular run of the protocol is:

\[\Pi = R \Pi_1 \Pi_2 = 10110 11001 01010\]

Breakdown of the Transcript:

Key Points

The transcript includes:

  1. The public randomness \(R\).
  2. All messages exchanged during the protocol’s execution.

It represents all the information visible to both parties (and possibly to an external observer). Different random strings and inputs will generate different transcripts, but all executions of the protocol will follow the same structure specified by \(\pi\).

2. Information complexity and communication complexity

\(\textbf{Theorem 2}\). \(IC_\mu(\pi) = \mathbb{E}_r [IC_\mu(\pi_r)]\)

Proof.

\[IC_\mu(\pi) = I(\Pi: X \mid Y) + I(\Pi: Y \mid X) \\= I(R:X\mid Y) + I(\Pi \setminus R:X \mid Y, R) \\ +I(R:Y\mid X) + I(\Pi \setminus R: Y\mid X, R) \\=I(R:X\mid Y) + I(\Pi:X \mid Y, R) +I(R:Y\mid X) + I(\Pi: Y\mid X, R) \\ =I(\Pi: X\mid Y R) + I(\Pi: Y \mid X, R) ( \text{ R is independent of X, Y})\\ = \mathbb{E}_r [I(\Pi: X \mid Y ,R=r)] + \mathbb{E}_r [I(\Pi: Y \mid X,R=r)]\\ = \mathbb{E}_r [IC_\mu(\pi_r)]\]

Useful Facts:

\[I(\Pi: X \mid Y ) = I(\Pi: X \mid Y, R) = I(\Pi: X \mid Y, R, R_B)\]

The information complexity easier

Information cost (inner) satisfies additivity.

\(\textbf{Theorem 3}\) For every communication task \(f\), we have \(IC_{\mu^n}(f^n) = nIC_\mu(f)\).

Proof. We know that the general trick of proving \(A=B\) is to prove \(A\geq B\) and \(A\leq B\).

It is straightforward that \(IC_{\mu^n}(f^n) \leq nIC_\mu(f)\) given a protocol \(\pi\) for \(f\), because we can solve the task \(f\) sequentially. Now, let us proof the other direction, i.e., \(IC_{\mu^n}(f^n) \geq nIC_\mu(f)\). To prove this, it is enough to design a protocol \(\pi'\) that can solve \(f\) at most \(IC_{\mu^n}(f^n)/n\).

The next step is to design \(\pi'\) from \(\pi\).

To be continued…

Reference

[1]. Barak, B., Braverman, M., Chen, X. and Rao, A., 2010, June. How to compress interactive communication. In Proceedings of the forty-second ACM symposium on Theory of computing (pp. 67-76).

[2]. COMP760, SUMMARY OF LECTURE 14. HAMED HATAMI 1. Introduction to information complexity

[3] Notebooklm.