Toolbox

C1: Direct Sums in Communication Complexity

Reminder: This post contains 3693 words · 11 min read · by Xianbin

Communication complexity theory is also very helpful in the area of economic. After knowing this, I feel it would be good to write more about communication complexity theory.

The Direct Sum Theorem

Theorem 1\textbf{Theorem 1} (worst case). Let gng^{n} denote the problem gg with nn copies. If there exists a randomized algorithm with communication ccgcc_g for gg, then, the randomized communication complexity of gng^n is Ω(ccgnlogccg)\Omega(\frac{cc_g \sqrt{n}}{\log cc_g}).

Tools

1. protocol and transcript

A (private-coin) protocol π\pi is a decision tree where each internal node has an owner Alice or Bob.

The transcript Π\Pi of a protocol is the concatenation of all bits exchanges during the execution of a protocol with public randomness RR and private randomness RAR_A and RBR_B, i.e., Π=Π(x,y,RA,RB,R)\Pi = \Pi (x,y, R_A, R_B, R).

A protocol π\pi is an algorithm while a transcript is a string.

Transcript (Π\Pi) Example (by notebooklm)

Now, let’s imagine a specific execution of this protocol:

Steps in the Protocol Execution

  1. Alice Computes Her Message:
    • M1=hA({apple,banana,cherry},011,10110)M_1 = h_A(\{\text{apple}, \text{banana}, \text{cherry}\}, 011, 10110)
    • Result: M1=11001M_1 = 11001
    • This is message Π1\Pi_1.
  2. Bob Computes His Message:
    • Bob receives Alice’s message and computes:
    • M2=hB({date,fig,apple},110,10110)M_2 = h_B(\{\text{date}, \text{fig}, \text{apple}\}, 110, 10110)
    • Result: M2=01010M_2 = 01010
    • This is message Π2\Pi_2.

Transcript (Π\Pi)

The transcript of this particular run of the protocol is:

Π=RΠ1Π2=101101100101010\Pi = R \Pi_1 \Pi_2 = 10110 11001 01010

Breakdown of the Transcript:

Key Points

The transcript includes:

  1. The public randomness RR.
  2. All messages exchanged during the protocol’s execution.

It represents all the information visible to both parties (and possibly to an external observer). Different random strings and inputs will generate different transcripts, but all executions of the protocol will follow the same structure specified by π\pi.

2. Information complexity and communication complexity

Theorem 2\textbf{Theorem 2}. ICμ(π)=Er[ICμ(πr)]IC_\mu(\pi) = \mathbb{E}_r [IC_\mu(\pi_r)]

Proof.

ICμ(π)=I(Π:XY)+I(Π:YX)=I(R:XY)+I(ΠR:XY,R)+I(R:YX)+I(ΠR:YX,R)=I(R:XY)+I(Π:XY,R)+I(R:YX)+I(Π:YX,R)=I(Π:XYR)+I(Π:YX,R)( R is independent of X, Y)=Er[I(Π:XY,R=r)]+Er[I(Π:YX,R=r)]=Er[ICμ(πr)]IC_\mu(\pi) = I(\Pi: X \mid Y) + I(\Pi: Y \mid X) \\= I(R:X\mid Y) + I(\Pi \setminus R:X \mid Y, R) \\ +I(R:Y\mid X) + I(\Pi \setminus R: Y\mid X, R) \\=I(R:X\mid Y) + I(\Pi:X \mid Y, R) +I(R:Y\mid X) + I(\Pi: Y\mid X, R) \\ =I(\Pi: X\mid Y R) + I(\Pi: Y \mid X, R) ( \text{ R is independent of X, Y})\\ = \mathbb{E}_r [I(\Pi: X \mid Y ,R=r)] + \mathbb{E}_r [I(\Pi: Y \mid X,R=r)]\\ = \mathbb{E}_r [IC_\mu(\pi_r)]

Useful Facts:

I(Π:XY)=I(Π:XY,R)=I(Π:XY,R,RB)I(\Pi: X \mid Y ) = I(\Pi: X \mid Y, R) = I(\Pi: X \mid Y, R, R_B)

The information complexity easier

Information cost (inner) satisfies additivity.

Theorem 3\textbf{Theorem 3} For every communication task ff, we have ICμn(fn)=nICμ(f)IC_{\mu^n}(f^n) = nIC_\mu(f).

Proof. We know that the general trick of proving A=BA=B is to prove ABA\geq B and ABA\leq B.

It is straightforward that ICμn(fn)nICμ(f)IC_{\mu^n}(f^n) \leq nIC_\mu(f) given a protocol π\pi for ff, because we can solve the task ff sequentially. Now, let us proof the other direction, i.e., ICμn(fn)nICμ(f)IC_{\mu^n}(f^n) \geq nIC_\mu(f). To prove this, it is enough to design a protocol π\pi' that can solve ff at most ICμn(fn)/nIC_{\mu^n}(f^n)/n.

The next step is to design π\pi' from π\pi.

To be continued…

Reference

[1]. Barak, B., Braverman, M., Chen, X. and Rao, A., 2010, June. How to compress interactive communication. In Proceedings of the forty-second ACM symposium on Theory of computing (pp. 67-76).

[2]. COMP760, SUMMARY OF LECTURE 14. HAMED HATAMI 1. Introduction to information complexity

[3] Notebooklm.