C1: Compress Interactive Communication (1)
It’s much more important today to be able to become an expert in a brand-new field in nine to twelve months than to have studied the right thing a long time ago. — Eric Jorgenson.
This post is based on [1, 2].
We start by examples.
Examples
- E1. Let be two u.i.r -bit strings. Alice uses private randomness to sample a uniformly random subset of size with . Alice sends the following -bit string to Bob, i.e., where if and otherwise.
. Can we use less communication bits to achieve the same goal?
. There exists a protocol with less than bits of communication simulating E1.
- Alice and Bob use public randomness to sample a set of size as well as a uniformly random -bit string .
- Alice computes the number of coordinates of such that .
- Alice sample a uniformly random subset of size with high probability).
- Alice construct the following string : If , ; If , ; , otherwise. Alice sends the part of that Bod does not know to Bob.
Since Bob knows , the bits of string is not sent. So, the number of bits sending by Alice is . Next, let us check the correctness. We only need to focus on the selected elements. When for these elements, Alice only needs to send because it should be and they are different. For the rest of elements, they are same, so we need to send . Then, the two strings are identical.
From the above example, we can see that we should be very careful about the randomness used when giving a lower bound.
Next, we introduce two important conceptions.
Definitions
. Given a distribution on , and a protocol , denoted by the public randomness and message exchanged during the protocol, the external information cost .
For a particular function , the external information cost of w.r.t the distribution is the infimum of over all protocols that compute on inputs from with probability at least .
Intuitively speaking, an external information cost is the minimum number of bits of information that an external observer reveals by learning about the inputs of two parties, Alice and Bob in any protocol to compute a function .
. Given a distribution on , and a protocol , denoted by the public randomness and message exchanged during the protocol, the external information cost .
It is the information that a party can learn about the inputs of another one through messages and public randomness.
Basic Lemmas
Let be the message from executing the protocol and be the public randomness of the protocol.
. The size of external information is not smaller than the size of internal information, i.e., .
Reference
[1]. Barak, Boaz, et al. “How to compress interactive communication.” Proceedings of the forty-second ACM symposium on Theory of computing. 2010.
[2]. Rao, Anup, and Amir Yehudayoff. Communication Complexity: and Applications. Cambridge University Press, 2020.