Revised: January 2, 2018

Published: December 27, 2018

**Keywords:**communication complexity, information complexity, fooling distribution, separation of complexity measures

**Categories:**complexity theory, communication complexity, information complexity, lower bounds, separation of complexity measures

**ACM Classification:**F.1.1, F.1.2, F.2.3

**AMS Classification:**68Q05, 68W10, 68Q17, 68Q85, 68Q87

**Abstract:**
[Plain Text Version]

We give an example of a boolean function whose information complexity
is exponentially smaller than its communication complexity.
Such a separation was first demonstrated
by Ganor, Kol and Raz
(J. ACM 2016).
We give
a simpler proof of the same result. In the course of this simplification, we make several new contributions: we introduce a new communication
lower-bound
technique, the notion of a *fooling distribution*, which allows us to separate information and communication complexity, and we also give a more direct proof of the information complexity upper bound.

We also prove a generalization of Shearer's Lemma that may be of independent interest. A version of Shearer's original lemma bounds the expected mutual information of a subset of random variables with another random variable, when the subset is chosen independently of all the random variables that are involved. Our generalization allows some dependence between the random subset and the random variables involved, and still gives us similar bounds with an appropriate error term.

A preliminary version of this paper appeared in ECCC as Technical Report TR15-057.