Communication complexity bounds from information causality
Communication complexity, which quantifies the minimum communication required for distributed computation, offers a natural setting for investigating the capabilities and limitations of quantum mechanics in information processing. We introduce an information-theoretic approach to study one-way communication complexity based solely on the axioms of mutual information. Within this framework, we derive an extended statement of the information causality principle, which recovers known lower bounds on the communication complexities for a range of functions in a simplified manner and leads to new results. We further prove that the extended information causality principle is at least as strong as the principle of non-trivial communication complexity in bounding the strength of quantum correlations attainable in Bell experiments. Our study establishes a new route for exploring the fundamental limits of quantum technologies from an information-theoretic viewpoint.
💡 Research Summary
The paper investigates one‑way communication complexity in the entanglement‑assisted classical model using only the axioms of mutual information. By adopting the four standard axioms (non‑negativity, chain rule, data‑processing inequality, and reduction to Shannon mutual information for classical variables), the authors derive an extended version of the Information Causality (IC) principle, presented as Theorem 1. This theorem states that for any Boolean function f:X×Y→{0,1} and any ordering of the elements of Y, the required communication C*ε(f) is lower‑bounded by the sum over i of the conditional mutual informations I(g;f(x,y_i) | {f(x,y_j)}_{j<i}), where g is Bob’s guess and ε is the allowed error probability.
Applying this bound to several well‑studied functions yields concise proofs of known linear lower bounds: for the INDEX, inner‑product (IP), and disjointness (DISJ) functions, the bound reduces to a sum of terms identical to those appearing in the original IC scenario, giving Cε = Ω(n). For the equality function EQ, setting ε = 0 makes each conditional mutual information equal to the entropy of f(x,y_i), and the sum evaluates exactly to n, reproducing the deterministic bound C0(EQ) = n.
The authors also analyze a previously unexamined family, the k‑intersect function k‑INT_n, defined as the predicate that the Hamming inner product of x and y is at least k. By ordering Y so that strings of Hamming weight k appear first, the bound decomposes into weighted sums of the same type as for INDEX, leading to C*ε(k‑INT_n) = Ω(n − 2k). This demonstrates that the communication cost remains linear in n but decreases with larger k, a novel result.
Beyond specific functions, the paper shows that the extended IC principle is at least as strong as the “non‑trivial communication complexity” principle. If Theorem 1 were violated, one could construct PR‑box‑type correlations that allow any function to be computed with a single bit of communication, contradicting known physical constraints. Hence the extended IC principle provides a tighter, information‑theoretic tool for bounding the set of quantum correlations achievable in Bell experiments.
Finally, the authors provide a publicly available software implementation that automates the calculation of the bound in Eq. (7) for arbitrary functions and input sizes, facilitating further research at the intersection of communication complexity, quantum information theory, and foundational studies of non‑locality.
Comments & Academic Discussion
Loading comments...
Leave a Comment