Assisted Common Information: Further Results
We presented assisted common information as a generalization of G'acs-K"orner (GK) common information at ISIT 2010. The motivation for our formulation was to improve upperbounds on the efficiency of protocols for secure two-party sampling (which is a form of secure multi-party computation). Our upperbound was based on a monotonicity property of a rate-region (called the assisted residual information region) associated with the assisted common information formulation. In this note we present further results. We explore the connection of assisted common information with the Gray-Wyner system. We show that the assisted residual information region and the Gray-Wyner region are connected by a simple relationship: the assisted residual information region is the increasing hull of the Gray-Wyner region under an affine map. Several known relationships between GK common information and Gray-Wyner system fall out as consequences of this. Quantities which arise in other source coding contexts acquire new interpretations. In previous work we showed that assisted common information can be used to derive upperbounds on the rate at which a pair of parties can {\em securely sample} correlated random variables, given correlated random variables from another distribution. Here we present an example where the bound derived using assisted common information is much better than previously known bounds, and in fact is tight. This example considers correlated random variables defined in terms of standard variants of oblivious transfer, and is interesting on its own as it answers a natural question about these cryptographic primitives.
💡 Research Summary
The paper introduces “Assisted Common Information” (ACI) as a natural generalization of the classic Gács–Körner (GK) common information. While GK common information captures the maximal entropy of a deterministic common part that can be extracted from a single pair of random variables (X, Y), ACI allows an omniscient genie to assist two parties through two separate, rate‑limited noiseless links. The three quantities of interest – the two genie‑to‑user rates (R₁, R₂) and the residual conditional mutual information I(Xⁿ;Yⁿ | W)/n after the parties generate a common random variable W – define a three‑dimensional trade‑off region called the Assisted Residual Information Region (ARIR). When the genie’s links have zero rate, ARIR collapses to the GK common information value.
A central contribution is the discovery of a simple affine transformation that maps the well‑studied Gray‑Wyner (GW) source‑coding region R_GW into the ARIR. Specifically, for any rate triple s = (R_A,R_B,R_C) in R_GW, the map f(s) = M·s − b, with M =
Comments & Academic Discussion
Loading comments...
Leave a Comment