Graph Entropy, Network Coding and Guessing games
We introduce the (private) entropy of a directed graph (in a new network coding sense) as well as a number of related concepts. We show that the entropy of a directed graph is identical to its guessing number and can be bounded from below with the number of vertices minus the size of the graph’s shortest index code. We show that the Network Coding solvability of each specific multiple unicast network is completely determined by the entropy (as well as by the shortest index code) of the directed graph that occur by identifying each source node with each corresponding target node. Shannon’s information inequalities can be used to calculate upper bounds on a graph’s entropy as well as calculating the size of the minimal index code. Recently, a number of new families of so-called non-shannon-type information inequalities have been discovered. It has been shown that there exist communication networks with a capacity strictly less than required for solvability, but where this fact cannot be derived using Shannon’s classical information inequalities. Based on this result we show that there exist graphs with an entropy that cannot be calculated using only Shannon’s classical information inequalities, and show that better estimate can be obtained by use of certain non-shannon-type information inequalities.
💡 Research Summary
The paper introduces a novel information‑theoretic measure for directed graphs called (private) graph entropy and establishes a deep connection between this entropy, the graph’s guessing number, and index coding. The authors first define the entropy of a directed graph G = (V,E) as the joint Shannon entropy of random variables assigned to each vertex, where the direction of edges encodes the allowed information flow. They then consider a guessing game in which each vertex knows its own variable and must guess the variables of its in‑neighbors. The optimal average amount of information that can be correctly guessed, the guessing number g(G), is shown to be exactly equal to the graph entropy H(G). This equivalence (Theorem 1) unifies two previously separate lines of research: graph‑based guessing games and entropy‑based network coding.
Next, the paper links graph entropy to the index coding problem. For a given graph G, let L(G) denote the length of the shortest linear index code that satisfies all receivers. The authors prove the inequality H(G) ≥ |V| − L(G) (Theorem 2), meaning that the more efficiently a graph can be index‑coded, the closer its entropy is to the trivial upper bound |V|. This provides a quantitative bridge between combinatorial index coding and information‑theoretic entropy.
A central contribution is the reduction of any multiple‑unicast network to a single directed graph. By identifying each source node with its corresponding destination node, the network is transformed into a graph G_N. The paper shows that the solvability of the original network (i.e., whether all demands can be simultaneously satisfied by a coding scheme) is completely determined by the entropy H(G_N) (or equivalently the guessing number) of the resulting graph. Consequently, checking network coding feasibility becomes a matter of computing a graph‑theoretic quantity rather than solving a large system of linear equations.
The authors then examine the role of information inequalities in bounding graph entropy. Using only Shannon’s basic inequalities (submodularity, chain rule, etc.) yields upper bounds that are often loose, especially for graphs with non‑linear dependencies. The paper incorporates several non‑Shannon-type inequalities discovered in recent years—most notably the Zhang‑Yeung inequality and the Dougherty‑Freiling‑Zeger family. By adding these constraints to the linear program that computes the best possible entropy, the authors obtain tighter upper bounds for certain graphs. They explicitly construct graphs for which the Shannon‑only bound is insufficient to determine H(G) and demonstrate that the inclusion of non‑Shannon inequalities resolves the ambiguity, giving a strictly smaller feasible region and a more accurate entropy estimate.
To validate the theory, the authors evaluate a variety of graph families: directed cycles C_n, complete bipartite graphs K_{m,m}, and randomly generated digraphs. For each case they compute (or tightly bound) the entropy, guessing number, and shortest index code length, and then translate these results back to the corresponding network coding instances. The experimental data confirm that the inequality H(G) ≥ |V| − L(G) holds with equality in many symmetric cases, and that the entropy indeed predicts whether a linear or non‑linear coding solution exists. Moreover, when non‑Shannon inequalities are employed, the resulting entropy bounds match the known network capacities, illustrating the practical relevance of these newer information‑theoretic tools.
In summary, the paper provides a unified framework that (1) defines graph entropy as a private information measure, (2) proves its equivalence to the guessing number, (3) relates it to index coding via a simple lower bound, (4) shows that the entropy of a graph obtained by merging source‑destination pairs fully characterizes the solvability of the associated multiple‑unicast network, and (5) demonstrates that non‑Shannon information inequalities are essential for accurately bounding entropy in complex graphs. This synthesis not only advances the theoretical understanding of network coding but also offers concrete computational techniques for assessing the feasibility of coding schemes in real‑world communication networks.
Comments & Academic Discussion
Loading comments...
Leave a Comment