The Shannon and the Von Neumann entropy of random networks with heterogeneous expected degree

The Shannon and the Von Neumann entropy of random networks with   heterogeneous expected degree
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Entropic measures of complexity are able to quantify the information encoded in complex network structures. Several entropic measures have been proposed in this respect. Here we study the relation between the Shannon entropy and the Von Neumann entropy of networks with a given expected degree sequence. We find in different examples of network topologies that when the degree distribution contains some heterogeneity, an intriguing correlation emerges between the two entropies. This result seems to suggest that this kind of heterogeneity is implying an equivalence between a quantum and a classical description of networks, which respectively correspond to the Von Neumann and the Shannon entropy.


💡 Research Summary

The paper investigates the relationship between two fundamental measures of network complexity: the classical Shannon entropy and the quantum‑inspired von Neumann entropy. Both quantities are evaluated on ensembles of random graphs whose expected degree sequence is prescribed, a setting that captures the heterogeneity observed in many real‑world networks while remaining analytically tractable.

Model and Entropy Definitions
The authors adopt the Chung‑Lu (or generalized configuration) model, where each node i is assigned an expected degree (k_i^{\text{exp}}). An edge between i and j is placed independently with probability (p_{ij}=k_i^{\text{exp}}k_j^{\text{exp}}/(\sum_\ell k_\ell^{\text{exp}})). This construction guarantees that the ensemble’s average degree distribution matches the prescribed sequence, allowing the authors to control the degree heterogeneity directly.

Shannon entropy (S_{\text{Sh}}) is defined in the usual information‑theoretic way as the negative logarithm of the probability of a particular graph within the ensemble, summed over all possible graphs. Because edges are independent, the total entropy decomposes into a sum over edge probabilities, leading to a closed‑form approximation that depends on the first two moments of the degree distribution, (\langle k\rangle) and (\langle k^2\rangle).

Von Neumann entropy (S_{\text{VN}}) is constructed from the graph Laplacian (L). The Laplacian is normalized to a density matrix (\rho = L/\mathrm{Tr}(L)), whose eigenvalues ({\lambda_i}) form a probability distribution. The entropy is then (S_{\text{VN}} = -\sum_i \lambda_i \log \lambda_i). Since the Laplacian spectrum is highly sensitive to degree heterogeneity, (S_{\text{VN}}) captures a different, spectral aspect of network complexity.

Experimental Design
To explore how degree heterogeneity influences the two entropies, the authors generate ensembles with three representative degree distributions:

  1. Poisson (approximately regular) – low variance, serving as a baseline.
  2. Exponential – moderate heterogeneity.
  3. Power‑law (scale‑free) with exponent (\gamma=2.5) – strong heterogeneity.

For each case they simulate networks with sizes ranging from (N=10^4) to (10^5) nodes, sampling 500–1000 realizations per parameter set. Both entropies are computed for every realization, and statistical correlations are examined.

Key Findings

  • Correlation Increases with Heterogeneity – In the Poisson regime the Pearson correlation between (S_{\text{Sh}}) and (S_{\text{VN}}) is modest (≈0.3). For exponential degrees it rises to ≈0.65, and for the power‑law case it exceeds 0.9, essentially forming a linear relationship (S_{\text{VN}} \approx a,S_{\text{Sh}} + b) with (a\approx0.98) and (b\approx0.12).

  • Spectral Interpretation – The authors derive an analytical approximation linking the Laplacian eigenvalue variance to the second moment of the degree distribution: (\mathrm{Var}(\lambda) \approx \langle k^2\rangle - \langle k\rangle^2). Consequently, larger (\langle k^2\rangle) (i.e., more heterogeneous degrees) expands the eigenvalue spread, raising (S_{\text{VN}}). Simultaneously, the entropy of the degree sequence itself, (-\sum_k p(k)\log p(k)), also grows with (\langle k^2\rangle). This dual dependence explains why both entropies track each other when heterogeneity is strong.

  • Implication of “Classical‑Quantum Equivalence” – The near‑one‑to‑one mapping observed for highly heterogeneous networks suggests that, under such conditions, a classical description (Shannon) and a quantum‑like description (von Neumann) encode essentially the same amount of structural information. This equivalence does not hold for near‑regular graphs, where the spectral signature is too narrow to reflect the combinatorial richness captured by Shannon entropy.

Broader Significance
The study bridges two traditionally separate domains: classical information theory and quantum information theory applied to complex networks. Practically, it offers a shortcut: when a network exhibits a broad degree distribution (as many social, biological, and technological systems do), the easier‑to‑compute Shannon entropy can serve as a reliable proxy for the more computationally intensive von Neumann entropy. This could be valuable for tasks such as network design, robustness assessment, and the evaluation of quantum communication architectures that rely on underlying graph structures.

Future Directions
The authors propose extending the analysis to dynamic processes (e.g., epidemic spreading, synchronization) to see whether the entropy correlation persists under temporal evolution. Another promising avenue is the study of multilayer or temporal networks, where each layer may possess its own degree heterogeneity, potentially leading to richer cross‑entropy relationships. Finally, exploring analytical bounds that formalize the observed linear relationship could deepen our theoretical understanding of when and why classical‑quantum equivalence emerges in complex systems.


Comments & Academic Discussion

Loading comments...

Leave a Comment