Scale-free networks as preasymptotic regimes of superlinear preferential attachment
We study the following paradox associated with networks growing according to superlinear preferential attachment: superlinear preference cannot produce scale-free networks in the thermodynamic limit, but there are superlinearly growing network models that perfectly match the structure of some real scale-free networks, such as the Internet. We obtain an analytic solution, supported by extensive simulations, for the degree distribution in superlinearly growing networks with arbitrary average degree, and confirm that in the true thermodynamic limit these networks are indeed degenerate, i.e., almost all nodes have low degrees. We then show that superlinear growth has vast preasymptotic regimes whose depths depend both on the average degree in the network and on how superlinear the preference kernel is. We demonstrate that a superlinearly growing network model can reproduce, in its preasymptotic regime, the structure of a real network, if the model captures some sufficiently strong structural constraints – rich-club connectivity, for example. These findings suggest that real scale-free networks of finite size may exist in preasymptotic regimes of network evolution processes that lead to degenerate network formations in the thermodynamic limit.
💡 Research Summary
The paper tackles a paradox that arises in network growth models based on super‑linear preferential attachment (SLPA). Theory predicts that when the attachment probability grows faster than linearly with node degree (i.e., P(i) ∝ k_i^δ with δ > 1), the network inevitably collapses into a star‑like structure in the thermodynamic limit: almost all nodes have degree 1 (or a small constant), and only a finite number of hubs survive. Yet several SLPA‑based models, most notably the Positive‑Feedback Preference (PFP) model of Zhou and Mondragón, reproduce the degree distribution, clustering, and rich‑club connectivity observed in real‑world scale‑free networks such as the Internet.
The authors resolve this apparent contradiction by showing that SLPA dynamics possess extremely long pre‑asymptotic regimes. In these regimes, the network’s degree distribution mimics a power law over many orders of magnitude, even though the true asymptotic state is degenerate. The paper proceeds in four logical steps.
-
Asymptotic degree distribution for SLPA
For the simplest case where each new node adds a single link (m = 1), the authors recover known results: if the attachment kernel exponent δ lies between the critical values δ_p = 1 + 1/p (p = 1,2,…), the number of nodes with degree k ≤ p scales as N_k ∝ N^{(k‑1)(1‑δ)} while nodes with higher degree remain finite. This yields an infinite series of “connectivity phase transitions” at δ = 1 + 1/p. When multiple links per node are allowed (m ≥ 2), the degree distribution is simply shifted by m: N_k/N ∝ N^{(k‑m)(1‑δ)} for m ≤ k ≤ p + m‑1, and vanishes otherwise. Hence, regardless of m, the asymptotic network is a star‑like object with only O(1) high‑degree nodes. -
Extremal growth and the “open m‑book” family
The authors introduce an extremal growth rule: a new node always connects to the current highest‑degree nodes. Under this rule the network evolves into a deterministic structure they call an “open m‑book”. For m = 2 the graph consists of two hub nodes (the binding) linked to a cascade of degree‑2 nodes that form triangular “pages”. For m = 3 the binding is a triangle and pages become tetrahedra, and so on. By induction they prove exact degree sequences (e.g., (N‑1, N‑2, 2,…,2) for m = 2) and show that these graphs are topologically equivalent to open books known in algebraic topology. -
Relaxing the extremal rule: pre‑asymptotic analysis
When the strict extremal rule is replaced by the probabilistic SLPA kernel, the authors estimate the probability P_N that a network of size N still retains the open‑book topology. Using a mean‑field approximation they obtain- P_N ≈ finite for δ > 2,
- P_N ≈ N^{‑6} for δ = 2, and
- P_N ≈ exp
Comments & Academic Discussion
Loading comments...
Leave a Comment