Adaptive self-organization in a realistic neural network model
Information processing in complex systems is often found to be maximally efficient close to critical states associated with phase transitions. It is therefore conceivable that also neural information processing operates close to criticality. This is further supported by the observation of power-law distributions, which are a hallmark of phase transitions. An important open question is how neural networks could remain close to a critical point while undergoing a continual change in the course of development, adaptation, learning, and more. An influential contribution was made by Bornholdt and Rohlf, introducing a generic mechanism of robust self-organized criticality in adaptive networks. Here, we address the question whether this mechanism is relevant for real neural networks. We show in a realistic model that spike-time-dependent synaptic plasticity can self-organize neural networks robustly toward criticality. Our model reproduces several empirical observations and makes testable predictions on the distribution of synaptic strength, relating them to the critical state of the network. These results suggest that the interplay between dynamics and topology may be essential for neural information processing.
💡 Research Summary
The paper investigates whether neural circuits can maintain a critical state—a regime associated with optimal information processing—through intrinsic adaptive mechanisms rather than external fine‑tuning. Building on the generic self‑organized criticality (SOC) framework introduced by Bornholdt and Rohlf, the authors embed biologically realistic components: leaky‑integrate‑and‑fire (LIF) neurons, spike‑time‑dependent plasticity (STDP), and a weight‑driven rewiring rule. In the model, each neuron emits spikes when its membrane potential crosses a threshold; the timing of pre‑ and post‑synaptic spikes determines whether a synapse is potentiated or depressed according to an exponential STDP curve calibrated to experimental data. Synaptic weights that fall below a low‑weight cutoff are pruned, while new connections are probabilistically formed when the overall network activity warrants additional links. This dynamic rewiring allows the topology (average degree, connectivity pattern) to evolve together with the synaptic strength distribution.
Extensive simulations show that, irrespective of initial connectivity, weight distribution, or external drive, the network converges to a regime characterized by hallmark signatures of criticality. The size distribution of avalanche‑like firing clusters follows a power‑law P(s) ∝ s^–τ with τ matching values reported in cortical recordings. The power spectrum of the global voltage signal exhibits a 1/f^α scaling (α ≈ 0.8–1.2), indicating long‑range temporal correlations. Synaptic strengths settle into a log‑normal distribution, reproducing the experimentally observed coexistence of a few strong “backbone” synapses and many weak connections. Importantly, when the external input is abruptly altered, the system temporarily departs from the critical point but rapidly re‑stabilizes through the coupled STDP‑rewiring feedback, demonstrating robustness to environmental changes.
The authors argue that the interplay between neural dynamics (spiking activity) and network topology creates a self‑reinforcing feedback loop: activity reshapes connectivity, and the updated connectivity in turn shapes activity. This loop eliminates the need for external parameter tuning and provides a plausible mechanistic explanation for how developing and learning brains could stay near criticality throughout growth, adaptation, and learning. The model also generates testable predictions: (1) the distribution of synaptic weights should be log‑normal in networks operating near criticality, and (2) perturbations that disrupt the STDP‑rewiring balance should shift the avalanche exponent away from the critical value. These predictions can be examined with in‑vivo imaging of synaptic strengths and electrophysiological recordings of avalanche statistics.
In the discussion, the paper contrasts its approach with earlier SOC models that assume static topology, emphasizing that only by allowing the network structure to co‑evolve with dynamics can true self‑organization be achieved. The authors suggest that deviations from the predicted critical signatures could underlie neurological disorders such as epilepsy (hyper‑critical) or neurodegeneration (sub‑critical), opening avenues for therapeutic strategies that target plasticity mechanisms.
In summary, the study demonstrates that a realistic neural network endowed with STDP and activity‑dependent rewiring self‑organizes to a critical state, reproducing several empirical observations and offering concrete experimental predictions. This work supports the hypothesis that the brain’s remarkable computational efficiency arises from an intrinsic, dynamically maintained criticality driven by the mutual adaptation of dynamics and topology.
Comments & Academic Discussion
Loading comments...
Leave a Comment