Dynamical synapses causing self-organized criticality in neural networks
We show that a network of spiking neurons exhibits robust self-organized criticality if the synaptic efficacies follow realistic dynamics. Deriving analytical expressions for the average coupling strengths and inter-spike intervals, we demonstrate that networks with dynamical synapses exhibit critical avalanche dynamics for a wide range of interaction parameters. We prove that in the thermodynamical limit the network becomes critical for all large enough coupling parameters. We thereby explain experimental observations in which cortical neurons show avalanche activity with the total intensity of firing events being distributed as a power-law.
💡 Research Summary
The paper investigates how realistic synaptic dynamics can endow a spiking neural network with robust self‑organized criticality (SOC). Traditional SOC models of neuronal avalanches require fine‑tuning of a global control parameter to sit at a critical point, which is at odds with experimental observations of scale‑free avalanche activity in cortical tissue that persists without external adjustment. Levina, Herrmann, and Geisel propose a minimal model in which synaptic efficacies are not static but evolve according to a biologically plausible use‑dependent depression rule: each presynaptic spike depletes a fraction u of the available neurotransmitter, reducing the synaptic weight Jᵢⱼ, while in the absence of spikes the synapse recovers exponentially toward a maximal value α/u with a recovery time constant τ_J.
The network consists of N all‑to‑all coupled integrate‑and‑fire neurons with threshold θ = 1. External input arrives as a Poisson‑like process that selects a neuron at random and injects a constant current I_ext. When a neuron’s membrane potential exceeds threshold it fires, delivering its spike to all postsynaptic neurons after a fixed delay τ_d; the membrane potential is then reset by subtracting θ. The authors assume a separation of timescales τ_d ≪ τ (the inter‑avalanche interval), allowing them to treat avalanches as isolated events.
Using a mean‑field approximation they derive self‑consistency equations for two macroscopic quantities: the average synaptic strength ⟨J⟩ and the average inter‑spike interval ⟨Δτ⟩. The key relation (Eq. 3) couples ⟨J⟩ and ⟨Δτ⟩ through the stationary conditions of the synaptic dynamics and the neuronal firing statistics. Graphical solution of the coupled equations (Fig. 4) shows a unique fixed point for any α > 0. Importantly, when α approaches 1 the fixed point satisfies u⟨J⟩ ≈ 1, which is precisely the critical connectivity condition known from static‑synapse models.
Extensive simulations confirm the analytical predictions. For α < 1 the avalanche size distribution is sub‑critical (exponential decay), for α ≈ 1 the distribution follows a power law P(L) ∝ L^−3/2 over several decades, and for α > 1 the system becomes super‑critical, producing system‑spanning avalanches. The power‑law regime widens with increasing system size N, as shown in Fig. 2, indicating that the critical state is not a finite‑size artifact.
The authors then examine the thermodynamic limit N → ∞. By scaling the external drive as I_ext ∼ N^−w (w > 0) they analyze four asymptotic regimes for ⟨Δτ⟩. In all cases where a solution exists, the product u⟨J⟩ tends to 1, meaning that the network self‑tunes to the critical point for any α ≥ 1. Thus, in the infinite‑size limit the model exhibits true parameter‑independent SOC.
To test robustness, they add a leak term (τ_l) to the membrane equation and a compensatory constant current C that depends on τ_l, mimicking homeostatic mechanisms. Numerical results show that the avalanche size distribution remains a power law for leak time constants up to ≈40 ms, with the exponent slightly shifting (≈ −1.2 to −2) but preserving scale‑free behavior.
The paper also discusses extensions to partially connected networks, random connectivity with probability c, and small‑world topologies, finding that the critical region either persists or becomes even broader when connections are heterogeneous.
In summary, the study demonstrates that activity‑dependent synaptic depression alone can generate a self‑regulating feedback loop that drives a neural network toward a critical state without external fine‑tuning. The analytical mean‑field framework accurately predicts the macroscopic behavior, and the results align with experimental observations of neuronal avalanches in cortical slices and cultures. This work provides a compelling mechanistic explanation for the emergence of criticality in the brain and highlights the central role of dynamic synapses in maintaining optimal computational regimes.
Comments & Academic Discussion
Loading comments...
Leave a Comment