Adaptive transitions in FitzHugh-Nagumo networks with Hebb-Oja coupling rules
Adaptive coupling in networks of interacting neurons has gained recent attention due to the many applications both in biological and in artificial neural networks, where adaptive coupling or synaptic plasticity is considered as a key factor in learning processes. In the present study, we apply adaptive connectivity rules in networks of interacting FitzHugh-Nagumo oscillators. Adaptive coupling, here, is realized via Hebbian learning adjusted by the Oja rule to prevent the network link weights from growing without bounds. Numerical investigations demonstrate that during the adaptation process the FitzHugh-Nagumo network undergoes adaptive transitions realizing traveling waves, synchronized states and chimera states transiting through various multiplicities. These transitions become more evident when the time scales governing the coupling dynamics are much slower than the ones governing the nodal dynamics (nodal potentials). Namely, when the coupling time scales are slow, the network has the time to realize and demonstrate different synchronization regimes before reaching the final steady state. The transitions can be observed not only in the spacetime plots but also in the abrupt changes of the average coupling weights as the network evolves in time. Regarding the asymptotic coupling distributions, we show that the limiting average coupling strength follows an inverse power law with respect to the Oja parameter (also called “forgetting” parameter) which balances the learning growth. We also report abrupt transitions in the asymptotic coupling strengths when the parameter related to adaptive coupling crosses from fast to slow time scales. These findings are in line with previous studies on spiking neural networks.
💡 Research Summary
The paper investigates how adaptive synaptic coupling, implemented via a Hebbian learning rule corrected by Oja’s forgetting term, shapes the collective dynamics of a ring network of FitzHugh‑Nagumo (FHN) oscillators. Each node is described by the classic two‑variable FHN equations (membrane potential u and recovery variable v) with a small time‑scale separation (ε = 0.01) and a bias term γ = 0.5 that places the uncoupled units in an oscillatory regime. Nodes are coupled to 2R nearest neighbours (periodic boundary conditions) and the instantaneous coupling strength σ_jk(t) evolves according to
τ_σ · dσ_jk/dt = u_j u_k − α u_j² σ_jk,
where the first term implements Hebb’s “fire‑together‑wire‑together” principle and the second term, introduced by Oja, prevents unbounded growth by providing a decay proportional to the post‑synaptic activity. The product σ_c σ_jk(t) defines an effective coupling weight σ_eff_jk(t).
The authors perform extensive numerical simulations varying two key parameters: the adaptation time constant τ_σ (which controls how fast the synaptic weights evolve relative to the neuronal dynamics) and the Oja forgetting parameter α (which balances learning and decay). For each simulation they monitor three observables: (i) the Kuramoto order parameter z(t) computed from the instantaneous phase θ_j = arctan(v_j/u_j), (ii) the network‑averaged effective coupling ⟨σ_eff⟩(t), and (iii) the spatial variance D_σ(t) of the coupling strengths.
When τ_σ is large (slow adaptation), the system exhibits a cascade of distinct dynamical regimes as the weights gradually increase. Initially, a traveling‑wave pattern propagates around the ring. As ⟨σ_eff⟩ grows, the wave destabilizes and multi‑chimera states appear, characterized by coexisting coherent and incoherent domains; the number of incoherent domains (chimera multiplicity) depends on the instantaneous coupling strength and can change during the evolution. Eventually, for sufficiently large ⟨σ_eff⟩, the network settles into a fully synchronized state (z ≈ 1). Each transition is marked by a sharp jump in z(t) and a pronounced peak in D_σ(t), indicating that spatial heterogeneity of the synaptic weights is maximal at the moment a new pattern emerges.
If τ_σ is reduced to intermediate values, the same sequence occurs but over a much shorter time window; the system may skip the traveling‑wave stage and jump directly from a low‑coherence state to a chimera, then to synchronization. When τ_σ is very small (fast adaptation), the weights converge almost instantly to their steady‑state values determined by α, and the network typically jumps straight to the final attractor (often full synchrony) without displaying intermediate patterns. This demonstrates that the relative speed of synaptic plasticity versus neuronal dynamics is a decisive factor for the emergence of transient collective states.
The dependence on the Oja parameter α is equally revealing. By scanning α over several orders of magnitude, the authors find that the asymptotic average coupling follows an inverse power law ⟨σ_eff⟩ ∝ α^−β with β≈1. Small α (weak forgetting) allows the Hebbian term to dominate, leading to large final weights and a richer repertoire of intermediate states. Large α (strong forgetting) suppresses weight growth, yielding weak coupling and often preventing the formation of chimera patterns. Moreover, a critical α exists where the system experiences an abrupt change in ⟨σ_eff⟩ and in the dynamical regime, reminiscent of a phase transition.
Overall, the study provides three major insights: (1) Hebb‑Oja adaptive coupling can drive a homogeneous FHN network through a hierarchy of spatiotemporal patterns—traveling waves, multi‑chimera states, and full synchrony—provided that synaptic plasticity evolves on a slower time scale than the neuronal dynamics; (2) the adaptation time constant τ_σ and the forgetting parameter α jointly control whether such intermediate regimes appear, how long they persist, and what final coupling strength is reached; (3) the steady‑state coupling obeys a robust inverse‑power scaling with α, confirming the theoretical expectation that Oja’s rule regularizes Hebbian growth.
These findings have direct relevance for neuroscience, where synaptic plasticity operates on time scales slower than membrane dynamics, and for artificial neural networks that incorporate online learning rules. The paper suggests that deliberately tuning the speed of weight updates relative to neuron updates could be a powerful tool to engineer desired transient dynamics, such as controlled wave propagation or temporary coexistence of coherent and incoherent activity. Future work could extend the analysis to higher‑dimensional lattices, heterogeneous node parameters, or external stimuli, thereby exploring how adaptive coupling interacts with more realistic brain‑like architectures.
Comments & Academic Discussion
Loading comments...
Leave a Comment