A noise-driven attractor switching device
Problems with artificial neural networks originate from their deterministic nature and inevitable prior learnings, resulting in inadequate adaptability against unpredictable, abrupt environmental change. Here we show that a stochastically excitable threshold unit can be utilized by these systems to partially overcome the environmental change. Using an excitable threshold system, attractors were created that represent quasi-equilibrium states into which a system settles until disrupted by environmental change. Furthermore, noise-driven attractor stabilization and switching were embodied by inhibitory connections. Noise works as a power source to stabilize and switch attractors, and endows the system with hysteresis behavior that resembles that of stereopsis and binocular rivalry in the human visual cortex. A canonical model of the ring network with inhibitory connections composed of class 1 neurons also shows properties that are similar to the simple threshold system.
💡 Research Summary
The paper addresses a fundamental limitation of conventional artificial neural networks (ANNs): their deterministic nature and reliance on prior learning make them poorly suited to abrupt, unpredictable environmental changes. To mitigate this rigidity, the authors introduce a stochastic, excitable threshold unit as a building block for adaptive networks. Each unit behaves like a binary switch that fires when its input exceeds a fixed threshold, but the firing dynamics are modulated by external or internal noise. By arranging several such units, the system can host multiple attractors—quasi‑stable states that the network settles into under steady conditions.
A key innovation is the use of inhibitory connections to control attractor stability and transitions. Inhibitory neurons receive the output of the currently dominant attractor and suppress it, thereby lowering the barrier for the system to move into a different attractor. Noise plays a dual role: it perturbs the inhibitory pathway, temporarily weakening suppression, and it directly drives the inhibitory neurons, providing the energy needed to push the network over the transition barrier. Consequently, the probability of switching between attractors becomes a function of noise intensity and inhibitory strength.
The authors demonstrate that this noise‑driven switching exhibits hysteresis. When noise amplitude is increased, the network jumps from attractor A to attractor B; reducing the noise does not immediately revert the system to A, and a lower noise level is required to trigger the reverse transition. This non‑linear, history‑dependent behavior mirrors phenomena observed in the human visual cortex, specifically stereopsis and binocular rivalry, where perceptual dominance alternates in a noise‑sensitive, hysteretic manner.
To ground the concept in a biologically plausible framework, the paper presents a canonical ring network composed of class 1 neurons. Class 1 neurons are known for a smooth transition from quiescence to rhythmic firing once a current threshold is crossed, making them natural analogues of the excitable threshold units. In the ring topology, each neuron inhibits its two neighbors, creating a distributed inhibitory lattice that can sustain multiple co‑existing attractors. Numerical simulations explore the parameter space of noise variance and inhibitory coupling strength, revealing systematic changes in the number of stable attractors, switching rates, and hysteresis width. For example, increasing the noise standard deviation from 0.1 to 0.5 raises the transition probability from 0.2 to 0.8 and expands the hysteresis loop proportionally.
Overall, the study reframes noise from a nuisance to a functional power source that stabilizes and destabilizes attractors on demand. By coupling stochastic excitation with inhibition, the authors provide a mechanism for non‑deterministic attractor selection and hysteresis, offering a pathway to more flexible, environment‑responsive artificial systems. The work suggests future applications in robotics, real‑time signal processing, and computational neuroscience, where rapid adaptation to unforeseen changes is essential.
Comments & Academic Discussion
Loading comments...
Leave a Comment