The thermodynamic temperature of a rhythmic spiking network

Artificial neural networks built from two-state neurons are powerful computational substrates, whose computational ability is well understood by analogy with statistical mechanics. In this work, we in

The thermodynamic temperature of a rhythmic spiking network

Artificial neural networks built from two-state neurons are powerful computational substrates, whose computational ability is well understood by analogy with statistical mechanics. In this work, we introduce similar analogies in the context of spiking neurons in a fixed time window, where excitatory and inhibitory inputs drawn from a Poisson distribution play the role of temperature. For single neurons with a “bandgap” between their inputs and the spike threshold, this temperature allows for stochastic spiking. By imposing a global inhibitory rhythm over the fixed time windows, we connect neurons into a network that exhibits synchronous, clock-like updating akin to neural networks. We implement a single-layer Boltzmann machine without learning to demonstrate our model.


💡 Research Summary

The paper “The thermodynamic temperature of a rhythmic spiking network” proposes a novel bridge between statistical‑mechanics concepts and biologically inspired spiking neural networks. The authors begin by restricting each leaky integrate‑and‑fire neuron to a fixed observation window Δt and treat the excitatory and inhibitory Poisson input streams as stochastic “heat baths.” The mean input current over the window is (λ⁺‑λ⁻)Δt, which the authors identify as an effective temperature T. By introducing a “bandgap” ΔE = θ – Ī between the average input current Ī and the firing threshold θ, they obtain a spike probability that follows the Boltzmann form P(spike)=1/(1+e^{ΔE/T}). Thus, as T increases, the neuron can stochastically cross the bandgap and fire, mirroring the temperature‑driven activation of binary units in a Boltzmann machine.

To synchronize many such neurons, the paper adds a global inhibitory rhythm. This rhythm consists of a periodic inhibitory pulse that creates a short “active” sub‑window τₐ within each period Tₚ, followed by a longer inhibitory phase τᵢ. During τₐ the Poisson inputs are allowed to affect the membrane potential; during τᵢ the neuron is clamped to a hyperpolarized state, preventing spikes. Consequently, all neurons update only during the same active sub‑window, yielding a clock‑like, synchronous update rule reminiscent of discrete‑time neural networks while preserving the underlying spiking dynamics.

The authors then assemble a fully connected network of N such neurons, each pair linked by a fixed weight w_{ij}. No learning algorithm is employed; instead, the weight matrix is pre‑specified to implement a single‑layer Boltzmann machine. At each active window, neuron i computes its total synaptic input Σ_j w_{ij} s_j, where s_j denotes the spike state of neuron j in the previous window. Using the temperature T derived from the Poisson rates, the neuron draws a spike with probability given by the Boltzmann expression. After spiking, the neuron is forced into the inhibitory phase of the next cycle, ensuring that all neurons remain synchronized.

Simulation results demonstrate two key findings. First, by varying T the network reproduces the canonical Boltzmann distribution over binary states: low T drives rapid convergence to low‑energy configurations, while high T promotes extensive exploration of the state space. Second, the global inhibitory clock yields deterministic, synchronous updates that match the behavior of conventional binary Boltzmann machines, confirming that the spiking implementation faithfully reproduces the statistical‑mechanical dynamics. The authors also show that the stochastic spiking behavior can be tuned continuously via the Poisson rates, offering a natural mechanism for annealing.

In the discussion, the authors acknowledge several limitations. The current model assumes stationary Poisson input and a simple sinusoidal inhibitory rhythm, whereas real cortical circuits exhibit non‑Poisson statistics, synaptic plasticity, and multiple co‑existing oscillations (e.g., theta, gamma). They propose future extensions that incorporate spike‑timing‑dependent plasticity (STDP) or Hebbian learning to adapt the weight matrix online, as well as richer rhythmic patterns to explore phase‑coding schemes. Scaling the architecture to multiple layers and introducing non‑linear weight transformations are suggested as pathways toward more powerful computation.

Overall, the paper introduces a compelling conceptual framework: temperature in a spiking network is realized through stochastic input statistics, and a global inhibitory rhythm provides a clock that aligns spiking updates across the population. This synthesis preserves the probabilistic foundations of Boltzmann machines while embedding them in a biologically plausible spiking substrate. The work opens avenues for neuromorphic hardware that leverages intrinsic noise and rhythmic inhibition to perform energy‑based inference, and it offers a fresh perspective for neuroscientists seeking statistical‑mechanical explanations of rhythmic cortical activity.


📜 Original Paper Content

🚀 Synchronizing high-quality layout from 1TB storage...