Dynamic Mean Field Theories for Nonlinear Noise in Recurrent Neuronal Networks
Strong, correlated noise in recurrent neural circuits often passes through nonlinear transfer functions, complicating dynamical mean-field analyses of complex phenomena such as transients and bifurcations. We introduce a method that replaces nonlinear functions of Ornstein-Uhlenbeck (OU) noise with a Gaussian-equivalent process matched in mean and covariance, and combine this with a lognormal moment closure for expansive nonlinearities to derive a closed dynamical mean-field theory for recurrent neuronal networks. The resulting theory captures order-one transients, fixed points, and noise-induced shifts of bifurcation structure, and outperforms standard linearization-based approximations in the strong-fluctuation regime. More broadly, the approach applies whenever dynamics depend smoothly on OU processes via nonlinear transformations, offering a tractable route to noise-dependent phase diagrams in computational neuroscience models.
💡 Research Summary
The manuscript introduces a novel framework for constructing dynamical mean‑field theories (DMFTs) that remain accurate in the presence of strong, correlated noise that is processed through nonlinear neuronal transfer functions. Traditional approaches to mean‑field analysis in computational neuroscience typically treat noise as a small perturbation, linearize the dynamics around deterministic fixed points, or move the noise outside of the nonlinear transfer function. Such approximations break down when the noise amplitude is of order one (O(1)) relative to the mean activity, a regime that is biologically realistic because cortical circuits receive shared, low‑dimensional fluctuations from external sources and often operate with expansive nonlinearities (e.g., quadratic or power‑law firing‑rate functions).
The authors first formalize the microscopic model of a recurrent excitatory‑inhibitory (E/I) network, where each neuron’s membrane potential obeys a leaky‑integrator equation driven by synaptic inputs and an Ornstein‑Uhlenbeck (OU) process η(t). By assuming all‑to‑all connectivity, identical time constants and noise intensities within each population, and decomposing the noise into private and shared components (controlled by parameters ν and ρ), they derive a macroscopic set of stochastic differential equations for the population‑averaged firing rates r_E(t) and r_I(t). Crucially, the nonlinear transfer function ϕ is applied directly to the summed input before temporal integration, meaning that the OU noise enters the dynamics inside the nonlinearity.
To handle the resulting non‑Gaussian terms (e.g., η²(t) that appear when ϕ is quadratic), the paper proposes the Gaussian Equivalent Method (GEM). GEM replaces any smooth function of the OU process with a Gaussian process γ(t) whose first two moments (mean and autocorrelation) are matched to those of the original non‑Gaussian term. The authors construct a stochastic differential equation for γ(t) driven by a new Wiener process, carefully discuss the order of limits (Δt → 0 versus τ_N → 0) and adopt a non‑standard Stratonovich‑type interpretation that preserves the stationary distribution of η while allowing a white‑noise limit for analytical tractability. This substitution converts the original problem into one amenable to standard Fokker‑Planck analysis.
The next obstacle is the closure of the moment hierarchy generated by the nonlinear transfer function. Because the quadratic (or more generally expansive) nonlinearity makes the evolution of the second moments depend on third moments, an infinite hierarchy would normally arise. The authors resolve this by assuming that the population firing rates are jointly log‑normally distributed. This ansatz is biologically plausible (rates are positive) and mathematically convenient: all higher‑order moments can be expressed exactly in terms of the first two moments. For example, ⟨r³⟩ = ⟨r²⟩⟨r⟩ and mixed moments such as ⟨r_E² r_I⟩ factor similarly. With this closure, the GEM‑derived stochastic equations yield a closed set of ordinary differential equations for the means ⟨r_E⟩, ⟨r_I⟩, the variances, and the cross‑covariance.
The resulting DMFT is tested on several benchmark systems. In an E/I network with a thresholded quadratic transfer function, the theory accurately reproduces order‑one transients, stationary firing‑rate distributions, and noise‑induced shifts of bifurcation points (e.g., Hopf and saddle‑node bifurcations). Comparisons with direct Monte‑Carlo simulations show that the GEM‑Lognormal DMFT remains precise even when the noise intensity σ is comparable to the mean input, whereas linearization‑based mean‑field approximations deviate dramatically. The authors also apply the framework to a single‑population model with a logistic (sigmoidal) transfer function, demonstrating how shared OU noise can move the effective threshold and alter the location of the critical point. Finally, they provide analytic bounds on the approximation error for generic smooth nonlinearities, showing that the GEM is accurate when the nonlinearity is sufficiently smooth and the OU correlation time τ_N is short relative to the intrinsic neuronal time constants.
In summary, the paper makes three substantive contributions: (1) a general Gaussian‑equivalent substitution that converts arbitrary smooth functions of OU noise into tractable Gaussian processes while preserving mean and covariance; (2) a log‑normal moment‑closure that enables exact closure of the moment hierarchy for expansive firing‑rate nonlinearities; and (3) a unified dynamical mean‑field theory that remains valid in the strong‑fluctuation regime, capturing both transient dynamics and noise‑dependent phase‑diagram modifications. This methodology broadens the analytical toolkit for computational neuroscientists, allowing systematic exploration of how correlated, strong noise reshapes the dynamics of recurrent networks and influencing interpretations of experimental data where such noise is ubiquitous.
Comments & Academic Discussion
Loading comments...
Leave a Comment