Dynamical and Statistical Criticality in a Model of Neural Tissue

Reading time: 6 minute
...

📝 Original Info

  • Title: Dynamical and Statistical Criticality in a Model of Neural Tissue
  • ArXiv ID: 0808.3996
  • Date: 2013-05-29
  • Authors: ** Marcelo O. Magnasco, Oreste Piro, Guillermo A. Cecchi **

📝 Abstract

For the nervous system to work at all, a delicate balance of excitation and inhibition must be achieved. However, when such a balance is sought by global strategies, only few modes remain balanced close to instability, and all other modes are strongly stable. Here we present a simple model of neural tissue in which this balance is sought locally by neurons following `anti-Hebbian' behavior: {\sl all} degrees of freedom achieve a close balance of excitation and inhibition and become "critical" in the dynamical sense. At long timescales, the modes of our model oscillate around the instability line, so an extremely complex "breakout" dynamics ensues in which different modes of the system oscillate between prominence and extinction. We show the system develops various anomalous statistical behaviours and hence becomes self-organized critical in the statistical sense.

💡 Deep Analysis

Deep Dive into Dynamical and Statistical Criticality in a Model of Neural Tissue.

For the nervous system to work at all, a delicate balance of excitation and inhibition must be achieved. However, when such a balance is sought by global strategies, only few modes remain balanced close to instability, and all other modes are strongly stable. Here we present a simple model of neural tissue in which this balance is sought locally by neurons following `anti-Hebbian’ behavior: {\sl all} degrees of freedom achieve a close balance of excitation and inhibition and become “critical” in the dynamical sense. At long timescales, the modes of our model oscillate around the instability line, so an extremely complex “breakout” dynamics ensues in which different modes of the system oscillate between prominence and extinction. We show the system develops various anomalous statistical behaviours and hence becomes self-organized critical in the statistical sense.

📄 Full Content

arXiv:0808.3996v1 [q-bio.NC] 28 Aug 2008 Dynamical and Statistical Criticality in a Model of Neural Tissue Marcelo O. Magnasco,1 Oreste Piro,2 and Guillermo A. Cecchi3 1Laboratory of Mathematical Physics, The Rockefeller University, 10021 New York, NY USA 2Departament de Fsica and IFISC(CSIC-UIB), Universitat de les Illes Balears, 07122 Palma de Mallorca, Spain. 3Computational Biology Center, T.J. Watson IBM Research Laboratory, 1101 Kitchawan Rd., Yorktown Heights, NY USA For the nervous system to work at all, a delicate balance of excitation and inhibition must be achieved. However, when such a balance is sought by global strategies, only few modes remain balanced close to instability, and all other modes are strongly stable. Here we present a simple model of neural tissue in which this balance is sought locally by neurons following ‘anti-Hebbian’ behavior: all degrees of freedom achieve a close balance of excitation and inhibition and become “critical” in the dynamical sense. At long timescales, the modes of our model oscillate around the instability line, so an extremely complex “breakout” dynamics ensues in which different modes of the system oscillate between prominence and extinction. We show the system develops various anomalous statistical behaviours and hence becomes self-organized critical in the statistical sense. PACS numbers: 87.10.+e, 05.20.-y, 89.70.+c Dynamical systems theory holds that systems of inter- est should be structurally stable: their behavior should not drastically change with small perturbations of the defining dynamics [1]. Thus high-order criticality, the si- multaneous presence of several critical features such as Hopf bifurcations, is not expected to be ever observed in a natural system. However natural systems lacking such structural stability are not infrequent: within neu- roscience examples include dynamically critical systems such as line attractors [2] in motor control [3] and deci- sion making [4], and self-tuned Hopf bifurcations in the auditory periphery [5] and olfactory system [6]. There are also examples in neuroscience of statistical critical- ity [7]: spontaneous heavy-tailed or scale-free fluctua- tions typical of critical phase transitions, such as neu- ronal avalanches in cortical slices [8], anomalous correla- tions in the retina [9] and in functional imaging [10], and models based on simulations of the highly non-linear dy- namics of spiking elements, display avalanche-like statis- tical criticality [11, 12]. There is no real understanding of a relation between these different concepts of criticality; developed turbulence, a well-studied example, displays both statistical criticality [13] and dynamical criticality (extensive number of zero Lyapunovs [14]), but a rela- tionship between them is far from clear. We present a simple model of neural tissue, an anti- Hebbian network which constantly forgets; this network spontaneously poises itself at a dynamically critical state in which an extensive number of degrees of freedom ap- proach Hopf bifurcations, becoming arbitrarily sensitive to external perturbations. As the dynamics controlling this state has itself a marginal fixed point, the eigenvalues do not converge but fluctuate, close to the imaginary axis; when they become slightly unstable, the corresponding mode “breaks out” and becomes more prominent, and as they become slightly stable the mode slowly damps out. This breakout dynamics displays avalanche-like activity bursts whose sizes are power-law distributed. Within these epochs the neurons of our model are slightly corre- lated; yet, as the number of small but significant correla- tions is high, the model has strongly correlated network states [9]. This system is, on the short time-scale, sensi- tive in bulk to any outside input, even if applied only to a small subset of the neurons; however, it does not learn. In fact, being anti-Hebbian, it constantly forgets. We show that we can enrich the dynamics adding, to the term which is anti-Hebbian respect to regular correlations, an- other term “positively” Hebbian to directed correlations, i.e., those causal in the sense of Granger [15]. Then the network may learn “predictable” stimuli, yet will stay unable to learn noise, and will display timing-dependent synaptic changes reminiscent of spike-timing dependent plasticity (STDP, [16]). We now present our model. The activities of a set of neurons, encoded in the vector x, evolve under the synap- tic connectivity matrix A; meanwhile A itself evolves, at a slower pace α, under an anti-Hebbian rule. ˙x = Ax (1) ˙A = α(I −xx⊤) (2) where the matrix A encodes the synaptic connections, α is the speed of synaptic evolution, assumed slow, and I the identity matrix. Inputs i(t), neuronal noise ξ(t), and nonlinear limiting terms such as x3 would normally be added to the RHS of eq (1) but that shall not be necessary for now. From eq. (2) the matrix A would stop evolving when the components of x have unit variance and are

…(Full text truncated)…

Reference

This content is AI-processed based on ArXiv data.

Start searching

Enter keywords to search articles

↑↓
ESC
⌘K Shortcut