Prediction of spatio-temporal patterns of neural activity from pairwise correlations
We designed a model-based analysis to predict the occurrence of population patterns in distributed spiking activity. Using a maximum entropy principle with a Markovian assumption, we obtain a model that accounts for both spatial and temporal pairwise correlations among neurons. This model is tested on data generated with a Glauber spin-glass system and is shown to correctly predict the occurrence probabilities of spatio-temporal patterns significantly better than Ising models taking into account only pairwise correlations. This increase of predictability was also observed on experimental data recorded in parietal cortex during slow-wave sleep. This approach can also be used to generate surrogates that reproduce the spatial and temporal correlations of a given data set.
💡 Research Summary
The paper introduces a novel statistical framework for predicting and characterizing spatio‑temporal patterns of neural activity. Traditional maximum‑entropy approaches, such as the Ising model, capture only spatial pairwise correlations and treat neural activity as a static snapshot. Consequently, they ignore the intrinsic temporal dependencies that are essential for describing the dynamics of spiking populations. To overcome this limitation, the authors combine the maximum‑entropy principle with a first‑order Markov assumption, yielding a “Markov‑Ising” model that simultaneously incorporates spatial and temporal pairwise interactions.
In the formalism, binary variables (\sigma_i^t) denote the presence (1) or absence (0) of a spike from neuron (i) at discrete time bin (t). The joint probability of a whole spike train is factorized into a product of conditional transition probabilities (P(\sigma^{t+1}\mid\sigma^t)). Three constraints are imposed on the distribution: (1) the mean firing rate of each neuron, (2) the spatial pairwise correlation (\langle\sigma_i^t\sigma_j^t\rangle), and (3) the temporal cross‑correlation (\langle\sigma_i^t\sigma_j^{t+1}\rangle). Maximizing entropy under these constraints leads to an energy function of the form
\
Comments & Academic Discussion
Loading comments...
Leave a Comment