An analytically tractable model of neural population activity in the presence of common input explains higher-order correlations and entropy

An analytically tractable model of neural population activity in the   presence of common input explains higher-order correlations and entropy
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

Simultaneously recorded neurons exhibit correlations whose underlying causes are not known. Here, we use a population of threshold neurons receiving correlated inputs to model neural population recordings. We show analytically that small changes in second-order correlations can lead to large changes in higher correlations, and that these higher-order correlations have a strong impact on the entropy, sparsity and statistical heat capacity of the population. Remarkably, our findings for this simple model may explain a couple of surprising effects recently observed in neural population recordings.


💡 Research Summary

The paper addresses a fundamental question in contemporary neuroscience: how do correlations observed in simultaneously recorded neuronal populations arise, and what are their consequences for the collective information processing of the network? To answer this, the authors construct a mathematically tractable model consisting of a population of binary “threshold” neurons that receive a mixture of independent noise and a shared, correlated input. Each neuron i receives an input xi(t)=√(1−ρ)·ηi(t)+√ρ·ξc(t), where ηi(t) are independent Gaussian white‑noise processes, ξc(t) is a common Gaussian process, and ρ∈


Comments & Academic Discussion

Loading comments...

Leave a Comment