Network algorithmics and the emergence of the cortical synaptic-weight distribution
When a neuron fires and the resulting action potential travels down its axon toward other neurons’ dendrites, the effect on each of those neurons is mediated by the weight of the synapse that separates it from the firing neuron. This weight, in turn, is affected by the postsynaptic neuron’s response through a mechanism that is thought to underlie important processes such as learning and memory. Although of difficult quantification, cortical synaptic weights have been found to obey a long-tailed unimodal distribution peaking near the lowest values, thus confirming some of the predictive models built previously. These models are all causally local, in the sense that they refer to the situation in which a number of neurons all fire directly at the same postsynaptic neuron. Consequently, they necessarily embody assumptions regarding the generation of action potentials by the presynaptic neurons that have little biological interpretability. In this letter we introduce a network model of large groups of interconnected neurons and demonstrate, making none of the assumptions that characterize the causally local models, that its long-term behavior gives rise to a distribution of synaptic weights with the same properties that were experimentally observed. In our model the action potentials that create a neuron’s input are, ultimately, the product of network-wide causal chains relating what happens at a neuron to the firings of others. Our model is then of a causally global nature and predicates the emergence of the synaptic-weight distribution on network structure and function. As such, it has the potential to become instrumental also in the study of other emergent cortical phenomena.
💡 Research Summary
The paper tackles a long‑standing puzzle in neuroscience: why the distribution of cortical synaptic weights observed experimentally is highly skewed, with a long tail and a peak near the lowest values. Earlier theoretical attempts have reproduced this shape only by assuming a “causally local” scenario, where a set of presynaptic neurons fire directly onto a single postsynaptic cell and each synapse updates its strength based solely on that immediate interaction. Such models necessarily impose unrealistic constraints on presynaptic firing patterns and ignore the rich, recurrent, and multi‑step pathways that dominate real cortical circuitry.
To overcome these limitations, the authors introduce a “causally global” network model. They construct a large directed graph whose nodes are neurons and whose edges are synapses. Each neuron’s spiking probability at a discrete time step is a function of its internal state and the sum of incoming spikes, which themselves are the result of cascades of activity propagating through the network. Synaptic plasticity follows a modified Hebbian rule: when the postsynaptic neuron spikes, the corresponding synapse is potentiated; when it is silent, the synapse is depressed. Crucially, the magnitude of potentiation or depression is modulated not only by the local pre‑ and postsynaptic activity but also by global properties of the causal chain that delivered the input—such as path length, branching factor, and overall network density.
The simulation protocol starts from an Erdős‑Rényi random graph with N neurons and M directed connections, assigning initial synaptic weights uniformly in
Comments & Academic Discussion
Loading comments...
Leave a Comment