Back

Metastable Neural Assemblies on a Wiring-Weight Continuum

Schmitt, F. J.; Müller, F. L.; Nawrot, M. P.

2026-03-18 neuroscience
10.64898/2026.03.16.712138 bioRxiv
Show abstract

Neural population activity typically evolves on low-dimensional manifolds and can be described as trajectories in attractor-like state spaces, including metastable switching among quasi-stable assembly states. Here we develop a unified definition of clustered neural networks with local excitatory-inhibitory balance in which enhanced within-cluster effective coupling can be realized by connection probability (structural clustering), synaptic efficacy (weight clustering), or any mixture of both. We introduce a single mixing parameter{kappa} [isin] [0, 1] that redistributes a defined clustering contrast between connection probabilities and synaptic efficacies while preserving the mean input of a balanced random network. Using mean-field theory and network simulations, we show that metastable dynamics are supported across the full{kappa} continuum. Shifting contrast between structural and weight clustering changes higher-order input structure, reshaping multistable regimes, neuronal correlations, and the balance between single- and multi-cluster episodes. Because real nervous systems jointly organize topology and synaptic strength, our approach provides a biologically realistic assembly definition and a basis for future models combining structural and functional plasticity. In practical terms,{kappa} offers a translation axis for neuromorphic and other constrained substrates, clarifying trade-offs between routing resources and synaptic weight resolution when implementing attractor-based computational primitives such as winner-take-all decisions and working-memory states for artificial agents.

Matching journals

The top 6 journals account for 50% of the predicted probability mass.

1
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 3%
14.2%
2
PLOS Computational Biology
1633 papers in training set
Top 2%
12.4%
3
Frontiers in Computational Neuroscience
53 papers in training set
Top 0.2%
10.0%
4
Nature Communications
4913 papers in training set
Top 26%
6.7%
5
eLife
5422 papers in training set
Top 22%
3.9%
6
Physical Review X
23 papers in training set
Top 0.1%
3.6%
50% of probability mass above
7
Cell Reports
1338 papers in training set
Top 15%
3.6%
8
Science Advances
1098 papers in training set
Top 9%
2.8%
9
PRX Life
34 papers in training set
Top 0.2%
2.6%
10
Scientific Reports
3102 papers in training set
Top 51%
2.1%
11
Bulletin of Mathematical Biology
84 papers in training set
Top 1.0%
1.9%
12
Physical Review Research
46 papers in training set
Top 0.3%
1.9%
13
Neural Computation
36 papers in training set
Top 0.3%
1.9%
14
Neuron
282 papers in training set
Top 5%
1.9%
15
Physical Review E
95 papers in training set
Top 0.6%
1.8%
16
Journal of The Royal Society Interface
189 papers in training set
Top 3%
1.7%
17
Nature Neuroscience
216 papers in training set
Top 4%
1.6%
18
PLOS ONE
4510 papers in training set
Top 57%
1.5%
19
Communications Physics
12 papers in training set
Top 0.2%
1.3%
20
Science
429 papers in training set
Top 16%
1.3%
21
Communications Biology
886 papers in training set
Top 13%
1.3%
22
Cell Systems
167 papers in training set
Top 10%
0.9%
23
Network Neuroscience
116 papers in training set
Top 1.0%
0.9%
24
PNAS Nexus
147 papers in training set
Top 1%
0.9%
25
Physical Review Letters
43 papers in training set
Top 0.5%
0.9%
26
Advanced Science
249 papers in training set
Top 18%
0.8%
27
Philosophical Transactions of the Royal Society B
51 papers in training set
Top 5%
0.8%
28
Neural Networks
32 papers in training set
Top 0.7%
0.8%
29
Entropy
20 papers in training set
Top 0.4%
0.7%
30
eneuro
389 papers in training set
Top 10%
0.7%