Back

Structured flexibility in recurrent neural networks via neuromodulation

Costacurta, J. C.; Bhandarkar, S.; Zoltowski, D. M.; Linderman, S. W.

2024-07-26 neuroscience
10.1101/2024.07.26.605315 bioRxiv
Show abstract

The goal of theoretical neuroscience is to develop models that help us better understand biological intelligence. Such models range broadly in complexity and biological detail. For example, task-optimized recurrent neural networks (RNNs) have generated hypotheses about how the brain may perform various computations, but these models typically assume a fixed weight matrix representing the synaptic connectivity between neurons. From decades of neuroscience research, we know that synaptic weights are constantly changing, controlled in part by chemicals such as neuromodulators. In this work we explore the computational implications of synaptic gain scaling, a form of neuromodulation, using task-optimized low-rank RNNs. In our neuromodulated RNN (NM-RNN) model, a neuromodulatory subnetwork outputs a low-dimensional neuromodulatory signal that dynamically scales the low-rank recurrent weights of an output-generating RNN. In empirical experiments, we find that the structured flexibility in the NM-RNN allows it to both train and generalize with a higher degree of accuracy than low-rank RNNs on a set of canonical tasks. Additionally, via theoretical analyses we show how neuromodulatory gain scaling endows networks with gating mechanisms commonly found in artificial RNNs. We end by analyzing the low-rank dynamics of trai ned NM-RNNs, to show how task computations are distributed.

Matching journals

The top 4 journals account for 50% of the predicted probability mass.

1
PLOS Computational Biology
1633 papers in training set
Top 0.5%
23.3%
2
Neural Computation
36 papers in training set
Top 0.1%
15.2%
3
Frontiers in Computational Neuroscience
53 papers in training set
Top 0.3%
7.4%
4
Nature Communications
4913 papers in training set
Top 34%
4.5%
50% of probability mass above
5
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 15%
4.5%
6
Scientific Reports
3102 papers in training set
Top 29%
4.1%
7
Neural Networks
32 papers in training set
Top 0.2%
2.8%
8
Network Neuroscience
116 papers in training set
Top 0.4%
2.5%
9
Physical Review E
95 papers in training set
Top 0.4%
2.5%
10
Journal of Computational Neuroscience
23 papers in training set
Top 0.2%
2.0%
11
Neurocomputing
13 papers in training set
Top 0.3%
1.5%
12
Biological Cybernetics
12 papers in training set
Top 0.1%
1.4%
13
Journal of The Royal Society Interface
189 papers in training set
Top 3%
1.3%
14
Entropy
20 papers in training set
Top 0.2%
1.3%
15
Cell Reports
1338 papers in training set
Top 29%
1.1%
16
PLOS ONE
4510 papers in training set
Top 61%
1.1%
17
iScience
1063 papers in training set
Top 24%
1.0%
18
Bulletin of Mathematical Biology
84 papers in training set
Top 2%
0.9%
19
Neuron
282 papers in training set
Top 8%
0.8%
20
PNAS Nexus
147 papers in training set
Top 1%
0.8%
21
Chaos: An Interdisciplinary Journal of Nonlinear Science
16 papers in training set
Top 0.2%
0.8%
22
Chaos, Solitons & Fractals
32 papers in training set
Top 2%
0.7%
23
eneuro
389 papers in training set
Top 9%
0.7%
24
eLife
5422 papers in training set
Top 58%
0.7%
25
Physical Review Research
46 papers in training set
Top 0.8%
0.7%
26
Physical Review X
23 papers in training set
Top 0.7%
0.7%
27
Communications Biology
886 papers in training set
Top 31%
0.5%
28
Nature Neuroscience
216 papers in training set
Top 7%
0.5%
29
NeuroImage
813 papers in training set
Top 7%
0.5%