Back

Neurons need no adaptation to optimally code arbitrarily complex stimuli

Forkosh, O.

2020-05-22 neuroscience
10.1101/2020.05.21.104638 bioRxiv
Show abstract

Neural networks seem to be able to handle almost any task they face. This feat involves coping efficiently with different data types, at multiple scales, and with varying statistical properties. Here, we show that this so-called optimal coding can occur at the single-neuron level and does not require adaptation. Differentiator neurons, i.e., neurons that spike whenever there is an increase in the input stimuli, are capable of capturing arbitrary statistics and scale of practically any stimulus they encounter. We show this optimality both analytically and using simulations, which demonstrate how an ideal neuron can handle drastically different probability distributions. While the mechanism we present is an oversimplification of "real" neurons and does not necessarily capture all neuron types, this is also its strength since it can function alongside other neuronal goals such as data manipulation and learning. Depicting the simplicity of neural response to complex stimuli, this result may also indicate a straightforward way to improve current artificial neural networks.

Matching journals

The top 3 journals account for 50% of the predicted probability mass.

1
PLOS Computational Biology
1633 papers in training set
Top 0.4%
26.3%
2
Neural Computation
36 papers in training set
Top 0.1%
19.7%
3
Frontiers in Computational Neuroscience
53 papers in training set
Top 0.4%
6.4%
50% of probability mass above
4
PLOS ONE
4510 papers in training set
Top 31%
4.9%
5
Journal of Computational Neuroscience
23 papers in training set
Top 0.1%
4.4%
6
Neural Networks
32 papers in training set
Top 0.1%
4.0%
7
Scientific Reports
3102 papers in training set
Top 30%
4.0%
8
Biological Cybernetics
12 papers in training set
Top 0.1%
2.8%
9
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 26%
2.4%
10
Nature Communications
4913 papers in training set
Top 50%
1.7%
11
Frontiers in Neuroscience
223 papers in training set
Top 4%
1.7%
12
Chaos, Solitons & Fractals
32 papers in training set
Top 1%
1.2%
13
Entropy
20 papers in training set
Top 0.2%
1.2%
14
Physical Review E
95 papers in training set
Top 0.9%
1.2%
15
Neurocomputing
13 papers in training set
Top 0.4%
0.9%
16
Journal of The Royal Society Interface
189 papers in training set
Top 4%
0.9%
17
eLife
5422 papers in training set
Top 55%
0.8%
18
Bulletin of Mathematical Biology
84 papers in training set
Top 2%
0.8%
19
Neuroscience
88 papers in training set
Top 3%
0.7%
20
Journal of Neurophysiology
263 papers in training set
Top 1%
0.7%
21
Journal of Neural Engineering
197 papers in training set
Top 2%
0.5%