Neurons need no adaptation to optimally code arbitrarily complex stimuli
Forkosh, O.
Show abstract
Neural networks seem to be able to handle almost any task they face. This feat involves coping efficiently with different data types, at multiple scales, and with varying statistical properties. Here, we show that this so-called optimal coding can occur at the single-neuron level and does not require adaptation. Differentiator neurons, i.e., neurons that spike whenever there is an increase in the input stimuli, are capable of capturing arbitrary statistics and scale of practically any stimulus they encounter. We show this optimality both analytically and using simulations, which demonstrate how an ideal neuron can handle drastically different probability distributions. While the mechanism we present is an oversimplification of "real" neurons and does not necessarily capture all neuron types, this is also its strength since it can function alongside other neuronal goals such as data manipulation and learning. Depicting the simplicity of neural response to complex stimuli, this result may also indicate a straightforward way to improve current artificial neural networks.
Matching journals
The top 3 journals account for 50% of the predicted probability mass.