Local gated-Hebbian learning of deep cerebellar networks with quadratic classification capacity
Hiratani, N.
Show abstract
A central goal of neuroscience is to understand how neural circuit architecture supports learning. While recent work has clarified the computational role of depth in sensory cortical hierarchies, it remains unclear why predominantly feedforward, non-convolutional circuits such as the cerebellum and olfactory system also contain multiple processing layers. Theoretical work in deep learning has shown that two-hidden-layer networks can achieve classification capacity that scales quadratically with the number of intermediate neurons, but these results rely on nonlocal synaptic optimization and are therefore difficult to reconcile with biological learning rules. Here, we show analytically and numerically that a two-hidden-layer network with feedforward gating can achieve quadratic capacity using local three-factor Hebbian learning when intermediate activity is sparse. This architecture supports efficient one-shot learning and, in settings where backpropagation requires many repeated weight updates, offers an advantage in learning speed. Beyond random perceptron tasks, the model also performs well on structured cerebellum-related tasks, including reinforcement-learning-based motor control. Mapping the model onto cerebellar microcircuitry further suggests functional roles for dendritic compartmentalization, branch-specific inhibition, and disinhibitory interneuron pathways. Together, these results extend the Marr-Albus-Ito framework by showing how the presence of multiple intermediate layers in cerebellum-like circuits can support fast, local, and high-capacity learning.
Matching journals
The top 5 journals account for 50% of the predicted probability mass.