Back

A novel framework for expanding RNNs with biophysical detail to solve cognitive tasks

Tolley, N.; Jones, S.

2026-03-17 neuroscience
10.64898/2026.03.13.711746 bioRxiv
Show abstract

Recurrent neural networks (RNNs) have proven to be highly successful in emulating human-like cognitive functions such as working memory. In recent years, RNNs are evolving to incorporate more biophysical realism to produce more plausible predictions on how cognitive tasks are solved in real neural circuits. However, there are major challenges in constructing and training networks with the complex and nonlinear properties of real neurons. A major component of the success of RNNs is that they share the same mathematical base as deep neural networks, permitting highly efficient optimization of model parameters using standard deep learning techniques. To do so, they use abstract representations of neurons which fail to capture the impact of cell-level biophysical and morphologic properties that may benefit network-level function. Expanding task-trained RNNs with biophysical properties such as dendrites and active ionic currents poses substantial challenges, as it moves these models away from the validated training regimes known to be highly effective for RNNs. To address this gap, we developed a biophysically detailed reservoir computing (BRC) framework with the goal of extracting mechanistic insights from biophysical neural models, and propose that these insights can be used to guide model choices that will work for specific categories of cognitive tasks. The BRC network was constructed with synaptically coupled excitatory and inhibitory cells, in which the excitatory cells include multicompartment biophysically active dendrites; motivated by empirical studies suggesting dendrites have desirable computational benefits (e.g. pattern classification and coincidence detection). We trained the BRC network to do a simplified working memory task where it had to maintain the representation of an extrinsic "cue" input. We studied the impact of extrinsic input time constants (fast AMPA vs slow NMDA) and location (dendrite vs soma) on the ability of a network to solve the task. Our results revealed that cue inputs through NMDA receptors are particularly efficient for solving the working memory task. Further, the properties of NMDA receptors are uniquely suited for cue inputs delivered at the dendrite, as networks trained with dendritic AMPA cue inputs failed to solve the task. Detailed examination of the cell and network dynamics that solve the task reveals distinct local network configurations and computing principles for the different types of extrinsic input. Overall, much like the body of mechanistic insights that have underpinned the success of training RNNs, this study lays the groundwork for applying the BRC framework to train biophysically detailed neural models to solve complex human-like cognitive tasks.

Matching journals

The top 3 journals account for 50% of the predicted probability mass.