Task-dependence of network-to-network variability in learning, performance, and dynamics of heterogeneous recurrent networks
Santhosh, A.; Narayanan, R.
Show abstract
Artificial recurrent networks are powerful models for studying neural dynamics and representations underlying complex cognitive tasks. However, the impact of neural-circuit heterogeneities on learning, dynamics, robustness, and generalization in these networks remains poorly understood. Here, we systematically investigated the impact of graded intrinsic heterogeneities in artificial recurrent networks trained on different cognitive tasks using reward- modulated Hebbian learning. Across networks trained with distinct hyperparameters and different levels of intrinsic heterogeneity, we observed pronounced network-to-network and task-to-task variability in training convergence, error dynamics during training, and task performance. These effects were strongly task dependent, with memory-dependent tasks exhibiting greater sensitivity to heterogeneity than memoryless tasks. We assessed these networks for robustness to multiple forms of graded post-training perturbations. Perturbations to intrinsic time constant distributions altered network dynamics, but had limited impact on final task accuracy in most cases. In contrast, perturbations to initial conditions, exploratory activity impulses, or task epoch durations strongly affected memory-dependent tasks. Among all perturbations, synaptic jitter was consistently the most detrimental, impairing performance across all tasks and heterogeneity levels. Importantly, despite such pronounced impact of heterogeneities, none of the metrics (spanning training, performance, dynamics, and robustness) varied monotonically with the level of training heterogeneity, instead showing additional dependencies on task demands, network configuration, and perturbation type. Finally, networks trained on a single task were able to perform structurally related untrained tasks, but failed on fundamentally distinct tasks. Strikingly, similar task performances emerged from divergent activity trajectories across networks and training conditions, together revealing pronounced functional degeneracy in network dynamics. Collectively, our findings establish that heterogeneous recurrent networks operate in a complex systems regime, where robust function emerges from non-unique, task-specific interactions among hyperparameters, dynamics, and heterogeneities. Our analyses emphasize the need for population- of-networks approaches that focus on interactions among multiple forms of neural heterogeneities in shaping learning and computation.
Matching journals
The top 4 journals account for 50% of the predicted probability mass.