Upgrading Voxel-wise Encoding Model via Integrated Integration over Features and Brain Networks
Li, Y.; Yang, H.; Gu, S.
Show abstract
A central goal of cognitive neuroscience is to build computational models that predict and explain neural responses to sensory inputs in the cortex. Recent studies attempt to borrow the representation power of deep neural networks (DNN) to predict the brain response and suggest a correspondence between artificial and biological neural networks in their feature representations. However, each DNN instance is often specified for certain computer vision tasks which may not lead to optimal brain correspondence. On the other hand, these voxel-wise encoding models focus on predicting single voxels independently, while brain activity often demonstrates rich and dynamic structures at the population and network levels during cognitive tasks. These two important properties suggest that we can improve the prevalent voxel-wise encoding models by integrating features from DNN models and by integrating cortical network information into the models. In this work, we propose a new unified framework that addresses these two aspects through DNN feature-level ensemble learning and brain atlas-level model integration. Our proposed approach leads to superior performance over previous DNN-based encoding models in predicting whole-brain neural activity during naturalistic video perception. Furthermore, our unified framework also facilitates the investigation of the brains neural representation mechanism by accurately predicting the neural response corresponding to complex visual concepts.
Matching journals
The top 5 journals account for 50% of the predicted probability mass.