A hierarchical generative model reveals enhanced latent precision of brain-body interaction dynamics during interoceptive attention
Shinagawa, K.; Idei, H.; Umeda, S.; Yamashita, Y.
Show abstract
Brain-body interactions (BBIs) are fundamental to cognition and mental health, but their continuous multimodal dynamics remain difficult to extract. Previous approaches have been largely observational, and few frameworks enable these interacting processes to be modeled within an integrated generative system. Here, we applied a Predictive-Coding-Inspired Variational RNN (PV-RNN) to simultaneous EEG, ECG, and respiration recordings obtained from 33 participants during exteroceptive and interoceptive attention. The model learned a temporal hierarchy spanning modality-specific dynamics, multimodal associative integration, and sequence-level global states, and accurately reconstructed unseen physiological sequences. Specifically, the intermediate associative layer successfully captured the core complexities of BBI by extracting multiscale, nonlinear, and bidirectional coupling dynamics with variable temporal lags. Furthermore, the estimated precision (inverse variance) of latent variables representing BBI dynamics within this multimodal associative layer increased significantly during interoceptive attention. The magnitude of this condition-dependent precision enhancement correlated positively with subjective adaptive body controllability and negatively with psychiatric vulnerabilities, including rumination and trait anxiety. These findings identify a latent physiological signature of interoceptive attention and establish hierarchical generative modeling as an interpretable framework for extracting continuous BBI dynamics and linking multimodal physiology to cognitive and clinical characteristics.
Matching journals
The top 6 journals account for 50% of the predicted probability mass.