Back

NeuroNarrator: A Generalist EEG-to-Text Foundation Model for Clinical Interpretation via Spectro-Spatial Grounding and Temporal State-Space Reasoning

Wang, G.; Yang, S.; Ding, J.-e.; Zhu, H.; Liu, F.

2026-03-10 bioinformatics
10.64898/2026.03.07.707799 bioRxiv
Show abstract

Electroencephalography (EEG) provides a non-invasive window into neural dynamics at high temporal resolution and plays a pivotal role in clinical neuroscience research. Despite this potential, prevailing computational approaches to EEG analysis remain largely confined to task-specific classification objectives or coarse-grained pattern recognition, offering limited support for clinically meaningful interpretation. To address these limitations, we introduce NeuroNarrator, the first generalist EEG-to-text foundation model designed to translate electrophysiological segments into precise clinical narratives. A cornerstone of this framework is the curation of NeuroCorpus-160K, the first harmonized largescale resource pairing over 160,000 EEG segments with structured, clinically grounded natural-language descriptions. Our architecture first aligns temporal EEG waveforms with spatial topographic maps via a rigorous contrastive objective, establishing spectro-spatially grounded representations. Building on this grounding, we condition a Large Language Model through a state-space-inspired formulation that integrates historical temporal and spectral context to support coherent clinical narrative generation. This approach establishes a principled bridge between continuous signal dynamics and discrete clinical language, enabling interpretable narrative generation that facilitates expert interpretation and supports clinical reporting workflows. Extensive evaluations across diverse benchmarks and zero-shot transfer tasks highlight NeuroNarrators capacity to integrate temporal, spectral, and spatial dynamics, positioning it as a foundational framework for time-frequency-aware, open-ended clinical interpretation of electrophysiological data.

Matching journals

The top 9 journals account for 50% of the predicted probability mass.

1
Advanced Science
249 papers in training set
Top 1%
10.1%
2
IEEE Journal of Biomedical and Health Informatics
34 papers in training set
Top 0.1%
8.5%
3
Nature Communications
4913 papers in training set
Top 22%
8.5%
4
Nature Machine Intelligence
61 papers in training set
Top 0.3%
6.9%
5
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 19%
3.7%
6
Genome Medicine
154 papers in training set
Top 2%
3.6%
7
Scientific Reports
3102 papers in training set
Top 36%
3.6%
8
PLOS Computational Biology
1633 papers in training set
Top 11%
3.1%
9
npj Digital Medicine
97 papers in training set
Top 1%
2.7%
50% of probability mass above
10
PLOS ONE
4510 papers in training set
Top 44%
2.7%
11
Nature Methods
336 papers in training set
Top 3%
2.6%
12
eLife
5422 papers in training set
Top 35%
2.1%
13
npj Systems Biology and Applications
99 papers in training set
Top 0.8%
2.1%
14
Communications Biology
886 papers in training set
Top 5%
2.1%
15
Science Advances
1098 papers in training set
Top 12%
2.1%
16
NeuroImage
813 papers in training set
Top 4%
1.9%
17
Bioinformatics
1061 papers in training set
Top 7%
1.9%
18
Computational and Structural Biotechnology Journal
216 papers in training set
Top 5%
1.5%
19
Nature Medicine
117 papers in training set
Top 3%
1.5%
20
Journal of Neural Engineering
197 papers in training set
Top 1%
1.3%
21
Nature
575 papers in training set
Top 13%
1.2%
22
Nature Biotechnology
147 papers in training set
Top 6%
1.2%
23
iScience
1063 papers in training set
Top 21%
1.2%
24
Nucleic Acids Research
1128 papers in training set
Top 13%
1.2%
25
Cell Reports Methods
141 papers in training set
Top 4%
1.1%
26
Nature Biomedical Engineering
42 papers in training set
Top 1%
1.1%
27
Imaging Neuroscience
242 papers in training set
Top 3%
0.9%
28
Patterns
70 papers in training set
Top 2%
0.9%
29
Nature Computational Science
50 papers in training set
Top 1%
0.8%
30
Bioinformatics Advances
184 papers in training set
Top 5%
0.8%