Back

Acoustic features of emotional vocalisations account for early modulations of event-related brain potentials

Tang, Y.; Corballis, P. M.; Hallum, L. E.

2026-01-21 physiology
10.64898/2026.01.18.700181 bioRxiv
Show abstract

Emotion is key to human communication, inferring emotion in a speakers voice is a cross-cultural and cross-linguistic capability. Electroencephalography (EEG) studies of neural mechanisms supporting emotion perception have reported that early components of the event-related potential (ERP) are modulated by emotion. However, the nature of emotions effect, especially on the P200 component, is disputed. We hypothesised that early acoustic features of emotional utterances might account for ERP modulations previously attributed to emotion. We recorded multi-channel EEG from healthy participants (n = 30) tasked with recognising the emotion of utterances. We used fifty vocalisations in five emotions - anger, happiness, neutral, sadness and pleasure - drawn from the Montreal Affective Voices dataset. We statistically quantified instantaneous associations between ERP amplitudes, emotion categories, and acoustic features, specifically, intensity, pitch, first formant, and second formant. We found that shortly after utterance onset (120-250 ms, i.e., P200, early P300) ERP amplitude for sad vocalisations was less than for other emotional categories. Moreover, ERP amplitude at around 180 ms for happy vocalisation was less than for anger, sadness, and pleasure. Our analysis showed that acoustic intensity explains most of these early-latency effects. We also found that, at longer latency (220-500 ms; late P200, P300) ERP amplitude for neutral vocalisations was less than for other emotional categories. Furthermore, there were also ERP differences between anger and happiness, anger and pleasure, anger and sadness, happiness and pleasure, as well as happiness and sadness in shorter windows during this late period. Acoustic pitch and, to a lesser degree, acoustic intensity explain most of these later effects. We conclude that acoustic features can account for early ERP modulations evoked by emotional utterances. Because previous studies used a variety of stimuli, our result likely resolves previous disputes on emotions effect on P200.

Matching journals

The top 5 journals account for 50% of the predicted probability mass.

1
Human Brain Mapping
295 papers in training set
Top 0.2%
18.7%
2
Scientific Reports
3102 papers in training set
Top 4%
12.5%
3
NeuroImage
813 papers in training set
Top 1%
8.4%
4
Cerebral Cortex
357 papers in training set
Top 0.1%
8.4%
5
PLOS ONE
4510 papers in training set
Top 28%
6.3%
50% of probability mass above
6
Psychophysiology
64 papers in training set
Top 0.1%
4.9%
7
The Journal of Neuroscience
928 papers in training set
Top 3%
4.0%
8
eneuro
389 papers in training set
Top 3%
3.6%
9
European Journal of Neuroscience
168 papers in training set
Top 0.1%
3.6%
10
International Journal of Psychophysiology
14 papers in training set
Top 0.1%
3.1%
11
iScience
1063 papers in training set
Top 16%
1.7%
12
PeerJ
261 papers in training set
Top 8%
1.5%
13
Brain Topography
23 papers in training set
Top 0.2%
1.3%
14
Proceedings of the National Academy of Sciences
2130 papers in training set
Top 36%
1.3%
15
eLife
5422 papers in training set
Top 49%
1.2%
16
Cognitive Neurodynamics
15 papers in training set
Top 0.2%
1.2%
17
Hearing Research
49 papers in training set
Top 0.3%
1.2%
18
Communications Biology
886 papers in training set
Top 17%
1.0%
19
Journal of Cognitive Neuroscience
119 papers in training set
Top 1%
0.9%
20
PLOS Biology
408 papers in training set
Top 16%
0.9%
21
Neurobiology of Aging
95 papers in training set
Top 2%
0.8%
22
Brain Stimulation
112 papers in training set
Top 1%
0.7%
23
Frontiers in Physiology
93 papers in training set
Top 6%
0.7%