Back

Electrophysiological indices of hierarchical speech processing differentially reflect the comprehension of speech in noise

Synigal, S. R.; Anderson, A. J.; Lalor, E. C.

2023-03-31 neuroscience
10.1101/2023.03.30.534927 bioRxiv
Show abstract

The past few years have seen an increase in the use of encoding models to explain neural responses to natural speech. The goal of these models is to characterize how the human brain converts acoustic energy into distinct linguistic representations that enable everyday speech comprehension. For example, researchers have shown that electroencephalography (EEG) data can be modeled in terms of acoustic features of speech, such as its amplitude envelope or spectrogram, linguistic features such as phonemes and phoneme probability, and higher-level linguistic features like context-based word predictability. However, it is unclear how reliably EEG indices of these speech feature representations reflect comprehension in different listening conditions. To address this, we recorded EEG from neurotypical adults who listened to segments of an audiobook in various levels of background noise. We modeled how their EEG responses reflected a range of acoustic and linguistic speech features and how this tracking varied with behavior across noise levels. EEG tracking of nearly all examined features showed SNR-dependent changes in unique variance explained, with the largest changes occurring for linguistic features. We hypothesized that only higher-level feature tracking would predict behavior but instead found that both high and low-level features were associated with behavioral scores depending on the noise level. EEG markers of the influence of top-down, context-based prediction on bottom-up acoustic processing also correlated with behavior. These findings help characterize the relationship between brain and behavior by comprehensively linking hierarchical indices of neural speech processing to language comprehension metrics. SIGNIFICANCE STATEMENTAcoustic and linguistic features of speech have been shown to be consistently tracked by neural activity even in noisy conditions. However, it is unclear how signatures of low- and high-level features covary with one another and relate to behavior across these listening conditions. Here, we find that linguistic (phonetic feature and word probability-based feature) processing is affected by noise more than low-level acoustic feature processing. We also find that behavioral performance is associated with acoustic, phonetic, and lexical surprisal tracking, and that these associations depend on background noise levels. These results extend our understanding of how various speech features are comparatively reflected in electrical brain activity and how they relate to perception in challenging listening conditions.

Matching journals

The top 1 journal accounts for 50% of the predicted probability mass.