Back

Watching Yourself Talk: Motor Experience Sharpens Sensitivity to Gesture-Speech Asynchrony

Vercillo, T.; Holler, J.; Noppeney, U.

2026-02-13 neuroscience
10.64898/2026.02.12.705486 bioRxiv
Show abstract

Language is inherently multisensory, with speech often accompanied by iconic gestures that convey semantic meaning related to actions, objects, or spatial relationships. Although the temporal coordination between speech and gesture is variable, the brain integrates these signals seamlessly. Yet, much of the cognitive mechanisms behind this integration remain unclear. This study investigates whether sensorimotor experience, specifically with ones own speech and gestures, enhances temporal sensitivity through internal forward models that guide audiovisual prediction. Participants first produced sentences with corresponding iconic gestures, which were audiovisually recorded. These recordings were later temporally manipulated and presented in a simultaneity judgment task, where participants evaluated both their own and others recordings. Results revealed narrower temporal binding windows (TBWs), indicating heightened sensitivity to audiovisual asynchrony, when participants judged their own speech-gesture recordings compared to those of others. To further explore the role of motor experience, we analysed individual variability in gesture-speech timing during production and found no reliable relationship between production variability and perceptual sensitivity, suggesting that perceptual precision is not simply a reflection of motor consistency. These findings demonstrate that sensorimotor experience with self-generated movements sharpens multisensory temporal integration, likely via predictive internal models, and underscore the functional role of predictive motor mechanisms in supporting temporal integration across perceptual and action systems.

Matching journals

The top 3 journals account for 50% of the predicted probability mass.