TY - JOUR T1 - More than words: Word predictability, prosody, gesture and mouth movements in natural language comprehension JF - bioRxiv DO - 10.1101/2020.01.08.896712 SP - 2020.01.08.896712 AU - Ye Zhang AU - Diego Frassinelli AU - Jyrki Tuomainen AU - Jeremy I Skipper AU - Gabriella Vigliocco Y1 - 2020/01/01 UR - http://biorxiv.org/content/early/2020/03/23/2020.01.08.896712.abstract N2 - The natural ecology of human language is face-to-face interaction, comprising cues, like co-speech gestures, mouth movements and prosody, tightly synchronized with speech. Yet, this rich multimodal context is usually stripped away in experimental studies as the dominant paradigm focuses on speech alone. We ask how these audio-visual cues impact brain activity during naturalistic language comprehension, how they are dynamically orchestrated and whether they are organized hierarchically. We quantify each cue in video-clips of a speaker and we used a well-established electroencephalographic marker of comprehension difficulties, an event-related potential, peaking around 400ms after word-onset. We found that multimodal cues always modulated brain activity in interaction with speech, that their impact dynamically changes with their informativeness and that there is a hierarchy: prosody shows the strongest effect followed by gestures and mouth movements. Thus, this study provides a first snapshot into how the brain dynamically weights audiovisual cues in real-world language comprehension. ER -