PT - JOURNAL ARTICLE AU - Ganesan, Karthik AU - Plass, John AU - Beltz, Adriene M. AU - Liu, Zhongming AU - Grabowecky, Marcia AU - Suzuki, Satoru AU - Stacey, William C. AU - Wasade, Vibhangini S. AU - Towle, Vernon L. AU - Tao, James X AU - Wu, Shasha AU - Issa, Naoum P AU - Brang, David TI - Visual speech differentially modulates beta, theta, and high gamma bands in auditory cortex AID - 10.1101/2020.09.07.284455 DP - 2020 Jan 01 TA - bioRxiv PG - 2020.09.07.284455 4099 - http://biorxiv.org/content/early/2020/09/07/2020.09.07.284455.short 4100 - http://biorxiv.org/content/early/2020/09/07/2020.09.07.284455.full AB - Speech perception is a central component of social communication. While speech perception is primarily driven by sounds, accurate perception in everyday settings is also supported by meaningful information extracted from visual cues (e.g., speech content, timing, and speaker identity). Previous research has shown that visual speech modulates activity in cortical areas subserving auditory speech perception, including the superior temporal gyrus (STG), likely through feedback connections from the multisensory posterior superior temporal sulcus (pSTS). However, it is unknown whether visual modulation of auditory processing in the STG is a unitary phenomenon or, rather, consists of multiple temporally, spatially, or functionally discrete processes. To explore these questions, we examined neural responses to audiovisual speech in electrodes implanted intracranially in the temporal cortex of 21 patients undergoing clinical monitoring for epilepsy. We found that visual speech modulates auditory processes in the STG in multiple ways, eliciting temporally and spatially distinct patterns of activity that differ across theta, beta, and high-gamma frequency bands. Before speech onset, visual information increased high-gamma power in the posterior STG and suppressed beta power in mid-STG regions, suggesting crossmodal prediction of speech signals in these areas. After sound onset, visual speech decreased theta power in the middle and posterior STG, potentially reflecting a decrease in sustained feedforward auditory activity. These results are consistent with models that posit multiple distinct mechanisms supporting audiovisual speech perception.Significance Statement Visual speech cues are often needed to disambiguate distorted speech sounds in the natural environment. However, understanding how the brain encodes and transmits visual information for usage by the auditory system remains a challenge. One persistent question is whether visual signals have a unitary effect on auditory processing or elicit multiple distinct effects throughout auditory cortex. To better understand how vision modulates speech processing, we measured neural activity produced by audiovisual speech from electrodes surgically implanted in auditory areas of 21 patients with epilepsy. Group-level statistics using linear mixed-effects models demonstrated distinct patterns of activity across different locations, timepoints, and frequency bands, suggesting the presence of multiple audiovisual mechanisms supporting speech perception processes in auditory cortex.Competing Interest StatementThe authors have declared no competing interest.