Skip to main content
bioRxiv
  • Home
  • About
  • Submit
  • ALERTS / RSS
Advanced Search
New Results

Visual and auditory cortices represent acoustic speech-related information during silent lip reading

View ORCID ProfileFelix Bröhl, View ORCID ProfileAnne Keitel, View ORCID ProfileChristoph Kayser
doi: https://doi.org/10.1101/2022.02.21.481292
Felix Bröhl
1Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Universitätsstr. 25, 33615, Bielefeld, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Felix Bröhl
  • For correspondence: felix.broehl@uni-bielefeld.de
Anne Keitel
2Psychology, University of Dundee, Scrymgeour Building, Dundee DD1 4HN, UK
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Anne Keitel
Christoph Kayser
1Department for Cognitive Neuroscience, Faculty of Biology, Bielefeld University, Universitätsstr. 25, 33615, Bielefeld, Germany
  • Find this author on Google Scholar
  • Find this author on PubMed
  • Search for this author on this site
  • ORCID record for Christoph Kayser
  • Abstract
  • Full Text
  • Info/History
  • Metrics
  • Preview PDF
Loading

Abstract

Speech is an intrinsically multisensory signal and seeing the speaker’s lips forms a cornerstone of communication in acoustically impoverished environments. Still, it remains unclear how the brain exploits visual speech for comprehension and previous work debated whether lip signals are mainly processed along the auditory pathways or whether the visual system directly implements speech-related processes. To probe this question, we systematically characterized dynamic representations of multiple acoustic and visual speech-derived features in source localized MEG recordings that were obtained while participants listened to speech or viewed silent speech. Using a mutual-information framework we provide a comprehensive assessment of how well temporal and occipital cortices reflect the physically presented signals and speech-related features that were physically absent but may still be critical for comprehension. Our results demonstrate that both cortices are capable of a functionally specific form of multisensory restoration: during lip reading both reflect unheard acoustic features, with occipital regions emphasizing spectral information and temporal regions emphasizing the speech envelope. Importantly, the degree of envelope restoration was predictive of lip reading performance. These findings suggest that when seeing the speaker’s lips the brain engages both visual and auditory pathways to support comprehension by exploiting multisensory correspondences between lip movements and spectro-temporal acoustic cues.

Highlights

  • Visual and auditory cortex represent unheard acoustic information during lip reading

  • Auditory cortex emphasizes the acoustic envelope

  • Visual cortex emphasizes a pitch signature

  • Tracking of unheard features in auditory cortex is associated with behavior

Competing Interest Statement

The authors have declared no competing interest.

Copyright 
The copyright holder for this preprint is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-NC-ND 4.0 International license.
Back to top
PreviousNext
Posted February 22, 2022.
Download PDF
Email

Thank you for your interest in spreading the word about bioRxiv.

NOTE: Your email address is requested solely to identify you as the sender of this article.

Enter multiple addresses on separate lines or separate them with commas.
Visual and auditory cortices represent acoustic speech-related information during silent lip reading
(Your Name) has forwarded a page to you from bioRxiv
(Your Name) thought you would like to see this page from the bioRxiv website.
CAPTCHA
This question is for testing whether or not you are a human visitor and to prevent automated spam submissions.
Share
Visual and auditory cortices represent acoustic speech-related information during silent lip reading
Felix Bröhl, Anne Keitel, Christoph Kayser
bioRxiv 2022.02.21.481292; doi: https://doi.org/10.1101/2022.02.21.481292
Reddit logo Twitter logo Facebook logo LinkedIn logo Mendeley logo
Citation Tools
Visual and auditory cortices represent acoustic speech-related information during silent lip reading
Felix Bröhl, Anne Keitel, Christoph Kayser
bioRxiv 2022.02.21.481292; doi: https://doi.org/10.1101/2022.02.21.481292

Citation Manager Formats

  • BibTeX
  • Bookends
  • EasyBib
  • EndNote (tagged)
  • EndNote 8 (xml)
  • Medlars
  • Mendeley
  • Papers
  • RefWorks Tagged
  • Ref Manager
  • RIS
  • Zotero
  • Tweet Widget
  • Facebook Like
  • Google Plus One

Subject Area

  • Neuroscience
Subject Areas
All Articles
  • Animal Behavior and Cognition (4394)
  • Biochemistry (9611)
  • Bioengineering (7108)
  • Bioinformatics (24909)
  • Biophysics (12639)
  • Cancer Biology (9977)
  • Cell Biology (14375)
  • Clinical Trials (138)
  • Developmental Biology (7966)
  • Ecology (12130)
  • Epidemiology (2067)
  • Evolutionary Biology (16004)
  • Genetics (10937)
  • Genomics (14761)
  • Immunology (9885)
  • Microbiology (23700)
  • Molecular Biology (9490)
  • Neuroscience (50953)
  • Paleontology (370)
  • Pathology (1544)
  • Pharmacology and Toxicology (2688)
  • Physiology (4030)
  • Plant Biology (8676)
  • Scientific Communication and Education (1512)
  • Synthetic Biology (2402)
  • Systems Biology (6446)
  • Zoology (1346)