Electrophysiological evidence for speech-specific audiovisual integration

M Baart, JJ Stekelenburg, J Vroomen - Neuropsychologia, 2014 - Elsevier
Lip-read speech is integrated with heard speech at various neural levels. Here, we
investigated the extent to which lip-read induced modulations of the auditory N1 and P2 (measured …

Lip-reading enables the brain to synthesize auditory features of unknown silent speech

M Bourguignon, M Baart, EC Kapnoula… - Journal of …, 2020 - Soc Neuroscience
Lip-reading is crucial for understanding speech in challenging conditions. But how the brain
extracts meaning from, silent, visual speech is still under debate. Lip-reading in silence …

Quantifying lip‐read‐induced suppression and facilitation of the auditory N1 and P2 reveals peak enhancements and delays

M Baart - Psychophysiology, 2016 - Wiley Online Library
Lip‐read speech suppresses and speeds up the auditory N1 and P2 peaks, but these
effects are not always observed or reported. Here, the robustness of lip‐read‐induced N1/P2 …

[HTML][HTML] Implementing EEG hyperscanning setups

…, H Liu, G Blanco-Gomez, MI van den Heuvel, M Baart… - MethodsX, 2019 - Elsevier
Hyperscanning refers to obtaining simultaneous neural recordings from more than one
person (Montage et al., 2002 [1]), that can be used to study interactive situations. In particular, …

[HTML][HTML] The late positive potential (LPP): A neural marker of internalizing problems in early childhood

MA McLean, BRH Van den Bergh, M Baart… - International Journal of …, 2020 - Elsevier
Background One potentially relevant neurophysiological marker of internalizing problems (anxiety/depressive
symptoms) is the late positive potential (LPP), as it is related to processing …

Degrading phonetic information affects matching of audiovisual speech in adults, but not in infants

M Baart, J Vroomen, K Shaw, H Bortfeld - Cognition, 2014 - Elsevier
Infants and adults are well able to match auditory and visual speech, but the cues on which
they rely (viz. temporal, phonetic and energetic correspondence in the auditory and visual …

Turning a blind eye to the lexicon: ERPs show no cross-talk between lip-read and lexical context during speech sound processing

M Baart, AG Samuel - Journal of Memory and Language, 2015 - Elsevier
Electrophysiological research has shown that pseudowords elicit more negative Event-Related
Potentials (ie, ERPs) than words within 250 ms after the lexical status of a speech token …

Phonetic recalibration only occurs in speech mode

J Vroomen, M Baart - Cognition, 2009 - Elsevier
Upon hearing an ambiguous speech sound dubbed onto lipread speech, listeners adjust
their phonetic categories in accordance with the lipread information (recalibration) that tells …

Lexical access versus lexical decision processes for auditory, visual, and audiovisual items: Insights from behavioral and neural measures

RAL Zunini, M Baart, AG Samuel, BC Armstrong - Neuropsychologia, 2020 - Elsevier
In two experiments, we investigated the relationship between lexical access processes, and
processes that are specifically related to making lexical decisions. In Experiment 1, …

Lipread-induced phonetic recalibration in dyslexia

M Baart, L de Boer-Schellekens, J Vroomen - Acta psychologica, 2012 - Elsevier
Auditory phoneme categories are less well-defined in developmental dyslexic readers than
in fluent readers. Here, we examined whether poor recalibration of phonetic boundaries …