The McGurk illusion

2021 ◽  
Vol 149 (4) ◽  
pp. A32-A32
Author(s):  
Kristin J. Van Engen
Keyword(s):  
2017 ◽  
Vol 38 (11) ◽  
pp. 5691-5705 ◽  
Author(s):  
Luis Morís Fernández ◽  
Emiliano Macaluso ◽  
Salvador Soto-Faraco

Author(s):  
Lucas Murrins Marques ◽  
Olivia Morgan Lapenta ◽  
Lotfi B. Merabet ◽  
Nadia Bolognini ◽  
Paulo Sérgio Boggio

2007 ◽  
Vol 45 (3) ◽  
pp. 587-597 ◽  
Author(s):  
Dave Saint-Amour ◽  
Pierfilippo De Sanctis ◽  
Sophie Molholm ◽  
Walter Ritter ◽  
John J. Foxe

2018 ◽  
Vol 120 (6) ◽  
pp. 2988-3000 ◽  
Author(s):  
Noelle T. Abbott ◽  
Antoine J. Shahin

In spoken language, audiovisual (AV) perception occurs when the visual modality influences encoding of acoustic features (e.g., phonetic representations) at the auditory cortex. We examined how visual speech (mouth movements) transforms phonetic representations, indexed by changes to the N1 auditory evoked potential (AEP). EEG was acquired while human subjects watched and listened to videos of a speaker uttering consonant vowel (CV) syllables, /ba/ and /wa/, presented in auditory-only or AV congruent or incongruent contexts or in a context in which the consonants were replaced by white noise (noise replaced). Subjects reported whether they heard “ba” or “wa.” We hypothesized that the auditory N1 amplitude during illusory perception (caused by incongruent AV input, as in the McGurk illusion, or white noise-replaced consonants in CV utterances) should shift to reflect the auditory N1 characteristics of the phonemes conveyed visually (by mouth movements) as opposed to acoustically. Indeed, the N1 AEP became larger and occurred earlier when listeners experienced illusory “ba” (video /ba/, audio /wa/, heard as “ba”) and vice versa when they experienced illusory “wa” (video /wa/, audio /ba/, heard as “wa”), mirroring the N1 AEP characteristics for /ba/ and /wa/ observed in natural acoustic situations (e.g., auditory-only setting). This visually mediated N1 behavior was also observed for noise-replaced CVs. Taken together, the findings suggest that information relayed by the visual modality modifies phonetic representations at the auditory cortex and that similar neural mechanisms support the McGurk illusion and visually mediated phonemic restoration. NEW & NOTEWORTHY Using a variant of the McGurk illusion experimental design (using the syllables /ba/ and /wa/), we demonstrate that lipreading influences phonetic encoding at the auditory cortex. We show that the N1 auditory evoked potential morphology shifts to resemble the N1 morphology of the syllable conveyed visually. We also show similar N1 shifts when the consonants are replaced by white noise, suggesting that the McGurk illusion and the visually mediated phonemic restoration rely on common mechanisms.


Author(s):  
Yadira Roa Romero ◽  
Julian Keil ◽  
Johanna Balz ◽  
Michael Niedeggen ◽  
Jürgen Gallinat ◽  
...  

2016 ◽  
Vol 28 (1) ◽  
pp. 1-7 ◽  
Author(s):  
Claudia S. Lüttke ◽  
Matthias Ekman ◽  
Marcel A. J. van Gerven ◽  
Floris P. de Lange

Auditory speech perception can be altered by concurrent visual information. The superior temporal cortex is an important combining site for this integration process. This area was previously found to be sensitive to audiovisual congruency. However, the direction of this congruency effect (i.e., stronger or weaker activity for congruent compared to incongruent stimulation) has been more equivocal. Here, we used fMRI to look at the neural responses of human participants during the McGurk illusion—in which auditory /aba/ and visual /aga/ inputs are fused to perceived /ada/—in a large homogenous sample of participants who consistently experienced this illusion. This enabled us to compare the neuronal responses during congruent audiovisual stimulation with incongruent audiovisual stimulation leading to the McGurk illusion while avoiding the possible confounding factor of sensory surprise that can occur when McGurk stimuli are only occasionally perceived. We found larger activity for congruent audiovisual stimuli than for incongruent (McGurk) stimuli in bilateral superior temporal cortex, extending into the primary auditory cortex. This finding suggests that superior temporal cortex prefers when auditory and visual input support the same representation.


2018 ◽  
Vol 5 (3) ◽  
pp. 170909 ◽  
Author(s):  
Claudia S. Lüttke ◽  
Alexis Pérez-Bellido ◽  
Floris P. de Lange

The human brain can quickly adapt to changes in the environment. One example is phonetic recalibration: a speech sound is interpreted differently depending on the visual speech and this interpretation persists in the absence of visual information. Here, we examined the mechanisms of phonetic recalibration. Participants categorized the auditory syllables /aba/ and /ada/, which were sometimes preceded by the so-called McGurk stimuli (in which an /aba/ sound, due to visual /aga/ input, is often perceived as ‘ada’). We found that only one trial of exposure to the McGurk illusion was sufficient to induce a recalibration effect, i.e. an auditory /aba/ stimulus was subsequently more often perceived as ‘ada’. Furthermore, phonetic recalibration took place only when auditory and visual inputs were integrated to ‘ada’ (McGurk illusion). Moreover, this recalibration depended on the sensory similarity between the preceding and current auditory stimulus. Finally, signal detection theoretical analysis showed that McGurk-induced phonetic recalibration resulted in both a criterion shift towards /ada/ and a reduced sensitivity to distinguish between /aba/ and /ada/ sounds. The current study shows that phonetic recalibration is dependent on the perceptual integration of audiovisual information and leads to a perceptual shift in phoneme categorization.


Sign in / Sign up

Export Citation Format

Share Document