Temporal and spectral information processing in the auditory cortex: a steady-state auditory-evoked potential study

2005 ◽  
Vol 1278 ◽  
pp. 27-30
Author(s):  
Tomomi Kurokawa-Kuroda ◽  
Takao Yamasaki ◽  
Yoshinobu Goto ◽  
Shozo Tobimatsu
Author(s):  
Felipe A. Araujo ◽  
Eduardo B. Jacobi ◽  
Juliana Avila-Souza ◽  
Jose F. Rodrigues ◽  
Renan C. Moioli ◽  
...  

2011 ◽  
Vol 123 (2) ◽  
pp. 65-73 ◽  
Author(s):  
Koichi Shibata ◽  
Kiyomi Yamane ◽  
Yoshiko Nishimura ◽  
Hiromi Kondo ◽  
Kuniaki Otuka

2019 ◽  
Vol 10 ◽  
Author(s):  
Javier Sanchez-Lopez ◽  
Silvia Savazzi ◽  
Caterina Annalaura Pedersini ◽  
Nicolò Cardobi ◽  
Carlo Alberto Marzi

2018 ◽  
Vol 120 (6) ◽  
pp. 2988-3000 ◽  
Author(s):  
Noelle T. Abbott ◽  
Antoine J. Shahin

In spoken language, audiovisual (AV) perception occurs when the visual modality influences encoding of acoustic features (e.g., phonetic representations) at the auditory cortex. We examined how visual speech (mouth movements) transforms phonetic representations, indexed by changes to the N1 auditory evoked potential (AEP). EEG was acquired while human subjects watched and listened to videos of a speaker uttering consonant vowel (CV) syllables, /ba/ and /wa/, presented in auditory-only or AV congruent or incongruent contexts or in a context in which the consonants were replaced by white noise (noise replaced). Subjects reported whether they heard “ba” or “wa.” We hypothesized that the auditory N1 amplitude during illusory perception (caused by incongruent AV input, as in the McGurk illusion, or white noise-replaced consonants in CV utterances) should shift to reflect the auditory N1 characteristics of the phonemes conveyed visually (by mouth movements) as opposed to acoustically. Indeed, the N1 AEP became larger and occurred earlier when listeners experienced illusory “ba” (video /ba/, audio /wa/, heard as “ba”) and vice versa when they experienced illusory “wa” (video /wa/, audio /ba/, heard as “wa”), mirroring the N1 AEP characteristics for /ba/ and /wa/ observed in natural acoustic situations (e.g., auditory-only setting). This visually mediated N1 behavior was also observed for noise-replaced CVs. Taken together, the findings suggest that information relayed by the visual modality modifies phonetic representations at the auditory cortex and that similar neural mechanisms support the McGurk illusion and visually mediated phonemic restoration. NEW & NOTEWORTHY Using a variant of the McGurk illusion experimental design (using the syllables /ba/ and /wa/), we demonstrate that lipreading influences phonetic encoding at the auditory cortex. We show that the N1 auditory evoked potential morphology shifts to resemble the N1 morphology of the syllable conveyed visually. We also show similar N1 shifts when the consonants are replaced by white noise, suggesting that the McGurk illusion and the visually mediated phonemic restoration rely on common mechanisms.


Sign in / Sign up

Export Citation Format

Share Document