The role of left inferior frontal cortex during audiovisual speech perception in infants

NeuroImage ◽  
2016 ◽  
Vol 133 ◽  
pp. 14-20 ◽  
Author(s):  
Nicole Altvater-Mackensen ◽  
Tobias Grossmann
2009 ◽  
Vol 51 (2) ◽  
pp. 184-193 ◽  
Author(s):  
Tobias S. Andersen ◽  
Kaisa Tiippana ◽  
Jari Laarni ◽  
Ilpo Kojo ◽  
Mikko Sams

Perception ◽  
10.1068/p3316 ◽  
2003 ◽  
Vol 32 (8) ◽  
pp. 921-936 ◽  
Author(s):  
Maxine V McCotter ◽  
Timothy R Jordan

We conducted four experiments to investigate the role of colour and luminance information in visual and audiovisual speech perception. In experiments la (stimuli presented in quiet conditions) and 1b (stimuli presented in auditory noise), face display types comprised naturalistic colour (NC), grey-scale (GS), and luminance inverted (LI) faces. In experiments 2a (quiet) and 2b (noise), face display types comprised NC, colour inverted (CI), LI, and colour and luminance inverted (CLI) faces. Six syllables and twenty-two words were used to produce auditory and visual speech stimuli. Auditory and visual signals were combined to produce congruent and incongruent audiovisual speech stimuli. Experiments 1a and 1b showed that perception of visual speech, and its influence on identifying the auditory components of congruent and incongruent audiovisual speech, was less for LI than for either NC or GS faces, which produced identical results. Experiments 2a and 2b showed that perception of visual speech, and influences on perception of incongruent auditory speech, was less for LI and CLI faces than for NC and CI faces (which produced identical patterns of performance). Our findings for NC and CI faces suggest that colour is not critical for perception of visual and audiovisual speech. The effect of luminance inversion on performance accuracy was relatively small (5%), which suggests that the luminance information preserved in LI faces is important for the processing of visual and audiovisual speech.


2017 ◽  
Vol 142 (4) ◽  
pp. 2705-2705
Author(s):  
Chao-Yang Lee ◽  
Margaret Harrison ◽  
Seth Wiener

2020 ◽  
Vol 63 (7) ◽  
pp. 2245-2254 ◽  
Author(s):  
Jianrong Wang ◽  
Yumeng Zhu ◽  
Yu Chen ◽  
Abdilbar Mamat ◽  
Mei Yu ◽  
...  

Purpose The primary purpose of this study was to explore the audiovisual speech perception strategies.80.23.47 adopted by normal-hearing and deaf people in processing familiar and unfamiliar languages. Our primary hypothesis was that they would adopt different perception strategies due to different sensory experiences at an early age, limitations of the physical device, and the developmental gap of language, and others. Method Thirty normal-hearing adults and 33 prelingually deaf adults participated in the study. They were asked to perform judgment and listening tasks while watching videos of a Uygur–Mandarin bilingual speaker in a familiar language (Standard Chinese) or an unfamiliar language (Modern Uygur) while their eye movements were recorded by eye-tracking technology. Results Task had a slight influence on the distribution of selective attention, whereas subject and language had significant influences. To be specific, the normal-hearing and the d10eaf participants mainly gazed at the speaker's eyes and mouth, respectively, in the experiment; moreover, while the normal-hearing participants had to stare longer at the speaker's mouth when they confronted with the unfamiliar language Modern Uygur, the deaf participant did not change their attention allocation pattern when perceiving the two languages. Conclusions Normal-hearing and deaf adults adopt different audiovisual speech perception strategies: Normal-hearing adults mainly look at the eyes, and deaf adults mainly look at the mouth. Additionally, language and task can also modulate the speech perception strategy.


2021 ◽  
Author(s):  
Veith Weilnhammer ◽  
Merve Fritsch ◽  
Meera Chikermane ◽  
Anna-Lena Eckert ◽  
Katharina Kanthak ◽  
...  

2007 ◽  
Vol 11 (4) ◽  
pp. 233-241 ◽  
Author(s):  
Nancy Tye-Murray ◽  
Mitchell Sommers ◽  
Brent Spehar

Sign in / Sign up

Export Citation Format

Share Document