Supplemental Material for The Role of Audiovisual Integration in the Perception of Attractiveness

2009 ◽  
Vol 65 ◽  
pp. S171
Author(s):  
Junya Hirokawa ◽  
Osamu Sadakane ◽  
Shuzo Sakata ◽  
Miquel Bosch ◽  
Yoshio Sakurai ◽  
...  

2012 ◽  
Vol 25 (0) ◽  
pp. 154
Author(s):  
Luis Morís Fernández ◽  
Maya Visser ◽  
Salvador Soto-Faraco

We assessed the role of audiovisual integration in selective attention by testing selective attention to sound. Participants were asked to focus on one audio speech stream out of two audio streams presented simultaneously at different pitch. We measured recall of words from the cued or the uncued sentence using a 2AFC at the end of each trial. A video-clip of the mouth of a speaker was presented in the middle of the display, matching one of the two simultaneous auditory streams (50% of the time it matched the cued sentence and the rest the uncued one). In Experiment 1 the cue was 75% valid. Recall in the valid trials was better than in the invalid ones. The critical result was, however, that only in the valid condition we did find differences between audio–visual matching and audio-visually mismatching sentences. On the invalid condition these differences were not found. In Experiment 2 the cue to the relevant sentence was 100% valid, and we included a control condition where the lips didn’t match either of the sentences. When the lips matched the cued sentence performance was better than when they matched the uncued sentence or none of them, suggesting a benefit of audiovisual matching rather than a cost of mismatch. Our results indicate that attention to acoustic frequency (pitch) plays an important role in what sounds benefit from multisensory integration.


2019 ◽  
Vol 13 (1) ◽  
pp. 1-15
Author(s):  
Alexis T. Mook ◽  
Aaron D. Mitchel

2010 ◽  
Vol 11 (1) ◽  
pp. 4-11 ◽  
Author(s):  
Jordi Navarra ◽  
Agnès Alsius ◽  
Salvador Soto-Faraco ◽  
Charles Spence

2014 ◽  
Vol 10 (5) ◽  
pp. 713-720 ◽  
Author(s):  
Jenny Kokinous ◽  
Sonja A. Kotz ◽  
Alessandro Tavano ◽  
Erich Schröger

i-Perception ◽  
2020 ◽  
Vol 11 (6) ◽  
pp. 204166952098109
Author(s):  
Qingqing Li ◽  
Qiong Wu ◽  
Yiyang Yu ◽  
Fengxia Wu ◽  
Satoshi Takahashi ◽  
...  

Attentional processes play a complex and multifaceted role in the integration of input from different sensory modalities. However, whether increased attentional load disrupts the audiovisual (AV) integration of common objects that involve semantic content remains unclear. Furthermore, knowledge regarding how semantic congruency interacts with attentional load to influence the AV integration of common objects is limited. We investigated these questions by examining AV integration under various attentional-load conditions. AV integration was assessed by adopting an animal identification task using unisensory (animal images and sounds) and AV stimuli (semantically congruent AV objects and semantically incongruent AV objects), while attentional load was manipulated by using a rapid serial visual presentation task. Our results indicate that attentional load did not attenuate the integration of semantically congruent AV objects. However, semantically incongruent animal sounds and images were not integrated (as there was no multisensory facilitation), and the interference effect produced by the semantically incongruent AV objects was reduced by increased attentional-load manipulations. These findings highlight the critical role of semantic congruency in modulating the effect of attentional load on the AV integration of common objects.


2021 ◽  
pp. 1-45
Author(s):  
Grant M. Walker ◽  
Patrick Rollo ◽  
Nitin Tandon ◽  
Gregory Hickok

Abstract Speech perception ability and structural neuroimaging were investigated in two cases of bilateral opercular syndrome. Due to bilateral ablation of the motor control center for the lower face and surrounds, these rare cases provide an opportunity to evaluate the necessity of cortical motor representations for speech perception, a cornerstone of some neurocomputational theories of language processing. Speech perception, including audiovisual integration (i.e., the McGurk effect), was mostly unaffected in these cases, although verbal short-term memory impairment hindered performance on several tasks that are traditionally used to evaluate speech perception. The results suggest that the role of the cortical motor system in speech perception is context dependent and supplementary, not inherent or necessary.


2019 ◽  
Vol 2 (XXI) ◽  
pp. 65-76
Author(s):  
Konrad Rachut

This paper draws attention to the role of nonverbal communication in the process of simultaneous interpreting. The theoretical basis is provided by the phe-nomenon of audiovisual integration: the ability of the human brain to incorporate both verbal and nonverbal signals into comprehension. Referring to previous works by F. Poyatos, S. Viaggio and K. Seeber, the author attempts to distinguish core functions of nonverbal signals in simultaneous interpreting and to analyse the model of cognitive resource footprint by K. Seeber. Additionally, theoretical and practical ramifications of taking nonverbal signals into consideration for the psychology and quality of work of simultaneous interpreters are pinpointed.


JAMA ◽  
1966 ◽  
Vol 195 (12) ◽  
pp. 1005-1009 ◽  
Author(s):  
D. J. Fernbach
Keyword(s):  

JAMA ◽  
1966 ◽  
Vol 195 (3) ◽  
pp. 167-172 ◽  
Author(s):  
T. E. Van Metre

Sign in / Sign up

Export Citation Format

Share Document