scholarly journals Audio-visual interactions in egocentric distance perception: Ventriloquism effect and aftereffect

2020 ◽  
Author(s):  
Ľuboš Hládek ◽  
Aaron R Seitz ◽  
Norbert Kopčo

AbstractThe processes of audio-visual integration and of visually-guided re-calibration of auditory distance perception are not well understood. Here, the ventriloquism effect (VE) and aftereffect (VAE) were used to study these processes in a real reverberant environment. Auditory and audio-visual (AV) stimuli were presented, in interleaved trials, over a range of distances from 0.7 to 2.04 m in front of the listener, whose task was to judge the distance of auditory stimuli or of the auditory components of AV stimuli. The relative location of the visual and auditory components of AV stimuli was fixed within a session such that the visual component was presented from distance 30% closer (V-closer) than the auditory component, 30% farther (V-farther), or aligned (V-aligned). The study examined the strength of VE and VAE as a function of the reference distance and of the direction of the visual component displacement, and the temporal profile of the build-up/break-down of these effects. All observed effects were approximately independent of target distance when expressed in logarithmic units. The VE strength, measured in the AV trials, was roughly constant for both directions of visual-component displacement such that, on average, the responses shifted in the direction of the visual component by 72% of the audio-visual disparity. The VAE strength, measured on the interleaved auditory-only trials, was stronger in the V-farther than the V-closer condition (44% vs. 31% of the audio-visual disparity, respectively). The VAE persisted to post-adaptation auditory-only blocks of trials, however it was weaker and the V-farther/V-closer asymmetry was reduced. The rates of build-up/break-down of the VAE were also asymmetrical, with slower adaptation in the V-closer condition. These results suggest that, on a logarithmic scale, the AV distance integration is symmetrical, independent of the direction of induced shift, while the visually-induced auditory distance re-callibration is asymmetrical, stronger and faster when evoked by more distant visual stimuli.

Mixed Reality ◽  
1999 ◽  
pp. 201-214 ◽  
Author(s):  
Jack M. Loomis ◽  
Roberta L. Klatzky ◽  
Reginald G. Golledge

1995 ◽  
Vol 48 (2) ◽  
pp. 320-333 ◽  
Author(s):  
Eugen Diesch

If a place-of-articulation contrast is created between the auditory and the visual component syllables of videotaped speech, frequently the syllable that listeners report they have heard differs phonetically from the auditory component. These “McGurk effects”, as they have come to be called, show that speech perception may involve some kind of intermodal process. There are two classes of these phenomena: fusions and combinations. Perception of the syllable /da/ when auditory /ba/ and visual /ga/ are presented provides a clear example of the former, and perception of the string /bga/ after presentation of auditory /ga/ and visual /ba/ an unambiguous instance of the latter. Besides perceptual fusions and combinations, hearing visually presented component syllables also shows an influence of vision on audition. It is argued that these “visual” responses arise from basically the same underlying processes that yield fusions and combinations, respectively. In the present study, the visual component of audiovisually incongruous CV-syllables was presented in the left and the right visual hemifield, respectively. Audiovisual fusion responses showed a left hemifield advantage, and audiovisual combination responses a right hemifield advantage. This finding suggests that the process of audiovisual integration differs between audiovisual fusions and combinations and, furthermore, that the two cerebral hemispheres contribute differentially to the two classes of response.


2021 ◽  
Vol 12 (1) ◽  
pp. 348
Author(s):  
Vincent Martin ◽  
Isabelle Viaud-Delmon ◽  
Olivier Warusfel

Audio-only augmented reality consists of enhancing a real environment with virtual sound events. A seamless integration of the virtual events within the environment requires processing them with artificial spatialization and reverberation effects that simulate the acoustic properties of the room. However, in augmented reality, the visual and acoustic environment of the listener may not be fully mastered. This study aims to gain some insight into the acoustic cues (intensity and reverberation) that are used by the listener to form an auditory distance judgment, and to observe if these strategies can be influenced by the listener’s environment. To do so, we present a perceptual evaluation of two distance-rendering models informed by a measured Spatial Room Impulse Response. The choice of the rendering methods was made to design stimuli categories in which the availability and reproduction quality of acoustic cues are different. The proposed models have been evaluated in an online experiment gathering 108 participants who were asked to provide judgments of auditory distance about a stationary source. To evaluate the importance of environmental cues, participants had to describe the environment in which they were running the experiment, and more specifically the volume of the room and the distance to the wall they were facing. It could be shown that these context cues had a limited, but significant, influence on the perceived auditory distance.


Perception ◽  
10.1068/p7153 ◽  
2012 ◽  
Vol 41 (2) ◽  
pp. 175-192 ◽  
Author(s):  
Esteban R Calcagno ◽  
Ezequiel L Abregú ◽  
Manuel C Eguía ◽  
Ramiro Vergara

In humans, multisensory interaction is an important strategy for improving the detection of stimuli of different nature and reducing the variability of response. It is known that the presence of visual information affects the auditory perception in the horizontal plane (azimuth), but there are few researches that study the influence of vision in the auditory distance perception. In general, the data obtained from these studies are contradictory and do not completely define the way in which visual cues affect the apparent distance of a sound source. Here psychophysical experiments on auditory distance perception in humans are performed, including and excluding visual cues. The results show that the apparent distance from the source is affected by the presence of visual information and that subjects can store in their memory a representation of the environment that later improves the perception of distance.


2014 ◽  
Vol 8 ◽  
Author(s):  
Matthew G. Wisniewski ◽  
Eduardo Mercado ◽  
Barbara A. Church ◽  
Klaus Gramann ◽  
Scott Makeig

Sign in / Sign up

Export Citation Format

Share Document