Remote measurement of spectral weighting in sound localization using virtual reality headsets

2021 ◽  
Vol 150 (4) ◽  
pp. A299-A299
Author(s):  
Jwala P. Rejimon ◽  
Monica L. Folkerts ◽  
G. Christopher Stecker
2021 ◽  
Vol 150 (4) ◽  
pp. A304-A304
Author(s):  
Monica L. Folkerts ◽  
Erin M. Picou ◽  
G. Christopher Stecker

2021 ◽  
Vol 2 ◽  
Author(s):  
Lasse Embøl ◽  
Carl Hutters ◽  
Andreas Junker ◽  
Daniel Reipur ◽  
Ali Adjorlu ◽  
...  

Cochlear implants (CI) enable hearing in individuals with sensorineural hearing loss, albeit with difficulties in speech perception and sound localization. In noisy environments, these difficulties are disproportionately greater for CI users than for children with no reported hearing loss. Parents of children with CIs are motivated to experience what CIs sound like, but options to do so are limited. This study proposes using virtual reality to simulate having CIs in a school setting with two contrasting settings: a noisy playground and a quiet classroom. To investigate differences between hearing conditions, an evaluation utilized a between-subjects design with 15 parents (10 female, 5 male; age M = 38.5, SD = 6.6) of children with CIs with no reported hearing loss. In the virtual environment, a word recognition and sound localization test using an open-set speech corpus compared differences between simulated unilateral CI, simulated bilateral CI, and normal hearing conditions in both settings. Results of both tests indicate that noise influences word recognition more than it influences sound localization, but ultimately affects both. Furthermore, bilateral CIs are equally to or significantly beneficial over having a simulated unilateral CI in both tests. A follow-up qualitative evaluation showed that the simulation enabled users to achieve a better understanding of what it means to be an hearing impaired child.


Author(s):  
Aleksei Tepljakov ◽  
Sergei Astapov ◽  
Eduard Petlenkov ◽  
Kristina Vassiljeva ◽  
Dirk Draheim

Author(s):  
Stephen D. Sechler ◽  
Alejandro Lopez Valdes ◽  
Saskia M. Waechter ◽  
Cristina Simoes-Franklin ◽  
Laura Viani ◽  
...  

2018 ◽  
Author(s):  
Axel Ahrens ◽  
Kasper Duemose Lund ◽  
Marton Marschall ◽  
Torsten Dau

AbstractTo achieve accurate spatial auditory perception, subjects typically require personal head-related transfer functions (HRTFs) and the freedom for head movements. Loudspeaker-based virtual sound environments allow for realism without individualized measurements. To study audio-visual perception in realistic environments, the combination of spatially tracked head mounted displays (HMDs), also known as virtual reality glasses, and virtual sound environments may be valuable. However, HMDs were recently shown to affect the subjects’ HRTFs and thus might influence sound localization performance. Furthermore, due to limitations of the reproduction of visual information on the HMD, audio-visual perception might be influenced. Here, a sound localization experiment was conducted both with and without an HMD and with a varying amount of visual information provided to the subjects. Furthermore, interaural time and level difference errors (ITDs and ILDs) as well as spectral perturbations induced by the HMD were analyzed and compared to the perceptual localization data. The results showed a reduction of the localization accuracy when the subjects were wearing an HMD and when they were blindfolded. The HMD-induced error in azimuth localization was found to be larger in the left than in the right hemisphere. Thus, the errors in ITD and ILD can only partly account for the perceptual differences. When visual information of the limited set of source locations was provided, the localization error induced by the HMD was found to be negligible. Presenting visual information of hand-location, room dimensions, source locations and pointing feedback on the HMD revealed similar effects as previously shown in real environments.


2020 ◽  
Vol 148 (4) ◽  
pp. 2619-2620
Author(s):  
Monica L. Folkerts ◽  
Erin M. Picou ◽  
G. Christopher Stecker

2020 ◽  
Author(s):  
V. Gaveau ◽  
A. Coudert ◽  
R. Salemme ◽  
E. Koun ◽  
C. Desoche ◽  
...  

AbstractIn everyday life, localizing a sound source in free-field entails more than the sole extraction of monaural and binaural auditory cues to define its location in the three-dimensions (azimuth, elevation and distance). In spatial hearing, we also take into account all the available visual information (e.g., cues to sound position, cues to the structure of the environment), and we resolve perceptual ambiguities through active listening behavior, exploring the auditory environment with head or/and body movements. Here we introduce a novel approach to sound localization in 3D named SPHERE (European patent n° WO2017203028A1), which exploits a commercially available Virtual Reality Head-mounted display system with real-time kinematic tracking to combine all of these elements (controlled positioning of a real sound source and recording of participants’ responses in 3D, controlled visual stimulations and active listening behavior). We prove that SPHERE allows accurate sampling of the 3D spatial hearing abilities of normal hearing adults, and it allowed detecting and quantifying the contribution of active listening. Specifically, comparing static vs. free head-motion during sound emission we found an improvement of sound localization accuracy and precisions. By combining visual virtual reality, real-time kinematic tracking and real-sound delivery we have achieved a novel approach to the study of spatial hearing, with the potentials to capture real-life behaviors in laboratory conditions. Furthermore, our new approach also paves the way for clinical and industrial applications that will leverage the full potentials of active listening and multisensory stimulation intrinsic to the SPHERE approach for the purpose rehabilitation and product assessment.


Author(s):  
Thirsa Huisman ◽  
Torsten Dau ◽  
Tobias Piechowiak ◽  
Ewen MacDonald

Despite more than 60 years of research, it has remained uncertain if and how realism affects the ventriloquist effect. Here, a sound localization experiment was run using spatially disparate audio-visual stimuli. The visual stimuli were presented using virtual reality, allowing for easy manipulation of the degree of realism of the stimuli. Starting from stimuli commonly used in ventriloquist experiments, i.e., a light flash and noise burst, a new factor was added or changed in each condition to investigate the effect of movement and realism without confounding the effects of an increased temporal correlation of the audio-visual stimuli. First, a distractor task was introduced to ensure that participants fixated their eye gaze during the experiment. Next, movement was added to the visual stimuli while maintaining a similar temporal correlation between the stimuli. Finally, by changing the stimuli from the flash and noise stimuli to the visuals of a bouncing ball that made a matching impact sound, the effect of realism was assessed. No evidence for an effect of realism and movement of the stimuli was found, suggesting that, in simple scenarios, the ventriloquist effect might not be affected by stimulus realism.


Author(s):  
Thirsa Huisman ◽  
Torsten Dau ◽  
Tobias Piechowiak ◽  
Ewen MacDonald

Despite more than 60 years of research, it has remained uncertain if and how realism affects the ventriloquist effect. Here, a sound localization experiment was run using spatially disparate audio-visual stimuli. The visual stimuli were presented using virtual reality, allowing for easy manipulation of the degree of realism of the stimuli. Starting from stimuli commonly used in ventriloquist experiments, i.e., a light flash and noise burst, a new factor was added or changed in each condition to investigate the effect of movement and realism without confounding the effects of an increased temporal correlation of the audio-visual stimuli. First, a distractor task was introduced to ensure that participants fixated their eye gaze during the experiment. Next, movement was added to the visual stimuli while maintaining a similar temporal correlation between the stimuli. Finally, by changing the stimuli from the flash and noise stimuli to the visuals of a bouncing ball that made a matching impact sound, the effect of realism was assessed. No evidence for an effect of realism and movement of the stimuli was found, suggesting that, in simple scenarios, the ventriloquist effect might not be affected by stimulus realism.


Sign in / Sign up

Export Citation Format

Share Document