scholarly journals The eardrums move when the eyes move: A multisensory effect on the mechanics of hearing

2017 ◽  
Author(s):  
K. G. Gruters ◽  
D. L. K. Murphy ◽  
Cole D. Jenson ◽  
D. W. Smith ◽  
C. A. Shera ◽  
...  

ABSTRACTInteractions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here we show a novel multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n=19 ears in 16 subjects) and monkeys (n=5 ears in 3 subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub Eye Movement Related Eardrum Oscillations (EMREOs), occurred in the absence of a sound stimulus. The EMREOs’ amplitude and phase depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.SIGNIFICANCE STATEMENTThe peripheral hearing system contains several motor mechanisms that allow the brain to modify the auditory transduction process. Movements or tensioning of either the middle-ear muscles or the outer hair cells modify eardrum motion, producing sounds that can be detected by a microphone placed in the ear canal (e.g. as otoacoustic emissions). Here, we report a novel form of eardrum motion produced by the brain via these systems -- oscillations synchronized with and covarying with the direction and amplitude of saccades. These observations suggest that a vision-related process modulates the first stage of hearing. In particular, these eye-movement related eardrum oscillations may help the brain connect sights and sounds despite changes in the spatial relationship between the eyes and the ears.

2018 ◽  
Vol 115 (6) ◽  
pp. E1309-E1318 ◽  
Author(s):  
Kurtis G. Gruters ◽  
David L. K. Murphy ◽  
Cole D. Jenson ◽  
David W. Smith ◽  
Christopher A. Shera ◽  
...  

Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.


2009 ◽  
pp. 295-304
Author(s):  
Christopher D. Bauch ◽  
Wayne O. Olsen

Audiologic testing in the form of pure-tone air-conduction and bone-conduction audiograms provides diagnostic information about the type of hearing loss (conductive, sensorineural, or mixed) and the degree of hearing loss and attendant communication difficulties. The addition of speech tests that use specific types of speech stimuli directly assesses the patient’s ability to hear and to understand speech. Acoustic reflex and reflex decay tests are used to evaluate the integrity of a complicated neural network involving not only the auditory tracts to and through the brain stem but also decussating pathways in the brain stem and the course of CN VII to the innervation of the stapedius muscle. EOAE tests provide an objective measurement of the peripheral hearing system from the external ear through the cochlear outer hair cells. They are useful screening tests for hearing in infants, in patients suspected of auditory neuropathy/dys-synchrony, and in patients suspected to have pseudohypacusis, that is, feigned or exaggerated hearing loss.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Chih-Wei Lin ◽  
Yu Hong ◽  
Jinfu Liu

Abstract Background Glioma is a malignant brain tumor; its location is complex and is difficult to remove surgically. To diagnosis the brain tumor, doctors can precisely diagnose and localize the disease using medical images. However, the computer-assisted diagnosis for the brain tumor diagnosis is still the problem because the rough segmentation of the brain tumor makes the internal grade of the tumor incorrect. Methods In this paper, we proposed an Aggregation-and-Attention Network for brain tumor segmentation. The proposed network takes the U-Net as the backbone, aggregates multi-scale semantic information, and focuses on crucial information to perform brain tumor segmentation. To this end, we proposed an enhanced down-sampling module and Up-Sampling Layer to compensate for the information loss. The multi-scale connection module is to construct the multi-receptive semantic fusion between encoder and decoder. Furthermore, we designed a dual-attention fusion module that can extract and enhance the spatial relationship of magnetic resonance imaging and applied the strategy of deep supervision in different parts of the proposed network. Results Experimental results show that the performance of the proposed framework is the best on the BraTS2020 dataset, compared with the-state-of-art networks. The performance of the proposed framework surpasses all the comparison networks, and its average accuracies of the four indexes are 0.860, 0.885, 0.932, and 1.2325, respectively. Conclusions The framework and modules of the proposed framework are scientific and practical, which can extract and aggregate useful semantic information and enhance the ability of glioma segmentation.


2013 ◽  
Vol 133 (4) ◽  
pp. EL331-EL337 ◽  
Author(s):  
Makram Zebian ◽  
Volker Schirkonyer ◽  
Johannes Hensel ◽  
Sven Vollbort ◽  
Thomas Fedtke ◽  
...  

2006 ◽  
Vol 119 (6) ◽  
pp. 3896-3907 ◽  
Author(s):  
Tiffany A. Johnson ◽  
Stephen T. Neely ◽  
Judy G. Kopun ◽  
Michael P. Gorga

Physiology ◽  
2001 ◽  
Vol 16 (5) ◽  
pp. 234-238 ◽  
Author(s):  
Bernhard J. M. Hess

The central vestibular system receives afferent information about head position as well as rotation and translation. This information is used to prevent blurring of the retinal image but also to control self-orientation and motion in space. Vestibular signal processing in the brain stem appears to be linked to an internal model of head motion in space.


2005 ◽  
Vol 132 (4) ◽  
pp. 550-553 ◽  
Author(s):  
Haralampos Gouveris ◽  
Jan Maurer ◽  
Wolf Mann

OBJECTIVE: To investigate cochlear outer hair cell function in patients with acute tonal tinnitus and normal or near-normal hearing threshold. STUDY DESIGN AND SETTING: Prospective controlled study in an academic tertiary health center. Distortion products of otoacoustic emissions (DPOAE)-grams of 32 ears with acute tonal tinnitus and normal hearing or minimal hearing loss were compared with those of 17 healthy nontinnitus ears. RESULTS: Tinnitus ears exhibited relatively increased amplitudes of DPOAE at high frequencies (4-6.3 kHz) when compared with the group of healthy ears and relatively decreased DPOAE amplitudes at middle frequencies (1650-2400 Hz). Statistically significant ( P < 0.01) increased mean values of DPOAE amplitudes were observed only at a frequency of f2 equal to 4.9 kHz. CONCLUSIONS AND SIGNIFICANCE: These findings suggest an altered functional state of the outer hair cells at a selected high-frequency region of the cochlea in ears with acute tonal tinnitus and normal or near-normal hearing threshold.


2016 ◽  
Vol 2 (8) ◽  
pp. e1501070 ◽  
Author(s):  
Liu Zhou ◽  
Teng Leng Ooi ◽  
Zijiang J. He

Our sense of vision reliably directs and guides our everyday actions, such as reaching and walking. This ability is especially fascinating because the optical images of natural scenes that project into our eyes are insufficient to adequately form a perceptual space. It has been proposed that the brain makes up for this inadequacy by using its intrinsic spatial knowledge. However, it is unclear what constitutes intrinsic spatial knowledge and how it is acquired. We investigated this question and showed evidence of an ecological basis, which uses the statistical spatial relationship between the observer and the terrestrial environment, namely, the ground surface. We found that in dark and reduced-cue environments where intrinsic knowledge has a greater contribution, perceived target location is more accurate when referenced to the ground than to the ceiling. Furthermore, taller observers more accurately localized the target. Superior performance was also observed in the full-cue environment, even when we compensated for the observers’ heights by having the taller observer sit on a chair and the shorter observers stand on a box. Although fascinating, this finding dovetails with the prediction of the ecological hypothesis for intrinsic spatial knowledge. It suggests that an individual’s accumulated lifetime experiences of being tall and his or her constant interactions with ground-based objects not only determine intrinsic spatial knowledge but also endow him or her with an advantage in spatial ability in the intermediate distance range.


2007 ◽  
Vol 97 (1) ◽  
pp. 921-926 ◽  
Author(s):  
Mark T. Wallace ◽  
Barry E. Stein

Multisensory integration refers to the process by which the brain synthesizes information from different senses to enhance sensitivity to external events. In the present experiments, animals were reared in an altered sensory environment in which visual and auditory stimuli were temporally coupled but originated from different locations. Neurons in the superior colliculus developed a seemingly anomalous form of multisensory integration in which spatially disparate visual-auditory stimuli were integrated in the same way that neurons in normally reared animals integrated visual-auditory stimuli from the same location. The data suggest that the principles governing multisensory integration are highly plastic and that there is no a priori spatial relationship between stimuli from different senses that is required for their integration. Rather, these principles appear to be established early in life based on the specific features of an animal's environment to best adapt it to deal with that environment later in life.


Sign in / Sign up

Export Citation Format

Share Document