vestibular cues
Recently Published Documents


TOTAL DOCUMENTS

57
(FIVE YEARS 18)

H-INDEX

15
(FIVE YEARS 2)

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Sebastian H Zahler ◽  
David E Taylor ◽  
Joey Y Wong ◽  
Julia M Adams ◽  
Evan H Feinberg

Animals investigate their environments by directing their gaze towards salient stimuli. In the prevailing view, mouse gaze shifts entail head rotations followed by brainstem-mediated eye movements, including saccades to reset the eyes. These 'recentering' saccades are attributed to head movement-related vestibular cues. However, microstimulating mouse superior colliculus (SC) elicits directed head and eye movements resembling SC-dependent sensory-guided gaze shifts in other species, suggesting that mouse gaze shifts may be more flexible than has been recognized. We investigated this possibility by tracking eye and attempted head movements in a head-fixed preparation that eliminates head movement-related sensory cues. We found tactile stimuli evoke directionally biased saccades coincident with attempted head rotations. Differences in saccade endpoints across stimuli are associated with distinct stimulus-dependent relationships between initial eye position and saccade direction and amplitude. Optogenetic perturbations revealed SC drives these gaze shifts. Thus, head-fixed mice make sensory-guided, SC-dependent gaze shifts involving coincident, directionally biased saccades and attempted head movements. Our findings uncover flexibility in mouse gaze shifts and provide a foundation for studying head-eye coupling.


2021 ◽  
pp. 1-17
Author(s):  
Iqra Arshad ◽  
Paulo De Mello ◽  
Martin Ender ◽  
Jason D. McEwen ◽  
Elisa R. Ferré

Abstract Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness. We evaluated whether a new Artificial Intelligence (AI) software designed to supplement the 360-degree VR experience with artificial six-degrees-of-freedom motion may reduce cybersickness. Explicit (simulator sickness questionnaire and Fast Motion Sickness (FMS) rating) and implicit (heart rate) measurements were used to evaluate cybersickness symptoms during and after 360-degree VR exposure. Simulator sickness scores showed a significant reduction in feelings of nausea during the AI-supplemented six-degrees-of-freedom motion VR compared to traditional 360-degree VR. However, six-degrees-of-freedom motion VR did not reduce oculomotor or disorientation measures of sickness. No changes were observed in FMS and heart rate measures. Improving the congruency between visual and vestibular cues in 360-degree VR, as provided by the AI-supplemented six-degrees-of-freedom motion system considered, is essential for a more engaging, immersive and safe VR experience, which is critical for educational, cultural and entertainment applications.


Author(s):  
Yasaman Jabbari ◽  
Darren M. Kenney ◽  
Martin von Mohrenschildt ◽  
Judith M. Shedden

Author(s):  
Benedict Wild ◽  
Stefan Treue

Primate visual cortex consists of dozens of distinct brain areas, each providing a highly specialized component to the sophisticated task of encoding the incoming sensory information and creating a representation of our visual environment that underlies our perception and action. One such area is the medial superior temporal cortex (MST), a motion-sensitive, direction-selective part of the primate visual cortex. It receives most of its input from the middle temporal (MT) area, but MST cells have larger receptive fields and respond to more complex motion patterns. The finding that MST cells are tuned for optic flow patterns has led to the suggestion that the area plays an important role in the perception of self-motion. This hypothesis has received further support from studies showing that some MST cells also respond selectively to vestibular cues. Furthermore, the area is part of a network that controls the planning and execution of smooth pursuit eye movements and its activity is modulated by cognitive factors, such as attention and working memory. This review of more than 90 studies focuses on providing clarity of the heterogeneous findings on MST in the macaque cortex and its putative homolog in the human cortex. From this analysis of the unique anatomical and functional position in the hierarchy of areas and processing steps in primate visual cortex, MST emerges as a gateway between perception, cognition, and action planning. Given this pivotal role, this area represents an ideal model system for the transition from sensation to cognition.


2021 ◽  
Vol 125 (2) ◽  
pp. 672-686
Author(s):  
Faisal Karmali ◽  
Adam D. Goodworth ◽  
Yulia Valko ◽  
Tania Leeder ◽  
Robert J. Peterka ◽  
...  

Vestibular feedback is important for postural control, but little is known about the role of tilt cues vs. translation cues vs. rotation cues. We studied healthy human subjects with no known vestibular pathology or symptoms. Our findings showed that vestibular encoding of lateral translation correlated with medial-lateral postural sway, consistent with lateral translation cues contributing to balance control. This adds support to the hypothesis that vestibular noise contributes to spontaneous postural sway.


2021 ◽  
Author(s):  
Sepiedeh Keshavarzi ◽  
Edward F. Bracey ◽  
Richard A. Faville ◽  
Dario Campagner ◽  
Adam L. Tyson ◽  
...  

The extent to which we successfully navigate the environment depends on our ability to continuously track our heading direction. Neurons that encode the speed and the direction of head turns during navigation, known as angular head velocity (AHV) cells, are fundamental to this process, but the sensory computations underlying their activity remain unknown. By performing chronic single-unit recordings in the retrosplenial cortex (RSP) of the mouse and tracking the activity of individual AHV neurons between freely moving and head-restrained conditions, we find that vestibular inputs dominate AHV signalling. In addition, we discover that self-generated optic flow input onto these neurons increases the gain and signal-to-noise ratio of angular velocity coding during navigation. Psychophysical experiments and neural decoding further reveal that vestibular-visual integration increases the perceptual accuracy of egocentric angular velocity and the fidelity of its representation by RSP ensembles. We propose that while AHV coding is dependent on vestibular cues, it also utilises vision to maximise navigation accuracy in nocturnal and diurnal environments.


2021 ◽  
Vol 102 ◽  
pp. 04022
Author(s):  
William L. Martens ◽  
Michael Cohen

When seated users of multimodal augmented reality (AR) systems attempt to navigate unfamiliar environments, they can become disoriented during their initial travel through a remote environment that is displayed for them via that AR display technology. Even when the multimodal displays provide mutually coherent visual, auditory, and vestibular cues to the movement of seated users through a remote environment (such as a maze), those users may make errors in judging their own orientation and position relative to their starting point, and also may have difficulty determining what moves to make in order to return themselves to their starting point. In a number of investigations using multimodal AR systems featuring realtime servocontrolled movement of seated users, the relative contribution of spatial auditory display technology was examined across a variety of spatial navigation scenarios. The results of those investigations have implications for the effective use of the auditory component of a multimodal AR system in applications supporting spatial navigation through a physical environment.


2020 ◽  
Vol 33 (6) ◽  
pp. 625-644 ◽  
Author(s):  
Maria Gallagher ◽  
Reno Choi ◽  
Elisa Raffaella Ferrè

Abstract During exposure to Virtual Reality (VR) a sensory conflict may be present, whereby the visual system signals that the user is moving in a certain direction with a certain acceleration, while the vestibular system signals that the user is stationary. In order to reduce this conflict, the brain may down-weight vestibular signals, which may in turn affect vestibular contributions to self-motion perception. Here we investigated whether vestibular perceptual sensitivity is affected by VR exposure. Participants’ ability to detect artificial vestibular inputs was measured during optic flow or random motion stimuli on a VR head-mounted display. Sensitivity to vestibular signals was significantly reduced when optic flow stimuli were presented, but importantly this was only the case when both visual and vestibular cues conveyed information on the same plane of self-motion. Our results suggest that the brain dynamically adjusts the weight given to incoming sensory cues for self-motion in VR; however this is dependent on the congruency of visual and vestibular cues.


2020 ◽  
Vol 238 (6) ◽  
pp. 1423-1432
Author(s):  
Darren M. Kenney ◽  
Shannon O’Malley ◽  
Hannah M. Song ◽  
Ben Townsend ◽  
Martin von Mohrenschildt ◽  
...  
Keyword(s):  

2020 ◽  
Vol 123 (3) ◽  
pp. 936-944
Author(s):  
Corey S. Shayman ◽  
Robert J. Peterka ◽  
Frederick J. Gallun ◽  
Yonghee Oh ◽  
Nai-Yuan N. Chang ◽  
...  

Recent evidence has shown that auditory information may be used to improve postural stability, spatial orientation, navigation, and gait, suggesting an auditory component of self-motion perception. To determine how auditory and other sensory cues integrate for self-motion perception, we measured motion perception during yaw rotations of the body and the auditory environment. Psychophysical thresholds in humans were measured over a range of frequencies (0.1–1.0 Hz) during self-rotation without spatial auditory stimuli, rotation of a sound source around a stationary listener, and self-rotation in the presence of an earth-fixed sound source. Unisensory perceptual thresholds and the combined multisensory thresholds were found to be frequency dependent. Auditory thresholds were better at lower frequencies, and vestibular thresholds were better at higher frequencies. Expressed in terms of peak angular velocity, multisensory vestibular and auditory thresholds ranged from 0.39°/s at 0.1 Hz to 0.95°/s at 1.0 Hz and were significantly better over low frequencies than either the auditory-only (0.54°/s to 2.42°/s at 0.1 and 1.0 Hz, respectively) or vestibular-only (2.00°/s to 0.75°/s at 0.1 and 1.0 Hz, respectively) unisensory conditions. Monaurally presented auditory cues were less effective than binaural cues in lowering multisensory thresholds. Frequency-independent thresholds were derived, assuming that vestibular thresholds depended on a weighted combination of velocity and acceleration cues, whereas auditory thresholds depended on displacement and velocity cues. These results elucidate fundamental mechanisms for the contribution of audition to balance and help explain previous findings, indicating its significance in tasks requiring self-orientation. NEW & NOTEWORTHY Auditory information can be integrated with visual, proprioceptive, and vestibular signals to improve balance, orientation, and gait, but this process is poorly understood. Here, we show that auditory cues significantly improve sensitivity to self-motion perception below 0.5 Hz, whereas vestibular cues contribute more at higher frequencies. Motion thresholds are determined by a weighted combination of displacement, velocity, and acceleration information. These findings may help understand and treat imbalance, particularly in people with sensory deficits.


Sign in / Sign up

Export Citation Format

Share Document