scholarly journals Eye and head movements while looking at rotated scenes in VR.

2019 ◽  
Vol 12 (7) ◽  
Author(s):  
Nicola C. Anderson ◽  
Walter F. Bischof

Video stream: https://vimeo.com/356859979 Production and  publication of the video stream was sponsored by SCIANS Ltd  http://www.scians.ch/ We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements. Both the eyes and head were tracked while observers looked at natural scenes in a virtual reality (VR) environment. In line with previous work, we found a horizontal bias in saccade directions, but this was affected by both the image shape and its content. Interestingly, when viewing landscapes (but not fractals), observers rotated their head in line with the image rotation, presumably to make saccades in cardinal, rather than oblique, directions. We discuss our findings in relation to current theories on eye movement control, and how insights from VR might inform traditional eyetracking studies. - Part 2: Observers looked at panoramic, 360 degree scenes using VR goggles while eye and head movements were tracked. Fixations were determined using IDT (Salvucci & Goldberg, 2000) adapted to a spherical coordinate system. We then analyzed a) the spatial distribution of fixations and the distribution of saccade directions, b) the spatial distribution of head positions and the distribution of head movements, and c) the relation between gaze and head movements. We found that, for landscape scenes, gaze and head best fit the allocentric frame defined by the scene horizon, especially when taking head tilt (i.e., head rotation around the view axis) into account. For fractal scenes, which are isotropic on average, the bias toward a body-centric frame gaze is weak for gaze and strong for the head. Furthermore, our data show that eye and head movements are closely linked in space and time in stereotypical ways, with volitional eye movements predominantly leading the head. We discuss our results in terms of models of visual exploratory behavior in panoramic scenes, both in virtual and real environments.

2020 ◽  
Author(s):  
Nicola C Anderson ◽  
Walter F. Bischof ◽  
Tom Foulsham ◽  
Alan Kingstone

Research investigating gaze in natural scenes has identified a number of spatial biases in where people look, but it is unclear whether these are partly due to constrained testing environments (e.g., a participant with their head restrained and looking at a landscape image framed within a computer monitor). We examined the extent to which image shape (square vs. circle), image rotation, and image content (landscapes vs. fractal images) influenced eye and head movements in virtual reality (VR). Both the eyes and head were tracked while observers looked at natural scenes in a virtual environment. In line with previous work, we found a bias for saccade directions parallel to the image horizon, regardless of image shape or content. We found that, when allowed to do so, observers move both their eyes and head to explore images. Head rotation, however, was idiosyncratic; some observers rotated a lot, while others did not. Interestingly, the head rotated in line with the rotation of landscape, but not fractal images. That head rotation and gaze direction respond differently to image content suggests that they may be under different control systems. We discuss our findings in relation to current theories on head and eye movement control, and how insights from VR might inform more traditional eye-tracking studies.


2020 ◽  
Author(s):  
Walter F. Bischof ◽  
Nicola C Anderson ◽  
Michael T. Doswell ◽  
Alan Kingstone

How do we explore the visual environment around us, and how are head and eye movements coordinated during our exploration? To investigate this question, we had observers look at omni-directional panoramic scenes, composed of both landscape and fractal images, using a virtual-reality (VR) viewer while their eye and head movements were tracked. We analyzed the spatial distribution of eye fixations and the distribution of saccade directions; the spatial distribution of head positions and the distribution of head shifts; as well as the relation between eye and head movements. The results show that, for landscape scenes, eye and head behaviour best fit the allocentric frame defined by the scene horizon, especially when head tilt (i.e., head rotation around the view axis) is considered. For fractal scenes, which have an isotropic texture, eye and head movements were executed primarily along the cardinal directions in world coordinates. The results also show that eye and head movements are closely linked in space and time in a complementary way, with stimulus-driven eye movements predominantly leading the head movements. Our study is the first to systematically examine eye and head movements in a panoramic VRenvironment, and the results demonstrate that a VR environment constitutes a powerful and informative research alternative to traditional methods for investigating looking behaviour.


Author(s):  
Arne F. Meyer ◽  
John O’Keefe ◽  
Jasper Poort

SummaryAnimals actively interact with their environment to gather sensory information. There is conflicting evidence about how mice use vision to sample their environment. During head restraint, mice make rapid eye movements strongly coupled between the eyes, similar to conjugate saccadic eye movements in humans. However, when mice are free to move their heads, eye movement patterns are more complex and often non-conjugate, with the eyes moving in opposite directions. Here, we combined eye tracking with head motion measurements in freely moving mice and found that both observations can be explained by the existence of two distinct types of coupling between eye and head movements. The first type comprised non-conjugate eye movements which systematically compensated for changes in head tilt to maintain approximately the same visual field relative to the horizontal ground plane. The second type of eye movements were conjugate and coupled to head yaw rotation to produce a “saccade and fixate” gaze pattern. During head initiated saccades, the eyes moved together in the same direction as the head, but during subsequent fixation moved in the opposite direction to the head to compensate for head rotation. This “saccade and fixate” pattern is similar to that seen in humans who use eye movements (with or without head movement) to rapidly shift gaze but in mice relies on combined eye and head movements. Indeed, the two types of eye movements very rarely occurred in the absence of head movements. Even in head-restrained mice, eye movements were invariably associated with attempted head motion. Both types of eye-head coupling were seen in freely moving mice during social interactions and a visually-guided object tracking task. Our results reveal that mice use a combination of head and eye movements to sample their environment and highlight the similarities and differences between eye movements in mice and humans.HighlightsTracking of eyes and head in freely moving mice reveals two types of eye-head couplingEye/head tilt coupling aligns gaze to horizontal planeRotational eye and head coupling produces a “saccade and fixate” gaze pattern with head leading the eyeBoth types of eye-head coupling are maintained during visually-guided behaviorsEye movements in head-restrained mice are related to attempted head movements


2021 ◽  
pp. 1-9
Author(s):  
Chiheon Kwon ◽  
Yunseo Ku ◽  
Shinhye Seo ◽  
Eunsook Jang ◽  
Hyoun-Joong Kong ◽  
...  

BACKGROUND: Low success and high recurrence of benign paroxysmal positional vertigo (BPPV) after home-based self-treated Epley and Barbeque (BBQ) roll maneuvers is an important issue. OBJECTIVE: To quantify the cause of low success rate of self-treated Epley and BBQ roll maneuvers and provide a clinically acceptable criterion to guide self-treatment head rotations. METHODS: Twenty-five participants without active BPPV wore a custom head-mount rotation monitoring device for objective measurements. Self-treatment and specialist-assisted maneuvers were compared for head rotation accuracy. Absolute differences between the head rotation evaluation criteria (American Academy of Otolaryngology guidelines) and measured rotation angles were considered as errors. Self-treatment and specialist-treated errors in maneuvers were compared. Between-trial variations and age effects were evaluated. RESULTS: A significantly large error and between-trial variation occurred in step 4 of the self-treated Epley maneuver, with a considerable error in the second trial. The cumulative error of all steps of self-treated BBQ roll maneuver was significantly large. Age effect occurred only in the self-treated BBQ roll maneuver. Errors in specialist-treated maneuvers ranged from 10 to 20 degrees. CONCLUSIONS: Real-time feedback of head movements during simultaneous head-body rotations could increase success rates of self-treatments. Specialist-treated maneuvers can be used as permissible rotation margin criteria.


1997 ◽  
Vol 7 (4) ◽  
pp. 303-310
Author(s):  
James R. Lackner ◽  
Paul DiZio

The reafference model has frequently been used to explain spatial constancy during eye and head movements. We have found that its basic concepts also form part of the information processing necessary for the control and recalibration of reaching movements. Reaching was studied in a novel force environment–a rotating room that creates centripetal forces of the type that could someday substitute for gravity in space flight, and Coriolis forces which are side effects of rotation. We found that inertial, noncontacting Coriolis forces deviate the path and endpoint of reaching movements, a finding that shows the inadequacy of equilibrium position models of movement control. Repeated movements in the rotating room quickly lead to normal movement patterns and to a failure to perceive the perturbing forces. The first movements made after rotation stops, without Coriolis forces present, show mirror-image deviations and evoke perception of a perturbing force even though none is present. These patterns of sensorimotor control and adaptation can largely be explained on the basis of comparisons of efference copy, reafferent muscle spindle, and cutaneous mechanoreceptor signals. We also describe experiments on human iocomotion using an apparatus similar to that which Mittelstaedt used to study the optomotor response of the Eristalis fly. These results show that the reafference principle relates as well to the perception of the forces acting on and exerted by the body during voluntary locomotion.


2020 ◽  
Author(s):  
Nguyen Nguyen ◽  
Kyu-Sung Kim ◽  
Gyutae Kim

Abstract Background: Due to the paired structure of two labyrinths, their neural communication is conducted through the interconnected commissural pathway. Using the tight link, the neural responding characteristics are formed in vestibular nucleus, and these responses are initially generated by the mechanical movement of the hair cells in the semicircular canals and otoliths. Although the mechanism to describe the neuronal responses to the head movements was evident, few direct experimental data were provided, especially the directional preference of otolith-related neurons as one of critical responses to elucidate the function of the neurons in vestibular nucleus (VN). Experimental Approach: The directional preference of otolith-related neurons was investigated in VN. Also, a chemically induced unilateral labyrinthectomy (UL) was performed to identify the origin of the directional preference. For the model evaluation, static and dynamic behavioral tests were performed. Following the evaluation, an extracellular neural activity was recorded for the neuronal responses to the horizontal head rotation and the linear head translation. Results: Seventy seven neuronal activities were recorded from thirty SD rats (270-450 g, male), and total population was divided into three groups; left UL (20), sham (35), right UL (22). Based on the directional preference, two sub-groups were again classified as contra- and ipsi-preferred neurons. There was no significance in the number of those sub-groups (contra-: 15/35, 43%; ipsi-: 20/35, 57%) in the sham (p=0.155). However, more ipsi-preferred neurons (19/22, 86%) were observed after right UL (p=6.056×10-5) while left UL caused more contra-preferred neurons (13/20, 65%) (p=0.058). In particular, the convergent neurons mainly led this biased difference in the population (ipsi-: 100% after right UL & contra-: 89% after left UL) (p<0.002). Conclusion: The directional preference was evenly maintained under a normal vestibular function, and its unilateral loss biased the directional preference of the neurons, depending on the side of lesion. Moreover, the dominance of the directional preference was mainly led by the convergent neurons which had the neural information related with head rotation and linear translation.


2008 ◽  
Vol 99 (5) ◽  
pp. 2558-2576
Author(s):  
Mario Ruiz-Ruiz ◽  
Julio C. Martinez-Trujillo

Previous studies have demonstrated that human subjects update the location of visual targets for saccades after head and body movements and in the absence of visual feedback. This phenomenon is known as spatial updating. Here we investigated whether a similar mechanism exists for the perception of motion direction. We recorded eye positions in three dimensions and behavioral responses in seven subjects during a motion task in two different conditions: when the subject's head remained stationary and when subjects rotated their heads around an anteroposterior axis (head tilt). We demonstrated that after head-tilt subjects updated the direction of saccades made in the perceived stimulus direction (direction of motion updating), the amount of updating varied across subjects and stimulus directions, the amount of motion direction updating was highly correlated with the amount of spatial updating during a memory-guided saccade task, subjects updated the stimulus direction during a two-alternative forced-choice direction discrimination task in the absence of saccadic eye movements (perceptual updating), perceptual updating was more accurate than motion direction updating involving saccades, and subjects updated motion direction similarly during active and passive head rotation. These results demonstrate the existence of an updating mechanism for the perception of motion direction in the human brain that operates during active and passive head rotations and that resembles the one of spatial updating. Such a mechanism operates during different tasks involving different motor and perceptual skills (saccade and motion direction discrimination) with different degrees of accuracy.


2000 ◽  
Vol 83 (1) ◽  
pp. 38-49 ◽  
Author(s):  
Benjamin T. Crane ◽  
Joseph L. Demer

Gain of the vestibuloocular reflex (VOR) not only varies with target distance and rotational axis, but can be chronically modified in response to prolonged wearing of head-mounted magnifiers. This study examined the effect of adaptation to telescopic spectacles on the variation of the VOR with changes in target distance and yaw rotational axis for head velocity transients having peak accelerations of 2,800 and 1,000°/s2. Eye and head movements were recorded with search coils in 10 subjects who underwent whole body rotations around vertical axes that were 10 cm anterior to the eyes, centered between the eyes, between the otoliths, or 20 cm posterior to the eyes. Immediately before each rotation, subjects viewed a target 15 or 500 cm distant. Lighting was extinguished immediately before and was restored after completion of each rotation. After initial rotations, subjects wore 1.9× magnification binocular telescopic spectacles during their daily activities for at least 6 h. Test spectacles were removed and measurement rotations were repeated. Of the eight subjects tolerant of adaptation to the telescopes, six demonstrated VOR gain enhancement after adaptation, while gain in two subjects was not increased. For all subjects, the earliest VOR began 7–10 ms after onset of head rotation regardless of axis eccentricity or target distance. Regardless of adaptation, VOR gain for the proximate target exceeded that for the distant target beginning at 20 ms after onset of head rotation. Adaptation increased VOR gain as measured 90–100 ms after head rotation onset by an average of 0.12 ± 0.02 (SE) for the higher head acceleration and 0.19 ± 0.02 for the lower head acceleration. After adaptation, four subjects exhibited significant increases in the canal VOR gain only, whereas two subjects exhibited significant increases in both angular and linear VOR gains. The latencies of linear and early angular target distance effects on VOR gain were unaffected by adaptation. The earliest significant change in angular VOR gain in response to adaptation occurred 50 and 68 ms after onset of the 2,800 and 1,000°/s2 peak head accelerations, respectively. The latency of the adaptive increase in linear VOR gain was ∼50 ms for the peak head acceleration of 2,800°/s2, and 100 ms for the peak head acceleration of 1,000°/s2. Thus VOR gain changes and latency were consistent with modification in the angular VOR in most subjects, and additionally in the linear VOR in a minority of subjects.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Takumi Mieda ◽  
Masahiro Kokubu

AbstractIn blind football, players predict the sound location of a ball to underpin the success of ball trapping. It is currently unknown whether blind footballers use head movements as a strategy for trapping a moving ball. This study investigated characteristics of head rotations in blind footballers during ball trapping compared to sighted nonathletes. Participants performed trapping an approaching ball using their right foot. Head and trunk rotation angles in the sagittal plane, and head rotation angles in the horizontal plane were measured during ball trapping. The blind footballers showed a larger downward head rotation angle, as well as higher performance at the time of ball trapping than did the sighted nonathletes. However, no significant differences between the groups were found with regards to the horizontal head rotation angle and the downward trunk rotation angle. The blind footballers consistently showed a larger relative angle of downward head rotation from an early time point after ball launching to the moment of ball trapping. These results suggest that blind footballers couple downward head rotation with the movement of an approaching ball, to ensure that the ball is kept in a consistent egocentric direction relative to the head throughout ball trapping.


2002 ◽  
Vol 87 (2) ◽  
pp. 912-924 ◽  
Author(s):  
H. Rambold ◽  
A. Churchland ◽  
Y. Selig ◽  
L. Jasmin ◽  
S. G. Lisberger

The vestibuloocular reflex (VOR) generates compensatory eye movements to stabilize visual images on the retina during head movements. The amplitude of the reflex is calibrated continuously throughout life and undergoes adaptation, also called motor learning, when head movements are persistently associated with image motion. Although the floccular-complex of the cerebellum is necessary for VOR adaptation, it is not known whether this function is localized in its anterior or posterior portions, which comprise the ventral paraflocculus and flocculus, respectively. The present paper reports the effects of partial lesions of the floccular-complex in five macaque monkeys, made either surgically or with stereotaxic injection of 3-nitropropionic acid (3-NP). Before and after the lesions, smooth pursuit eye movements were tested during sinusoidal and step-ramp target motion. Cancellation of the VOR was tested by moving a target exactly with the monkey during sinusoidal head rotation. The control VOR was tested during sinusoidal head rotation in the dark and during 30°/s pulses of head velocity. VOR adaptation was studied by having the monkeys wear ×2 or ×0.25 optics for 4–7 days. In two monkeys, bilateral lesions removed all of the flocculus except for parts of folia 1 and 2 but did not produce any deficits in smooth pursuit, VOR adaptation, or VOR cancellation. We conclude that the flocculus alone probably is not necessary for either pursuit or VOR learning. In two monkeys, unilateral lesions including a large fraction of the ventral paraflocculus produced small deficits in horizontal and vertical smooth pursuit, and mild impairments of VOR adaptation and VOR cancellation. We conclude that the ventral paraflocculus contributes to both behaviors. In one monkey, a bilateral lesion of the flocculus and ventral paraflocculus produced severe deficits smooth pursuit and VOR cancellation, and a complete loss of VOR adaptation. Considering all five cases together, there was a strong correlation between the size of the deficits in VOR learning and pursuit. We found the strongest correlation between the behavior deficits and the size of the lesion of the ventral paraflocculus, a weaker but significant correlation for the full floccular complex, and no correlation with the size of the lesion of the flocculus. We conclude that 1) lesions of the floccular complex cause linked deficits in smooth pursuit and VOR adaptation, and 2) the relevant portions of the structure are primarily in the ventral paraflocculus, although the flocculus may participate.


Sign in / Sign up

Export Citation Format

Share Document