Opposing Resistance to the Head Movement Does not Affect Space Perception During Head Rotations

1999 ◽  
pp. 193-201 ◽  
Author(s):  
Jean Blouin ◽  
Nicolas Amade ◽  
Jean-Louis Vercher ◽  
Gabriel Gauthier
2019 ◽  
Vol 6 ◽  
pp. 205566831984130
Author(s):  
Nahal Norouzi ◽  
Luke Bölling ◽  
Gerd Bruder ◽  
Greg Welch

Introduction: A large body of research in the field of virtual reality is focused on making user interfaces more natural and intuitive by leveraging natural body movements to explore a virtual environment. For example, head-tracked user interfaces allow users to naturally look around a virtual space by moving their head. However, such approaches may not be appropriate for users with temporary or permanent limitations of their head movement. Methods: In this paper, we present techniques that allow these users to get virtual benefits from a reduced range of physical movements. Specifically, we describe two techniques that augment virtual rotations relative to physical movement thresholds. Results: We describe how each of the two techniques can be implemented with either a head tracker or an eye tracker, e.g. in cases when no physical head rotations are possible. Conclusions: We discuss their differences and limitations and we provide guidelines for the practical use of such augmented user interfaces.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Sebastian H Zahler ◽  
David E Taylor ◽  
Joey Y Wong ◽  
Julia M Adams ◽  
Evan H Feinberg

Animals investigate their environments by directing their gaze towards salient stimuli. In the prevailing view, mouse gaze shifts entail head rotations followed by brainstem-mediated eye movements, including saccades to reset the eyes. These 'recentering' saccades are attributed to head movement-related vestibular cues. However, microstimulating mouse superior colliculus (SC) elicits directed head and eye movements resembling SC-dependent sensory-guided gaze shifts in other species, suggesting that mouse gaze shifts may be more flexible than has been recognized. We investigated this possibility by tracking eye and attempted head movements in a head-fixed preparation that eliminates head movement-related sensory cues. We found tactile stimuli evoke directionally biased saccades coincident with attempted head rotations. Differences in saccade endpoints across stimuli are associated with distinct stimulus-dependent relationships between initial eye position and saccade direction and amplitude. Optogenetic perturbations revealed SC drives these gaze shifts. Thus, head-fixed mice make sensory-guided, SC-dependent gaze shifts involving coincident, directionally biased saccades and attempted head movements. Our findings uncover flexibility in mouse gaze shifts and provide a foundation for studying head-eye coupling.


2021 ◽  
Vol 23 (8) ◽  
pp. 2189-2209
Author(s):  
Fernanda Herrera ◽  
Jeremy N Bailenson

The present investigation examined the effect of avatar representation, choice, and head movement on prosocial behaviors and measures of presence after a virtual reality perspective-taking (VRPT) task. Participants were either represented by a set of virtual hands or had no representation during a VRPT task. Of those with hands, only half were able to choose their skin tone. Results showed that there was no significant advantage to having an avatar representation. However, if participants had an avatar and were able to choose their own skin tone, a higher proportion of participants performed prosocial behaviors and reported higher social presence scores compared with participants who had no choice. Regardless of condition, head rotations significantly predicted petition signatures such that the more participants rotated their heads side to side, the more likely they were to sign the petition. Moreover, when participants do not consistently rotate their head side to side, the proportion of petitions signed is on par with individuals who do not complete a VRPT task at all.


1999 ◽  
Vol 58 (3) ◽  
pp. 170-179 ◽  
Author(s):  
Barbara S. Muller ◽  
Pierre Bovet

Twelve blindfolded subjects localized two different pure tones, randomly played by eight sound sources in the horizontal plane. Either subjects could get information supplied by their pinnae (external ear) and their head movements or not. We found that pinnae, as well as head movements, had a marked influence on auditory localization performance with this type of sound. Effects of pinnae and head movements seemed to be additive; the absence of one or the other factor provoked the same loss of localization accuracy and even much the same error pattern. Head movement analysis showed that subjects turn their face towards the emitting sound source, except for sources exactly in the front or exactly in the rear, which are identified by turning the head to both sides. The head movement amplitude increased smoothly as the sound source moved from the anterior to the posterior quadrant.


Author(s):  
Joanna Ganczarek ◽  
Vezio Ruggieri ◽  
Marta Olivetti Belardinelli ◽  
Daniele Nardi

1911 ◽  
Vol 8 (1) ◽  
pp. 31-31
Author(s):  
No authorship indicated
Keyword(s):  

This project is regarding the Motion controlled wheelchair for disabled. We are going to control motorized wheelchair using a head band having motion sensor and Arduino as controller. Problem: “often disabled who cannot walk find themselves being burden for their families or caretakers just for moving around the house. Disabled who are paralysed below head, who may not have functioning arms cannot control joystick controlled electric wheelchair.” This project is to solve their problem using a motion sensor to control their wheelchair. We are aiming towards building a more affordable, unique, low maintenance and available for all kind of head-controlled wheel chair.


Sign in / Sign up

Export Citation Format

Share Document