scholarly journals Small head movements that accompany goal-directed arm movements provide various useful cues about the target’s distance

2015 ◽  
Vol 15 (12) ◽  
pp. 592
Author(s):  
Cristina de la Malla ◽  
Stijn Buiteman ◽  
Wilmer Otters ◽  
Jeroen Smeets ◽  
Eli Brenner
2021 ◽  
Vol 36 (6) ◽  
pp. 1185-1185
Author(s):  
Peii Chen ◽  
Denise Krch ◽  
Grigoriy Shekhtman

Abstract Objective Examine the usability and feasibility of a virtual reality (VR) treatment for persons with spatial neglect using head mounted display (HMD) and hand tracking technologies. Method Recruited from a rehabilitation hospital, 9 stroke survivors with spatial neglect (3 females; mean age = 64.2 years, SD = 9.1; 8 left neglect) participated in user testing for ongoing software development. Participants tested one of four customized treatment modules and completed the System Usability Scale, the Presence Questionnaire, and the Simulator Sickness Questionnaire. Feedback from participants were integrated into iterative prototype revisions. Module 1 (n = 7) required arm movements gradually reaching toward the neglected side of space, while the virtual hand appeared reaching straight ahead. Module 2 (n = 4) required head movements from the non-neglected to the neglected side. Module 3 (n = 6) involved head and arm movements towards both sides of space to collect objects. Module 4 (n = 2) was to stop approaching objects from a distance ahead. Results Despite reporting a lack of realism, participants preferred VR over conventional therapy. Participants felt comfortable and confident engaging in the virtual environment. Module 4 was more difficult than the other modules as participants required more practice to perform the task. Two participants reported Module 3 being tiresome, with one reporting mild shoulder pain and eye strain, and moderate sweating. However, all reported symptoms were temporary and resolved following a short break. Conclusion VR-based rehabilitation for spatial neglect using HMD and hand tracking technologies may be a viable treatment option for stroke survivors with spatial neglect. The modules benefited substantively from modifications based on participants’ feedback.


2020 ◽  
Vol 27 (2) ◽  
pp. 163-182 ◽  
Author(s):  
Fernanda Herrera ◽  
Soo Youn Oh ◽  
Jeremy N. Bailenson

Collaborative virtual environments (CVEs), wherein people can virtually interact with each other via avatars, are becoming increasingly prominent. However, CVEs differ in type of avatar representation and level of behavioral realism afforded to users. The present investigation compared the effect of behavioral realism on users' nonverbal behavior, self-presence, social presence, and interpersonal attraction during a dyadic interaction. Fifty-one dyads (aged 18 to 26) embodied either a full-bodied avatar with mapped hands and inferred arm movements, an avatar consisting of only a floating head and mapped hands, or a static full-bodied avatar. Planned contrasts compared the effect of behavioral realism against no behavioral realism, and compared the effect of low versus high behavioral realism. Results show that participants who embodied the avatar with only a floating head and hands experienced greater social presence, self-presence, and interpersonal attraction than participants who embodied a full-bodied avatar with mapped hands. In contrast, there were no significant differences on these measures between participants in the two mapped-hands conditions and those who embodied a static avatar. Participants in the static-avatar condition rotated their own physical head and hands significantly less than participants in the other two conditions during the dyadic interaction. Additionally, side-to-side head movements were negatively correlated with interpersonal attraction regardless of condition. We discuss implications of the finding that behavioral realism influences nonverbal behavior and communication outcomes.


Author(s):  
Benjamin P. Widlus ◽  
Keith S. Jones

Gibson (1979/1986) argued that exploratory movements generate information about agents’ action-capabilities within a given environment, that is, about the agent-environment system’s affordances. To date, the scant literature on exploratory movements has revealed two important findings. First, restricting exploratory movements degrades the accuracy of affordance judgments (Mark et al., 1990; Yu, Bardy, & Stoffregen, 2011). Second, exploratory movements can be very subtle (Stoffregen, Yang, & Bardy, 2005; Yu, Bardy, & Stoffregen, 2011). However, many questions regarding exploratory movements have yet to be answered. For example, what exploratory movements are necessary to perceive a given affordance, and how do exploratory movements differ from related movements? Our long-term goal is to address such gaps in the literature. We decided to begin by examining what exploratory movements must be executed in order to perceive whether the actor can reach an object. Reaching exploratory movements likely have two key components: 1) head movements and 2) shoulder movements. The former can generate information about the absolute distance between the actor and the to-be-reached object (Bingham & Stassen, 1994), and have been confirmed to be necessary to produce accurate reaching judgments (Mantel, Stoffregen, Campbell & Bardy, 2015). The latter generates information about the actor’s arm length (Anderson & Turvey, 1998; Shibata, Gyoba, & Takeshima, 2012;), but their necessity to the reach-ability judgment has yet to be studied. The current experiment used a restriction paradigm to determine whether exploratory arm movements are necessary to make accurate reaching judgments. Participants (n = 32) judged their maximum reaching ability either while holding their arms behind their backs with their dominant hand grasping their non-dominant wrist (the Restricted condition), or while their arms swung naturally at their sides (the Unrestricted condition). Judgments were made actively, by walking forward or backward, in order to allow participants to generate the exploratory movements they would normally create (with the exception of arm movements in the Restricted condition) when moving toward an object with the intention to perform a reach (Mantel, Bardy, & Stoffregen, 2010). The study utilized a within-subjects design, with starting condition counterbalanced. For each condition, participants completed 1 practice trial followed by 9 experimental trials. Starting distances (from object) and angles were drawn equally and randomly from ranges of 1 – 24”, 25 – 48”, 49 – 72”, and 0 - 29o, 30 - 59o, 60 - 89o, respectively. Distances and angles were not repeated to prevent memorization. In line with previous affordance perception research, the dependent variable, Accuracy, was computed in terms of percentage of absolute error (|[judged maximum reach / actual maximum reach] -1| *100) (Oudejans, Michaels, Bakker, & Dolné, 1996). Accuracy was significantly greater when arm movements were unrestricted as compared to restricted, supporting the theory that exploratory arm movements are a component of reach-ability judgments. Reaching judgments in neither condition were perfectly accurate, which may have been due to the reaching judgment being the focal task (Heft, 1993). The present results have practical implications for operational situations in which actors’ arm movements might be restricted. For example, U.S. police and military personnel sometimes wear body armor that covers their shoulders, mounts ballistic plates to their upper arms, or some combination thereof. To the extent that such body armor restricts arm movements, then our results suggest that their reach-ability judgments would be degraded.


Author(s):  
Kenneth Holmqvist ◽  
Saga Lee Örbom ◽  
Raimondas Zemblys

AbstractWe empirically investigate the role of small, almost imperceptible balance and breathing movements of the head on the level and colour of noise in data from five commercial video-based P-CR eye trackers. By comparing noise from recordings with completely static artificial eyes to noise from recordings where the artificial eyes are worn by humans, we show that very small head movements increase levels and colouring of the noise in data recorded from all five eye trackers in this study. This increase of noise levels is seen not only in the gaze signal, but also in the P and CR signals of the eye trackers that provide these camera image features. The P and CR signals of the SMI eye trackers correlate strongly during small head movements, but less so or not at all when the head is completely still, indicating that head movements are registered by the P and CR images in the eye camera. By recording with artificial eyes, we can also show that the pupil size artefact has no major role in increasing and colouring noise. Our findings add to and replicate the observation by Niehorster et al., (2021) that lowpass filters in video-based P–CR eye trackers colour the data. Irrespective of source, filters or head movements, coloured noise can be confused for oculomotor drift. We also find that usage of the default head restriction in the EyeLink 1000+, the EyeLink II and the HiSpeed240 result in noisier data compared to less head restriction. Researchers investigating data quality in eye trackers should consider not using the Gen 2 artificial eye from SR Research / EyeLink. Data recorded with this artificial eye are much noisier than data recorded with other artificial eyes, on average 2.2–14.5 times worse for the five eye trackers.


1999 ◽  
Vol 58 (3) ◽  
pp. 170-179 ◽  
Author(s):  
Barbara S. Muller ◽  
Pierre Bovet

Twelve blindfolded subjects localized two different pure tones, randomly played by eight sound sources in the horizontal plane. Either subjects could get information supplied by their pinnae (external ear) and their head movements or not. We found that pinnae, as well as head movements, had a marked influence on auditory localization performance with this type of sound. Effects of pinnae and head movements seemed to be additive; the absence of one or the other factor provoked the same loss of localization accuracy and even much the same error pattern. Head movement analysis showed that subjects turn their face towards the emitting sound source, except for sources exactly in the front or exactly in the rear, which are identified by turning the head to both sides. The head movement amplitude increased smoothly as the sound source moved from the anterior to the posterior quadrant.


2000 ◽  
Author(s):  
Frank E. Pollick ◽  
Helena Paterson ◽  
Andrew J. Calder ◽  
Armin Bruderlin ◽  
Anthony J. Sanford
Keyword(s):  

2004 ◽  
Vol 35 (03) ◽  
Author(s):  
P Wagner ◽  
J Cunha ◽  
C Mauerer ◽  
C Vollmar ◽  
B Feddersen ◽  
...  
Keyword(s):  

2003 ◽  
Vol 10 (5) ◽  
pp. 862-869 ◽  
Author(s):  
A. W. Floris Vos ◽  
Matteus A. M. Linsen ◽  
J. Tim Marcus ◽  
Jos C. van den Berg ◽  
Jan Albert Vos ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document