head movements
Recently Published Documents


TOTAL DOCUMENTS

1330
(FIVE YEARS 306)

H-INDEX

66
(FIVE YEARS 6)

2022 ◽  
Vol 12 ◽  
Author(s):  
Chenhao Chiu ◽  
Yining Weng ◽  
Bo-wei Chen

Recent research on body and head positions has shown that postural changes may induce varying degrees of changes on acoustic speech signals and articulatory gestures. While the preservation of formant profiles across different postures is suitably accounted for by the two-tube model and perturbation theory, it remains unclear whether it is resulted from the accommodation of tongue postures. Specifically, whether the tongue accommodates the changes in head angle to maintain the target acoustics is yet to be determined. The present study examines vowel acoustics and their correspondence with the articulatory maneuvers of the tongue, including both tongue postures and movements of the tongue center, across different head angles. The results show that vowel acoustics, including pitch and formants, are largely unaffected by upward or downward tilting of the head. These preserved acoustics may be attributed to the lingual gestures that compensate for the effects of gravity. Our results also reveal that the tongue postures in response to head movements appear to be vowel-dependent, and the tongue center may serve as an underlying drive that covariates with the head angle changes. These results imply a close relationship between vowel acoustics and tongue postures as well as a target-oriented strategy for different head angles.


2022 ◽  
Vol 15 ◽  
Author(s):  
Hui Ho Vanessa Chang ◽  
Barbara J. Morley ◽  
Kathleen E. Cullen

The functional role of the mammalian efferent vestibular system (EVS) is not fully understood. One proposal is that the mammalian EVS plays a role in the long-term calibration of central vestibular pathways, for example during development. Here to test this possibility, we studied vestibular function in mice lacking a functional α9 subunit of the nicotinic acetylcholine receptor (nAChR) gene family, which mediates efferent activation of the vestibular periphery. We focused on an α9 (−/−) model with a deletion in exons 1 and 2. First, we quantified gaze stability by testing vestibulo-ocular reflex (VOR, 0.2–3 Hz) responses of both α9 (−/−) mouse models in dark and light conditions. VOR gains and phases were comparable for both α9 (−/−) mutants and wild-type controls. Second, we confirmed the lack of an effect from the α9 (−/−) mutation on central visuo-motor pathways/eye movement pathways via analyses of the optokinetic reflex (OKR) and quick phases of the VOR. We found no differences between α9 (−/−) mutants and wild-type controls. Third and finally, we investigated postural abilities during instrumented rotarod and balance beam tasks. Head movements were quantified using a 6D microelectromechanical systems (MEMS) module fixed to the mouse’s head. Compared to wild-type controls, we found head movements were strikingly altered in α9 (−/−) mice, most notably in the pitch axis. We confirmed these later results in another α9 (−/−) model, with a deletion in the exon 4 region. Overall, we conclude that the absence of the α9 subunit of nAChRs predominately results in an impairment of posture rather than gaze.


2022 ◽  
Author(s):  
VINICIUS OLIVEIRA ◽  
Felisberto Pereira ◽  
Nuno Carvalho ◽  
Sérgio Lopes

Abstract This paper proposes a low cost IoT solution to detect head movements and positions of a patient by means of Force Sensing Resistors positioned on a pillow and connected to a micro-controller collecting patient data anytime, when sleeping, sending it to the cloud and making it available to healthcare professionals. The impact of this work is focused on monitoring sleep quality, using low-cost and easy to use pillows in an ambulatory scenario, without the need of expensive and dedicated sleeping rooms for sleep monitoring, which most of the times affect patient sleep and degrades the quality of the measurement.In this case it is possible to monitor the patient’s behavior throughout the entire sleep, important for detecting factors causing minor head and neck injuries and even checking for events of long pauses in respiratory rate.


2022 ◽  
pp. 194338752110530
Author(s):  
Thomas Pepper ◽  
Harry Spiers ◽  
Alex Weller ◽  
Clare Schilling

Introduction Cervical spine (C-spine) injury is present in up to 10% of patients with maxillofacial fractures. Uncertainty over the status of the C-spine and permitted head movements may delay maxillofacial surgical intervention, resulting in prolonged patient discomfort and return to oral nutrition, reducing quality of life. This study aimed to investigate the effects on the C-spine of positioning patients for maxillofacial procedures by simulating intraoperative positions for common maxillofacial procedures. Methods Magnetic resonance imaging was used to assess the effects of head position in common intraoperative configurations – neutral (anterior mandible position), extended (tracheostomy position) and laterally rotated (mandibular condyle position) on the C-spine of a healthy volunteer. Results In the tracheostomy position, maximal movement occurred in the sagittal plane between the cervico-occipital junction and C4–C5, as well as at the cervico-thoracic junction. Minimal movement occurred at C2 (on C3), C5 (on C6) and C6 (on C7). In the mandibular condyle position, C-spine movements occurred in both rotational and sagittal planes. Maximal movement occurred above the level of C4, concentrated at atlanto-occipital and atlanto-axial (C1–2) joints. Conclusion Neck extension is likely to be relatively safe in injuries that are stable in flexion and extension, such as odontoid peg fracture and fractures between C5 and C7. Head rotation is likely to be relatively safe in fractures below C4, as well as vertebral body fractures, and laminar fractures without disc disruption. Early dialogue with the neurosurgical team remains a central tenet of safe management of patients with combined maxillofacial and C-spine injuries.


2021 ◽  
Vol 12 ◽  
Author(s):  
Chloe Callahan-Flintoft ◽  
Christian Barentine ◽  
Jonathan Touryan ◽  
Anthony J. Ries

Using head mounted displays (HMDs) in conjunction with virtual reality (VR), vision researchers are able to capture more naturalistic vision in an experimentally controlled setting. Namely, eye movements can be accurately tracked as they occur in concert with head movements as subjects navigate virtual environments. A benefit of this approach is that, unlike other mobile eye tracking (ET) set-ups in unconstrained settings, the experimenter has precise control over the location and timing of stimulus presentation, making it easier to compare findings between HMD studies and those that use monitor displays, which account for the bulk of previous work in eye movement research and vision sciences more generally. Here, a visual discrimination paradigm is presented as a proof of concept to demonstrate the applicability of collecting eye and head tracking data from an HMD in VR for vision research. The current work’s contribution is 3-fold: firstly, results demonstrating both the strengths and the weaknesses of recording and classifying eye and head tracking data in VR, secondly, a highly flexible graphical user interface (GUI) used to generate the current experiment, is offered to lower the software development start-up cost of future researchers transitioning to a VR space, and finally, the dataset analyzed here of behavioral, eye and head tracking data synchronized with environmental variables from a task specifically designed to elicit a variety of eye and head movements could be an asset in testing future eye movement classification algorithms.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Sebastian H Zahler ◽  
David E Taylor ◽  
Joey Y Wong ◽  
Julia M Adams ◽  
Evan H Feinberg

Animals investigate their environments by directing their gaze towards salient stimuli. In the prevailing view, mouse gaze shifts entail head rotations followed by brainstem-mediated eye movements, including saccades to reset the eyes. These 'recentering' saccades are attributed to head movement-related vestibular cues. However, microstimulating mouse superior colliculus (SC) elicits directed head and eye movements resembling SC-dependent sensory-guided gaze shifts in other species, suggesting that mouse gaze shifts may be more flexible than has been recognized. We investigated this possibility by tracking eye and attempted head movements in a head-fixed preparation that eliminates head movement-related sensory cues. We found tactile stimuli evoke directionally biased saccades coincident with attempted head rotations. Differences in saccade endpoints across stimuli are associated with distinct stimulus-dependent relationships between initial eye position and saccade direction and amplitude. Optogenetic perturbations revealed SC drives these gaze shifts. Thus, head-fixed mice make sensory-guided, SC-dependent gaze shifts involving coincident, directionally biased saccades and attempted head movements. Our findings uncover flexibility in mouse gaze shifts and provide a foundation for studying head-eye coupling.


2021 ◽  
Vol 11 (4) ◽  
pp. 188-194
Author(s):  
Putri Ayu Zartika ◽  
Mila Kusumawardani ◽  
Koesmarijanto Koesmarijanto

Problems that are often faced by people with physical disabilities are those who have limited hands, one of which is when they will use the computer. His inability to grip and use the mouse is often a barrier in using the computer. The purpose of the design of the tool is to provide facilities for people with disabilities to be able to use a mouse that will be moved based on head movements without noise interference caused by the MPU-6050 sensor. The results of the tests carried out show that designing a mouse with the MPU-6050 sensor has been successfully carried out, the MPU-6050 sensor by implementing a kalman filter as a noise reducer on the X axis has an accuracy value with an average error percentage of 0.09% and at Y angle is 0.12%. Data transmission from the mouse to the computer is done wirelessly using bluetooth HC-05 can receive data well as far as 12.5 meters with an error percentage of 0%. The button on the mouse that functions to perform the left click function when the button is bitten 1x, right click when the button is bitten 2x and click and hold to do a left click 2x or double click can run according to the command, has a 100% success rate.


Neurology ◽  
2021 ◽  
Vol 98 (1 Supplement 1) ◽  
pp. S3.3-S4
Author(s):  
John Heick

ObjectiveTo compare equilibrium scores between computerized dynamic posturography tests of the Sensory Organization Test (SOT) to the Head Shake-Sensory Organization Test (HS-SOT) in healthy adults.BackgroundApproximately 50% of the brain's pathways are related to vision and many of these pathways are susceptible to injury in concussion. Visual-motor disruptions occur in 65%–90% of concussed patients. These disruptions impair balance and can be measured. The SOT is a computerized postural test that evaluates balance by altering visual, proprioceptive, and vestibular cues. The HS-SOT modifies 2 of the standard SOT conditions by including dynamic head motions that stimulate the semicircular canals within the vestibular system.Design/MethodsParticipants completed the Dizziness Handicap Inventory, Activities of Balance Confidence Scale, SOT, and HS-SOT in one session.ResultsTwenty-five individuals (17 females, 8 males; mean age, 21.08 ± 4.10 years, range, 18–33 years) completed outcome measures and 3 trials of testing. There was a significant difference in mean values between the SOT and the HS-SOT for both condition 2 (t(16) = 3.034, p = 0.008) and 5 (t(16) = 5.706, p < 0.001). Additionally, there was a significant difference in mean values between the SOT and the foam HS-SOT for condition 2 (t(16) = 4.673, p < 0.001) and condition 5 (t(16) = 7.263, p < 0.001). There was not a significant difference in means between the foam and without foam for HS-SOT for condition 2 (t(16) = 1.77, p = 0.095) and condition 5 (t(16) = 1.825, p = 0.087).ConclusionsThe HS-SOT may quantify subtle balance deficits and enhance the clinical standard use of the SOT. Unlike the SOT where the head is static, the HS-SOT requires head movements, as if saying no repeatedly at approximately 100°/second as measured by an accelerometer. The HS-SOT may quantify subtle balance deficits and enhance the clinical standard use of the SOT.


2021 ◽  
Author(s):  
Rachel Ege ◽  
A. John van Opstal ◽  
Marc Mathijs van Wanrooij

The ventriloquism aftereffect (VAE) describes the persistent shift of perceived sound location after having been adapted to a ventriloquism condition, in which the sound was repeatedly paired with a displaced visual stimulus. In the latter case, participants consistently mislocalize the sound in the direction of the visual stimulus (ventriloquism effect, VE). Previous studies provide conflicting reports regarding the strength of the VAE, ranging from 0 to nearly 100%. Moreover, there is controversy about its generalization to different sounds than the one inducing the VE, ranging from no transfer at all, to full transfer across different sound spectra. Here, we imposed the VE for three different sounds: a low-frequency and a high-frequency narrow-band noise, and a broadband Gaussian white noise (GWN). In the adaptation phase, listeners generated fast goal-directed head movements to localize the sound, presented across a 70 deg range in the horizontal plane, while ignoring a visual distracter that was consistently displaced 10 deg to the right of the sound. In the post-adaptation phase, participants localized narrow-band sounds with center frequencies from 0.5 to 8 kHz, as well as GWN, without the visual distracter. Our results show that the VAE amounted to approximately 40% of the VE and generalized well across the entire frequency domain. We also found that the strength of the VAE correlated with the pre-adaptation sound-localization performance. We compare our results with previous reports and discuss different hypotheses regarding optimal audio-visual cue integration.


Author(s):  
Pawan Lapborisuth ◽  
Sharath Koorathota ◽  
Qi Wang ◽  
Paul Sajda

Abstract Objective. Reorienting is central to how humans direct attention to different stimuli in their environment. Previous studies typically employ well-controlled paradigms with limited eye and head movements to study the neural and physiological processes underlying attention reorienting. Here, we aim to better understand the relationship between gaze and attention reorienting using a naturalistic virtual reality (VR)-based target detection paradigm. Approach. Subjects were navigated through a city and instructed to count the number of targets that appeared on the street. Subjects performed the task in a fixed condition with no head movement and in a free condition where head movements were allowed. Electroencephalography (EEG), gaze and pupil data were collected. To investigate how neural and physiological reorienting signals are distributed across different gaze events, we used hierarchical discriminant component analysis (HDCA) to identify EEG and pupil-based discriminating components. Mixedeffects general linear models (GLM) were used to determine the correlation between these discriminating components and the different gaze events time. HDCA was also used to combine EEG, pupil and dwell time signals to classify reorienting events. Main results. In both EEG and pupil, dwell time contributes most significantly to the reorienting signals. However, when dwell times were orthogonalized against other gaze events, the distributions of the reorienting signals were different across the two modalities, with EEG reorienting signals leading that of the pupil reorienting signals. We also found that the hybrid classifier that integrates EEG, pupil and dwell time features detects the reorienting signals in both the fixed (AUC = 0.79) and the free (AUC = 0.77) condition. Significance. We show that the neural and ocular reorienting signals are distributed differently across gaze events when a subject is immersed in VR, but nevertheless can be captured and integrated to classify target vs. distractor objects to which the human subject orients.


Sign in / Sign up

Export Citation Format

Share Document