scholarly journals Prospective head tracking in infants: Head movements, accuracy and timing in relation to a circular object motion

2009 ◽  
Author(s):  
Bert JONSSON ◽  
Louise RÖNNQVIST ◽  
Erik DOMELLÖF
2000 ◽  
Vol 84 (3) ◽  
pp. 1614-1626 ◽  
Author(s):  
Timothy Belton ◽  
Robert A. McCrea

The contribution of the flocculus region of the cerebellum to horizontal gaze pursuit was studied in squirrel monkeys. When the head was free to move, the monkeys pursued targets with a combination of smooth eye and head movements; with the majority of the gaze velocity produced by smooth tracking head movements. In the accompanying study we reported that the flocculus region was necessary for cancellation of the vestibuloocular reflex (VOR) evoked by passive whole body rotation. The question addressed in this study was whether the flocculus region of the cerebellum also plays a role in canceling the VOR produced by active head movements during gaze pursuit. The firing behavior of 121 Purkinje (Pk) cells that were sensitive to horizontal smooth pursuit eye movements was studied. The sample included 66 eye velocity Pk cells and 55 gaze velocity Pk cells. All of the cells remained sensitive to smooth pursuit eye movements during combined eye and head tracking. Eye velocity Pk cells were insensitive to smooth pursuit head movements. Gaze velocity Pk cells were nearly as sensitive to active smooth pursuit head movements as they were passive whole body rotation; but they were less than half as sensitive (≈43%) to smooth pursuit head movements as they were to smooth pursuit eye movements. Considered as a whole, the Pk cells in the flocculus region of the cerebellar cortex were <20% as sensitive to smooth pursuit head movements as they were to smooth pursuit eye movements, which suggests that this region does not produce signals sufficient to cancel the VOR during smooth head tracking. The comparative effect of injections of muscimol into the flocculus region on smooth pursuit eye and head movements was studied in two monkeys. Muscimol inactivation of the flocculus region profoundly affected smooth pursuit eye movements but had little effect on smooth pursuit head movements or on smooth tracking of visual targets when the head was free to move. We conclude that the signals produced by flocculus region Pk cells are neither necessary nor sufficient to cancel the VOR during gaze pursuit.


2003 ◽  
Vol 14 (4) ◽  
pp. 340-346 ◽  
Author(s):  
Mark Wexler

Although visual input is egocentric, at least some visual perceptions and representations are allocentric, that is, independent of the observer's vantage point or motion. Three experiments investigated the visual perception of three-dimensional object motion during voluntary and involuntary motion in human subjects. The results show that the motor command contributes to the objective perception of space: Observers are more likely to apply, consciously and unconsciously, spatial criteria relative to an allocentric frame of reference when they are executing voluntary head movements than while they are undergoing similar involuntary displacements (which lead to a more egocentric bias). Furthermore, details of the motor command are crucial to spatial vision, as allocentric bias decreases or disappears when self-motion and motor command do not match.


2015 ◽  
Vol 3 ◽  
pp. 829-836 ◽  
Author(s):  
Ilja T. Feldstein ◽  
Alexander Güntner ◽  
Klaus Bengler

2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Tong Xiao ◽  
Xiaojun Qiu ◽  
Benjamin Halkon

AbstractOne enduring challenge for controlling high frequency sound in local active noise control (ANC) systems is to obtain the acoustic signal at the specific location to be controlled. In some applications such as in ANC headrest systems, it is not practical to install error microphones in a person’s ears to provide the user a quiet or optimally acoustically controlled environment. Many virtual error sensing approaches have been proposed to estimate the acoustic signal remotely with the current state-of-the-art method using an array of four microphones and a head tracking system to yield sound reduction up to 1 kHz for a single sound source. In the work reported in this paper, a novel approach of incorporating remote acoustic sensing using a laser Doppler vibrometer into an ANC headrest system is investigated. In this “virtual ANC headphone” system, a lightweight retro-reflective membrane pick-up is mounted in each synthetic ear of a head and torso simulator to determine the sound in the ear in real-time with minimal invasiveness. The membrane design and the effects of its location on the system performance are explored, the noise spectra in the ears without and with ANC for a variety of relevant primary sound fields are reported, and the performance of the system during head movements is demonstrated. The test results show that at least 10 dB sound attenuation can be realised in the ears over an extended frequency range (from 500 Hz to 6 kHz) under a complex sound field and for several common types of synthesised environmental noise, even in the presence of head motion.


2019 ◽  
Vol 8 (3) ◽  
pp. 3045-3050

A head tracker is a crucial part of the head-mounted display systems, as it tracks the head of the pilot in the plane/cockpit simulator. The operational flaws of head trackers are also dependent on different environmental conditions like different lighting conditions and stray light interference. In this paper, an optical tracker has been employed to gather the 6-DoF data of head movements under different environmental conditions. Also, the effect of different environmental conditions and variation in distance between the receiver and optical transmitter on the 6-DoF data is analyzed. This can help in the prediction of the accuracy of a optical head tracker under different environmental conditions prior to its deployment in the aircraft.


2021 ◽  
Vol 12 ◽  
Author(s):  
Chloe Callahan-Flintoft ◽  
Christian Barentine ◽  
Jonathan Touryan ◽  
Anthony J. Ries

Using head mounted displays (HMDs) in conjunction with virtual reality (VR), vision researchers are able to capture more naturalistic vision in an experimentally controlled setting. Namely, eye movements can be accurately tracked as they occur in concert with head movements as subjects navigate virtual environments. A benefit of this approach is that, unlike other mobile eye tracking (ET) set-ups in unconstrained settings, the experimenter has precise control over the location and timing of stimulus presentation, making it easier to compare findings between HMD studies and those that use monitor displays, which account for the bulk of previous work in eye movement research and vision sciences more generally. Here, a visual discrimination paradigm is presented as a proof of concept to demonstrate the applicability of collecting eye and head tracking data from an HMD in VR for vision research. The current work’s contribution is 3-fold: firstly, results demonstrating both the strengths and the weaknesses of recording and classifying eye and head tracking data in VR, secondly, a highly flexible graphical user interface (GUI) used to generate the current experiment, is offered to lower the software development start-up cost of future researchers transitioning to a VR space, and finally, the dataset analyzed here of behavioral, eye and head tracking data synchronized with environmental variables from a task specifically designed to elicit a variety of eye and head movements could be an asset in testing future eye movement classification algorithms.


2020 ◽  
Vol 2020 (3) ◽  
pp. 303-1-303-12
Author(s):  
Muratcan Cicek ◽  
Jinrong Xie ◽  
Qiaosong Wang ◽  
Robinson Piramuthu

Shopping is difficult for people with motor impairments. This includes online shopping. Proprietary software can emulate mouse and keyboard via head tracking. However, such a solution is not common for smartphones. Unlike desktop and laptop computers, they are also much easier to carry indoors and outdoors. To address this, we implement and open source button that is sensitive to head movements tracked from the front camera of iPhone X. This allows developers to integrate in eCommerce applications easily without requiring specialized knowledge. Other applications include gaming and use in hands-free situations such as during cooking, auto-repair. We built a sample online shopping application that allows users to easily browse between items from various categories and take relevant action just by head movements. We present results of user studies on this sample application and also include sensitivity studies based on two independent tests performed at 3 different distances to the screen.


2019 ◽  
Vol 9 (9) ◽  
pp. 1760 ◽  
Author(s):  
Rong Han ◽  
Ming Wu ◽  
Chen Gong ◽  
Shangshuai Jia ◽  
Tieli Han ◽  
...  

Active headrest can reduce the low-frequency noise around ears based on the principle of active noise control. This paper presents a combination of robust algorithm and head-tracking for a feedforward active headrest to reduce the broadband noise for a sleeper on a high-speed train. A robust algorithm based on the feedforward active noise control is proposed to improve the noise control performance during head rotations. The head-tracking system with infrared rangefinders tracks the head position based on the Kalman filter to further improve system performance with head movements. Experiments were conducted on a model of a sleeper on a high-speed train. The experimental results show that the proposed active headrest system effectively controls broadband noise with head movements and rotations.


Perception ◽  
1992 ◽  
Vol 21 (5) ◽  
pp. 569-582 ◽  
Author(s):  
Michael T Swanston ◽  
Nicholas J Wade

The motion aftereffect (MAE) was measured with retinally moving vertical gratings positioned above and below (flanking) a retinally stationary central grating (experiments 1 and 2). Motion over the retina was produced by leftward motion of the flanking gratings relative to the stationary eyes, and by rightward eye or head movements tracking the moving (but retinally stationary) central grating relative to the stationary (but retinally moving) surround gratings. In experiment 1 the motion occurred within a fixed boundary on the screen, and oppositely directed MAEs were produced in the central and flanking gratings with static fixation; but with eye or head tracking MAEs were reported only in the central grating. In experiment 2 motion over the retina was equated for the static and tracking conditions by moving blocks of grating without any dynamic occlusion and disclosure at the boundaries. Both conditions yielded equivalent leftward MAEs of the central grating in the same direction as the prior flanking motion, ie an MAE was consistently produced in the region that had remained retinally stationary. No MAE was recorded in the flanking gratings, even though they moved over the retina during adaptation. When just two gratings were presented, MAEs were produced in both, but in opposite directions (experiments 3 and 4). It is concluded that the MAE is a consequence of adapting signals for the relative motion between elements of a display.


Sign in / Sign up

Export Citation Format

Share Document