head tracker
Recently Published Documents


TOTAL DOCUMENTS

46
(FIVE YEARS 6)

H-INDEX

7
(FIVE YEARS 0)

2021 ◽  
Vol 263 (1) ◽  
pp. 5071-5082
Author(s):  
William D'Andrea Fonseca ◽  
Davi Rocha Carvalho ◽  
Jacob Hollebon ◽  
Paulo Henrique Mareze ◽  
Filippo Maria Fazi

Binaural rendering is a technique that seeks to generate virtual auditory environments that replicate the natural listening experience, including the three-dimensional perception of spatialized sound sources. As such, real-time knowledge of the listener's position, or more specifically, their head and ear orientations allow the transfer of movement from the real world to virtual spaces, which consequently enables a richer immersion and interaction with the virtual scene. This study presents the use of a simple laptop integrated camera (webcam) as a head tracker sensor, disregarding the necessity to mount any hardware to the listener's head. The software was built on top of a state-of-the-art face landmark detection model, from Google's MediaPipe library for Python. Manipulations to the coordinate system are performed, in order to translate the origin from the camera to the center of the subject's head and adequately extract rotation matrices and Euler angles. Low-latency communication is enabled via User Datagram Protocol (UDP), allowing the head tracker to run in parallel and asynchronous with the main application. Empirical experiments have demonstrated reasonable accuracy and quick response, indicating suitability to real-time applications that do not necessarily require methodical precision.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2237
Author(s):  
Iosune Salinas-Bueno ◽  
Maria Francesca Roig-Maimó ◽  
Pau Martínez-Bueso ◽  
Katia San-Sebastián-Fernández ◽  
Javier Varona ◽  
...  

Vision-based interfaces are used for monitoring human motion. In particular, camera-based head-trackers interpret the movement of the user’s head for interacting with devices. Neck pain is one of the most important musculoskeletal conditions in prevalence and years lived with disability. A common treatment is therapeutic exercise, which requires high motivation and adherence to treatment. In this work, we conduct an exploratory experiment to validate the use of a non-invasive camera-based head-tracker monitoring neck movements. We do it by means of an exergame for performing the rehabilitation exercises using a mobile device. The experiments performed in order to explore its feasibility were: (1) validate neck’s range of motion (ROM) that the camera-based head-tracker was able to detect; (2) ensure safety application in terms of neck ROM solicitation by the mobile application. Results not only confirmed safety, in terms of ROM requirements for different preset patient profiles, according with the safety parameters previously established, but also determined the effectiveness of the camera-based head-tracker to monitor the neck movements for rehabilitation purposes.


2021 ◽  
Vol 13 (1) ◽  
pp. 47-58
Author(s):  
P. P. Debnath ◽  
M. G. Rashed ◽  
D. Das ◽  
M. R. Basar

The paper presents an approach to detect and control the focus of attention of the suspect using his/her eye gaze and head movement direction to build up an automatic interrogation system- specially to detect lies. To this point, we classified interrogation conversation into different criteria and identified the fatal ones. At first, we conducted psychological experiments on the sampled population to detect the different parameters connected with various symptoms when the suspect tells lies and build our knowledgebase with the results. This knowledgebase helps the system to make strategic decisions and to optimize accuracy. A monitoring camera captures continuous interrogation and feeds the frames to our proposed system. 3D head tracker is used to track the head from image and Active Shape Model (ASM) is utilized to localize face points. Vector Field of Image Gradient (VFIG) is calculated to track the eyeball and its rotation within the eye area. Random eye and head movement, change of eyebrow at the critical level of questionnaire provide us the possibility of detecting lies. Finally, experiments are conducted in a controlled environment to validate our psychological findings. 


2019 ◽  
Vol 8 (3) ◽  
pp. 3045-3050

A head tracker is a crucial part of the head-mounted display systems, as it tracks the head of the pilot in the plane/cockpit simulator. The operational flaws of head trackers are also dependent on different environmental conditions like different lighting conditions and stray light interference. In this paper, an optical tracker has been employed to gather the 6-DoF data of head movements under different environmental conditions. Also, the effect of different environmental conditions and variation in distance between the receiver and optical transmitter on the 6-DoF data is analyzed. This can help in the prediction of the accuracy of a optical head tracker under different environmental conditions prior to its deployment in the aircraft.


2019 ◽  
Vol 6 ◽  
pp. 205566831984130
Author(s):  
Nahal Norouzi ◽  
Luke Bölling ◽  
Gerd Bruder ◽  
Greg Welch

Introduction: A large body of research in the field of virtual reality is focused on making user interfaces more natural and intuitive by leveraging natural body movements to explore a virtual environment. For example, head-tracked user interfaces allow users to naturally look around a virtual space by moving their head. However, such approaches may not be appropriate for users with temporary or permanent limitations of their head movement. Methods: In this paper, we present techniques that allow these users to get virtual benefits from a reduced range of physical movements. Specifically, we describe two techniques that augment virtual rotations relative to physical movement thresholds. Results: We describe how each of the two techniques can be implemented with either a head tracker or an eye tracker, e.g. in cases when no physical head rotations are possible. Conclusions: We discuss their differences and limitations and we provide guidelines for the practical use of such augmented user interfaces.


Author(s):  
Carlos Santos ◽  
Alexandre Freitas ◽  
Brunelli Miranda ◽  
Nikolas Carneiro ◽  
Tiago Araujo ◽  
...  

Author(s):  
Maria Francesca Roig-Maimó ◽  
Cristina Manresa-Yee ◽  
Javier Varona ◽  
I. Scott MacKenzie
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document