A real-time head tracker for autostereoscopic display

Author(s):  
Song Guo ◽  
Phil Surman ◽  
Zhenfeng Zhuang ◽  
Xiao Wei Sun
Optik ◽  
2013 ◽  
Vol 124 (4) ◽  
pp. 297-300 ◽  
Author(s):  
Xiao-Qing Xu ◽  
Qiong-Hua Wang ◽  
Jun Liu ◽  
Jun Gu ◽  
Lei Li

1993 ◽  
Author(s):  
Steven C. Gustafson ◽  
Gordon R. Little ◽  
Thomas P. Staub ◽  
John S. Loomis ◽  
Jay M. Brown ◽  
...  

2021 ◽  
Vol 263 (1) ◽  
pp. 5071-5082
Author(s):  
William D'Andrea Fonseca ◽  
Davi Rocha Carvalho ◽  
Jacob Hollebon ◽  
Paulo Henrique Mareze ◽  
Filippo Maria Fazi

Binaural rendering is a technique that seeks to generate virtual auditory environments that replicate the natural listening experience, including the three-dimensional perception of spatialized sound sources. As such, real-time knowledge of the listener's position, or more specifically, their head and ear orientations allow the transfer of movement from the real world to virtual spaces, which consequently enables a richer immersion and interaction with the virtual scene. This study presents the use of a simple laptop integrated camera (webcam) as a head tracker sensor, disregarding the necessity to mount any hardware to the listener's head. The software was built on top of a state-of-the-art face landmark detection model, from Google's MediaPipe library for Python. Manipulations to the coordinate system are performed, in order to translate the origin from the camera to the center of the subject's head and adequately extract rotation matrices and Euler angles. Low-latency communication is enabled via User Datagram Protocol (UDP), allowing the head tracker to run in parallel and asynchronous with the main application. Empirical experiments have demonstrated reasonable accuracy and quick response, indicating suitability to real-time applications that do not necessarily require methodical precision.


1979 ◽  
Vol 44 ◽  
pp. 41-47
Author(s):  
Donald A. Landman

This paper describes some recent results of our quiescent prominence spectrometry program at the Mees Solar Observatory on Haleakala. The observations were made with the 25 cm coronagraph/coudé spectrograph system using a silicon vidicon detector. This detector consists of 500 contiguous channels covering approximately 6 or 80 Å, depending on the grating used. The instrument is interfaced to the Observatory’s PDP 11/45 computer system, and has the important advantages of wide spectral response, linearity and signal-averaging with real-time display. Its principal drawback is the relatively small target size. For the present work, the aperture was about 3″ × 5″. Absolute intensity calibrations were made by measuring quiet regions near sun center.


Author(s):  
Alan S. Rudolph ◽  
Ronald R. Price

We have employed cryoelectron microscopy to visualize events that occur during the freeze-drying of artificial membranes by employing real time video capture techniques. Artificial membranes or liposomes which are spherical structures within internal aqueous space are stabilized by water which provides the driving force for spontaneous self-assembly of these structures. Previous assays of damage to these structures which are induced by freeze drying reveal that the two principal deleterious events that occur are 1) fusion of liposomes and 2) leakage of contents trapped within the liposome [1]. In the past the only way to access these events was to examine the liposomes following the dehydration event. This technique allows the event to be monitored in real time as the liposomes destabilize and as water is sublimed at cryo temperatures in the vacuum of the microscope. The method by which liposomes are compromised by freeze-drying are largely unknown. This technique has shown that cryo-protectants such as glycerol and carbohydrates are able to maintain liposomal structure throughout the drying process.


Sign in / Sign up

Export Citation Format

Share Document