73-2: A Wide View Glass-less 3D Display with Head-Tracking System for Horizontal and Vertical Directions

2016 ◽  
Vol 47 (1) ◽  
pp. 990-993 ◽  
Author(s):  
Daichi Suzuki ◽  
Takeo Koito ◽  
Taiki Kasai ◽  
Hiroki Sugiyama ◽  
Shuji Hayashi ◽  
...  
2013 ◽  
Vol 28 (2) ◽  
pp. 233-237
Author(s):  
陈瑞改 CHEN Rui-gai ◽  
陶宇虹 TAO Yu-hong ◽  
谢佳 XIE Jia ◽  
张永栋 ZHANG Yong-dong ◽  
李曙新 LI Shu-xin

Drones ◽  
2019 ◽  
Vol 3 (2) ◽  
pp. 37 ◽  
Author(s):  
Rizwan ◽  
Shehzad ◽  
Awais

Air transport is the fastest way to reach areas with no direct land routes for ambulances. This paper presents the development of a quadcopter-based rapid response unit in an efficient aerial aid system to eliminate the delay time for first aid supplies. The system comprises a health monitoring and calling system for a field person working in open areas and a base station with the quadcopter. In an uncertain situation, the quadcopter is deployed from the base station towards the field person for immediate help through the specified path using constant Global System for Mobile (GSM)- and Global Positioning System (GPS)-based connections. The entire operation can be monitored at the base station with a Virtual Reality (VR) head-tracking system supported by a smartphone. The camera installed on the quadcopter is synchronized with the operator’s head movement while wearing a VR head-tracking system at the base station. Moreover, an Infrared (IR)-based obstacle-evasion model is implemented separately to explain the working of the autonomous collision-avoidance system. The system was tested, which confirmed the reduction in the response time to supply aid to the desired locations.


2003 ◽  
Vol 12 (1) ◽  
pp. 1-18 ◽  
Author(s):  
Winyu Chinthammit ◽  
Eric J. Seibel ◽  
Thomas A. Furness

The operation and performance of a six degree-of-freedom (DOF) shared-aperture tracking system with image overlay is described. This unique tracking technology shares the same aperture or scanned optical beam with the visual display, virtual retinal display (VRD). This display technology provides high brightness in an AR helmet-mounted display, especially in the extreme environment of a military cockpit. The VRD generates an image by optically scanning visible light directly to the viewer's eye. By scanning both visible and infrared light, the head-worn display can be directly coupled to a head-tracking system. As a result, the proposed tracking system requires minimal calibration between the user's viewpoint and the tracker's viewpoint. This paper demonstrates that the proposed shared-aperture tracking system produces high accuracy and computational efficiency. The current proof-of-concept system has a precision of +/− 0.05 and +/− 0.01 deg. in the horizontal and vertical axes, respectively. The static registration error was measured to be 0.08 +/− 0.04 and 0.03 +/− 0.02 deg. for the horizontal and vertical axes, respectively. The dynamic registration error or the system latency was measured to be within 16.67 ms, equivalent to our display refresh rate of 60 Hz. In all testing, the VRD was fixed and the calibrated motion of a robot arm was tracked. By moving the robot arm within a restricted volume, this real-time shared-aperture method of tracking was extended to six-DOF measurements. Future AR applications of our shared-aperture tracking and display system will be highly accurate head tracking when the VRD is helmet mounted and worn within an enclosed space, such as an aircraft cockpit.


Sensors ◽  
2009 ◽  
Vol 9 (11) ◽  
pp. 8924-8943 ◽  
Author(s):  
Fernando Caballero ◽  
Iván Maza ◽  
Roberto Molina ◽  
David Esteban ◽  
Aníbal Ollero

2001 ◽  
Vol 10 (1) ◽  
pp. 1-21 ◽  
Author(s):  
Greg Welch ◽  
Gary Bishop ◽  
Leandra Vicci ◽  
Stephen Brumback ◽  
Kurtis Keller ◽  
...  

Since the early 1980s, the Tracker Project at the University of North Carolina at Chapel Hill has been working on wide-area head tracking for virtual and augmented environments. Our long-term goal has been to achieve the high performance required for accurate visual simulation throughout our entire laboratory, beyond into the hallways, and eventually even outdoors. In this article, we present results and a complete description of our most recent electro-optical system, the HiBall Tracking System. In particular, we discuss motivation for the geometric configuration and describe the novel optical, mechanical, electronic, and algorithmic aspects that enable unprecedented speed, resolution, accuracy, robustness, and flexibility.


2014 ◽  
Vol 4 (2) ◽  
pp. 1
Author(s):  
Vitor Reus ◽  
Márcio Mello ◽  
Luciana Nedel ◽  
Anderson Maciel

Head-mounted displays (HMD) allow a personal and immersive viewing of virtual environments, and can be used with almost any desktop computer. Most HMDs have inertial sensors embedded for tracking the user head rotations. These low-cost sensors have high quality and availability. However, even if they are very sensitive and precise, inertial sensors work with incremental information, easily introducing errors in the system. The most relevant is that head tracking suffers from drifting. In this paper we present important limitations that still prevent the wide use of inertial sensors for tracking. For instance, to compensate for the drifting, users of HMD-based immersive VEs move away from their suitable pose. We also propose a software solution for two problems: prevent the occurrence of drifting in incremental sensors, and avoid the user from move its body in relation to another tracking system that uses absolute sensors (e.g. MS Kinect). We analyze and evaluate our solutions experimentally, including user tests. Results show that our comfortable pose function is effective on eliminating drifting, and that it can be inverted and applied also to prevent the user from moving their body away of the absolute sensor range. The efficiency and accuracy of this method makes it suitable for a number of applications in immersive VR.


2019 ◽  
Author(s):  
Walter Vanzella ◽  
Natalia Grion ◽  
Daniele Bertolini ◽  
Andrea Perissinotto ◽  
Davide Zoccolan

AbstractTracking head’s position and orientation of small mammals is crucial in many behavioral neurophysiology studies. Yet, full reconstruction of the head’s pose in 3D is a challenging problem that typically requires implanting custom headsets made of multiple LEDs or inertial units. These assemblies need to be powered in order to operate, thus preventing wireless experiments, and, while suitable to study navigation in large arenas, their application is unpractical in the narrow operant boxes employed in perceptual studies. Here we propose an alternative approach, based on passively imaging a 3D-printed structure, painted with a pattern of black dots over a white background. We show that this method is highly precise and accurate and we demonstrate that, given its minimal weight and encumbrance, it can be used to study how rodents sample sensory stimuli during a perceptual discrimination task and how hippocampal place cells represent head position over extremely small spatial scales.


Sign in / Sign up

Export Citation Format

Share Document