People Tracking Using a Time-of-Flight Depth Sensor

Author(s):  
Alessandro Bevilacqua ◽  
Luigi Stefano ◽  
Pietro Azzari
Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4488
Author(s):  
Otto Korkalo ◽  
Tapio Takala

Depth cameras are widely used in people tracking applications. They typically suffer from significant range measurement noise, which causes uncertainty in the detections made of the people. The data fusion, state estimation and data association tasks require that the measurement uncertainty is modelled, especially in multi-sensor systems. Measurement noise models for different kinds of depth sensors have been proposed, however, the existing approaches require manual calibration procedures which can be impractical to conduct in real-life scenarios. In this paper, we present a new measurement noise model for depth camera-based people tracking. In our tracking solution, we utilise the so-called plan-view approach, where the 3D measurements are transformed to the floor plane, and the tracking problem is solved in 2D. We directly model the measurement noise in the plan-view domain, and the errors that originate from the imaging process and the geometric transformations of the 3D data are combined. We also present a method for directly defining the noise models from the observations. Together with our depth sensor network self-calibration routine, the approach allows fast and practical deployment of depth-based people tracking systems.


Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1156
Author(s):  
Eu-Tteum Baek ◽  
Hyung-Jeong Yang ◽  
Soo-Hyung Kim ◽  
Gueesang Lee ◽  
Hieyong Jeong

A distance map captured using a time-of-flight (ToF) depth sensor has fundamental problems, such as ambiguous depth information in shiny or dark surfaces, optical noise, and mismatched boundaries. Severe depth errors exist in shiny and dark surfaces owing to excess reflection and excess absorption of light, respectively. Dealing with this problem has been a challenge due to the inherent hardware limitations of ToF, which measures the distance using the number of reflected photons. This study proposes a distance error correction method using three ToF sensors, set to different integration times to address the ambiguity in depth information. First, the three ToF depth sensors are installed horizontally at different integration times to capture distance maps at different integration times. Given the amplitude maps and error regions are estimated based on the amount of light, the estimated error regions are refined by exploiting the accurate depth information from the neighboring depth sensors that use different integration times. Moreover, we propose a new optical noise reduction filter that considers the distribution of the depth information biased toward one side. Experimental results verified that the proposed method overcomes the drawbacks of ToF cameras and provides enhanced distance maps.


2013 ◽  
Vol 48 (2) ◽  
pp. 559-572 ◽  
Author(s):  
Cristiano Niclass ◽  
Mineki Soga ◽  
Hiroyuki Matsubara ◽  
Satoru Kato ◽  
Manabu Kagami
Keyword(s):  

2018 ◽  
Vol 15 (1) ◽  
pp. 172988141775369 ◽  
Author(s):  
Dong-Hyun Lee ◽  
Brian Coltin ◽  
Theodore Morse ◽  
In-Won Park ◽  
Lorenzo Flückiger ◽  
...  

We present a handrail detection and pose estimation algorithm for the free-flying Astrobee robots that will operate inside the International Space Station. The Astrobee will be equipped with a single time-of-flight depth sensor and a compliant perching arm to grab the International Space Station handrails. Autonomous perching enables a free-flying robot to minimize power consumption by holding its position without using propulsion. Astrobee is a small robot with many competing demands on its computing, power, and volume resources. Therefore, for perching, we were limited to using a single compact sensor and a lightweight detection algorithm. Moreover, the handrails on the International Space Station are surrounded by various instruments and cables, and the lighting conditions change significantly depending on the light sources, time, and robot location. The proposed algorithm uses a time-of-flight depth sensor for handrail perception under varying lighting conditions and utilizes the geometric characteristics of the handrails for robust detection and pose estimation. We demonstrate the robustness and accuracy of the algorithm in various environment scenarios.


Sign in / Sign up

Export Citation Format

Share Document