Pathfinding Decision-Making Using Proximity Sensors, Depth Camera and Active IR Marker Tracking Data Fusion for Human Following Companion Robot

Author(s):  
Mark Tee Kit Tsun ◽  
Lau Bee Theng ◽  
Hudyjaya Siswoyo Jo
Author(s):  
Brett Pollard ◽  
Fabian Held ◽  
Lina Engelen ◽  
Lauren Powell ◽  
Richard de Dear

Sensors ◽  
2020 ◽  
Vol 20 (10) ◽  
pp. 2759 ◽  
Author(s):  
Lukas Wöhle ◽  
Marion Gebhard

This paper presents the use of eye tracking data in Magnetic AngularRate Gravity (MARG)-sensor based head orientation estimation. The approach presented here can be deployed in any motion measurement that includes MARG and eye tracking sensors (e.g., rehabilitation robotics or medical diagnostics). The challenge in these mostly indoor applications is the presence of magnetic field disturbances at the location of the MARG-sensor. In this work, eye tracking data (visual fixations) are used to enable zero orientation change updates in the MARG-sensor data fusion chain. The approach is based on a MARG-sensor data fusion filter, an online visual fixation detection algorithm as well as a dynamic angular rate threshold estimation for low latency and adaptive head motion noise parameterization. In this work we use an adaptation of Madgwicks gradient descent filter for MARG-sensor data fusion, but the approach could be used with any other data fusion process. The presented approach does not rely on additional stationary or local environmental references and is therefore self-contained. The proposed system is benchmarked against a Qualisys motion capture system, a gold standard in human motion analysis, showing improved heading accuracy for the MARG-sensor data fusion up to a factor of 0.5 while magnetic disturbance is present.


Author(s):  
Abner Cardoso da Silva ◽  
Cesar A. Sierra-Franco ◽  
Greis Francy M. Silva-Calpa ◽  
Felipe Carvalho ◽  
Alberto Barbosa Raposo

Sign in / Sign up

Export Citation Format

Share Document