scholarly journals Application of the extended Kalman filter to Lidar pose estimation

2021 ◽  
Author(s):  
Marcin Kuryllo

Application of the extended Kalman filter to Lidar pose estimation

2021 ◽  
Author(s):  
Marcin Kuryllo

Application of the extended Kalman filter to Lidar pose estimation


2020 ◽  
Author(s):  
AYUKO SAITO ◽  
Satoru Kizawa ◽  
Yoshikazu Kobayashi ◽  
Kazuto Miyawaki

Abstract This paper presents an extended Kalman filter for pose estimation using noise covariance matrices based on sensor output. Compact and lightweight nine-axis motion sensors are used for motion analysis in widely various fields such as medical welfare and sports. A nine-axis motion sensor includes a three-axis gyroscope, a three-axis accelerometer, and a three-axis magnetometer. Information obtained from the three sensors is useful for estimating joint angles using the Kalman filter. The extended Kalman filter is used widely for state estimation because it can estimate the status with a small computational load. However, determining the process and observation noise covariance matrices in the extended Kalman filter is complicated. The noise covariance matrices in the extended Kalman filter were found for this study based on the sensor output. Postural change appears in the gyroscope output because the rotational motion of the joints produces human movement. Therefore, the process noise covariance matrix was determined based on the gyroscope output. An observation noise covariance matrix was determined based on the accelerometer and magnetometer output because the two sensors’ outputs were used as observation values. During a laboratory experiment, the lower limb joint angles of three participants were measured using an optical 3D motion analysis system and nine-axis motion sensors while participants were walking. The lower limb joint angles estimated using the extended Kalman filter with noise covariance matrices based on sensor output were generally consistent with results obtained from the optical 3D motion analysis system. Furthermore, the lower limb joint angles were measured using nine-axis motion sensors while participants were running in place for about 100 seconds. The experiment results demonstrated the effectiveness of the proposed method for human pose estimation.


2019 ◽  
Vol 158 ◽  
pp. 55-67 ◽  
Author(s):  
Francesco Cavenago ◽  
Pierluigi Di Lizia ◽  
Mauro Massari ◽  
Alexander Wittig

2015 ◽  
Vol 38 (9) ◽  
pp. 1625-1641 ◽  
Author(s):  
Nuno Filipe ◽  
Michail Kontitsis ◽  
Panagiotis Tsiotras

Sensors ◽  
2021 ◽  
Vol 21 (23) ◽  
pp. 7840
Author(s):  
Fabien Colonnier ◽  
Luca Della Vedova ◽  
Garrick Orchard

Event-based vision sensors show great promise for use in embedded applications requiring low-latency passive sensing at a low computational cost. In this paper, we present an event-based algorithm that relies on an Extended Kalman Filter for 6-Degree of Freedom sensor pose estimation. The algorithm updates the sensor pose event-by-event with low latency (worst case of less than 2 μs on an FPGA). Using a single handheld sensor, we test the algorithm on multiple recordings, ranging from a high contrast printed planar scene to a more natural scene consisting of objects viewed from above. The pose is accurately estimated under rapid motions, up to 2.7 m/s. Thereafter, an extension to multiple sensors is described and tested, highlighting the improved performance of such a setup, as well as the integration with an off-the-shelf mapping algorithm to allow point cloud updates with a 3D scene and enhance the potential applications of this visual odometry solution.


Sign in / Sign up

Export Citation Format

Share Document