scholarly journals Robust Tightly Coupled Pose Measurement Based on Multi-Sensor Fusion in Mobile Robot System

Sensors ◽  
2021 ◽  
Vol 21 (16) ◽  
pp. 5522
Author(s):  
Gang Peng ◽  
Zezao Lu ◽  
Jiaxi Peng ◽  
Dingxin He ◽  
Xinde Li ◽  
...  

Currently, simultaneous localization and mapping (SLAM) is one of the main research topics in the robotics field. Visual-inertia SLAM, which consists of a camera and an inertial measurement unit (IMU), can significantly improve robustness and enable scale weak-visibility, whereas monocular visual SLAM is scale-invisible. For ground mobile robots, the introduction of a wheel speed sensor can solve the scale weak-visibility problem and improve robustness under abnormal conditions. In this paper, a multi-sensor fusion SLAM algorithm using monocular vision, inertia, and wheel speed measurements is proposed. The sensor measurements are combined in a tightly coupled manner, and a nonlinear optimization method is used to maximize the posterior probability to solve the optimal state estimation. Loop detection and back-end optimization are added to help reduce or even eliminate the cumulative error of the estimated poses, thus ensuring global consistency of the trajectory and map. The outstanding contribution of this paper is that the wheel odometer pre-integration algorithm, which combines the chassis speed and IMU angular speed, can avoid the repeated integration caused by linearization point changes during iterative optimization; state initialization based on the wheel odometer and IMU enables a quick and reliable calculation of the initial state values required by the state estimator in both stationary and moving states. Comparative experiments were conducted in room-scale scenes, building scale scenes, and visual loss scenarios. The results showed that the proposed algorithm is highly accurate—2.2 m of cumulative error after moving 812 m (0.28%, loopback optimization disabled)—robust, and has an effective localization capability even in the event of sensor loss, including visual loss. The accuracy and robustness of the proposed method are superior to those of monocular visual inertia SLAM and traditional wheel odometers.

Author(s):  
Mohamed Atia

The art of multi-sensor processing, or “sensor-fusion,” is the ability to optimally infer state information from multiple noisy streams of data. One major application area where sensor fusion is commonly used is navigation technology. While global navigation satellite systems (GNSS) can provide centimeter-level location accuracy worldwide, they suffer from signal availability problems in dense urban environment and they hardly work indoors. While several alternative backups have been proposed, so far, no single sensor or technology can provide the desirable precise localization in such environments under reasonable costs and affordable infrastructures. Therefore, to navigate through these complex areas, combining sensors is beneficial. Common sensors used to augment/replace GNSS in complex environments include inertial measurement unit (IMU), range sensors, and vision sensors. This chapter discusses the design and implementation of tightly coupled sensor fusion of GNSS, IMU, and light detection and ranging (LiDAR) measurements to navigate in complex urban and indoor environments.


Sensors ◽  
2019 ◽  
Vol 19 (9) ◽  
pp. 2004 ◽  
Author(s):  
Linlin Xia ◽  
Qingyu Meng ◽  
Deru Chi ◽  
Bo Meng ◽  
Hanrui Yang

The development and maturation of simultaneous localization and mapping (SLAM) in robotics opens the door to the application of a visual inertial odometry (VIO) to the robot navigation system. For a patrol robot with no available Global Positioning System (GPS) support, the embedded VIO components, which are generally composed of an Inertial Measurement Unit (IMU) and a camera, fuse the inertial recursion with SLAM calculation tasks, and enable the robot to estimate its location within a map. The highlights of the optimized VIO design lie in the simplified VIO initialization strategy as well as the fused point and line feature-matching based method for efficient pose estimates in the front-end. With a tightly-coupled VIO anatomy, the system state is explicitly expressed in a vector and further estimated by the state estimator. The consequent problems associated with the data association, state optimization, sliding window and timestamp alignment in the back-end are discussed in detail. The dataset tests and real substation scene tests are conducted, and the experimental results indicate that the proposed VIO can realize the accurate pose estimation with a favorable initializing efficiency and eminent map representations as expected in concerned environments. The proposed VIO design can therefore be recognized as a preferred tool reference for a class of visual and inertial SLAM application domains preceded by no external location reference support hypothesis.


2021 ◽  
Vol 33 (1) ◽  
pp. 33-43
Author(s):  
Kazuhiro Funato ◽  
Ryosuke Tasaki ◽  
Hiroto Sakurai ◽  
Kazuhiko Terashima ◽  
◽  
...  

The authors have been developing a mobile robot to assist doctors in hospitals in managing medical tools and patient electronic medical records. The robot tracks behind a mobile medical worker while maintaining a constant distance from the worker. However, it was difficult to detect objects in the sensor’s invisible region, called occlusion. In this study, we propose a sensor fusion method to estimate the position of a robot tracking target indirectly by an inertial measurement unit (IMU) in addition to the direct measurement by an laser range finder (LRF) and develop a human tracking system to avoid occlusion by a mobile robot. Based on this, we perform detailed experimental verification of tracking a specified person to verify the validity of the proposed method.


Sensors ◽  
2019 ◽  
Vol 19 (22) ◽  
pp. 4912 ◽  
Author(s):  
Wei Liu ◽  
Dan Song ◽  
Zhipeng Wang ◽  
Kun Fang

Considering the inertial measurement unit (IMU) faults risk of an unmanned aerial vehicle (UAV), this paper provides an analysis of the error overboundings of position estimation in a tightly coupled IMU/global navigation satellite system (GNSS) integrated architecture under the IMU fault conditions using an error-state EKF-based approach and provides a comparison to a recently published EKF-based full state method. Simulation results show that both the error overboundings of the error-state and full-state EKFs can fit the state error against the IMU faults, but the error-state EKF is more suitable for UAV navigation system integrity assurance due to its higher calculation effciency. This study will be extended to the integrity monitoring of multisensor systems.


Sensors ◽  
2019 ◽  
Vol 19 (7) ◽  
pp. 1584 ◽  
Author(s):  
Yushan Li ◽  
Wenbo Zhang ◽  
Xuewu Ji ◽  
Chuanxiang Ren ◽  
Jian Wu

The curvature of the lane output by the vision sensor caused by shadows, changes in lighting and line breaking jumps over in a period of time, which leads to serious problems for unmanned driving control. It is particularly important to predict or compensate the real lane in real-time during sensor jumps. This paper presents a lane compensation method based on multi-sensor fusion of global positioning system (GPS), inertial measurement unit (IMU) and vision sensors. In order to compensate the lane, the cubic polynomial function of the longitudinal distance is selected as the lane model. In this method, a Kalman filter is used to estimate vehicle velocity and yaw angle by GPS and IMU measurements, and a vehicle kinematics model is established to describe vehicle motion. It uses the geometric relationship between vehicle and relative lane motion at the current moment to solve the coefficient of the lane polynomial at the next moment. The simulation and vehicle test results show that the prediction information can compensate for the failure of the vision sensor, and has good real-time, robustness and accuracy.


Author(s):  
Chang Chen ◽  
Hua Zhu

Purpose This study aims to present a visual-inertial simultaneous localization and mapping (SLAM) method for accurate positioning and navigation of mobile robots in the event of global positioning system (GPS) signal failure in buildings, trees and other obstacles. Design/methodology/approach In this framework, a feature extraction method distributes features on the image under texture-less scenes. The assumption of constant luminosity is improved, and the features are tracked by the optical flow to enhance the stability of the system. The camera data and inertial measurement unit data are tightly coupled to estimate the pose by nonlinear optimization. Findings The method is successfully performed on the mobile robot and steadily extracts the features on low texture environments and tracks features. The end-to-end error is 1.375 m with respect to the total length of 762 m. The authors achieve better relative pose error, scale and CPU load than ORB-SLAM2 on EuRoC data sets. Originality/value The main contribution of this study is the theoretical derivation and experimental application of a new visual-inertial SLAM method that has excellent accuracy and stability on weak texture scenes.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 590 ◽  
Author(s):  
Shizhuang Wang ◽  
Xingqun Zhan ◽  
Yawei Zhai ◽  
Baoyu Liu

To ensure navigation integrity for safety-critical applications, this paper proposes an efficient Fault Detection and Exclusion (FDE) scheme for tightly coupled navigation system of Global Navigation Satellite Systems (GNSS) and Inertial Navigation System (INS). Special emphasis is placed on the potential faults in the Kalman Filter state prediction step (defined as “filter fault”), which could be caused by the undetected faults occurring previously or the Inertial Measurement Unit (IMU) failures. The integration model is derived first to capture the features and impacts of GNSS faults and filter fault. To accommodate various fault conditions, two independent detectors, which are respectively designated for GNSS fault and filter fault, are rigorously established based on hypothesis-test methods. Following a detection event, the newly-designed exclusion function enables (a) identifying and removing the faulty measurements and (b) eliminating the effect of filter fault through filter recovery. Moreover, we also attempt to avoid wrong exclusion events by analyzing the underlying causes and optimizing the decision strategy for GNSS fault exclusion accordingly. The FDE scheme is validated through multiple simulations, where high efficiency and effectiveness have been achieved in various fault scenarios.


Sensors ◽  
2018 ◽  
Vol 19 (1) ◽  
pp. 46 ◽  
Author(s):  
N. Koksal ◽  
M. Jalalmaab ◽  
B. Fidan

In this paper, an infinite-horizon adaptive linear quadratic tracking (ALQT) control scheme is designed for optimal attitude tracking of a quadrotor unmanned aerial vehicle (UAV). The proposed control scheme is experimentally validated in the presence of real-world uncertainties in quadrotor system parameters and sensor measurement. The designed control scheme guarantees asymptotic stability of the close-loop system with the help of complete controllability of the attitude dynamics in applying optimal control signals. To achieve robustness against parametric uncertainties, the optimal tracking solution is combined with an online least squares based parameter identification scheme to estimate the instantaneous inertia of the quadrotor. Sensor measurement noises are also taken into account for the on-board Inertia Measurement Unit (IMU) sensors. To improve controller performance in the presence of sensor measurement noises, two sensor fusion techniques are employed, one based on Kalman filtering and the other based on complementary filtering. The ALQT controller performance is compared for the use of these two sensor fusion techniques, and it is concluded that the Kalman filter based approach provides less mean-square estimation error, better attitude estimation, and better attitude control performance.


Sign in / Sign up

Export Citation Format

Share Document