scholarly journals Tightly-Coupled Stereo Visual-Inertial Navigation Using Point and Line Features

Sensors ◽  
2015 ◽  
Vol 15 (6) ◽  
pp. 12816-12833 ◽  
Author(s):  
Xianglong Kong ◽  
Wenqi Wu ◽  
Lilian Zhang ◽  
Yujie Wang
2017 ◽  
Vol 70 (5) ◽  
pp. 1079-1097 ◽  
Author(s):  
Qigao Fan ◽  
Biwen Sun ◽  
Yan Sun ◽  
Yaheng Wu ◽  
Xiangpeng Zhuang

This paper proposes a novel sensor fusion approach using Ultra Wide Band (UWB) wireless radio and an Inertial Navigation System (INS), which aims to reduce the accumulated error of low-cost Micro-Electromechanical Systems (MEMS) Inertial Navigation Systems used for real-time navigation and tracking of mobile robots in a closed environment. A tightly-coupled model of INS/UWB is established within the integrated positioning system. A two-dimensional kinematic model of the mobile robot based on kinematics analysis is then established, and an Auto-Regressive (AR) algorithm is used to establish third-order error equations of the gyroscope and the accelerometer. An Improved Adaptive Kalman Filter (IAKF) algorithm is proposed. The orthogonality judgment method of innovation is used to identify the “outliers”, and a covariance matching technique is introduced to judge the filter state. The simulation results show that the IAKF algorithm has a higher positioning accuracy than the KF algorithm and the UWB system. Finally, static and dynamic experiments are performed using an indoor experimental platform. The results show that the INS/UWB integrated navigation system can achieve a positioning accuracy of within 0·24 m, which meets the requirements for practical conditions and is superior to other independent subsystems.


Sensors ◽  
2019 ◽  
Vol 19 (15) ◽  
pp. 3418 ◽  
Author(s):  
Junxiang Jiang ◽  
Xiaoji Niu ◽  
Ruonan Guo ◽  
Jingnan Liu

The fusion of visual and inertial measurements for motion tracking has become prevalent in the robotic community, due to its complementary sensing characteristics, low cost, and small space requirements. This fusion task is known as the vision-aided inertial navigation system problem. We present a novel hybrid sliding window optimizer to achieve information fusion for a tightly-coupled vision-aided inertial navigation system. It possesses the advantages of both the conditioning-based method and the prior-based method. A novel distributed marginalization method was also designed based on the multi-state constraints method with significant efficiency improvement over the traditional method. The performance of the proposed algorithm was evaluated with the publicly available EuRoC datasets and showed competitive results compared with existing algorithms.


2013 ◽  
Vol 46 (30) ◽  
pp. 251-256 ◽  
Author(s):  
Simon Lynen ◽  
Sammy Omari ◽  
Matthias Wüest ◽  
Markus Achtelik ◽  
Roland Siegwart

2019 ◽  
Vol 27 (3) ◽  
pp. 1084-1099 ◽  
Author(s):  
Jakob M. Hansen ◽  
Tor Arne Johansen ◽  
Nadezda Sokolova ◽  
Thor I. Fossen

2008 ◽  
Vol 41 (2) ◽  
pp. 15973-15978 ◽  
Author(s):  
M. Morgado ◽  
P. Oliveira ◽  
C. Silvestre ◽  
J.F. Vasconcelos

2018 ◽  
Vol 71 (6) ◽  
pp. 1312-1328 ◽  
Author(s):  
Qiangwen Fu ◽  
Yang Liu ◽  
Zhenbo Liu ◽  
Sihai Li ◽  
Bofan Guan

This paper describes a fully autonomous real-time in-motion alignment algorithm for Strapdown Inertial Navigation Systems (SINS) in land vehicle applications. Once the initial position is available, the vehicle can start a mission immediately with accurate attitude, position and velocity information determined within ten minutes. This is achieved by two tightly coupled stages, that is, real-time Double-vector Attitude Determination Coarse Alignment (DADCA) and Backtracking Fine Alignment (BFA). In the DADCA process, the vehicle motion is omitted to roughly estimate the attitude at the very start of the alignment. Meanwhile, attitude quaternions and velocity increments are extracted and recorded. The BFA process utilises the stored data and exploits the Non-Holonomic Constraints (NHC) of a vehicle to obtain virtual velocity measurements. A linear SINS/NHC Kalman filter with mounting angles as extended states is constructed to improve the fine alignment accuracy. The method is verified by three vehicle tests, which shows that the accuracy of alignment azimuth is 0·0358° (Root Mean Square, RMS) and the positioning accuracy is about 15 m (RMS) at the end of the alignment.


Sign in / Sign up

Export Citation Format

Share Document