Optical flow and inertial navigation system fusion in the UAV navigation

Author(s):  
A. Popov ◽  
A. Miller ◽  
B. Miller ◽  
K. Stepanyan
2013 ◽  
Vol 46 (30) ◽  
pp. 251-256 ◽  
Author(s):  
Simon Lynen ◽  
Sammy Omari ◽  
Matthias Wüest ◽  
Markus Achtelik ◽  
Roland Siegwart

2021 ◽  
Vol 13 (4) ◽  
pp. 772
Author(s):  
Changhui Xu ◽  
Zhenbin Liu ◽  
Zengke Li

Simultaneous Localization and Mapping (SLAM) has always been the focus of the robot navigation for many decades and becomes a research hotspot in recent years. Because a SLAM system based on vision sensor is vulnerable to environment illumination and texture, the problem of initial scale ambiguity still exists in a monocular SLAM system. The fusion of a monocular camera and an inertial measurement unit (IMU) can effectively solve the scale blur problem, improve the robustness of the system, and achieve higher positioning accuracy. Based on a monocular visual-inertial navigation system (VINS-mono), a state-of-the-art fusion performance of monocular vision and IMU, this paper designs a new initialization scheme that can calculate the acceleration bias as a variable during the initialization process so that it can be applied to low-cost IMU sensors. Besides, in order to obtain better initialization accuracy, visual matching positioning method based on feature point is used to assist the initialization process. After the initialization process, it switches to optical flow tracking visual positioning mode to reduce the calculation complexity. By using the proposed method, the advantages of feature point method and optical flow method can be fused. This paper, the first one to use both the feature point method and optical flow method, has better performance in the comprehensive performance of positioning accuracy and robustness under the low-cost sensors. Through experiments conducted with the EuRoc dataset and campus environment, the results show that the initial values obtained through the initialization process can be efficiently used for launching nonlinear visual-inertial state estimator and positioning accuracy of the improved VINS-mono has been improved by about 10% than VINS-mono.


Author(s):  
Mohammad K. Al-Sharman ◽  
Mohammad Amin Al-Jarrah ◽  
Mamoun Abdel-Hafez

The high estimated position error in current commercial-off-the-shelf (GPS/INS) impedes achieving precise autonomous takeoff and landing (TOL) flight operations. To overcome this problem, in this paper, we propose an integrated global positioning system (GPS)/inertial navigation system (INS)/optical flow (OF) solution in which the OF provides an accurate augmentation to the GPS/INS. To ensure accurate and robust OF augmentation, we have used a robust modeling method to estimate OF based on a set of real-time experiments conducted under various simulated helicopter-landing scenarios. Knowing that the accuracy of the OF measurements is dependent on the accuracy of the height measurements, we have developed a real-time testing environment to model and validate the obtained dynamic OF model at various heights. The performance of the obtained OF model matches the real OF sensor with 87.70% fitting accuracy. An accuracy of 0.006 m/s mean error between the real OF sensor velocity and the velocity of the OF model is also achieved. The velocity measurements of the obtained OF model and the position of the GPS/INS are used in performing a dynamic model-based sensor fusion algorithm. In the proposed solution, the OF sensor is engaged when the vehicle approaches a landing spot that is equipped with a predefined landing pattern. The proposed solution has succeeded in performing a helicopter auto TOL with a maximum position error of 27 cm.


2020 ◽  
Vol 75 (4) ◽  
pp. 336-341
Author(s):  
A. V. Rzhevskiy ◽  
O. V. Snigirev ◽  
Yu. V. Maslennikov ◽  
V. Yu. Slobodchikov

Sign in / Sign up

Export Citation Format

Share Document