scholarly journals Navigation of Underwater Drones and Integration of Acoustic Sensing with Onboard Inertial Navigation System

Drones ◽  
2021 ◽  
Vol 5 (3) ◽  
pp. 83
Author(s):  
Alexander Miller ◽  
Boris Miller ◽  
Gregory Miller

The navigation of autonomous underwater vehicles is a major scientific and technological challenge. The principal difficulty is the opacity of the water media for usual types of radiation except for the acoustic waves. Thus, an acoustic transducer (array) composed of an acoustic sonar is the only tool for external measurements of the AUV attitude and position. Another difficulty is the inconstancy of the speed of propagation of acoustic waves, which depends on the temperature, salinity, and pressure. For this reason, only the data fusion of the acoustic measurements with data from other onboard inertial navigation system sensors can provide the necessary estimation quality and robustness. This review presents common approaches to underwater navigation and also one novel method of velocity measurement. The latter is an analog of the well-known Optical Flow method but based on a sequence of sonar array measurements.

2013 ◽  
Vol 389 ◽  
pp. 758-764 ◽  
Author(s):  
Qi Wang ◽  
Dong Li ◽  
Zi Jia Zhang ◽  
Chang Song Yang

To improve the navigation precision of autonomous underwater vehicles, a terrain-aided strapdown inertial navigation based on Improved Unscented Kalman Filter (IUKF) is proposed in this paper. The characteristics of strapdown inertial navigation system and terrain-aided navigation system are described in this paper, and improved UKF method is applied to the information fusion. Simulation experiments of novel integrated navigation system proposed in the paper were carried out comparing to the traditional Kalman filtering methods. The experiment results suggest that the IUKF method is able to greatly improve the long-time navigation precision, relative to the traditional information fusion method.


2001 ◽  
Vol 15 (5) ◽  
pp. 521-532 ◽  
Author(s):  
Xiaoping Yun ◽  
Eric R. Bachmann ◽  
Suatarslan ◽  
Kadir Akyol ◽  
Robert B. McGhee

2021 ◽  
Vol 13 (4) ◽  
pp. 772
Author(s):  
Changhui Xu ◽  
Zhenbin Liu ◽  
Zengke Li

Simultaneous Localization and Mapping (SLAM) has always been the focus of the robot navigation for many decades and becomes a research hotspot in recent years. Because a SLAM system based on vision sensor is vulnerable to environment illumination and texture, the problem of initial scale ambiguity still exists in a monocular SLAM system. The fusion of a monocular camera and an inertial measurement unit (IMU) can effectively solve the scale blur problem, improve the robustness of the system, and achieve higher positioning accuracy. Based on a monocular visual-inertial navigation system (VINS-mono), a state-of-the-art fusion performance of monocular vision and IMU, this paper designs a new initialization scheme that can calculate the acceleration bias as a variable during the initialization process so that it can be applied to low-cost IMU sensors. Besides, in order to obtain better initialization accuracy, visual matching positioning method based on feature point is used to assist the initialization process. After the initialization process, it switches to optical flow tracking visual positioning mode to reduce the calculation complexity. By using the proposed method, the advantages of feature point method and optical flow method can be fused. This paper, the first one to use both the feature point method and optical flow method, has better performance in the comprehensive performance of positioning accuracy and robustness under the low-cost sensors. Through experiments conducted with the EuRoc dataset and campus environment, the results show that the initial values obtained through the initialization process can be efficiently used for launching nonlinear visual-inertial state estimator and positioning accuracy of the improved VINS-mono has been improved by about 10% than VINS-mono.


2020 ◽  
Vol 75 (4) ◽  
pp. 336-341
Author(s):  
A. V. Rzhevskiy ◽  
O. V. Snigirev ◽  
Yu. V. Maslennikov ◽  
V. Yu. Slobodchikov

Sign in / Sign up

Export Citation Format

Share Document