scholarly journals Dynamics-Based Modified Fast Simultaneous Localization and Mapping for Unmanned Aerial Vehicles With Joint Inertial Sensor Bias and Drift Estimation

IEEE Access ◽  
2021 ◽  
Vol 9 ◽  
pp. 120247-120260
Author(s):  
Nargess Sadeghzadeh-Nokhodberiz ◽  
Aydin Can ◽  
Rustam Stolkin ◽  
Allahyar Montazeri
Sensors ◽  
2020 ◽  
Vol 20 (19) ◽  
pp. 5570
Author(s):  
Yiming Ding ◽  
Zhi Xiong ◽  
Wanling Li ◽  
Zhiguo Cao ◽  
Zhengchun Wang

The combination of biomechanics and inertial pedestrian navigation research provides a very promising approach for pedestrian positioning in environments where Global Positioning System (GPS) signal is unavailable. However, in practical applications such as fire rescue and indoor security, the inertial sensor-based pedestrian navigation system is facing various challenges, especially the step length estimation errors and heading drift in running and sprint. In this paper, a trinal-node, including two thigh-worn inertial measurement units (IMU) and one waist-worn IMU, based simultaneous localization and occupation grid mapping method is proposed. Specifically, the gait detection and segmentation are realized by the zero-crossing detection of the difference of thighs pitch angle. A piecewise function between the step length and the probability distribution of waist horizontal acceleration is established to achieve accurate step length estimation both in regular walking and drastic motions. In addition, the simultaneous localization and mapping method based on occupancy grids, which involves the historic trajectory to improve the pedestrian’s pose estimation is introduced. The experiments show that the proposed trinal-node pedestrian inertial odometer can identify and segment each gait cycle in the walking, running, and sprint. The average step length estimation error is no more than 3.58% of the total travel distance in the motion speed from 1.23 m/s to 3.92 m/s. In combination with the proposed simultaneous localization and mapping method based on the occupancy grid, the localization error is less than 5 m in a single-story building of 2643.2 m2.


Sensors ◽  
2017 ◽  
Vol 17 (4) ◽  
pp. 802 ◽  
Author(s):  
Elena López ◽  
Sergio García ◽  
Rafael Barea ◽  
Luis Bergasa ◽  
Eduardo Molinos ◽  
...  

2015 ◽  
Vol 40 (5) ◽  
pp. 881-902 ◽  
Author(s):  
Pedro Lourenço ◽  
Bruno J. Guerreiro ◽  
Pedro Batista ◽  
Paulo Oliveira ◽  
Carlos Silvestre

Proceedings ◽  
2018 ◽  
Vol 4 (1) ◽  
pp. 44 ◽  
Author(s):  
Ankit Ravankar ◽  
Abhijeet Ravankar ◽  
Yukinori Kobayashi ◽  
Takanori Emaru

Mapping and exploration are important tasks of mobile robots for various applications such as search and rescue, inspection, and surveillance. Unmanned aerial vehicles (UAVs) are more suited for such tasks because they have a large field of view compared to ground robots. Autonomous operation of UAVs is desirable for exploration in unknown environments. In such environments, the UAV must make a map of the environment and simultaneously localize itself in it which is commonly known as the SLAM (simultaneous localization and mapping) problem. This is also required to safely navigate between open spaces, and make informed decisions about the exploration targets. UAVs have physical constraints including limited payload, and are generally equipped with low-spec embedded computational devices and sensors. Therefore, it is often challenging to achieve robust SLAM on UAVs which also affects exploration. In this paper, we present an autonomous exploration of UAVs in completely unknown environments using low cost sensors such as LIDAR and an RGBD camera. A sensor fusion method is proposed to build a dense 3D map of the environment. Multiple images from the scene are geometrically aligned as the UAV explores the environment, and then a frontier exploration technique is used to search for the next target in the mapped area to explore the maximum area possible. The results show that the proposed algorithm can build precise maps even with low-cost sensors, and explore the environment efficiently.


Drones ◽  
2021 ◽  
Vol 5 (4) ◽  
pp. 121
Author(s):  
Buğra ŞİMŞEK ◽  
Hasan Şakir BİLGE

Localization and mapping technologies are of great importance for all varieties of Unmanned Aerial Vehicles (UAVs) to perform their operations. In the near future, it is planned to increase the use of micro/nano-size UAVs. Such vehicles are sometimes expendable platforms, and reuse may not be possible. Compact, mounted and low-cost cameras are preferred in these UAVs due to weight, cost and size limitations. Visual simultaneous localization and mapping (vSLAM) methods are used for providing situational awareness of micro/nano-size UAVs. Fast rotational movements that occur during flight with gimbal-free, mounted cameras cause motion blur. Above a certain level of motion blur, tracking losses exist, which causes vSLAM algorithms not to operate effectively. In this study, a novel vSLAM framework is proposed that prevents the occurrence of tracking losses in micro/nano-UAVs due to the motion blur. In the proposed framework, the blur level of the frames obtained from the platform camera is determined and the frames whose focus measure score is below the threshold are restored by specific motion-deblurring methods. The major reasons of tracking losses have been analyzed with experimental studies, and vSLAM algorithms have been made durable by our studied framework. It has been observed that our framework can prevent tracking losses at 5, 10 and 20 fps processing speeds. vSLAM algorithms continue to normal operations at those processing speeds that have not been succeeded before using standard vSLAM algorithms, which can be considered as a superiority of our study.


2014 ◽  
pp. 1183-1212
Author(s):  
Seungho Yoon ◽  
Seungkeun Kim ◽  
Jonghee Bae ◽  
Youdan Kim ◽  
Eung Tai Kim

Sign in / Sign up

Export Citation Format

Share Document