scholarly journals Monocular vision system for unmanned aerial vehicles

2013 ◽  
Author(s):  
Nasim Sepehri Boroujeni
Sensors ◽  
2015 ◽  
Vol 15 (7) ◽  
pp. 16848-16865 ◽  
Author(s):  
Kuo-Lung Huang ◽  
Chung-Cheng Chiu ◽  
Sheng-Yi Chiu ◽  
Yao-Jen Teng ◽  
Shu-Sheng Hao

Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 391
Author(s):  
Luca Bigazzi ◽  
Stefano Gherardini ◽  
Giacomo Innocenti ◽  
Michele Basso

In this paper, solutions for precise maneuvering of an autonomous small (e.g., 350-class) Unmanned Aerial Vehicles (UAVs) are designed and implemented from smart modifications of non expensive mass market technologies. The considered class of vehicles suffers from light load, and, therefore, only a limited amount of sensors and computing devices can be installed on-board. Then, to make the prototype capable of moving autonomously along a fixed trajectory, a “cyber-pilot”, able on demand to replace the human operator, has been implemented on an embedded control board. This cyber-pilot overrides the commands thanks to a custom hardware signal mixer. The drone is able to localize itself in the environment without ground assistance by using a camera possibly mounted on a 3 Degrees Of Freedom (DOF) gimbal suspension. A computer vision system elaborates the video stream pointing out land markers with known absolute position and orientation. This information is fused with accelerations from a 6-DOF Inertial Measurement Unit (IMU) to generate a “virtual sensor” which provides refined estimates of the pose, the absolute position, the speed and the angular velocities of the drone. Due to the importance of this sensor, several fusion strategies have been investigated. The resulting data are, finally, fed to a control algorithm featuring a number of uncoupled digital PID controllers which work to bring to zero the displacement from the desired trajectory.


Author(s):  
Oleksii Pikenin ◽  
Oleksander Marynoshenko

The chapter considers a description of developed control system for a group of unmanned aerial vehicles (UAV) that has a software capable to continue the flight in case of failures by using alternative control algorithms. Control system is developed on vision system by using methods of image recognition. Grouped coordinated flight of UAVs can significantly improve the performance of surveillance processes, such as reconnaissance, image recognition, aerial photography, industrial and environmental monitoring, etc. But to control a group of UAVs is a quite difficult task. In this chapter, the authors propose a model that corresponds to the principle of construction by the leading UAVs. In the case of using this model, the parameters of the system motion are determined by the direction of motion, the speed, and the acceleration of the UAVs' driving. The control system based on the methods of image recognition expands the possibilities of coordinating the group of UAVs.


Author(s):  
Mohammed Boulekchour ◽  
Nabil Aouf ◽  
Mark Richardson

In this paper, a system for real-time cooperative monocular visual motion estimation with multiple unmanned aerial vehicles is proposed. Distributing the system across a network of vehicles allows for efficient processing in terms of both computational time and estimation accuracy. The resulting global cooperative motion estimation employs state-of-the-art approaches for optimisation, individual motion estimation and registration. Three-view geometry algorithms are developed within a convex optimisation framework on-board the monocular vision systems of each vehicle. In the presented novel distributed cooperative strategy a visual loop-closure module is deployed to detect any simultaneously overlapping fields of view of two or more of the vehicles. A positive feedback from the latter module triggers the collaborative motion estimation algorithm between any vehicles involved in this loop-closure. This scenario creates a flexible stereo set-up which jointly optimises the motion estimates of all vehicles in the cooperative scheme. Prior to that, vehicle-to-vehicle relative pose estimates are recovered with a novel robust registration solution in a global optimisation framework. Furthermore, as a complementary solution, a robust non-linear H∞filter is designed to fuse measurements from the vehicles’ on-board inertial sensors with the visual estimates. The proposed cooperative navigation solution has been validated on real-world data, using two unmanned aerial vehicles equipped with monocular vision systems.


As inspired by birds flying in flocks, their vision is one of the most critical components to enable them to respond to their neighbor’s motion. In this paper, a novel approach in developing a Vision System as the primary sensor for relative positioning in flight formation of a Leader-Follower scenario is introduced. To use the system in real-time and on-board of the unmanned aerial vehicles (UAVs) with up to 1.5 kilograms of payload capacity, few computing platforms are reviewed and evaluated. The study shows that the NVIDIA Jetson TX1 is the most suited platform for this project. In addition, several different techniques and approaches for developing the algorithm is discussed as well. As per system requirements and conducted study, the algorithm that is developed for this Vision System is based on Tracking and On-Line Machine Learning approach. Flight test has been performed to check the accuracy and reliability of the system, and the results indicate the minimum accuracy of 83% of the vision system against ground truth data.


Author(s):  
Oleksii Pikenin ◽  
Oleksander Marynoshenko

The chapter considers a description of developed control system for a group of unmanned aerial vehicles (UAV) that has a software capable to continue the flight in case of failures by using alternative control algorithms. Control system is developed on vision system by using methods of image recognition. Grouped coordinated flight of UAVs can significantly improve the performance of surveillance processes, such as reconnaissance, image recognition, aerial photography, industrial and environmental monitoring, etc. But to control a group of UAVs is a quite difficult task. In this chapter, the authors propose a model that corresponds to the principle of construction by the leading UAVs. In the case of using this model, the parameters of the system motion are determined by the direction of motion, the speed, and the acceleration of the UAVs' driving. The control system based on the methods of image recognition expands the possibilities of coordinating the group of UAVs.


Sensors ◽  
2021 ◽  
Vol 21 (21) ◽  
pp. 7360
Author(s):  
Paweł Rzucidło ◽  
Grzegorz Jaromi ◽  
Tomasz Kapuściński ◽  
Damian Kordos ◽  
Tomasz Rogalski ◽  
...  

In the near future, the integration of manned and unmanned aerial vehicles into the common airspace will proceed. The changes taking place mean that the safety of light aircraft, ultralight aircraft and unmanned air vehicles (UAV) will become an increasing problem. The IDAAS project (Intruder Detection And collision Avoidance System) meets the new challenges as it aims to produce technically advanced detection and collision avoidance systems for light and unmanned aerial vehicles. The work discusses selected elements of research and practical tests of the intruder detection vision system, which is part the of IDAAS project. At the outset, the current formal requirements related to the necessity of installing anticollision systems on aircraft are presented. The concept of the IDAAS system and the structure of algorithms related to image processing are also discussed. The main part of the work presents the methodology developed for the needs of dedicated flight tests, its implementation and the results obtained. The initial tests of the IDAAS system carried out on an ultralight aircraft generally indicate the possibility of the effective detection of intruders in the airspace with the use of vision methods, although they also indicated the existence of conditions in which this detection may prove difficult or even impossible.


Sign in / Sign up

Export Citation Format

Share Document