Robust L∞ convex optimisation for UAVs cooperative motion estimation

Author(s):  
Mohammed Boulekchour ◽  
Nabil Aouf ◽  
Mark Richardson

In this paper, a system for real-time cooperative monocular visual motion estimation with multiple unmanned aerial vehicles is proposed. Distributing the system across a network of vehicles allows for efficient processing in terms of both computational time and estimation accuracy. The resulting global cooperative motion estimation employs state-of-the-art approaches for optimisation, individual motion estimation and registration. Three-view geometry algorithms are developed within a convex optimisation framework on-board the monocular vision systems of each vehicle. In the presented novel distributed cooperative strategy a visual loop-closure module is deployed to detect any simultaneously overlapping fields of view of two or more of the vehicles. A positive feedback from the latter module triggers the collaborative motion estimation algorithm between any vehicles involved in this loop-closure. This scenario creates a flexible stereo set-up which jointly optimises the motion estimates of all vehicles in the cooperative scheme. Prior to that, vehicle-to-vehicle relative pose estimates are recovered with a novel robust registration solution in a global optimisation framework. Furthermore, as a complementary solution, a robust non-linear H∞filter is designed to fuse measurements from the vehicles’ on-board inertial sensors with the visual estimates. The proposed cooperative navigation solution has been validated on real-world data, using two unmanned aerial vehicles equipped with monocular vision systems.

2019 ◽  
Vol 9 (15) ◽  
pp. 3196 ◽  
Author(s):  
Lidia María Belmonte ◽  
Rafael Morales ◽  
Antonio Fernández-Caballero

Personal assistant robots provide novel technological solutions in order to monitor people’s activities, helping them in their daily lives. In this sense, unmanned aerial vehicles (UAVs) can also bring forward a present and future model of assistant robots. To develop aerial assistants, it is necessary to address the issue of autonomous navigation based on visual cues. Indeed, navigating autonomously is still a challenge in which computer vision technologies tend to play an outstanding role. Thus, the design of vision systems and algorithms for autonomous UAV navigation and flight control has become a prominent research field in the last few years. In this paper, a systematic mapping study is carried out in order to obtain a general view of this subject. The study provides an extensive analysis of papers that address computer vision as regards the following autonomous UAV vision-based tasks: (1) navigation, (2) control, (3) tracking or guidance, and (4) sense-and-avoid. The works considered in the mapping study—a total of 144 papers from an initial set of 2081—have been classified under the four categories above. Moreover, type of UAV, features of the vision systems employed and validation procedures are also analyzed. The results obtained make it possible to draw conclusions about the research focuses, which UAV platforms are mostly used in each category, which vision systems are most frequently employed, and which types of tests are usually performed to validate the proposed solutions. The results of this systematic mapping study demonstrate the scientific community’s growing interest in the development of vision-based solutions for autonomous UAVs. Moreover, they will make it possible to study the feasibility and characteristics of future UAVs taking the role of personal assistants.


2015 ◽  
Vol 9 (5) ◽  
Author(s):  
Nikolay Vladimirovich Kim ◽  
Nikolay Evgenievich Bodunkov ◽  
Roman Igorevich Cherkezov

Sign in / Sign up

Export Citation Format

Share Document