Visual industrial inspection using aerial robots

Author(s):  
Sammy Omari ◽  
Pascal Gohl ◽  
Michael Burri ◽  
Markus Achtelik ◽  
Roland Siegwart
2019 ◽  
Vol 11 (10) ◽  
pp. 1157 ◽  
Author(s):  
Jorge Fuentes-Pacheco ◽  
Juan Torres-Olivares ◽  
Edgar Roman-Rangel ◽  
Salvador Cervantes ◽  
Porfirio Juarez-Lopez ◽  
...  

Crop segmentation is an important task in Precision Agriculture, where the use of aerial robots with an on-board camera has contributed to the development of new solution alternatives. We address the problem of fig plant segmentation in top-view RGB (Red-Green-Blue) images of a crop grown under open-field difficult circumstances of complex lighting conditions and non-ideal crop maintenance practices defined by local farmers. We present a Convolutional Neural Network (CNN) with an encoder-decoder architecture that classifies each pixel as crop or non-crop using only raw colour images as input. Our approach achieves a mean accuracy of 93.85% despite the complexity of the background and a highly variable visual appearance of the leaves. We make available our CNN code to the research community, as well as the aerial image data set and a hand-made ground truth segmentation with pixel precision to facilitate the comparison among different algorithms.


2021 ◽  
Author(s):  
Iman Shafieenejad ◽  
Elham Dehghan Rouzi ◽  
Jamshid Sardari ◽  
Mohammad Siami Araghi ◽  
Amirhosein Esmaeili ◽  
...  

2021 ◽  
Vol 11 (9) ◽  
pp. 3921
Author(s):  
Paloma Carrasco ◽  
Francisco Cuesta ◽  
Rafael Caballero ◽  
Francisco J. Perez-Grau ◽  
Antidio Viguria

The use of unmanned aerial robots has increased exponentially in recent years, and the relevance of industrial applications in environments with degraded satellite signals is rising. This article presents a solution for the 3D localization of aerial robots in such environments. In order to truly use these versatile platforms for added-value cases in these scenarios, a high level of reliability is required. Hence, the proposed solution is based on a probabilistic approach that makes use of a 3D laser scanner, radio sensors, a previously built map of the environment and input odometry, to obtain pose estimations that are computed onboard the aerial platform. Experimental results show the feasibility of the approach in terms of accuracy, robustness and computational efficiency.


Sensors ◽  
2021 ◽  
Vol 21 (7) ◽  
pp. 2534
Author(s):  
Oualid Doukhi ◽  
Deok-Jin Lee

Autonomous navigation and collision avoidance missions represent a significant challenge for robotics systems as they generally operate in dynamic environments that require a high level of autonomy and flexible decision-making capabilities. This challenge becomes more applicable in micro aerial vehicles (MAVs) due to their limited size and computational power. This paper presents a novel approach for enabling a micro aerial vehicle system equipped with a laser range finder to autonomously navigate among obstacles and achieve a user-specified goal location in a GPS-denied environment, without the need for mapping or path planning. The proposed system uses an actor–critic-based reinforcement learning technique to train the aerial robot in a Gazebo simulator to perform a point-goal navigation task by directly mapping the noisy MAV’s state and laser scan measurements to continuous motion control. The obtained policy can perform collision-free flight in the real world while being trained entirely on a 3D simulator. Intensive simulations and real-time experiments were conducted and compared with a nonlinear model predictive control technique to show the generalization capabilities to new unseen environments, and robustness against localization noise. The obtained results demonstrate our system’s effectiveness in flying safely and reaching the desired points by planning smooth forward linear velocity and heading rates.


2000 ◽  
Vol 6 (S2) ◽  
pp. 1170-1171
Author(s):  
M. C. Henk ◽  
H. Silverman

LSU began introducing a prototype SCOPE-ON-A-ROPE (SOAR) to selected teachers in Louisiana and Tennessee three years ago as part of our K-12 outreach activities. It proved to be an invaluable aid to all K-12 classrooms as well as to college classrooms or laboratories in several disciplines. The SOAR is extremely easy to use in the normal classroom setting, but can also introduce sophisticated concepts usually possible only through complicated microscopy exercises with specialized instrumentation.The professional microscopist who occasionally teaches students how to use microscopes can only begin to appreciate the position of classroom teachers who are routinely faced with inadequate, insufficient microscopes for classes of 20- 30 students at a time. This SOAR, inspired by industrial inspection devices, aids the teacher in introducing valuable concepts in microscopy and scale while easily serving the functions of many different microscopes and accessories. It is a comfortably hand-held device that can be used capably even by a five-year-old to provide excellent,


Robotics ◽  
2019 ◽  
Vol 8 (2) ◽  
pp. 24 ◽  
Author(s):  
Hang Cui ◽  
Catherine Maguire ◽  
Amy LaViers

This paper presents a method for creating expressive aerial robots through an algorithmic procedure for creating variable motion under given task constraints. This work is informed by the close study of the Laban/Bartenieff movement system, and movement observation from this discipline will provide important analysis of the method, offering descriptive words and fitting contexts—a choreographic frame—for the motion styles produced. User studies that use utilize this qualitative analysis then validate that the method can be used to generate appropriate motion in in-home contexts. The accuracy of an individual descriptive word for the developed motion is up to 77% and context accuracy is up to 83%. A capacity for state discernment from motion profile is essential in the context of projects working toward developing in-home robots.


Author(s):  
Shehryar Khattak ◽  
Frank Mascarich ◽  
Tung Dang ◽  
Christos Papachristos ◽  
Kostas Alexis
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document