Visual odometry data fusion for indoor localization of an unmanned aerial vehicle

Author(s):  
Saptadeep Debnath ◽  
Jagadish Nayak
2021 ◽  
Author(s):  
Shuang Wu ◽  
Lei Deng ◽  
Lijie Guo ◽  
Yanjie Wu

Abstract Background: Leaf Area Index (LAI) is half of the amount of leaf area per unit horizontal ground surface area. Consequently, accurate vegetation extraction in remote sensing imagery is critical for LAI estimation. However, most studies do not fully exploit the advantages of Unmanned Aerial Vehicle (UAV) imagery with high spatial resolution, such as not removing the background (soil and shadow, etc.). Furthermore, the advancement of multi-sensor synchronous observation and integration technology allows for the simultaneous collection of canopy spectral, structural, and thermal data, making it possible for data fusion.Methods: To investigate the potential of high-resolution UAV imagery combined with multi-sensor data fusion in LAI estimation. High-resolution UAV imagery was obtained with a multi-sensor integrated MicaSense Altum camera to extract the wheat canopy's spectral, structural, and thermal features. After removing the soil background, all features were fused, and LAI was estimated using Random Forest and Support Vector Machine Regression.Result: The results show that: (1) the soil background reduced the accuracy of the LAI prediction, and soil background could be effectively removed by taking advantage of high-resolution UAV imagery. After removing the soil background, the LAI prediction accuracy improved significantly, R2 raised by about 0.27, and RMSE fell by about 0.476. (2) The fusion of multi-sensor synchronous observation data improved LAI prediction accuracy and achieved the best accuracy (R2 = 0.815 and RMSE = 1.023). (3) When compared to other variables, 23 CHM, NRCT, NDRE, and BLUE are crucial for LAI estimation. Even the simple Multiple Linear Regression model could achieve high prediction accuracy (R2 = 0.679 and RMSE = 1.231), providing inspiration for rapid and efficient LAI prediction.Conclusions: The method of this study can be transferred to other sites with more extensive areas or similar agriculture structures, which will facilitate agricultural production and management.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 919 ◽  
Author(s):  
Hao Du ◽  
Wei Wang ◽  
Chaowen Xu ◽  
Ran Xiao ◽  
Changyin Sun

The question of how to estimate the state of an unmanned aerial vehicle (UAV) in real time in multi-environments remains a challenge. Although the global navigation satellite system (GNSS) has been widely applied, drones cannot perform position estimation when a GNSS signal is not available or the GNSS is disturbed. In this paper, the problem of state estimation in multi-environments is solved by employing an Extended Kalman Filter (EKF) algorithm to fuse the data from multiple heterogeneous sensors (MHS), including an inertial measurement unit (IMU), a magnetometer, a barometer, a GNSS receiver, an optical flow sensor (OFS), Light Detection and Ranging (LiDAR), and an RGB-D camera. Finally, the robustness and effectiveness of the multi-sensor data fusion system based on the EKF algorithm are verified by field flights in unstructured, indoor, outdoor, and indoor and outdoor transition scenarios.


2020 ◽  
Vol 12 (9) ◽  
pp. 1357 ◽  
Author(s):  
Maitiniyazi Maimaitijiang ◽  
Vasit Sagan ◽  
Paheding Sidike ◽  
Ahmad M. Daloye ◽  
Hasanjan Erkbol ◽  
...  

Non-destructive crop monitoring over large areas with high efficiency is of great significance in precision agriculture and plant phenotyping, as well as decision making with regards to grain policy and food security. The goal of this research was to assess the potential of combining canopy spectral information with canopy structure features for crop monitoring using satellite/unmanned aerial vehicle (UAV) data fusion and machine learning. Worldview-2/3 satellite data were tasked synchronized with high-resolution RGB image collection using an inexpensive unmanned aerial vehicle (UAV) at a heterogeneous soybean (Glycine max (L.) Merr.) field. Canopy spectral information (i.e., vegetation indices) was extracted from Worldview-2/3 data, and canopy structure information (i.e., canopy height and canopy cover) was derived from UAV RGB imagery. Canopy spectral and structure information and their combination were used to predict soybean leaf area index (LAI), aboveground biomass (AGB), and leaf nitrogen concentration (N) using partial least squares regression (PLSR), random forest regression (RFR), support vector regression (SVR), and extreme learning regression (ELR) with a newly proposed activation function. The results revealed that: (1) UAV imagery-derived high-resolution and detailed canopy structure features, canopy height, and canopy coverage were significant indicators for crop growth monitoring, (2) integration of satellite imagery-based rich canopy spectral information with UAV-derived canopy structural features using machine learning improved soybean AGB, LAI, and leaf N estimation on using satellite or UAV data alone, (3) adding canopy structure information to spectral features reduced background soil effect and asymptotic saturation issue to some extent and led to better model performance, (4) the ELR model with the newly proposed activated function slightly outperformed PLSR, RFR, and SVR in the prediction of AGB and LAI, while RFR provided the best result for N estimation. This study introduced opportunities and limitations of satellite/UAV data fusion using machine learning in the context of crop monitoring.


Sensors ◽  
2018 ◽  
Vol 18 (12) ◽  
pp. 4363 ◽  
Author(s):  
Qijun Gu ◽  
Drew R. Michanowicz ◽  
Chunrong Jia

The unmanned aerial vehicle (UAV) offers great potential for collecting air quality data with high spatial and temporal resolutions. The objective of this study is to design and develop a modular UAV-based platform capable of real-time monitoring of multiple air pollutants. The system comprises five modules: the UAV, the ground station, the sensors, the data acquisition (DA) module, and the data fusion (DF) module. The hardware was constructed with off-the-shelf consumer parts and the open source software Ardupilot was used for flight control and data fusion. The prototype UAV system was tested in representative settings. Results show that this UAV platform can fly on pre-determined pathways with adequate flight time for various data collection missions. The system simultaneously collects air quality and high precision X-Y-Z data and integrates and visualizes them in a real-time manner. While the system can accommodate multiple gas sensors, UAV operations may electronically interfere with the performance of chemical-resistant sensors. Our prototype and experiments prove the feasibility of the system and show that it features a stable and high precision spatial-temporal platform for air sample collection. Future work should be focused on gas sensor development, plug-and-play interfaces, impacts of rotor wash, and all-weather designs.


Sign in / Sign up

Export Citation Format

Share Document