scholarly journals Applications of UAS in Crop Biomass Monitoring: A Review

2021 ◽  
Vol 12 ◽  
Author(s):  
Tianhai Wang ◽  
Yadong Liu ◽  
Minghui Wang ◽  
Qing Fan ◽  
Hongkun Tian ◽  
...  

Biomass is an important indicator for evaluating crops. The rapid, accurate and nondestructive monitoring of biomass is the key to smart agriculture and precision agriculture. Traditional detection methods are based on destructive measurements. Although satellite remote sensing, manned airborne equipment, and vehicle-mounted equipment can nondestructively collect measurements, they are limited by low accuracy, poor flexibility, and high cost. As nondestructive remote sensing equipment with high precision, high flexibility, and low-cost, unmanned aerial systems (UAS) have been widely used to monitor crop biomass. In this review, UAS platforms and sensors, biomass indices, and data analysis methods are presented. The improvements of UAS in monitoring crop biomass in recent years are introduced, and multisensor fusion, multi-index fusion, the consideration of features not directly related to monitoring biomass, the adoption of advanced algorithms and the use of low-cost sensors are reviewed to highlight the potential for monitoring crop biomass with UAS. Considering the progress made to solve this type of problem, we also suggest some directions for future research. Furthermore, it is expected that the challenge of UAS promotion will be overcome in the future, which is conducive to the realization of smart agriculture and precision agriculture.

Electronics ◽  
2021 ◽  
Vol 10 (4) ◽  
pp. 517
Author(s):  
Seong-heum Kim ◽  
Youngbae Hwang

Owing to recent advancements in deep learning methods and relevant databases, it is becoming increasingly easier to recognize 3D objects using only RGB images from single viewpoints. This study investigates the major breakthroughs and current progress in deep learning-based monocular 3D object detection. For relatively low-cost data acquisition systems without depth sensors or cameras at multiple viewpoints, we first consider existing databases with 2D RGB photos and their relevant attributes. Based on this simple sensor modality for practical applications, deep learning-based monocular 3D object detection methods that overcome significant research challenges are categorized and summarized. We present the key concepts and detailed descriptions of representative single-stage and multiple-stage detection solutions. In addition, we discuss the effectiveness of the detection models on their baseline benchmarks. Finally, we explore several directions for future research on monocular 3D object detection.


Sensors ◽  
2019 ◽  
Vol 19 (22) ◽  
pp. 4916 ◽  
Author(s):  
Qiaoyun Wu ◽  
Yunzhe Zhang ◽  
Qian Yang ◽  
Ning Yuan ◽  
Wei Zhang

The vital importance of rapid and accurate detection of food borne pathogens has driven the development of biosensor to prevent food borne illness outbreaks. Electrochemical DNA biosensors offer such merits as rapid response, high sensitivity, low cost, and ease of use. This review covers the following three aspects: food borne pathogens and conventional detection methods, the design and fabrication of electrochemical DNA biosensors and several techniques for improving sensitivity of biosensors. We highlight the main bioreceptors and immobilizing methods on sensing interface, electrochemical techniques, electrochemical indicators, nanotechnology, and nucleic acid-based amplification. Finally, in view of the existing shortcomings of electrochemical DNA biosensors in the field of food borne pathogen detection, we also predict and prospect future research focuses from the following five aspects: specific bioreceptors (improving specificity), nanomaterials (enhancing sensitivity), microfluidic chip technology (realizing automate operation), paper-based biosensors (reducing detection cost), and smartphones or other mobile devices (simplifying signal reading devices).


Author(s):  
Charles Marseille ◽  
Martin Aubé ◽  
Africa Barreto Velasco ◽  
Alexandre Simoneau

The aerosol optical depth is an important indicator of aerosol particle properties and associated radiative impacts. AOD determination is therefore very important to achieve relevant climate modeling. Most remote sensing techniques to retrieve aerosol optical depth are applicable to daytime given the high level of light available. The night represents half of the time but in such conditions only a few remote sensing techniques are available. Among these techniques, the most reliable are moon photometers and star photometers. In this paper, we attempt to fill gaps in the aerosol detection performed with the aforementioned techniques using night sky brightness measurements during moonless nights with the novel CoSQM: a portable, low cost and open-source multispectral photometer. In this paper, we present an innovative method for estimating the aerosol optical depth by using an empirical relationship between the zenith night sky brightness measured at night with the CoSQM and the aerosol optical depth retrieved at daytime from the AErosol Robotic NETwork. Such a method is especially suited to light-polluted regions with light pollution sources located within a few kilometers of the observation site. A coherent day-to-night aerosol optical depth and Ångström Exponent evolution in a set of 354 days and nights from August 2019 to February 2021 was verified at the location of Santa Cruz de Tenerife on the island of Tenerife, Spain. The preliminary uncertainty of this technique was evaluated using the variance under stable day-to-night conditions, set at 0.02 for aerosol optical depth and 0.75 for Ångström Exponent. These results indicate the set of CoSQM and the proposed methodology appear to be a promising tool to add new information on the aerosol optical properties at night, which could be of key importance to improve climate predictions.


2020 ◽  
Vol 10 (19) ◽  
pp. 6668
Author(s):  
Laura García ◽  
Lorena Parra ◽  
Jose M. Jimenez ◽  
Jaime Lloret ◽  
Pedro V. Mauri ◽  
...  

The increase in the world population has led to new needs for food. Precision Agriculture (PA) is one of the focuses of these policies to optimize the crops and facilitate crop management using technology. Drones have been gaining popularity in PA to perform remote sensing activities such as photo and video capture as well as other activities such as fertilization or scaring animals. These drones could be used as a mobile gateway as well, benefiting from its already designed flight plan. In this paper, we evaluate the adequacy of remote sensing drones to perform gateway functionalities, providing a guide for choosing the best drone parameters for successful WiFi data transmission between sensor nodes and the gateway in PA systems for crop monitoring and management. The novelty of this paper compared with existing mobile gateway proposals is that we are going to test the performance of the drone that is acting as a remote sensing tool to carry a low-cost gateway node to gather the data from the nodes deployed on the field. Taking this in mind, simulations of different scenarios were performed to determine if the data can be transmitted correctly or not considering different flying parameters such as speed (from 1 to 20 m/s) and flying height (from 4 to 104 m) and wireless sensor network parameters such as node density (1 node each 60 m2 to 1 node each 5000 m2) and antenna coverage (25 to 200 m). We have calculated the time that each node remains with connectivity and the time required to send the data to estimate if the connection will be bad, good, or optimal. Results point out that for the maximum node density, there is only one combination that offers good connectivity (lowest velocity, the flying height of 24 m, and antenna with 25 m of coverage). For the other node densities, several combinations of flying height and antenna coverage allows good and optimal connectivity.


Author(s):  
M. Hassanein ◽  
M. Khedr ◽  
N. El-Sheimy

<p><strong>Abstract.</strong> Precision Agriculture (PA) management systems are considered among the top ten revolutions in the agriculture industry during the last couple decades. Generally, the PA is a management system that aims to integrate different technologies as navigation and imagery systems to control the use of the agriculture industry inputs aiming to enhance the quality and quantity of its output, while preserving the surrounding environment from any harm that might be caused due to the use of these inputs. On the other hand, during the last decade, Unmanned Aerial Vehicles (UAVs) showed great potential to enhance the use of remote sensing and imagery sensors for different PA applications such as weed management, crop health monitoring, and crop row detection. UAV imagery systems are capable to fill the gap between aerial and terrestrial imagery systems and enhance the use of imagery systems and remote sensing for PA applications. One of the important PA applications that uses UAV imagery systems, and which drew lots of interest is the crop row detection, especially that such application is important for other applications such as weed detection and crop yield predication. This paper introduces a new crop row detection methodology using low-cost UAV RGB imagery system. The methodology has three main steps. First, the RGB images are converted into HSV color space and the Hue image are extracted. Then, different sections are generated with different orientation angles in the Hue images. For each section, using the PCA of the Hue values in the section, an analysis can be performed to evaluate the variances of the Hue values in the section. The crop row orientation angle is detected as the same orientation angle of the section that provides the minimum variances of Hue values. Finally, a scan line is generated over the Hue image with the same orientation angle of the crop rows. The scan line computes the average of the Hue values for each line in the Hue image similar to the detected crop row orientation. The generated values provide a graph full of peaks and valleys which represent the crop and soil rows. The proposed methodology was evaluated using different RGB images acquired by low-cost UAV for a Canola field. The images were taken at different flight heights and different dates. The achieved results proved the ability of the proposed methodology to detect the crop rows at different cases.</p>


Author(s):  
R. A. Oliveira ◽  
E. Khoramshahi ◽  
J. Suomalainen ◽  
T. Hakala ◽  
N. Viljanen ◽  
...  

The use of drones and photogrammetric technologies are increasing rapidly in different applications. Currently, drone processing workflow is in most cases based on sequential image acquisition and post-processing, but there are great interests towards real-time solutions. Fast and reliable real-time drone data processing can benefit, for instance, environmental monitoring tasks in precision agriculture and in forest. Recent developments in miniaturized and low-cost inertial measurement systems and GNSS sensors, and Real-time kinematic (RTK) position data are offering new perspectives for the comprehensive remote sensing applications. The combination of these sensors and light-weight and low-cost multi- or hyperspectral frame sensors in drones provides the opportunity of creating near real-time or real-time remote sensing data of target object. We have developed a system with direct georeferencing onboard drone to be used combined with hyperspectral frame cameras in real-time remote sensing applications. The objective of this study is to evaluate the real-time georeferencing comparing with post-processing solutions. Experimental data sets were captured in agricultural and forested test sites using the system. The accuracy of onboard georeferencing data were better than 0.5&amp;thinsp;m. The results showed that the real-time remote sensing is promising and feasible in both test sites.


Author(s):  
V. Manishankar ◽  
S. Harish ◽  
S. Lakshmanan ◽  
L.N. Selvan ◽  
R. Vinodhan

The use of Unmanned Aerial Systems (UAS) in precision agriculture applications has increased in the last three years. This is mainly due to the UAS capability to provide the farmers with important information related to crop health for a better input management. This allows the constant growth resource optimization, an underlying issue for farmers. Furthermore, UAS are relatively cheap in comparison with manned aircraft or satellite-based systems, they are also small and easy to use. All these facts promote the growing popularization of agriculture UAS. In this paper an easy-to-implement and low-cost system is proposed for basic agriculture tasks, such as NDVI computation and crop imagery collection.


Smart agriculture plays a vital role in counties economy and development. Automation in the field of agriculture has led to the usage of autonomous robots for many field applications. In this scenario, remote monitoring of various field parameters are highly imperative for increased productivity. In order to demonstrate the usage of robots in agriculture, environmental monitoring robot was designed in this work. The first part was designed to collect soil moisture and temperature data from farm land and secondly to aggregate the data from the sensors. Finally, the data from the sensors will stored in the cloud for further processing and decision making for controlling actuator part. Overall system can increase the crop productivity and reduce the wastage of resources


Research on Scene classification of remotely sensed images has shown a significant improvement in the recent years as it is used in various applications such as urban planning, urban mapping, management of natural resources, precision agriculture, detecting targets etc. The recent advancement of intelligent earth observation system has led to the generation of high resolution remote sensing images in terms of spatial, spectral and temporal resolutions which in turn helped the researchers to improve the performance of Land Use Land Class (LULC) Classification Techniques to a higher level. With the usage of different deep learning architecture and the availability of various high resolution image datasets, the field of Remote Sensing Scene Classification of high resolution (RSSCHR) images has shown tremendous improvement in the past decade. In this paper we present the different publicly available datasets , various scene classification methods and the future research scope of remotely sensed high resolution images.


2015 ◽  
Vol 13 (34) ◽  
pp. 49-63 ◽  
Author(s):  
Liseth Viviana Campo Arcos ◽  
Juan Carlos Corrales Muñoz ◽  
Agapito Ledezma Espino

This paper presents a proposal for information gathering from crops by means of a low-cost quadcopter known as the AR Drone 2.0. To achieve this, we designed a system for remote sensing that addresses challenges identified in the present research, such as acquisition of aerial photographs of an entire crop and AR Drone navigation on non-planar areas arises. The project is currently at an early stage of development. The first stage describes platform and hardware/software tools used to build the proposed prototype. Second stage characterizes performance experiments of sensors stability and altitude in AR Drone, in order to design an altitude strategy control over non-flat crops. In addition, path planning algorithms based on shortest route by graphs (Dijkstra, A* and wavefront propagation) are evaluated with simulated quadcopter. The implementation of the shortest path algorithms is the beginning to full coverage of a crop. Observations of quadcopter behavior in Gazebo simulator and real tests demonstrate viability to execute the project by using AR Drone like platform of a remote sensing system to precision agriculture.


Sign in / Sign up

Export Citation Format

Share Document