Unmanned Aerial System applications in construction

2020 ◽  
pp. 264-288
Author(s):  
Masoud Gheisari ◽  
Dayana Bastos Costa ◽  
Javier Irizarry
2014 ◽  
Author(s):  
Katie Feltman ◽  
F. Richard Ferraro ◽  
Kyle Bernhardt ◽  
Hannah Hill

2018 ◽  
Author(s):  
Mohammed S. Mayeed ◽  
Franklin Woods ◽  
Alexander Bryant

2019 ◽  
Vol 3 ◽  
pp. 1255
Author(s):  
Ahmad Salahuddin Mohd Harithuddin ◽  
Mohd Fazri Sedan ◽  
Syaril Azrad Md Ali ◽  
Shattri Mansor ◽  
Hamid Reza Jifroudi ◽  
...  

Unmanned aerial systems (UAS) has many advantages in the fields of SURVAILLANCE and disaster management compared to space-borne observation, manned missions and in situ methods. The reasons include cost effectiveness, operational safety, and mission efficiency. This has in turn underlined the importance of UAS technology and highlighted a growing need in a more robust and efficient unmanned aerial vehicles to serve specific needs in SURVAILLANCE and disaster management. This paper first gives an overview on the framework for SURVAILLANCE particularly in applications of border control and disaster management and lists several phases of SURVAILLANCE and service descriptions. Based on this overview and SURVAILLANCE phases descriptions, we show the areas and services in which UAS can have significant advantage over traditional methods.


Author(s):  
Matthew B. Galles ◽  
Noah H. Schiller ◽  
Kasey A. Ackerman ◽  
Brett A. Newman

2020 ◽  
Vol 3 (2) ◽  
pp. 58-73
Author(s):  
Vijay Bhagat ◽  
Ajaykumar Kada ◽  
Suresh Kumar

Unmanned Aerial System (UAS) is an efficient tool to bridge the gap between high expensive satellite remote sensing, manned aerial surveys, and labors time consuming conventional fieldwork techniques of data collection. UAS can provide spatial data at very fine (up to a few mm) and desirable temporal resolution. Several studies have used vegetation indices (VIs) calculated from UAS based on optical- and MSS-datasets to model the parameters of biophysical units of the Earth surface. They have used different techniques of estimations, predictions and classifications. However, these results vary according to used datasets and techniques and appear very site-specific. These existing approaches aren’t optimal and applicable for all cases and need to be tested according to sensor category and different geophysical environmental conditions for global applications. UAS remote sensing is a challenging and interesting area of research for sustainable land management.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1369
Author(s):  
Hyojun Lee ◽  
Jiyoung Yoon ◽  
Min-Seong Jang ◽  
Kyung-Joon Park

To perform advanced operations with unmanned aerial vehicles (UAVs), it is crucial that components other than the existing ones such as flight controller, network devices, and ground control station (GCS) are also used. The inevitable addition of hardware and software to accomplish UAV operations may lead to security vulnerabilities through various vectors. Hence, we propose a security framework in this study to improve the security of an unmanned aerial system (UAS). The proposed framework operates in the robot operating system (ROS) and is designed to focus on several perspectives, such as overhead arising from additional security elements and security issues essential for flight missions. The UAS is operated in a nonnative and native ROS environment. The performance of the proposed framework in both environments is verified through experiments.


2021 ◽  
Vol 13 (14) ◽  
pp. 2822
Author(s):  
Zhe Lin ◽  
Wenxuan Guo

An accurate stand count is a prerequisite to determining the emergence rate, assessing seedling vigor, and facilitating site-specific management for optimal crop production. Traditional manual counting methods in stand assessment are labor intensive and time consuming for large-scale breeding programs or production field operations. This study aimed to apply two deep learning models, the MobileNet and CenterNet, to detect and count cotton plants at the seedling stage with unmanned aerial system (UAS) images. These models were trained with two datasets containing 400 and 900 images with variations in plant size and soil background brightness. The performance of these models was assessed with two testing datasets of different dimensions, testing dataset 1 with 300 by 400 pixels and testing dataset 2 with 250 by 1200 pixels. The model validation results showed that the mean average precision (mAP) and average recall (AR) were 79% and 73% for the CenterNet model, and 86% and 72% for the MobileNet model with 900 training images. The accuracy of cotton plant detection and counting was higher with testing dataset 1 for both CenterNet and MobileNet models. The results showed that the CenterNet model had a better overall performance for cotton plant detection and counting with 900 training images. The results also indicated that more training images are required when applying object detection models on images with different dimensions from training datasets. The mean absolute percentage error (MAPE), coefficient of determination (R2), and the root mean squared error (RMSE) values of the cotton plant counting were 0.07%, 0.98 and 0.37, respectively, with testing dataset 1 for the CenterNet model with 900 training images. Both MobileNet and CenterNet models have the potential to accurately and timely detect and count cotton plants based on high-resolution UAS images at the seedling stage. This study provides valuable information for selecting the right deep learning tools and the appropriate number of training images for object detection projects in agricultural applications.


Sign in / Sign up

Export Citation Format

Share Document