Classification of Terrain Types in Unmanned Aerial Vehicle Images

Author(s):  
Inon Wiratsin ◽  
Veerapong Suchaiporn ◽  
Pojchara Trainorapong ◽  
Jirachaipat Chaichinvara ◽  
Sakwaroon Rattanajitdamrong ◽  
...  
Author(s):  
V. Y. Stepanov

The article gives a classification of the main components of unmanned aerial vehicle (UAV) systems, gives the areas in which the application of UAVs is actual in practice today. Further, the UAV is considered in more detail from the point of view of its flight dynamics analysis, the equation necessary for creating a mathematical model, as well as the model of an ordinary dynamic system as a non-stationary nonlinear controlled object, is given. Next, a description of the developed software for modeling and a description of program algorithm are given. Finally, a conclusion describes the necessary directions for further scientific researches.


Author(s):  
Caique Carvalho Medauar ◽  
Samuel de Assis Silva ◽  
Luis Carlos Cirilo Carvalho ◽  
Rafael Augusto Soares Tibúrcio ◽  
Paullo Augusto Silva Medauar

Currently, the efficiency of chemical weeding for controlling eucalyptus sprouts is measured by field sampling, but the inefficiency of the sampling methods has led to the investigation of new technologies, such as using unmanned aerial vehicle (UAV) to help to identify the vegetative vigor of eucalyptus after chemical weeding. This study, therefore, used aerial images obtained by a UAV embedded with a sensor to identify the vegetative vigor and quantify the area occupied by eucalyptus sprouts 90 days after the chemical weeding. The study was conducted in three fields planted with eucalyptus whose sprouts had been previously controlled by the chemical weeding with the Scout® herbicide in November 2016. The vegetative vigor of the eucalyptus sprouts was evaluated from the aerial images obtained by the UAV with embedded sensor, during flights conducted in November 2016 and February 2017, that were used to calculate the normalized difference vegetation index and later, a random sample grid was constructed for each image by supervised classification of the area (m2) to determine the percentage occupied by the sprouts. The used chemical control method neither eradicated the sprouts nor reduced the sprout occupied area. The normalized difference vegetation index and supervised classification tools allowed determining with high precision sprout health status and size, generating interpretable data on the different evaluated fields and periods. The processing of the images obtained by the UAV provided a viable alternative of management to evaluate sprout status in reforestation areas.


2020 ◽  
Vol 12 (1) ◽  
pp. 146 ◽  
Author(s):  
Miao Liu ◽  
Tao Yu ◽  
Xingfa Gu ◽  
Zhensheng Sun ◽  
Jian Yang ◽  
...  

Fine classification of vegetation types has always been the focus and difficulty in the application field of remote sensing. Unmanned Aerial Vehicle (UAV) sensors and platforms have become important data sources in various application fields due to their high spatial resolution and flexibility. Especially, UAV hyperspectral images can play a significant role in the fine classification of vegetation types. However, it is not clear how the ultrahigh resolution UAV hyperspectral images react in the fine classification of vegetation types in highly fragmented planting areas, and how the spatial resolution variation of UAV images will affect the classification accuracy. Based on UAV hyperspectral images obtained from a commercial hyperspectral imaging sensor (S185) onboard a UAV platform, this paper examines the impact of spatial resolution on the classification of vegetation types in highly fragmented planting areas in southern China by aggregating 0.025 m hyperspectral image to relatively coarse spatial resolutions (0.05, 0.1, 0.25, 0.5, 1, 2.5 m). The object-based image analysis (OBIA) method was used and the effects of several segmentation scale parameters and different number of features were discussed. Finally, the classification accuracies from 84.3% to 91.3% were obtained successfully for multi-scale images. The results show that with the decrease of spatial resolution, the classification accuracies show a stable and slight fluctuation and then gradually decrease since the 0.5 m spatial resolution. The best classification accuracy does not occur in the original image, but at an intermediate level of resolution. The study also proves that the appropriate feature parameters vary at different scales. With the decrease of spatial resolution, the importance of vegetation index features has increased, and that of textural features shows an opposite trend; the appropriate segmentation scale has gradually decreased, and the appropriate number of features is 30 to 40. Therefore, it is of vital importance to select appropriate feature parameters for images in different scales so as to ensure the accuracy of classification.


Information ◽  
2021 ◽  
Vol 12 (1) ◽  
pp. 23
Author(s):  
Marzena Mięsikowska

The aim of this study was to perform discriminant analysis of voice commands in the presence of an unmanned aerial vehicle equipped with four rotating propellers, as well as to obtain background sound levels and speech intelligibility. The measurements were taken in laboratory conditions in the absence of the unmanned aerial vehicle and the presence of the unmanned aerial vehicle. Discriminant analysis of speech commands (left, right, up, down, forward, backward, start, and stop) was performed based on mel-frequency cepstral coefficients. Ten male speakers took part in this experiment. The unmanned aerial vehicle hovered at a height of 1.8 m during the recordings at a distance of 2 m from the speaker and 0.3 m above the measuring equipment. Discriminant analysis based on mel-frequency cepstral coefficients showed promising classification of speech commands equal to 76.2% for male speakers. Evaluated speech intelligibility during recordings and obtained sound levels in the presence of the unmanned aerial vehicle during recordings did not exclude verbal communication with the unmanned aerial vehicle for male speakers.


Sign in / Sign up

Export Citation Format

Share Document