scholarly journals Correlation between Geochemical and Multispectral Patterns in an Area Severely Contaminated by Former Hg-As Mining

2020 ◽  
Vol 9 (12) ◽  
pp. 739
Author(s):  
Carlos Boente ◽  
Lorena Salgado ◽  
Emilio Romero-Macías ◽  
Arturo Colina ◽  
Carlos A. López-Sánchez ◽  
...  

In the context of soil pollution, plants suffer stress when exposed to extreme concentrations of potentially toxic elements (PTEs). The alterations to the plants caused by such stressors can be monitored by multispectral imagery in the form of vegetation indices, which can inform pollution management strategies. Here we combined geochemistry and remote sensing techniques to offer a preliminary soil pollution assessment of a vast abandoned spoil heap in the surroundings of La Soterraña mining site (Asturias, Spain). To study the soil distribution of the PTEs over time, twenty-seven soil samples were randomly collected downstream of and around the main spoil heap. Furthermore, the area was covered by an unmanned aerial vehicle (UAV) carrying a high-resolution multispectral camera with four bands (red, green, red-edge and near infrared). Multielement analysis revealed mercury and arsenic as principal pollutants. Two indices (from a database containing up to 55 indices) offered a proper correlation with the concentration of PTEs. These were: CARI2, presenting a Pearson Coefficient (PC) of 0.89 for concentrations >200 mg/kg of As; and NDVIg, PC of −0.67 for >40 mg/kg of Hg. The combined approach helps prediction of those areas susceptible to greatest pollution, thus reducing the costs of geochemical campaigns.

2019 ◽  
Vol 11 (22) ◽  
pp. 2667 ◽  
Author(s):  
Jiang ◽  
Cai ◽  
Zheng ◽  
Cheng ◽  
Tian ◽  
...  

Commercially available digital cameras can be mounted on an unmanned aerial vehicle (UAV) for crop growth monitoring in open-air fields as a low-cost, highly effective observation system. However, few studies have investigated their potential for nitrogen (N) status monitoring, and the performance of camera-derived vegetation indices (VIs) under different conditions remains poorly understood. In this study, five commonly used VIs derived from normal color (RGB) images and two typical VIs derived from color near-infrared (CIR) images were used to estimate leaf N concentration (LNC). To explore the potential of digital cameras for monitoring LNC at all crop growth stages, two new VIs were proposed, namely, the true color vegetation index (TCVI) from RGB images and the false color vegetation index (FCVI) from CIR images. The relationships between LNC and the different VIs varied at different stages. The commonly used VIs performed well at some stages, but the newly proposed TCVI and FCVI had the best performance at all stages. The performances of the VIs with red (or near-infrared) and green bands as the numerator were limited by saturation at intermediate to high LNCs (LNC > 3.0%), but the TCVI and FCVI had the ability to mitigate the saturation. The results of model validations further supported the superiority of the TCVI and FCVI for LNC estimation. Compared to the other VIs derived using RGB cameras, the relative root mean square errors (RRMSEs) of the TCVI were improved by 8.6% on average. For the CIR images, the best-performing VI for LNC was the FCVI (R2 = 0.756, RRMSE = 14.18%). The LNC–TCVI and LNC–FCVI were stable under different cultivars, N application rates, and planting densities. The results confirmed the applicability of UAV-based RGB and CIR cameras for crop N status monitoring under different conditions, which should assist the precision management of N fertilizers in agronomic practices.


2019 ◽  
Vol 2019 ◽  
pp. 1-16 ◽  
Author(s):  
Qian Sun ◽  
Lin Sun ◽  
Meiyan Shu ◽  
Xiaohe Gu ◽  
Guijun Yang ◽  
...  

Lodging is one of the main factors affecting the quality and yield of crops. Timely and accurate determination of crop lodging grade is of great significance for the quantitative and objective evaluation of yield losses. The purpose of this study was to analyze the monitoring ability of a multispectral image obtained by an unmanned aerial vehicle (UAV) for determination of the maize lodging grade. A multispectral Parrot Sequoia camera is specially designed for agricultural applications and provides new information that is useful in agricultural decision-making. Indeed, a near-infrared image which cannot be seen with the naked eye can be used to make a highly precise diagnosis of the vegetation condition. The images obtained constitute a highly effective tool for analyzing plant health. Maize samples with different lodging grades were obtained by visual interpretation, and the spectral reflectance, texture feature parameters, and vegetation indices of the training samples were extracted. Different feature transformations were performed, texture features and vegetation indices were combined, and various feature images were classified by maximum likelihood classification (MLC) to extract four lodging grades. Classification accuracy was evaluated using a confusion matrix based on the verification samples, and the features suitable for monitoring the maize lodging grade were screened. The results showed that compared with a multispectral image, the principal components, texture features, and combination of texture features and vegetation indices were improved by varying degrees. The overall accuracy of the combination of texture features and vegetation indices is 86.61%, and the Kappa coefficient is 0.8327, which is higher than that of other features. Therefore, the classification result based on the feature combinations of the UAV multispectral image is useful for monitoring of maize lodging grades.


OENO One ◽  
2015 ◽  
Vol 49 (2) ◽  
pp. 85 ◽  
Author(s):  
Rebecca Retzlaff ◽  
Daniel Molitor ◽  
Marc Behr ◽  
Christian Bossung ◽  
Gilles Rock ◽  
...  

<p style="text-align: justify;"><strong>Aims</strong>: The present investigation in a Luxembourgish vineyard aimed at evaluating the potential of multispectral, multi-angular UAS (unmanned aerial system) imagery to separate four soil management strategies, to predict physiological variables (chlorophyll, nitrogen, yield etc.) and to follow seasonal changes in grapevine physiology in relation to soil management.</p><p style="text-align: justify;"><strong>Methods and results</strong>: Multi-angular (nadir and 45° off-nadir) multispectral imageries (530-900 nm) were taken in the years 2011 and 2012. Image grey values and reflectance-derived vegetation indices were computed and canopy and vigour properties were monitored in the field. All four soil management strategies could be significantly discriminated (box-plots, linear discriminant analysis) and vegetation properties estimated (linear regression) in 2011. For 2012, global models predicted chlorophyll contents and nitrogen balance index values with a R²<sub>cv</sub> of 0.65 and 0.76, respectively.</p><p style="text-align: justify;"><strong>Conclusions</strong>: Soil management strategies strongly affect plant vigour and reflectance. Differences were best detectable by oblique visible/near-infrared (Vis/nIR) UAS data of illuminated canopies.</p><p style="text-align: justify;"><strong>Significance and impact of the study</strong>: UAS imaging is a flexible tool for applications in precision viticulture.</p>


2020 ◽  
Vol 12 (2) ◽  
pp. 317 ◽  
Author(s):  
Francisco-Javier Mesas-Carrascosa ◽  
Ana I. de Castro ◽  
Jorge Torres-Sánchez ◽  
Paula Triviño-Tarradas ◽  
Francisco M. Jiménez-Brenes ◽  
...  

Remote sensing applied in the digital transformation of agriculture and, more particularly, in precision viticulture offers methods to map field spatial variability to support site-specific management strategies; these can be based on crop canopy characteristics such as the row height or vegetation cover fraction, requiring accurate three-dimensional (3D) information. To derive canopy information, a set of dense 3D point clouds was generated using photogrammetric techniques on images acquired by an RGB sensor onboard an unmanned aerial vehicle (UAV) in two testing vineyards on two different dates. In addition to the geometry, each point also stores information from the RGB color model, which was used to discriminate between vegetation and bare soil. To the best of our knowledge, the new methodology herein presented consisting of linking point clouds with their spectral information had not previously been applied to automatically estimate vine height. Therefore, the novelty of this work is based on the application of color vegetation indices in point clouds for the automatic detection and classification of points representing vegetation and the later ability to determine the height of vines using as a reference the heights of the points classified as soil. Results from on-ground measurements of the heights of individual grapevines were compared with the estimated heights from the UAV point cloud, showing high determination coefficients (R² > 0.87) and low root-mean-square error (0.070 m). This methodology offers new capabilities for the use of RGB sensors onboard UAV platforms as a tool for precision viticulture and digitizing applications.


2021 ◽  
Vol 13 (16) ◽  
pp. 3238
Author(s):  
Mirko Saponaro ◽  
Athos Agapiou ◽  
Diofantos G. Hadjimitsis ◽  
Eufemia Tarantino

The consolidation of unmanned aerial vehicle (UAV) photogrammetric techniques for campaigns with high and medium observation scales has triggered the development of new application areas. Most of these vehicles are equipped with common visible-band sensors capable of mapping areas of interest at various spatial resolutions. It is often necessary to identify vegetated areas for masking purposes during the postprocessing phase, excluding them for the digital elevation models (DEMs) generation or change detection purposes. However, vegetation can be extracted using sensors capable of capturing the near-infrared part of the spectrum, which cannot be recorded by visible (RGB) cameras. In this study, after reviewing different visible-band vegetation indices in various environments using different UAV technology, the influence of the spatial resolution of orthomosaics generated by photogrammetric processes in the vegetation extraction was examined. The triangular greenness index (TGI) index provided a high level of separability between vegetation and nonvegetation areas for all case studies in any spatial resolution. The efficiency of the indices remained fundamentally linked to the context of the scenario under investigation, and the correlation between spatial resolution and index incisiveness was found to be more complex than might be trivially assumed.


Drones ◽  
2019 ◽  
Vol 3 (1) ◽  
pp. 25 ◽  
Author(s):  
René Heim ◽  
Ian Wright ◽  
Peter Scarth ◽  
Angus Carnegie ◽  
Dominique Taylor ◽  
...  

Disease management in agriculture often assumes that pathogens are spread homogeneously across crops. In practice, pathogens can manifest in patches. Currently, disease detection is predominantly carried out by human assessors, which can be slow and expensive. A remote sensing approach holds promise. Current satellite sensors are not suitable to spatially resolve individual plants or lack temporal resolution to monitor pathogenesis. Here, we used multispectral imaging and unmanned aerial systems (UAS) to explore whether myrtle rust (Austropuccinia psidii) could be detected on a lemon myrtle (Backhousia citriodora) plantation. Multispectral aerial imagery was collected from fungicide treated and untreated tree canopies, the fungicide being used to control myrtle rust. Spectral vegetation indices and single spectral bands were used to train a random forest classifier. Treated and untreated trees could be classified with high accuracy (95%). Important predictors for the classifier were the near-infrared (NIR) and red edge (RE) spectral band. Taking some limitations into account, that are discussedherein, our work suggests potential for mapping myrtle rust-related symptoms from aerial multispectral images. Similar studies could focus on pinpointing disease hotspots to adjust management strategies and to feed epidemiological models.


Drones ◽  
2020 ◽  
Vol 4 (2) ◽  
pp. 27 ◽  
Author(s):  
Athos Agapiou

Red–green–blue (RGB) cameras which are attached in commercial unmanned aerial vehicles (UAVs) can support remote-observation small-scale campaigns, by mapping, within a few centimeter’s accuracy, an area of interest. Vegetated areas need to be identified either for masking purposes (e.g., to exclude vegetated areas for the production of a digital elevation model (DEM) or for monitoring vegetation anomalies, especially for precision agriculture applications. However, while detection of vegetated areas is of great importance for several UAV remote sensing applications, this type of processing can be quite challenging. Usually, healthy vegetation can be extracted at the near-infrared part of the spectrum (approximately between 760–900 nm), which is not captured by the visible (RGB) cameras. In this study, we explore several visible (RGB) vegetation indices in different environments using various UAV sensors and cameras to validate their performance. For this purposes, openly licensed unmanned aerial vehicle (UAV) imagery has been downloaded “as is” and analyzed. The overall results are presented in the study. As it was found, the green leaf index (GLI) was able to provide the optimum results for all case studies.


2021 ◽  
Vol 13 (3) ◽  
pp. 536
Author(s):  
Eve Laroche-Pinel ◽  
Mohanad Albughdadi ◽  
Sylvie Duthoit ◽  
Véronique Chéret ◽  
Jacques Rousseau ◽  
...  

The main challenge encountered by Mediterranean winegrowers is water management. Indeed, with climate change, drought events are becoming more intense each year, dragging the yield down. Moreover, the quality of the vineyards is affected and the level of alcohol increases. Remote sensing data are a potential solution to measure water status in vineyards. However, important questions are still open such as which spectral, spatial, and temporal scales are adapted to achieve the latter. This study aims at using hyperspectral measurements to investigate the spectral scale adapted to measure their water status. The final objective is to find out whether it would be possible to monitor the vine water status with the spectral bands available in multispectral satellites such as Sentinel-2. Four Mediterranean vine plots with three grape varieties and different water status management systems are considered for the analysis. Results show the main significant domains related to vine water status (Short Wave Infrared, Near Infrared, and Red-Edge) and the best vegetation indices that combine these domains. These results give some promising perspectives to monitor vine water status.


2021 ◽  
Vol 13 (2) ◽  
pp. 233
Author(s):  
Ilja Vuorinne ◽  
Janne Heiskanen ◽  
Petri K. E. Pellikka

Biomass is a principal variable in crop monitoring and management and in assessing carbon cycling. Remote sensing combined with field measurements can be used to estimate biomass over large areas. This study assessed leaf biomass of Agave sisalana (sisal), a perennial crop whose leaves are grown for fibre production in tropical and subtropical regions. Furthermore, the residue from fibre production can be used to produce bioenergy through anaerobic digestion. First, biomass was estimated for 58 field plots using an allometric approach. Then, Sentinel-2 multispectral satellite imagery was used to model biomass in an 8851-ha plantation in semi-arid south-eastern Kenya. Generalised Additive Models were employed to explore how well biomass was explained by various spectral vegetation indices (VIs). The highest performance (explained deviance = 76%, RMSE = 5.15 Mg ha−1) was achieved with ratio and normalised difference VIs based on the green (R560), red-edge (R740 and R783), and near-infrared (R865) spectral bands. Heterogeneity of ground vegetation and resulting background effects seemed to limit model performance. The best performing VI (R740/R783) was used to predict plantation biomass that ranged from 0 to 46.7 Mg ha−1 (mean biomass 10.6 Mg ha−1). The modelling showed that multispectral data are suitable for assessing sisal leaf biomass at the plantation level and in individual blocks. Although these results demonstrate the value of Sentinel-2 red-edge bands at 20-m resolution, the difference from the best model based on green and near-infrared bands at 10-m resolution was rather small.


Sign in / Sign up

Export Citation Format

Share Document