scholarly journals Improving Nitrogen Status Estimation in Malting Barley Based on Hyperspectral Reflectance and Artificial Neural Networks

Agronomy ◽  
2021 ◽  
Vol 11 (12) ◽  
pp. 2592
Author(s):  
Karel Klem ◽  
Jan Křen ◽  
Ján Šimor ◽  
Daniel Kováč ◽  
Petr Holub ◽  
...  

Malting barley requires sensitive methods for N status estimation during the vegetation period, as inadequate N nutrition can significantly limit yield formation, while overfertilization often leads to an increase in grain protein content above the limit for malting barley and also to excessive lodging. We hypothesized that the use of N nutrition index and N uptake combined with red-edge or green reflectance would provide extended linearity and higher accuracy in estimating N status across different years, genotypes, and densities, and the accuracy of N status estimation will be further improved by using artificial neural network based on multiple spectral reflectance wavelengths. Multifactorial field experiments on interactive effects of N nutrition, sowing density, and genotype were conducted in 2011–2013 to develop methods for estimation of N status and to reduce dependency on changing environmental conditions, genotype, or barley management. N nutrition index (NNI) and total N uptake were used to correct the effect of biomass accumulation and N dilution during plant development. We employed an artificial neural network to integrate data from multiple reflectance wavelengths and thereby eliminate the effects of such interfering factors as genotype, sowing density, and year. NNI and N uptake significantly reduced the interannual variation in relationships to vegetation indices documented for N content. The vegetation indices showing the best performance across years were mainly based on red-edge and carotenoid absorption bands. The use of an artificial neural network also significantly improved the estimation of all N status indicators, including N content. The critical reflectance wavelengths for neural network training were in spectral bands 400–490, 530–570, and 710–720 nm. In summary, combining NNI or N uptake and neural network increased the accuracy of N status estimation to up 94%, compared to less than 60% for N concentration.

Sensors ◽  
2019 ◽  
Vol 19 (17) ◽  
pp. 3712 ◽  
Author(s):  
Lukas Prey ◽  
Urs Schmidhalter

Precise sensor-based non-destructive estimation of crop nitrogen (N) status is essential for low-cost, objective optimization of N fertilization, as well as for early estimation of yield potential and N use efficiency. Several studies assessed the performance of spectral vegetation indices (SVI) for winter wheat (Triticum aestivum L.), often either for conditions of low N status or across a wide range of the target traits N uptake (Nup), N concentration (NC), dry matter biomass (DM), and N nutrition index (NNI). This study aimed at a critical assessment of the estimation ability depending on the level of the target traits. It included seven years’ data with nine measurement dates from early stem elongation until flowering in eight N regimes (0–420 kg N ha−1) for selected SVIs. Tested across years, a pronounced date-specific clustering was found particularly for DM and NC. While for DM, only the R900_970 gave moderate but saturated relationships (R2 = 0.47, p < 0.001) and no index was useful for NC across dates, NNI and Nup could be better estimated (REIP: R2 = 0.59, p < 0.001 for both traits). Tested within growth stages across N levels, the order of the estimation of the traits was mostly Nup ≈ NNI > NC ≈ DM. Depending on the number (n = 1–3) and characteristic of cultivars included, the relationships improved when testing within instead of across cultivars, with the relatively lowest cultivar effect on the estimation of DM and the strongest on NC. For assessing the trait estimation under conditions of high–excessive N fertilization, the range of the target traits was divided into two intervals with NNI values < 0.8 (interval 1: low N status) and with NNI values > 0.8 (interval 2: high N status). Although better estimations were found in interval 1, useful relationships were also obtained in interval 2 from the best indices (DM: R780_740: average R2 = 0.35, RMSE = 567 kg ha−1; NC: REIP: average R2 = 0.40, RMSE = 0.25%; NNI: REIP: average R2 = 0.46, RMSE = 0.10; Nup: REIP: average R2 = 0.48, RMSE = 21 kg N ha−1). While in interval 1, all indices performed rather similarly, the three red edge-based indices were clearly better suited for the three N-related traits. The results are promising for applying SVIs also under conditions of high N status, aiming at detecting and avoiding excessive N use. While in canopies of lower N status, the use of simple NIR/VIS indices may be sufficient without losing much precision, the red edge information appears crucial for conditions of higher N status. These findings can be transferred to the configuration and use of simpler multispectral sensors under conditions of contrasting N status in precision farming.


2000 ◽  
Vol 25 (4) ◽  
pp. 325-325
Author(s):  
J.L.N. Roodenburg ◽  
H.J. Van Staveren ◽  
N.L.P. Van Veen ◽  
O.C. Speelman ◽  
J.M. Nauta ◽  
...  

2004 ◽  
Vol 171 (4S) ◽  
pp. 502-503
Author(s):  
Mohamed A. Gomha ◽  
Khaled Z. Sheir ◽  
Saeed Showky ◽  
Khaled Madbouly ◽  
Emad Elsobky ◽  
...  

1998 ◽  
Vol 49 (7) ◽  
pp. 717-722 ◽  
Author(s):  
M C M de Carvalho ◽  
M S Dougherty ◽  
A S Fowkes ◽  
M R Wardman

2020 ◽  
Vol 39 (6) ◽  
pp. 8463-8475
Author(s):  
Palanivel Srinivasan ◽  
Manivannan Doraipandian

Rare event detections are performed using spatial domain and frequency domain-based procedures. Omnipresent surveillance camera footages are increasing exponentially due course the time. Monitoring all the events manually is an insignificant and more time-consuming process. Therefore, an automated rare event detection contrivance is required to make this process manageable. In this work, a Context-Free Grammar (CFG) is developed for detecting rare events from a video stream and Artificial Neural Network (ANN) is used to train CFG. A set of dedicated algorithms are used to perform frame split process, edge detection, background subtraction and convert the processed data into CFG. The developed CFG is converted into nodes and edges to form a graph. The graph is given to the input layer of an ANN to classify normal and rare event classes. Graph derived from CFG using input video stream is used to train ANN Further the performance of developed Artificial Neural Network Based Context-Free Grammar – Rare Event Detection (ACFG-RED) is compared with other existing techniques and performance metrics such as accuracy, precision, sensitivity, recall, average processing time and average processing power are used for performance estimation and analyzed. Better performance metrics values have been observed for the ANN-CFG model compared with other techniques. The developed model will provide a better solution in detecting rare events using video streams.


Sign in / Sign up

Export Citation Format

Share Document