Comparison of traditional method and triple collocation analysis for evaluation of multiple gridded precipitation products across Germany

Author(s):  
Zheng Duan ◽  
Edward Duggan ◽  
Cheng Chen ◽  
Hongkai Gao ◽  
Jianzhi Dong ◽  
...  

AbstractEvaluating the accuracy of precipitation products is essential for many applications. The traditional method for evaluation is to calculate error metrics of products with gauge measurements that are considered as ground-truth. The multiplicative triple collocation (MTC) method has been demonstrated powerful in error quantification of precipitation products when ground-truth is not known. This study applied MTC to evaluate five precipitation products in Germany: two raw satellite-based (CMORPH and PERSIANN), one reanalysis (ERA-Interim), one soil moisture-based (SM2RAIN-ASCAT), and one gauge-based (REGNIE) products. Evaluation was performed at the 0.5° -daily spatial-temporal scales. MTC involves a log transformation of data, necessitating dealing with zero values in daily precipitation. Effects of 12 different strategies for dealing with zero value on MTC results were investigated. Seven different triplet combinations were tested to evaluate the stability of MTC. Results showed that different strategies for replacing zero values had considerable effects on MTC-derived error metrics particularly for root mean squared error (RMSE). MTC with different triplet combinations generated different error metrics for individual products. MTC-derived correlation coefficient (CC) was more reliable than RMSE. It is more appropriate to use MTC to compare the relative accuracy of different precipitation products. Based on CC with unknown truth, MTC with different triplet combinations produced the same ranking of products as the traditional method. A comparison of results from MTC and the classic TC with additive error model showed the potential limitation of MTC in arid area or dry time periods with large ratio of zero daily precipitation.

Drones ◽  
2021 ◽  
Vol 5 (2) ◽  
pp. 37
Author(s):  
Bingsheng Wei ◽  
Martin Barczyk

We consider the problem of vision-based detection and ranging of a target UAV using the video feed from a monocular camera onboard a pursuer UAV. Our previously published work in this area employed a cascade classifier algorithm to locate the target UAV, which was found to perform poorly in complex background scenes. We thus study the replacement of the cascade classifier algorithm with newer machine learning-based object detection algorithms. Five candidate algorithms are implemented and quantitatively tested in terms of their efficiency (measured as frames per second processing rate), accuracy (measured as the root mean squared error between ground truth and detected location), and consistency (measured as mean average precision) in a variety of flight patterns, backgrounds, and test conditions. Assigning relative weights of 20%, 40% and 40% to these three criteria, we find that when flying over a white background, the top three performers are YOLO v2 (76.73 out of 100), Faster RCNN v2 (63.65 out of 100), and Tiny YOLO (59.50 out of 100), while over a realistic background, the top three performers are Faster RCNN v2 (54.35 out of 100, SSD MobileNet v1 (51.68 out of 100) and SSD Inception v2 (50.72 out of 100), leading us to recommend Faster RCNN v2 as the recommended solution. We then provide a roadmap for further work in integrating the object detector into our vision-based UAV tracking system.


2010 ◽  
Vol 1 (4) ◽  
pp. 17-45
Author(s):  
Antons Rebguns ◽  
Diana F. Spears ◽  
Richard Anderson-Sprecher ◽  
Aleksey Kletsov

This paper presents a novel theoretical framework for swarms of agents. Before deploying a swarm for a task, it is advantageous to predict whether a desired percentage of the swarm will succeed. The authors present a framework that uses a small group of expendable “scout” agents to predict the success probability of the entire swarm, thereby preventing many agent losses. The scouts apply one of two formulas to predict – the standard Bernoulli trials formula or the new Bayesian formula. For experimental evaluation, the framework is applied to simulated agents navigating around obstacles to reach a goal location. Extensive experimental results compare the mean-squared error of the predictions of both formulas with ground truth, under varying circumstances. Results indicate the accuracy and robustness of the Bayesian approach. The framework also yields an intriguing result, namely, that both formulas usually predict better in the presence of (Lennard-Jones) inter-agent forces than when their independence assumptions hold.


Author(s):  
Mehdi Azarafza ◽  
Mohammad Azarafza ◽  
Jafar Tanha

Since December 2019 coronavirus disease (COVID-19) is outbreak from China and infected more than 4,666,000 people and caused thousands of deaths. Unfortunately, the infection numbers and deaths are still increasing rapidly which has put the world on the catastrophic abyss edge. Application of artificial intelligence and spatiotemporal distribution techniques can play a key role to infection forecasting in national and province levels in many countries. As methodology, the presented study employs long short-term memory-based deep for time series forecasting, the confirmed cases in both national and province levels, in Iran. The data were collected from February 19, to March 22, 2020 in provincial level and from February 19, to May 13, 2020 in national level by nationally recognised sources. For justification, we use the recurrent neural network, seasonal autoregressive integrated moving average, Holt winter's exponential smoothing, and moving averages approaches. Furthermore, the mean absolute error, mean squared error, and mean absolute percentage error metrics are used as evaluation factors with associate the trend analysis. The results of our experiments show that the LSTM model is performed better than the other methods on the collected COVID-19 dataset in Iran


Author(s):  
Antons Rebguns ◽  
Diana F. Spears ◽  
Richard Anderson-Sprecher ◽  
Aleksey Kletsov

This paper presents a novel theoretical framework for swarms of agents. Before deploying a swarm for a task, it is advantageous to predict whether a desired percentage of the swarm will succeed. The authors present a framework that uses a small group of expendable “scout” agents to predict the success probability of the entire swarm, thereby preventing many agent losses. The scouts apply one of two formulas to predict – the standard Bernoulli trials formula or the new Bayesian formula. For experimental evaluation, the framework is applied to simulated agents navigating around obstacles to reach a goal location. Extensive experimental results compare the mean-squared error of the predictions of both formulas with ground truth, under varying circumstances. Results indicate the accuracy and robustness of the Bayesian approach. The framework also yields an intriguing result, namely, that both formulas usually predict better in the presence of (Lennard-Jones) inter-agent forces than when their independence assumptions hold.


2020 ◽  
Author(s):  
Noah J. Goodall ◽  
Brian L. Smith ◽  
B. Brian Park

The introduction of mobile sensors, i.e. probe vehicles with GPS-enabled smart phones or connected vehicle technology, will potentially provide more comprehensive information on roadway conditions than conventional point detection alone. Several mobility applications have been proposed that utilize this new vehicle-specific data rather than aggregated speed, density, and flow. Because of bandwidth limitations of cellular and an expected slow deployment of connected vehicles, only a portion of vehicles on the roadway will be able to report their positions at any given time. This paper proposes a novel technique to analyze the behavior of freeway vehicles equipped with GPS receivers and accelerometers to estimate the quantity, locations, and speeds of those vehicles that do not have similar equipment. If an equipped vehicle deviates significantly from a car-following model’s expected behavior, the deviation is assumed to be the result of an interaction with an unequipped vehicle (i.e. an undetectable “ghost” vehicle). This unequipped vehicle is then inserted into a rolling estimation of individual vehicle movements. Because this technique is dependent on vehicles interacting during congestion, a second scenario uses an upstream detector to detect and insert unequipped vehicles at the point of detection, essentially “seeding” the network. An evaluation using the NGSIM US-101 dataset shows realistic vehicle density estimations during and immediately after congestion. Introducing an upstream detector to supply initial locations of unequipped vehicles improves accuracy in free flow conditions, thereby improving the root mean squared error of the number of vehicles within a 120-foot cell from 3.8 vehicles without a detector, to 2.4 vehicles with a detector, as compared to ground truth.


2012 ◽  
Vol 61 (2) ◽  
pp. 277-290 ◽  
Author(s):  
Ádám Csorba ◽  
Vince Láng ◽  
László Fenyvesi ◽  
Erika Michéli

Napjainkban egyre nagyobb igény mutatkozik olyan technológiák és módszerek kidolgozására és alkalmazására, melyek lehetővé teszik a gyors, költséghatékony és környezetbarát talajadat-felvételezést és kiértékelést. Ezeknek az igényeknek felel meg a reflektancia spektroszkópia, mely az elektromágneses spektrum látható (VIS) és közeli infravörös (NIR) tartományában (350–2500 nm) végzett reflektancia-mérésekre épül. Figyelembe véve, hogy a talajokról felvett reflektancia spektrum információban nagyon gazdag, és a vizsgált tartományban számos talajalkotó rendelkezik karakterisztikus spektrális „ujjlenyomattal”, egyetlen görbéből lehetővé válik nagyszámú, kulcsfontosságú talajparaméter egyidejű meghatározása. Dolgozatunkban, a reflektancia spektroszkópia alapjaira helyezett, a talajok ösz-szetételének meghatározását célzó módszertani fejlesztés első lépéseit mutatjuk be. Munkánk során talajok szervesszén- és CaCO3-tartalmának megbecslését lehetővé tévő többváltozós matematikai-statisztikai módszerekre (részleges legkisebb négyzetek módszere, partial least squares regression – PLSR) épülő prediktív modellek létrehozását és tesztelését végeztük el. A létrehozott modellek tesztelése során megállapítottuk, hogy az eljárás mindkét talajparaméter esetében magas R2értéket [R2(szerves szén) = 0,815; R2(CaCO3) = 0,907] adott. A becslés pontosságát jelző közepes négyzetes eltérés (root mean squared error – RMSE) érték mindkét paraméter esetében közepesnek mondható [RMSE (szerves szén) = 0,467; RMSE (CaCO3) = 3,508], mely a reflektancia mérési előírások standardizálásával jelentősen javítható. Vizsgálataink alapján arra a következtetésre jutottunk, hogy a reflektancia spektroszkópia és a többváltozós kemometriai eljárások együttes alkalmazásával, gyors és költséghatékony adatfelvételezési és -értékelési módszerhez juthatunk.


Author(s):  
Nadia Hashim Al-Noor ◽  
Shurooq A.K. Al-Sultany

        In real situations all observations and measurements are not exact numbers but more or less non-exact, also called fuzzy. So, in this paper, we use approximate non-Bayesian computational methods to estimate inverse Weibull parameters and reliability function with fuzzy data. The maximum likelihood and moment estimations are obtained as non-Bayesian estimation. The maximum likelihood estimators have been derived numerically based on two iterative techniques namely “Newton-Raphson” and the “Expectation-Maximization” techniques. In addition, we provide compared numerically through Monte-Carlo simulation study to obtained estimates of the parameters and reliability function in terms of their mean squared error values and integrated mean squared error values respectively.


2014 ◽  
Vol 2 (2) ◽  
pp. 47-58
Author(s):  
Ismail Sh. Baqer

A two Level Image Quality enhancement is proposed in this paper. In the first level, Dualistic Sub-Image Histogram Equalization DSIHE method decomposes the original image into two sub-images based on median of original images. The second level deals with spikes shaped noise that may appear in the image after processing. We presents three methods of image enhancement GHE, LHE and proposed DSIHE that improve the visual quality of images. A comparative calculations is being carried out on above mentioned techniques to examine objective and subjective image quality parameters e.g. Peak Signal-to-Noise Ratio PSNR values, entropy H and mean squared error MSE to measure the quality of gray scale enhanced images. For handling gray-level images, convenient Histogram Equalization methods e.g. GHE and LHE tend to change the mean brightness of an image to middle level of the gray-level range limiting their appropriateness for contrast enhancement in consumer electronics such as TV monitors. The DSIHE methods seem to overcome this disadvantage as they tend to preserve both, the brightness and contrast enhancement. Experimental results show that the proposed technique gives better results in terms of Discrete Entropy, Signal to Noise ratio and Mean Squared Error values than the Global and Local histogram-based equalization methods


Geosciences ◽  
2020 ◽  
Vol 10 (9) ◽  
pp. 329
Author(s):  
Mahdi O. Karkush ◽  
Mahmood D. Ahmed ◽  
Ammar Abdul-Hassan Sheikha ◽  
Ayad Al-Rumaithi

The current study involves placing 135 boreholes drilled to a depth of 10 m below the existing ground level. Three standard penetration tests (SPT) are performed at depths of 1.5, 6, and 9.5 m for each borehole. To produce thematic maps with coordinates and depths for the bearing capacity variation of the soil, a numerical analysis was conducted using MATLAB software. Despite several-order interpolation polynomials being used to estimate the bearing capacity of soil, the first-order polynomial was the best among the other trials due to its simplicity and fast calculations. Additionally, the root mean squared error (RMSE) was almost the same for the all of the tried models. The results of the study can be summarized by the production of thematic maps showing the variation of the bearing capacity of the soil over the whole area of Al-Basrah city correlated with several depths. The bearing capacity of soil obtained from the suggested first-order polynomial matches well with those calculated from the results of SPTs with a deviation of ±30% at a 95% confidence interval.


Sign in / Sign up

Export Citation Format

Share Document