scholarly journals The UAE Cloud Seeding Program: A Statistical and Physical Evaluation

Atmosphere ◽  
2021 ◽  
Vol 12 (8) ◽  
pp. 1013
Author(s):  
Taha Al Hosari ◽  
Abdulla Al Mandous ◽  
Youssef Wehbe ◽  
Abdeltawab Shalaby ◽  
Noor Al Shamsi ◽  
...  

Operational cloud seeding programs have been increasingly deployed in several countries to augment natural rainfall amounts, particularly over water-scarce and arid regions. However, evaluating operational programs by quantifying seeding impacts remains a challenging task subject to complex uncertainties. In this study, we investigate seeding impacts using both long-term rain gauge records and event-based weather radar retrievals within the framework of the United Arab Emirates (UAE) National Center of Meteorology’s operational cloud seeding program. First, seasonal rain gauge records are inter-compared between unseeded (1981–2002) and seeded (2003–2019) periods, after which a posteriori target/control regression is developed to decouple natural and seeded rainfall time series. Next, trend analyses and change point detection are carried out over the July-October seeding periods using the modified Mann-Kendall (mMK) test and the Cumulative Sum (CUSUM) method, respectively. Results indicate an average increase of 23% in annual surface rainfall over the seeded target area, along with statistically significant change points detected during 2011 with decreasing/increasing rainfall trends for pre-/post-change point periods, respectively. Alternatively, rain gauge records over the control (non-seeded) area show non-significant change points. In line with the gauge-based statistical findings, a physical analysis using an archive of seeded (65) and unseeded (87) storms shows enhancements in radar-based storm properties within 15–25 min of seeding. The largest increases are recorded in storm volume (159%), area cover (72%), and lifetime (65%). The work provides new insights for assessing long-term seeding impacts and has significant implications for policy- and decision-making related to cloud seeding research and operational programs in arid regions.

Water ◽  
2021 ◽  
Vol 13 (12) ◽  
pp. 1633
Author(s):  
Elena-Simona Apostol ◽  
Ciprian-Octavian Truică ◽  
Florin Pop ◽  
Christian Esposito

Due to the exponential growth of the Internet of Things networks and the massive amount of time series data collected from these networks, it is essential to apply efficient methods for Big Data analysis in order to extract meaningful information and statistics. Anomaly detection is an important part of time series analysis, improving the quality of further analysis, such as prediction and forecasting. Thus, detecting sudden change points with normal behavior and using them to discriminate between abnormal behavior, i.e., outliers, is a crucial step used to minimize the false positive rate and to build accurate machine learning models for prediction and forecasting. In this paper, we propose a rule-based decision system that enhances anomaly detection in multivariate time series using change point detection. Our architecture uses a pipeline that automatically manages to detect real anomalies and remove the false positives introduced by change points. We employ both traditional and deep learning unsupervised algorithms, in total, five anomaly detection and five change point detection algorithms. Additionally, we propose a new confidence metric based on the support for a time series point to be an anomaly and the support for the same point to be a change point. In our experiments, we use a large real-world dataset containing multivariate time series about water consumption collected from smart meters. As an evaluation metric, we use Mean Absolute Error (MAE). The low MAE values show that the algorithms accurately determine anomalies and change points. The experimental results strengthen our assumption that anomaly detection can be improved by determining and removing change points as well as validates the correctness of our proposed rules in real-world scenarios. Furthermore, the proposed rule-based decision support systems enable users to make informed decisions regarding the status of the water distribution network and perform effectively predictive and proactive maintenance.


2019 ◽  
Vol 58 (10) ◽  
pp. 2177-2196 ◽  
Author(s):  
Yingxian Zhang ◽  
Yuyu Ren ◽  
Guoyu Ren ◽  
Guofu Wang

AbstractTypical rain gauge measurements have long been recognized to underestimate actual precipitation. Long-term daily precipitation records during 1961–2013 from a dense national network of 2379 gauges were corrected to remove systematic errors caused by trace precipitation, wetting losses, and wind-induced undercatch. The corrected percentage was higher in cold seasons and lower in warm seasons. Both trace precipitation and wetting loss corrections were more important in arid regions than in wet regions. A greater correction percentage for wind-induced error could be found in cold and arid regions, as well as high wind speed areas. Generally, the annual precipitation amounts as well as the annual precipitation intensity increased to varying degrees after bias correction with the maximum percentage being about 35%. More importantly, the bias-corrected snowfall amount as well as the rainstorm amount increased remarkably by percentages of more than 50% and 18%, respectively. Remarkably, the total number of actual rainstorm events during the past 53 years could be 90 days more than the observed rainstorm events in some coastal areas of China. Therefore, the actual amounts of precipitation, snowfall, and intense rainfall were much higher than previously measured over China. Bias correction is thus needed to obtain accurate estimates of precipitation amounts and precipitation intensity.


Author(s):  
Mehdi Moradi ◽  
Manuel Montesino-SanMartin ◽  
M. Dolores Ugarte ◽  
Ana F. Militino

AbstractWe propose an adaptive-sliding-window approach (LACPD) for the problem of change-point detection in a set of time-ordered observations. The proposed method is combined with sub-sampling techniques to compensate for the lack of enough data near the time series’ tails. Through a simulation study, we analyse its behaviour in the presence of an early/middle/late change-point in the mean, and compare its performance with some of the frequently used and recently developed change-point detection methods in terms of power, type I error probability, area under the ROC curves (AUC), absolute bias, variance, and root-mean-square error (RMSE). We conclude that LACPD outperforms other methods by maintaining a low type I error probability. Unlike some other methods, the performance of LACPD does not depend on the time index of change-points, and it generally has lower bias than other alternative methods. Moreover, in terms of variance and RMSE, it outperforms other methods when change-points are close to the time series’ tails, whereas it shows a similar (sometimes slightly poorer) performance as other methods when change-points are close to the middle of time series. Finally, we apply our proposal to two sets of real data: the well-known example of annual flow of the Nile river in Awsan, Egypt, from 1871 to 1970, and a novel remote sensing data application consisting of a 34-year time-series of satellite images of the Normalised Difference Vegetation Index in Wadi As-Sirham valley, Saudi Arabia, from 1986 to 2019. We conclude that LACPD shows a good performance in detecting the presence of a change as well as the time and magnitude of change in real conditions.


2020 ◽  
Author(s):  
Simon Letzgus

Abstract. Analysis of data from wind turbine supervisory control and data acquisition (SCADA) systems has attracted considerable research interest in recent years. The data is predominantly used to gain insights into turbine condition without the need for additional sensing equipment. Most successful approaches apply semi-supervised anomaly detection methods, also called normal behaivour models, that use clean training data sets to establish healthy component baseline models. However, one of the major challenges when working with wind turbine SCADA data in practice is the presence of systematic changes in signal behaviour induced by malfunctions or maintenance actions. Even though this problem is well described in literature it has not been systematically addressed so far. This contribution is the first to comprehensively analyse the presence of change-points in wind turbine SCADA signals and introduce an algorithm for their automated detection. 600 signals from 33 turbines are analysed over an operational period of more than two years. During this time one third of the signals are affected by change-points. Kernel change-point detection methods have shown promising results in similar settings but their performance strongly depends on the choice of several hyperparameters. This contribution presents a comprehensive comparison between different kernels as well as kernel-bandwidth and regularisation-penalty selection heuristics. Moreover, an appropriate data pre-processing procedure is introduced. The results show that the combination of Laplace kernels with a newly introduced bandwidth and penalty selection heuristic robustly outperforms existing methods. In a signal validation setting more than 90 % of the signals were classified correctly regarding the presence or absence of change-points, resulting in a F1-score of 0.86. For a change-point-free sequence selection the most severe 60 % of all CPs could be automatically removed with a precision of more than 0.96 and therefore without a significant loss of training data. These results indicate that the algorithm can be a meaningful step towards automated SCADA data pre-processing which is key for data driven methods to reach their full potential. The algorithm is open source and its implementation in Python publicly available.


2021 ◽  
Author(s):  
Seyed Arman Hashemi Monfared ◽  
Samaneh Poormohammadi ◽  
Mehran Fatemi ◽  
Faezeh Rasaei ◽  
Mahmood Khosravi

Abstract The water shortage is a challenge in many countries around the world. Today, the latest scientific and practical technologies are used to solve the problem of water shortage in arid and semi-arid regions. The optimal use of water resources as well as the use of novel methods of water extraction plays a significant role in alleviating the effects of this crisis. One of the methods used for increasing rainfall and water harvesting from the atmosphere is cloud seeding technology. The first step of this technique involves studying the target area and selecting the appropriate time and place for cloud seeding. The purpose of this study is to investigate the feasibility of cloud seeding in Sistan and Baluchestan province, south east of Iran, for rainmaking. Therefore, using the parameters of precipitation, minimum temperature, relative humidity and cloudy parameter, the status and feasibility for rainmaking in the province were evaluated and suitable months for cloud seeding were determined. Accordingly, December, January, February and March were found to provide suitable conditions for seeding. In order to select suitable places for cloud seeding, zoning maps of precipitation, temperature and relative humidity in selected months as well as the topographic map of the province were prepared by GIS After fuzzyization and integration of these maps, the zoning map of suitable areas for cloud seeding in Sistan and Baluchestan province was drawn to select the most susceptible areas. The area surrounding Khash synoptic station and the southern areas of the province were found to be suitable for cloud seeding.


2021 ◽  
Vol 13 (2) ◽  
pp. 247
Author(s):  
Youssef Wehbe ◽  
Marouane Temimi

A better understanding of the spatiotemporal distribution of water resources is crucial for the sustainable development of hyper-arid regions. Here, we focus on the Arabian Peninsula (AP) and use remotely sensed data to (i) analyze the local climatology of total water storage (TWS), precipitation, and soil moisture; (ii) characterize their temporal variability and spatial distribution; and (iii) infer recent trends and change points within their time series. Remote sensing data for TWS, precipitation, and soil moisture are obtained from the Gravity Recovery and Climate Experiment (GRACE), the Tropical Rainfall Measuring Mission (TRMM), and the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), respectively. The study relies on trend analysis, the modified Mann–Kendall test, and change point detection statistics. We first derive 10-year (2002–2011) seasonal averages from each of the datasets and intercompare their spatial organization. In the absence of large-scale in situ data, we then compare trends from GRACE TWS retrievals to in situ groundwater observations locally over the subdomain of the United Arab Emirates (UAE). TWS anomalies vary between −6.2 to 3.2 cm/month and −6.8 to −0.3 cm/month during the winter and summer periods, respectively. Trend analysis shows decreasing precipitation trends (−2.3 × 10−4 mm/day) spatially aligned with decreasing soil moisture trends (−1.5 × 10−4 g/cm3/month) over the southern part of the AP, whereas the highest decreasing TWS trends (−8.6 × 10−2 cm/month) are recorded over areas of excessive groundwater extraction in the northern AP. Interestingly, change point detection reveals increasing precipitation trends pre- and post-change point breaks over the entire AP region. Significant spatial dependencies are observed between TRMM and GRACE change points, particularly over Yemen during 2010, revealing the dominant impact of climatic changes on TWS depletion.


2021 ◽  
Author(s):  
Miriam Sieg ◽  
Lina Katrin Sciesielski ◽  
Karin Kirschner ◽  
Jochen Kruppa

Abstract Background: In longitudinal studies, observations are made over time. Hence, the single observations at each time point are dependent, making them a repeated measurement. In this work, we explore a different, counterintuitive setting: At each developmental time point, a lethal observation is performed on the pregnant or nursing mother. Therefore, the single time points are independent. Furthermore, the observation in the offspring at each time point is correlated with each other because each litter consists of several (genetically linked) littermates. In addition, the observed time series is short from a statistical perspective as animal ethics prevent killing more mother mice than absolutely necessary, and murine development is short anyway. We solve these challenges by using multiple contrast tests and visualizing the change point by the use of confidence intervals.Results: We used linear mixed models to model the variability of the mother. The estimates from the linear mixed model are then used in multiple contrast tests.There are a variety of contrasts and intuitively, we would use the Changepoint method. However, it does not deliver satisfying results. Interestingly, we found two other contrasts, both capable of answering different research questions in change point detection: i) Should a single point with change direction be found, or ii) Should the overall progression be determined? The Sequen contrast answers the first, the McDermott the second. Confidence intervals deliver effect estimates for the strength of the potential change point. Therefore, the scientist can define a biologically relevant limit of change depending on the research question.Conclusion: We present a solution with effect estimates for short independent time series with observations nested at a given time point. Multiple contrast tests produce confidence intervals, which allow determining the position of change points or to visualize the expression course over time. We suggest to use McDermott’s method to determine if there is an overall significant change within the time frame, while Sequen is better in determining specific change points. In addition, we offer a short formula for the estimation of the maximal length of the time series.


Entropy ◽  
2019 ◽  
Vol 21 (5) ◽  
pp. 531
Author(s):  
Xiaomin Duan ◽  
Huafei Sun ◽  
Xinyu Zhao

A matrix information-geometric method was developed to detect the change-points of rigid body motions. Note that the set of all rigid body motions is the special Euclidean group S E ( 3 ) , so the Riemannian mean based on the Lie group structures of S E ( 3 ) reflects the characteristics of change-points. Once a change-point occurs, the distance between the current point and the Riemannian mean of its neighbor points should be a local maximum. A gradient descent algorithm is proposed to calculate the Riemannian mean. Using the Baker–Campbell–Hausdorff formula, the first-order approximation of the Riemannian mean is taken as the initial value of the iterative procedure. The performance of our method was evaluated by numerical examples and manipulator experiments.


2014 ◽  
Vol 536-537 ◽  
pp. 499-511 ◽  
Author(s):  
Li Zhao ◽  
Qian Liu ◽  
Peng Du ◽  
Ge Fu ◽  
Wei Cao

Change-point detection is the problem of finding abrupt changes in time-series. However, the meaningful changes are usually difficult to identify from the original massive traffics, due to high dimension and strong periodicity. In this paper, we propose a novel change-point detection approach, which simultaneously detects change points from all dimensions of the traffics with three steps. We first reduce the dimensions by the classical Principal Component Analysis (PCA), then we apply an extended time-series segmentation method to detect the nontrivial change times, finally we identify the responsible applications for the changes by F-test. We demonstrate through experiments on datasets collected from four distributed systems with 44 applications that the proposed approach can effectively detect the nontrivial change points from the multivariate and periodical traffics. Our approach is more appropriate for mining the nontrivial changes in traffic data comparing with other clustering methods, such as center-based Kmeans and density-based DBSCAN.


Sign in / Sign up

Export Citation Format

Share Document