Statistical methods and weather prediction for ampacity forecasting in smart grids

Author(s):  
Rafael Alberdi ◽  
Elvira Fernandez ◽  
Igor Albizu ◽  
Victor Valverde ◽  
Miren T. Bedialauneta ◽  
...  
2009 ◽  
Vol 137 (12) ◽  
pp. 4355-4368 ◽  
Author(s):  
Andrew E. Mercer ◽  
Chad M. Shafer ◽  
Charles A. Doswell ◽  
Lance M. Leslie ◽  
Michael B. Richman

Abstract Tornadoes often strike as isolated events, but many occur as part of a major outbreak of tornadoes. Nontornadic outbreaks of severe convective storms are more common across the United States but pose different threats than do those associated with a tornado outbreak. The main goal of this work is to distinguish between significant instances of these outbreak types objectively by using statistical modeling techniques on numerical weather prediction output initialized with synoptic-scale data. The synoptic-scale structure contains information that can be utilized to discriminate between the two types of severe weather outbreaks through statistical methods. The Weather Research and Forecast model (WRF) is initialized with synoptic-scale input data (the NCEP–NCAR reanalysis dataset) on a set of 50 significant tornado outbreaks and 50 nontornadic severe weather outbreaks. Output from the WRF at 18-km grid spacing is used in the objective classification. Individual severe weather parameters forecast by the model near the time of the outbreak are analyzed from simulations initialized at 24, 48, and 72 h prior to the outbreak. An initial candidate set of 15 variables expected to be related to severe storms is reduced to a set of 6 or 7, depending on lead time, that possess the greatest classification capability through permutation testing. These variables serve as inputs into two statistical methods, support vector machines and logistic regression, to classify outbreak type. Each technique is assessed based on bootstrap confidence limits of contingency statistics. An additional backward selection of the reduced variable set is conducted to determine which variable combination provides the optimal contingency statistics. Results for the contingency statistics regarding the verification of discrimination capability are best at 24 h; at 48 h, modest degradation is present. By 72 h, the contingency statistics decline by up to 15%. Overall, results are encouraging, with probability of detection values often exceeding 0.8 and Heidke skill scores in excess of 0.7 at 24-h lead time.


2021 ◽  
Author(s):  
Malik bader alazzam ◽  
Fawaz Alassery

Abstract The Internet of Things (IoT) has subsequently been applied to a variety of sectors, including smart grids, farming, weather prediction, power generation, wastewater treatment, and so on. So if the Internet of Things has enormous promise in a wide range of applications, there still are certain areas where it may be improved. Designers had focused our present research on reducing the energy consumption of devices in IoT networks, which will result in a longer network lifetime. The far more suitable Cluster Head (CH) throughout the IoT system is determined in this study to optimize energy consumption. Whale Optimization Algorithm (WOA) with Evolutionary Algorithm (EA) is indeed a mixed meta-heuristic algorithm used during the suggested study. Various quantifiable metrics, including the variety of adult nodes, workload, temperatures, remaining energy, and a target value, were utilized IoT network groups. The suggested method then is contrasted to several cutting-edge optimization techniques, including the Artificial Bee Colony method, Neural Network, Adapted Gravity Simulated annealing. The findings show that the suggested hybrid method outperforms conventional methods.


2016 ◽  
Author(s):  
Vianney Courdent ◽  
Morten Grum ◽  
Thomas Munk-Nielsen ◽  
Peter S. Mikkelsen

Abstract. Precipitation is the major perturbation to the flow in urban drainage and wastewater systems. Flow forecast, generated by coupling rainfall predictions with a hydrologic runoff model, can potentially be used to optimise the operation of Integrated Urban Drainage–Wastewater Systems (IUDWS) during both wet and dry weather periods. Numerical Weather Prediction (NWP) models have significantly improved in recent years; increasing their spatial and temporal resolution. Finer resolution NWP are suitable for urban catchment scale applications, providing longer lead time than radar extrapolation. However, forecasts are inevitably uncertain and fine resolution is especially challenging for NWP. This uncertainty is commonly addressed in meteorology with Ensemble Prediction Systems (EPS). Handling uncertainty is challenging for decision makers and hence tools are necessary to provide insight on ensemble forecast usage and to support the rationality of decisions (i.e. forecasts are uncertain therefore errors will be made, decision makers need tools to justify their choices, demonstrating that these choices are beneficial in the long run). This study presents an economic framework to support the decision making process by providing information on when acting on the forecast is beneficial and how to handle the EPS. The Relative Economic Value (REV) approach associates economic values to the potential outcomes and determines the preferential use of the EPS forecast. The envelope curve of the REV diagram combines the results from each probability forecast to provide the highest relative economic value for a given gain-loss ratio. This approach is traditionally used at larger scales to assess mitigation measures for adverse events (i.e. the actions are taken when events are forecasted). The specificity of this study is to optimise the energy consumption in IUDWS during low flow periods by exploiting the electrical smart grid market (i.e. the actions are taken when no events are forecasted). Furthermore, the results demonstrate the benefit of NWP neighbourhood post-processing methods to enhance the forecast skill and increase the range of beneficial use.


2012 ◽  
Vol 14 (4) ◽  
pp. 1006-1023 ◽  
Author(s):  
Getnet Y. Muluye

There are several statistical downscaling methods available for generating local-scale meteorological variables from large-scale model outputs. There is still no universal single method, or group of methods, that is clearly superior, particularly for downscaling daily precipitation. This paper compares different statistical methods for downscaling daily precipitation from numerical weather prediction model output. Three different methods are considered: (i) hybrids; (ii) neural networks; and (iii) nearest neighbor-based approaches. These methods are implemented in the Saguenay watershed in northeastern Canada. Suites of standard diagnostic measures are computed to evaluate and inter-compare the performances of the downscaling models. Although results of the downscaling experiment show mixed performances, clear patterns emerge with respect to the reproduction of variation in daily precipitation and skill values. Artificial neural network-logistic regression (ANN-Logst), partial least squares (PLS) regression and recurrent multilayer perceptron (RMLP) models yield greater skill values, and conditional resampling method (SDSM) and K-nearest neighbor (KNN)-based models show the potential to capture the variability in daily precipitation.


1978 ◽  
Vol 48 ◽  
pp. 7-29
Author(s):  
T. E. Lutz

This review paper deals with the use of statistical methods to evaluate systematic and random errors associated with trigonometric parallaxes. First, systematic errors which arise when using trigonometric parallaxes to calibrate luminosity systems are discussed. Next, determination of the external errors of parallax measurement are reviewed. Observatory corrections are discussed. Schilt’s point, that as the causes of these systematic differences between observatories are not known the computed corrections can not be applied appropriately, is emphasized. However, modern parallax work is sufficiently accurate that it is necessary to determine observatory corrections if full use is to be made of the potential precision of the data. To this end, it is suggested that a prior experimental design is required. Past experience has shown that accidental overlap of observing programs will not suffice to determine observatory corrections which are meaningful.


1973 ◽  
Vol 18 (11) ◽  
pp. 562-562
Author(s):  
B. J. WINER
Keyword(s):  

1996 ◽  
Vol 41 (12) ◽  
pp. 1224-1224
Author(s):  
Terri Gullickson
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document