Improving hydrological forecasts through temporal hierarchal reconciliation

Author(s):  
Mohammad Sina Jahangir ◽  
John Quilty

<p>Hydrological forecasts at different horizons are often made using different models. These forecasts are usually temporally inconsistent (e.g., monthly forecasts may not sum to yearly forecasts), which may lead to misaligned or conflicting decisions. Temporal hierarchal reconciliation (or simply, hierarchical reconciliation) methods can be used for obtaining consistent forecasts at different horizons. However, their effectiveness in the field of hydrology has not yet been investigated. Thus, this research assesses hierarchal reconciliation for precipitation forecasting due to its high importance in hydrological applications (e.g., reservoir operations, irrigation, drought and flood forecasting). Original precipitation forecasts (ORF) were produced using three different models, including ‘automatic’ Exponential Time-Series Smoothing (ETS), Artificial Neural Networks (ANN), and Seasonal Auto-Regressive Integrated Moving Average (SARIMA). The forecasts were produced at six timescales, namely, monthly, 2-monthly, quarterly, 4-monthly, bi-annual, and annual, for 84 basins selected from the Canadian model parameter experiment (CANOPEX) dataset. Hierarchical reconciliation methods including Hierarchical Least Squares (HLS), Weighted Least Squares (WLS), and Ordinary Least Squares (OLS) along with the Bottom-Up (BU) method were applied to obtain consistent forecasts at all timescales.</p><p>Generally, ETS and ANN showed the best and worst performance, respectively, according to a wide range of performance metrics (root mean square error (RMSE), normalized RMSE (nRMSE), mean absolute error (MAE), normalized MAE (nMAE), and Nash-Sutcliffe Efficiency index (NSE)). The results indicated that hierarchal reconciliation has a dissimilar impact on the ORFs’ accuracy in different basins and timescales, improving the RMSE in some cases while decreasing it in others. Also, it was highlighted that for different forecast models, hierarchical reconciliation methods showed different levels of performance. According to the RMSE and MAE, the BU method outperformed the hierarchical methods for ETS forecasts, while for ANN and SARIMA forecasts, HLS and OLS improved the forecasts more substantially, respectively. The sensitivity of ORF to hierarchical reconciliation was assessed using the RMSE. It was shown that both accurate and inaccurate ORF could be improved through hierarchical reconciliation; in particular, the effectiveness of hierarchical reconciliation appears to be more dependent on the ORF accuracy than it is on the type of hierarchical reconciliation method.</p><p>While in the present work, the effectiveness of hierarchical reconciliation for hydrological forecasting was assessed via data-driven models, the methodology can easily be extended to process-based or hybrid (process-based data-driven) models. Further, since hydrological forecasts at different timescales may have different levels of importance to water resources managers and/or policymakers, hierarchical reconciliation can be used to weight the different timescales according to the user’s preference/desired goals.</p>

2009 ◽  
Vol 2009 ◽  
pp. 1-8 ◽  
Author(s):  
Janet Myhre ◽  
Daniel R. Jeske ◽  
Michael Rennie ◽  
Yingtao Bi

A heteroscedastic linear regression model is developed from plausible assumptions that describe the time evolution of performance metrics for equipment. The inherited motivation for the related weighted least squares analysis of the model is an essential and attractive selling point to engineers with interest in equipment surveillance methodologies. A simple test for the significance of the heteroscedasticity suggested by a data set is derived and a simulation study is used to evaluate the power of the test and compare it with several other applicable tests that were designed under different contexts. Tolerance intervals within the context of the model are derived, thus generalizing well-known tolerance intervals for ordinary least squares regression. Use of the model and its associated analyses is illustrated with an aerospace application where hundreds of electronic components are continuously monitored by an automated system that flags components that are suspected of unusual degradation patterns.


2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Mathil K. Thamer ◽  
Raoudha Zine

We have studied one of the most common distributions, namely, Lindley distribution, which is an important continuous mixed distribution with great ability to represent different systems. We studied this distribution with three parameters because of its high flexibility in modelling life data. The parameters were estimated by five different methods, namely, maximum likelihood estimation, ordinary least squares, weighted least squares, maximum product of spacing, and Cramér-von Mises. Simulation experiments were performed with different sample sizes and different parameter values. The different methods were compared on the generated data by mean square error and mean absolute error. In addition, we compared the methods for real data, which represent COVID-19 data in Iraq/Anbar Province.


2009 ◽  
Vol 12 (03) ◽  
pp. 297-317 ◽  
Author(s):  
ANOUAR BEN MABROUK ◽  
HEDI KORTAS ◽  
SAMIR BEN AMMOU

In this paper, fractional integrating dynamics in the return and the volatility series of stock market indices are investigated. The investigation is conducted using wavelet ordinary least squares, wavelet weighted least squares and the approximate Maximum Likelihood estimator. It is shown that the long memory property in stock returns is approximately associated with emerging markets rather than developed ones while strong evidence of long range dependence is found for all volatility series. The relevance of the wavelet-based estimators, especially, the approximate Maximum Likelihood and the weighted least squares techniques is proved in terms of stability and estimation accuracy.


Entropy ◽  
2022 ◽  
Vol 24 (1) ◽  
pp. 95
Author(s):  
Pontus Söderbäck ◽  
Jörgen Blomvall ◽  
Martin Singull

Liquid financial markets, such as the options market of the S&P 500 index, create vast amounts of data every day, i.e., so-called intraday data. However, this highly granular data is often reduced to single-time when used to estimate financial quantities. This under-utilization of the data may reduce the quality of the estimates. In this paper, we study the impacts on estimation quality when using intraday data to estimate dividends. The methodology is based on earlier linear regression (ordinary least squares) estimates, which have been adapted to intraday data. Further, the method is also generalized in two aspects. First, the dividends are expressed as present values of future dividends rather than dividend yields. Second, to account for heteroscedasticity, the estimation methodology was formulated as a weighted least squares, where the weights are determined from the market data. This method is compared with a traditional method on out-of-sample S&P 500 European options market data. The results show that estimations based on intraday data have, with statistical significance, a higher quality than the corresponding single-times estimates. Additionally, the two generalizations of the methodology are shown to improve the estimation quality further.


2014 ◽  
Vol 71 (1) ◽  
Author(s):  
Bello Abdulkadir Rasheed ◽  
Robiah Adnan ◽  
Seyed Ehsan Saffari ◽  
Kafi Dano Pati

In a linear regression model, the ordinary least squares (OLS) method is considered the best method to estimate the regression parameters if the assumptions are met. However, if the data does not satisfy the underlying assumptions, the results will be misleading. The violation for the assumption of constant variance in the least squares regression is caused by the presence of outliers and heteroscedasticity in the data. This assumption of constant variance (homoscedasticity) is very important in linear regression in which the least squares estimators enjoy the property of minimum variance. Therefor e robust regression method is required to handle the problem of outlier in the data. However, this research will use the weighted least square techniques to estimate the parameter of regression coefficients when the assumption of error variance is violated in the data. Estimation of WLS is the same as carrying out the OLS in a transformed variables procedure. The WLS can easily be affected by outliers. To remedy this, We have suggested a strong technique for the estimation of regression parameters in the existence of heteroscedasticity and outliers. Here we apply the robust regression of M-estimation using iterative reweighted least squares (IRWLS) of Huber and Tukey Bisquare function and resistance regression estimator of least trimmed squares to estimating the model parameters of state-wide crime of united states in 1993. The outcomes from the study indicate the estimators obtained from the M-estimation techniques and the least trimmed method are more effective compared with those obtained from the OLS.


1985 ◽  
Vol 15 (2) ◽  
pp. 331-340 ◽  
Author(s):  
T. Cunia ◽  
R. D. Briggs

To construct biomass tables for various tree components that are consistent with each other, one may use linear regression techniques with dummy variables. When the biomass of these components is measured on the same sample trees, one should also use the generalized rather than ordinary least squares method. A procedure is shown which allows the estimation of the covariance matrix of the sample biomass values and circumvents the problem of storing and inverting large covariance matrices. Applied to 20 sets of sample tree data, the generalized least squares regressions generated estimates which, on the average were slightly higher (about 1%) than the sample data. The confidence and prediction bands about the regression function were wider, sometimes considerably wider than those estimated by the ordinary weighted least squares.


Aerospace ◽  
2018 ◽  
Vol 5 (4) ◽  
pp. 104 ◽  
Author(s):  
Ilias Lappas ◽  
Michail Bozoudis

The development of a parametric model for the variable portion of the Cost Per Flying Hour (CPFH) of an ‘unknown’ aircraft platform and its application to diverse types of fixed and rotary wing aircraft development programs (F-35A, Su-57, Dassault Rafale, T-X candidates, AW189, Airbus RACER among others) is presented. The novelty of this paper lies in the utilization of a diverse sample of aircraft types, aiming to obtain a ‘universal’ Cost Estimating Relationship (CER) applicable to a wide range of platforms. Moreover, the model does not produce absolute cost figures but rather analogy ratios versus the F-16’s CPFH, broadening the model’s applicability. The model will enable an analyst to carry out timely and reliable Operational and Support (O&S) cost estimates for a wide range of ‘unknown’ aircraft platforms at their early stages of conceptual design, despite the lack of actual data from the utilization and support life cycle stages. The statistical analysis is based on Ordinary Least Squares (OLS) regression, conducted with R software (v5.3.1, released on 2 July 2018). The model’s output is validated against officially published CPFH data of several existing ‘mature’ aircraft platforms, including one of the most prolific fighter jet types all over the world, the F-16C/D, which is also used as a reference to compare CPFH estimates of various next generation aircraft platforms. Actual CPFH data of the Hellenic Air Force (HAF) have been used to develop the parametric model, the application of which is expected to significantly inform high level decision making regarding aircraft procurement, budgeting and future force structure planning, including decisions related to large scale aircraft modifications and upgrades.


1986 ◽  
Vol 16 (2) ◽  
pp. 249-255 ◽  
Author(s):  
Edwin J. Green ◽  
William E. Strawderman

A Stein-rule estimator, which shrinks least squares estimates of regression parameters toward their weighted average, was employed to estimate the coefficient in the constant form factor volume equation for 18 species simultaneously. The Stein-rule procedure was applied to ordinary least squares estimates and weighted least squares estimates. Simulation tests on independent validation data sets revealed that the Stein-rule estimates were biased, but predicted better than the corresponding least squares estimates. The Stein-rule procedures also yielded lower estimated mean square errors for the volume equation coefficient than the corresponding least squares procedure. Different methods of withdrawing sample data from the total sample available for each species revealed that the superiority of Stein-rule procedures over least squares decreased as the sample size increased and that the Stein-rule procedures were robust to unequal sample sizes, at least on the scale studied here.


2007 ◽  
Vol 40 (4) ◽  
pp. 694-701
Author(s):  
Thaung Lwin

Knudsen [X-ray Spectrom.(1981),10, 54–561] proposed and demonstrated a least-squares approach to estimating the unknown parameters of a system of equations required for calibration in X-ray diffraction analysis. The approach is an ordinary least-squares approach which does not incorporate information on the errors of the measured intensities for a set of samples used as standards. The purpose of the present paper is to show that a functional relationship model can be applied to the problem to account for all the variation due to sampling and measurement error in the peak intensities. It is also shown that Knudsen's calibration estimator can be regarded as an approximation to a more general and potentially more efficient weighted least-squares estimator derived from the functional relationship model. The closeness of the approximation depends on the nature of the covariance structure of the intensity measurements.


Sign in / Sign up

Export Citation Format

Share Document