runoff forecasting
Recently Published Documents


TOTAL DOCUMENTS

180
(FIVE YEARS 42)

H-INDEX

25
(FIVE YEARS 4)

Water ◽  
2021 ◽  
Vol 13 (23) ◽  
pp. 3390
Author(s):  
Zhanxing Xu ◽  
Jianzhong Zhou ◽  
Li Mo ◽  
Benjun Jia ◽  
Yuqi Yang ◽  
...  

Runoff forecasting is of great importance for flood mitigation and power generation plan preparation. To explore the better application of time-frequency decomposition technology in runoff forecasting and improve the prediction accuracy, this research has developed a framework of runoff forecasting named Decomposition-Integration-Prediction (DIP) using parallel-input neural network, and proposed a novel runoff forecasting model with Variational Mode Decomposition (VMD), Gated Recurrent Unit (GRU), and Stochastic Fractal Search (SFS) algorithm under this framework. In this model, the observed runoff series is first decomposed into several sub-series via the VMD method to extract different frequency information. Secondly, the parallel layers in the parallel-input neural network based on GRU are trained to receive the input samples of each subcomponent and integrate their output adaptively through the concatenation layers. Finally, the output of concatenation layers is treated as the final runoff forecasting result. In this process, the SFS algorithm was adopted to optimize the structure of the neural network. The prediction performance of the proposed model was evaluated using the historical monthly runoff data at Pingshan and Yichang hydrological stations in the Upper Yangtze River Basin of China, and seven various single and decomposition-based hybrid models were developed for comparison. The results show that the proposed model has obvious advantages in overall prediction performance, model training time, and multi-step-ahead prediction compared to several comparative methods, which is a reasonable and more efficient monthly runoff forecasting method based on time series decomposition and neural networks.


2021 ◽  
Author(s):  
Wenchuan Wang ◽  
Yu-jin Du ◽  
Kwok-wing Chau ◽  
Chun-Tian Cheng ◽  
Dong-mei Xu ◽  
...  

Abstract The optimal planning and management of modern water resources depends highly on reliable and accurate runoff forecasting. Data preprocessing technology can provide new possibilities for improving the accuracy of runoff forecasting, when basic physical relationships cannot be captured using a single prediction model. Yet, few researches evaluated the performances of various data preprocessing technology in predicting monthly runoff time series so far. In order to fill this research gap, this paper investigates the potential of five data preprocessing techniques based on gated recurrent unit network (GRU) model in monthly runoff prediction, namely variational mode decomposition (VMD), wavelet packet decomposition (WPD), complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN), extreme-point symmetric mode decomposition (ESMD), and singular spectrum analysis (SSA). In this study, the original monthly runoff data is first decomposed into a set of subcomponents using five data preprocessing methods; second, each component is predicted by developing an appropriate GRU model; finally, the forecasting results of different two-stage hybrid models are obtained by aggregating of forecast results of the corresponding subcomponents. Four performance metrics are employed as the evaluation benchmarks. The experimental results from two hydropower stations in China show that five data preprocessing techniques can attain satisfying prediction results, while VMD and WPD methods can yield better performance than CEEMDAN, ESMD and SSA in both training and testing periods in terms of four indexes. Indeed, it is significantly important to carefully determine an appropriate data preprocessing method according to the actual characteristics of the study area.


2021 ◽  
Vol 596 ◽  
pp. 126067
Author(s):  
Jiangwei Zhang ◽  
Xiaohui Chen ◽  
Amirul Khan ◽  
You-kuan Zhang ◽  
Xingxing Kuang ◽  
...  

2021 ◽  
Author(s):  
Yani Lian ◽  
Jungang Luo ◽  
Jingmin Wang ◽  
Ganggang Zuo

Abstract Many previous studies have developed decomposition and ensemble models to improve runoff forecasting performance. However, these decomposition-based models usually introduce large decomposition errors into the modeling process. Since the variation in runoff time series is greatly driven by climate change, many previous studies considering climate change focused on only rainfall-runoff modeling, with few meteorological factors as input. Therefore, a climate-driven streamflow forecasting (CDSF) framework was proposed to improve the runoff forecasting accuracy. This framework is realized using principal component analysis (PCA), long short-term memory (LSTM) and Bayesian optimization (BO) referred to as PCA-LSTM-BO. To validate the effectiveness and superiority of the PCA-LSTM-BO method with which one autoregressive LSTM model and two other CDSF models based on PCA, BO, and either support vector regression (SVR) or, gradient boosting regression trees (GBRT), namely, PCA-SVR-BO and PCA-GBRT-BO, respectively, were compared. A generalization performance index based on the Nash-Sutcliffe efficiency (NSE), called the GI(NSE) value, is proposed to evaluate the generalizability of the model. The results show that (1) the proposed model is significantly better than the other benchmark models in terms of the mean square error (MSE<=185.782), NSE>=0.819, and GI(NSE) <=0.223 for all the forecasting scenarios; (2) the PCA in the CDSF framework can improve the forecasting capacity and generalizability; (3) the CDSF framework is superior to the autoregressive LSTM models for all the forecasting scenarios; and (4) the GI(NSE) value is demonstrated to be effective in selecting the optimal model with a better generalizability.


2021 ◽  
Vol 9 ◽  
Author(s):  
Ruifang Yuan ◽  
Siyu Cai ◽  
Weihong Liao ◽  
Xiaohui Lei ◽  
Yunhui Zhang ◽  
...  

Hydrological series data are non-stationary and nonlinear. However, certain data-driven forecasting methods assume that streamflow series are stable, which contradicts reality and causes the simulated value to deviate from the observed one. Ensemble empirical mode decomposition (EEMD) was employed in this study to decompose runoff series into several stationary components and a trend. The long short-term memory (LSTM) model was used to build the prediction model for each sub-series. The model input set contained the historical flow series of the simulation station, its upstream hydrological station, and the historical meteorological element series. The final input of the LSTM model was selected by the MI method. To verify the effect of EEMD, this study used the Radial Basis Function (RBF) model to predict the sub-series, which was decomposed by EEMD. In addition, to study the simulation characteristics of the EEMD-LSTM model for different months of runoff, the GM(group by month)-EEMD-LSTM was set up for comparison. The key difference between the GM-EEMD-LSTM model and the EEMD-LSTM model is that the GM model must divide the runoff sequence on a monthly basis, followed by decomposition with EEMD and prediction with the LSTM model. The prediction results of the sub-series obtained by the LSTM and RBF exhibited better statistical performance than those of the original series, especially for the EEMD-LSTM. The overall GM-EEMD-LSTM model performance in low-water months was superior to that of the EEMD-LSTM model, but the simulation effect in the flood season was slightly lower than that of the EEMD-LSTM model. The simulation results of both models are significantly improved compared to those of the LSTM model.


2021 ◽  
Author(s):  
Martin Gauch ◽  
Frederik Kratzert ◽  
Grey Nearing ◽  
Jimmy Lin ◽  
Sepp Hochreiter ◽  
...  

&lt;p&gt;Rainfall&amp;#8211;runoff predictions are generally evaluated on reanalysis datasets such as the DayMet, Maurer, or NLDAS forcings in the CAMELS dataset. While useful for benchmarking, this does not fully reflect real-world applications. There, meteorological information is much coarser, and fine-grained predictions are at best available until the present. For any prediction of future discharge, we must rely on forecasts, which introduce an additional layer of uncertainty. Thus, the model inputs need to switch from past data to forecast data at some point, which raises several questions: How can we design models that support this transition? How can we design tests that evaluate the performance of the model? Aggravating the challenge, the past and future data products may include different variables or have different temporal resolutions.&lt;/p&gt;&lt;p&gt;We demonstrate how to seamlessly integrate past and future meteorological data in one deep learning model, using the recently proposed Multi-Timescale LSTM (MTS-LSTM, [1]). MTS-LSTMs are based on LSTMs but can generate rainfall&amp;#8211;runoff predictions at multiple timescales more efficiently. One MTS-LSTM consists of several LSTMs that are organized in a branched structure. Each LSTM branch processes a part of the input time series at a certain temporal resolution. Then it passes its states to the next LSTM branch&amp;#8212;thus sharing information across branches. We generalize this layout to handovers across data products (rather than just timescales) through an additional branch. This way, we can integrate past and future data in one prediction pipeline, yielding more accurate predictions.&lt;/p&gt;&lt;p&gt;&amp;#160;&lt;/p&gt;&lt;p&gt;[1] M. Gauch, F. Kratzert, D. Klotz, G. Nearing, J. Lin, and S. Hochreiter. &amp;#8220;Rainfall&amp;#8211;Runoff Prediction at Multiple Timescales with a Single Long Short-Term Memory Network.&amp;#8221; Hydrology and Earth System Sciences Discussions, in review, 2020.&lt;/p&gt;


2021 ◽  
Author(s):  
Paul Muñoz ◽  
David F. Muñoz ◽  
Johanna Orellana-Alvear ◽  
Hamed Moftakhari ◽  
Hamid Moradkhani ◽  
...  

&lt;p&gt;Current efforts on Deep Learning-based modeling are being put for solving real world problems with complex or even not-fully understood interactions between predictors and target variables. A special artificial neural network, the Long Short-Term Memory (LSTM) is a promising data-driven modeling approach for dynamic systems yet little has been explored in hydrological applications such as runoff forecasting. An aditional challenge to the forecasting task arises from the uncertainties generated when using readily-available Remote Sensing (RS) imagery aimed to overcome lack of in-situ data describing the runoff governing processes. Here, we proposed a runoff forecasting framework for a 300-km&lt;sup&gt;2&amp;#160;&lt;/sup&gt;mountain catchment located in the tropical Andes of Ecuador. The framework consists on real-time data acquisition, preprocessing and runoff forecasting for lead times between 1 and 12 hours. LSTM models were fed with 18 years of hourly runoff, and precipitation data from the novel PERSIANN-Dynamic Infrared Rain Rate near real-time (PDIR-Now) product. Model efficiencies according to the NSE metric ranged from 0.959 to 0.554, for the 1- to 12-hour models, respectively. Considering that the concentration time of the catchment is approximately 4 hours, the proposed framework becomes a useful tool for delivering runoff forecasts to decision makers, stakeholders and the public. This study has shown the suitability of using the PDIR-Now product in a LSTM-modeling framework for real-time hydrological applications. Future endeavors must focus on improving data representation and data assimilation through feature engineering strategies.&lt;/p&gt;&lt;p&gt;&lt;strong&gt;Keywords:&amp;#160;&lt;/strong&gt;Long Short-Term Memory; PDIR-Now; Hydroinformatics; Runoff forecasting; Tropical Andes&lt;/p&gt;


2021 ◽  
Vol 35 (4) ◽  
pp. 1167-1181
Author(s):  
Yun Bai ◽  
Nejc Bezak ◽  
Bo Zeng ◽  
Chuan Li ◽  
Klaudija Sapač ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document