scholarly journals A New Forecast Model Based on the Analog Method for Persistent Extreme Precipitation

2016 ◽  
Vol 31 (4) ◽  
pp. 1325-1341 ◽  
Author(s):  
Baiquan Zhou ◽  
Panmao Zhai

Abstract This study aims to establish an analog prediction model for forecasting daily persistent extreme precipitation (PEP) during a PEP event (PEPE) using the temporal sequences of predictors with different weights applied in the atmospheric spatial field. The predictors are atmospheric variables in areas where the key influential systems of a PEPE are active in the THORPEX Interactive Grand Global Ensemble (TIGGE) dataset. By means of the cosine similarity measure and the cuckoo search technique, a forecast model was established and named the Key Influential Systems Based Analog Model (KISAM). Validations through threat scores (TSs) and root-mean-square errors for PEP during 17–25 June 2010 indicate that KISAM is able to identify the approaching PEP earlier and yield a more accurate forecast for the location and intensity of PEP than direct model output (DMO) at 3-day and longer lead times in the Yangtze–Huai River valley. For the independent PEPE case on 17–19 June 2010, KISAM is able to predict the PEPE about 8 days in advance. That is much earlier than with DMO. In addition, KISAM produces better intensity forecasts and predicts the extent of the PEPE better than DMO at the same lead time of 5 days. In terms of the forecast experiments during June 2010 and 2015, KISAM shows relatively stronger capacity than DMO in predicting the occurrence and intensity of extreme precipitation (EP) and PEP events at lead times of 1 week or even longer. Through validation of more EP, better performance of KISAM compared to DMO on average is further confirmed at 3-day and longer lead times.

2018 ◽  
Vol 33 (1) ◽  
pp. 221-238 ◽  
Author(s):  
Baiquan Zhou ◽  
Panmao Zhai ◽  
Ruoyun Niu

Abstract Two persistent extreme precipitation events (PEPEs) that caused severe flooding in the Yangtze–Huai River valley in summer 2016 presented a significant challenge to operational forecasters. To provide forecasters with useful references, the capacity of two objective forecast models in predicting these two PEPEs is investigated. The objective models include a numerical weather prediction (NWP) model from the European Centre for Medium-Range Weather Forecasts (ECMWF), and a statistical downscaling model, the Key Influential Systems Based Analog Model (KISAM). Results show that the ECMWF ensemble provides a skillful spectrum of solutions for determining the location of the daily heavy precipitation (≥25 mm day−1) during the PEPEs, despite its general underestimation of heavy precipitation. For lead times longer than 3 days, KISAM outperforms the ensemble mean and nearly one-half or more of all the ensemble members of ECMWF. Moreover, at longer lead times, KISAM generally performs better in reproducing the meridional location of accumulated rainfall over the two PEPEs compared to the ECMWF ensemble mean and the control run. Further verification of the vertical velocity that affects the production of heavy rainfall in ECMWF and KISAM implies the quality of the depiction of ascending motion during the PEPEs has a dominating influence on the models’ performance in predicting the meridional location of the PEPEs at all lead times. The superiority of KISAM indicates that statistical downscaling techniques are effective in alleviating the deficiency of global NWP models for PEPE forecasts in the medium range of 4–10 days.


2016 ◽  
Author(s):  
Elcin Tan

Abstract. Providing high accuracy in quantitative extreme precipitation forecasting (QEPF) is still a challenge. California is vulnerable to extreme precipitation, which occurs due to atmospheric rivers and might be more intense with climate change. Accordingly, this study is an attempt to evaluate the extreme precipitation forecasting performance of a QPF model, the Weather Research and Forecast Model, version 3.1.7, for the extreme precipitation event that caused the 1997 New Year’s flood in California. Sensitivities of 19 microphysics schemes are tested by utilizing 18 various Goodness of Fit (GoF) tests for hourly and point-wise comparisons between 3-km horizontal domain resolution simulations of the WRF Model and observations. The results indicate that the coefficient of persistence (cp) is the first metric that needs to be evaluated because it determines whether simulation versus observation values are reasonable. Comparisons of 3 out of 8 stations in the American River Watershed passed this test. The results also show that Normalized Root Mean Square Errors (NRMSE) and Percent Bias (PBIAS) metrics are more representative than others due to their ability to discriminate model performances. Further, microphysics (MP) schemes are also significantly sensitive to location. Although 3 of the stations that passed the cp test are quite near to each other spatially, different MP schemes become prominent for different observation locations. For instance, for the ALP station, MP3, MP8, MP17, and MP28 indicate better performances, whereas the errors of MP3, MP8, MP9, and MP17 are less than other MPs for the BTA station. However, MP11 has the only reasonable results, according to cp values for the CAP station. The MPs are also evaluated for 72-hr and basin-averaged precipitation estimations of the WRF Model by means of true percent relative errors. The results show that the accuracy of the WRF Model is much higher for the 72-hr total basin-averaged evaluations than for the hourly and point-wise comparisons. Thus, the Thompson Scheme (MP8) indicates more trustworthy results than others, with a 3.1 % true percent relative error. Although WRF simulations overestimate the 72-hr basin-averaged precipitation for most of the MP schemes, this may not be pronounced for moderate, heavy, and extreme precipitation when hourly and point-wise evaluations are performed but is valid for light precipitation.


2011 ◽  
Vol 139 (6) ◽  
pp. 1960-1971 ◽  
Author(s):  
Jakob W. Messner ◽  
Georg J. Mayr

Abstract Three methods to make probabilistic weather forecasts by using analogs are presented and tested. The basic idea of these methods is that finding similar NWP model forecasts to the current one in an archive of past forecasts and taking the corresponding analyses as prediction should remove all systematic errors of the model. Furthermore, this statistical postprocessing can convert NWP forecasts to forecasts for point locations and easily turn deterministic forecasts into probabilistic ones. These methods are tested in the idealized Lorenz96 system and compared to a benchmark bracket formed by ensemble relative frequencies from direct model output and logistic regression. The analog methods excel at longer lead times.


2014 ◽  
Vol 5 (2) ◽  
pp. 1-21 ◽  
Author(s):  
Arpita Sharma ◽  
Samiksha Goel

This paper proposes two novel nature inspired decision level fusion techniques, Cuckoo Search Decision Fusion (CSDF) and Improved Cuckoo Search Decision Fusion (ICSDF) for enhanced and refined extraction of terrain features from remote sensing data. The developed techniques derive their basis from a recently introduced bio-inspired meta-heuristic Cuckoo Search and modify it suitably to be used as a fusion technique. The algorithms are validated on remote sensing satellite images acquired by multispectral sensors namely LISS3 Sensor image of Alwar region in Rajasthan, India and LANDSAT Sensor image of Delhi region, India. Overall accuracies obtained are substantially better than those of the four individual terrain classifiers used for fusion. Results are also compared with majority voting and average weighing policy fusion strategies. A notable achievement of the proposed fusion techniques is that the two difficult to identify terrains namely barren and urban are identified with similar high accuracies as other well identified land cover types, which was not possible by single analyzers.


Author(s):  
Guangyu Zhou ◽  
Aijia Ouyang ◽  
Yuming Xu

To overcome the shortcomings of the basic glowworm swarm optimization (GSO) algorithm, such as low accuracy, slow convergence speed and easy to fall into local minima, chaos algorithm and cloud model algorithm are introduced to optimize the evolution mechanism of GSO, and a chaos GSO algorithm based on cloud model (CMCGSO) is proposed in the paper. The simulation results of benchmark function of global optimization show that the CMCGSO algorithm performs better than the cuckoo search (CS), invasive weed optimization (IWO), hybrid particle swarm optimization (HPSO), and chaos glowworm swarm optimization (CGSO) algorithm, and CMCGSO has the advantages of high accuracy, fast convergence speed and strong robustness to find the global optimum. Finally, the CMCGSO algorithm is used to solve the problem of face recognition, and the results are better than the methods from literatures.


2017 ◽  
Vol 10 (5) ◽  
pp. 1813-1821
Author(s):  
Pengfei Xia ◽  
Shirong Ye ◽  
Kecai Jiang ◽  
Dezhong Chen

Abstract. In the GPS radio occultation technique, the atmospheric excess phase (AEP) can be used to derive the refractivity, which is an important quantity in numerical weather prediction. The AEP is conventionally estimated based on GPS double-difference or single-difference techniques. These two techniques, however, rely on the reference data in the data processing, increasing the complexity of computation. In this study, an undifferenced (ND) processing strategy is proposed to estimate the AEP. To begin with, we use PANDA (Positioning and Navigation Data Analyst) software to perform the precise orbit determination (POD) for the purpose of acquiring the position and velocity of the mass centre of the COSMIC (The Constellation Observing System for Meteorology, Ionosphere and Climate) satellites and the corresponding receiver clock offset. The bending angles, refractivity and dry temperature profiles are derived from the estimated AEP using Radio Occultation Processing Package (ROPP) software. The ND method is validated by the COSMIC products in typical rising and setting occultation events. Results indicate that rms (root mean square) errors of relative refractivity differences between undifferenced and atmospheric profiles (atmPrf) provided by UCAR/CDAAC (University Corporation for Atmospheric Research/COSMIC Data Analysis and Archive Centre) are better than 4 and 3 % in rising and setting occultation events respectively. In addition, we also compare the relative refractivity bias between ND-derived methods and atmPrf profiles of globally distributed 200 COSMIC occultation events on 12 December 2013. The statistical results indicate that the average rms relative refractivity deviation between ND-derived and COSMIC profiles is better than 2 % in the rising occultation event and better than 1.7 % in the setting occultation event. Moreover, the observed COSMIC refractivity profiles from ND processing strategy are further validated using European Centre for Medium-Range Weather Forecasts (ECMWF) analysis data, and the results indicate that the undifferenced method reduces the noise level on the excess phase paths in the lower troposphere compared to the single-difference processing strategy.


2020 ◽  
Vol 21 (4) ◽  
pp. 751-771 ◽  
Author(s):  
Brian Henn ◽  
Rachel Weihs ◽  
Andrew C. Martin ◽  
F. Martin Ralph ◽  
Tashiana Osborne

AbstractThe partitioning of rain and snow during atmospheric river (AR) storms is a critical factor in flood forecasting, water resources planning, and reservoir operations. Forecasts of atmospheric rain–snow levels from December 2016 to March 2017, a period of active AR landfalls, are evaluated using 19 profiling radars in California. Three forecast model products are assessed: a global forecast model downscaled to 3-km grid spacing, 4-km river forecast center operational forecasts, and 50-km global ensemble reforecasts. Model forecasts of the rain–snow level are compared with observations of rain–snow melting-level brightband heights. Models produce mean bias magnitudes of less than 200 m across a range of forecast lead times. Error magnitudes increase with lead time and are similar between models, averaging 342 m for lead times of 24 h or less and growing to 700–800 m for lead times of greater than 144 h. Observed extremes in the rain–snow level are underestimated, particularly for warmer events, and the magnitude of errors increases with rain–snow level. Storms with high rain–snow levels are correlated with larger observed precipitation rates in Sierra Nevada watersheds. Flood risk increases with rain–snow levels, not only because a greater fraction of the watershed receives rain, but also because warmer storms carry greater water vapor and thus can produce heavier precipitation. The uncertainty of flood forecasts grows nonlinearly with the rain–snow level for these reasons as well. High rain–snow level ARs are a major flood hazard in California and are projected to be more prevalent with climate warming.


2021 ◽  
Vol 2083 (3) ◽  
pp. 032010
Author(s):  
Rong Ma

Abstract The traditional BP neural network is difficult to achieve the target effect in the prediction of waterway cargo turnover. In order to improve the accuracy of waterway cargo turnover forecast, a waterway cargo turnover forecast model was created based on genetic algorithm to optimize neural network parameters. The genetic algorithm overcomes the trap that the general iterative method easily falls into, that is, the “endless loop” phenomenon that occurs when the local minimum is small, and the calculation time is small, and the robustness is high. Using genetic algorithm optimized BP neural network to predict waterway cargo turnover, and the empirical analysis of the waterway cargo turnover forecast is carried out. The results obtained show that the neural network waterway optimized by genetic algorithm has a higher accuracy than the traditional BP neural network for predicting waterway cargo turnover, and the optimization model can long-term analysis of the characteristics of waterway cargo turnover changes shows that the prediction effect is far better than traditional neural networks.


2011 ◽  
Vol 139 (2) ◽  
pp. 332-350 ◽  
Author(s):  
Charles Jones ◽  
Jon Gottschalck ◽  
Leila M. V. Carvalho ◽  
Wayne Higgins

Abstract Extreme precipitation events are among the most devastating weather phenomena since they are frequently accompanied by loss of life and property. This study uses reforecasts of the NCEP Climate Forecast System (CFS.v1) to evaluate the skill of nonprobabilistic and probabilistic forecasts of extreme precipitation in the contiguous United States (CONUS) during boreal winter for lead times up to two weeks. The CFS model realistically simulates the spatial patterns of extreme precipitation events over the CONUS, although the magnitudes of the extremes in the model are much larger than in the observations. Heidke skill scores (HSS) for forecasts of extreme precipitation at the 75th and 90th percentiles showed that the CFS model has good skill at week 1 and modest skill at week 2. Forecast skill is usually higher when the Madden–Julian oscillation (MJO) is active and has enhanced convection occurring over the Western Hemisphere, Africa, and/or the western Indian Ocean than in quiescent periods. HSS greater than 0.1 extends to lead times of up to two weeks in these situations. Approximately 10%–30% of the CONUS has HSS greater than 0.1 at lead times of 1–14 days when the MJO is active. Probabilistic forecasts for extreme precipitation events at the 75th percentile show improvements over climatology of 0%–40% at 1-day lead and 0%–5% at 7-day leads. The CFS has better skill in forecasting severe extremes (i.e., events exceeding the 90th percentile) at longer leads than moderate extremes (75th percentile). Improvements over climatology between 10% and 30% at leads of 3 days are observed over several areas across the CONUS—especially in California and in the Midwest.


Author(s):  
Quyen

Stormsurge is a typical genuine fiasco coming from the ocean. Therefore, an accurate forecast of surges is a vital assignment to dodge property misfortunes and decrease the chance of tropical storm surges. Genetic Programming (GP) is an evolution-based model learning technique that can simultaneously find the functional form and the numeric coefficients for the model. Moreover, GP has been widely applied to build models for predictive problems. However, GP has seldom been applied to the problem of storm surge forecasting. In this paper, a new method to use GP for evolving models for storm surge forecasting is proposed. Experimental results on data-sets collected from the Tottori coast of Japan show that GP can become more accurate storm surge forecasting models than other standard machine learning methods. Moreover, GP can automatically select relevant features when evolving storm surge forecasting models, and the models developed by GP are interpretable.


Sign in / Sign up

Export Citation Format

Share Document