probabilistic forecasts
Recently Published Documents


TOTAL DOCUMENTS

591
(FIVE YEARS 213)

H-INDEX

55
(FIVE YEARS 6)

2022 ◽  
Author(s):  
Zachary J. Smith ◽  
J. Eric Bickel

In Weighted Scoring Rules and Convex Risk Measures, Dr. Zachary J. Smith and Prof. J. Eric Bickel (both at the University of Texas at Austin) present a general connection between weighted proper scoring rules and investment decisions involving the minimization of a convex risk measure. Weighted scoring rules are quantitative tools for evaluating the accuracy of probabilistic forecasts relative to a baseline distribution. In their paper, the authors demonstrate that the relationship between convex risk measures and weighted scoring rules relates closely with previous economic characterizations of weighted scores based on expected utility maximization. As illustrative examples, the authors study two families of weighted scoring rules based on phi-divergences (generalizations of the Weighted Power and Weighted Pseudospherical Scoring rules) along with their corresponding risk measures. The paper will be of particular interest to the decision analysis and mathematical finance communities as well as those interested in the elicitation and evaluation of subjective probabilistic forecasts.


2022 ◽  
Vol 9 ◽  
Author(s):  
Arnau Folch ◽  
Leonardo Mingari ◽  
Andrew T. Prata

Operational forecasting of volcanic ash and SO2 clouds is challenging due to the large uncertainties that typically exist on the eruption source term and the mass removal mechanisms occurring downwind. Current operational forecast systems build on single-run deterministic scenarios that do not account for model input uncertainties and their propagation in time during transport. An ensemble-based forecast strategy has been implemented in the FALL3D-8.1 atmospheric dispersal model to configure, execute, and post-process an arbitrary number of ensemble members in a parallel workflow. In addition to intra-member model domain decomposition, a set of inter-member communicators defines a higher level of code parallelism to enable future incorporation of model data assimilation cycles. Two types of standard products are automatically generated by the ensemble post-process task. On one hand, deterministic forecast products result from some combination of the ensemble members (e.g., ensemble mean, ensemble median, etc.) with an associated quantification of forecast uncertainty given by the ensemble spread. On the other hand, probabilistic products can also be built based on the percentage of members that verify a certain threshold condition. The novel aspect of FALL3D-8.1 is the automatisation of the ensemble-based workflow, including an eventual model validation. To this purpose, novel categorical forecast diagnostic metrics, originally defined in deterministic forecast contexts, are generalised here to probabilistic forecasts in order to have a unique set of skill scores valid to both deterministic and probabilistic forecast contexts. Ensemble-based deterministic and probabilistic approaches are compared using different types of observation datasets (satellite cloud detection and retrieval and deposit thickness observations) for the July 2018 Ambae eruption in the Vanuatu archipelago and the April 2015 Calbuco eruption in Chile. Both ensemble-based approaches outperform single-run simulations in all categorical metrics but no clear conclusion can be extracted on which is the best option between these two.


Atmosphere ◽  
2021 ◽  
Vol 12 (12) ◽  
pp. 1643
Author(s):  
Hee-Wook Choi ◽  
Yeon-Hee Kim ◽  
Keunhee Han ◽  
Chansoo Kim

Wind shear can occur at all flight levels; however, it is particularly dangerous at low levels, from the ground up to approximately 2000 feet. If this phenomenon can occur during the take-off and landing of an aircraft, it may interfere with the normal altitude change of the aircraft, causing delay and cancellation of the aircraft, as well as economic damage. In this paper, to estimate the probabilistic forecasts of low-level wind shear at Gimpo, Gimhae, Incheon and Jeju International Airports, an Ensemble Model Output Statistics (EMOS) model based on a left-truncated normal distribution with a cutoff zero was applied. Observations were obtained from Gimpo, Gimhae, Incheon and Jeju International Airports and 13 ensemble member forecasts generated from the Limited-Area Ensemble Prediction System (LENS), for the period December 2018 to February 2020. Prior to applying to EMOS models, statistical consistency was analyzed by using a rank histogram and kernel density estimation to identify the uniformity of ensembles with corresponding observations. Performances were evaluated by mean absolute error, continuous ranked probability score and probability integral transform. The results showed that probabilistic forecasts obtained from the EMOS model exhibited better prediction skills when compared to the raw ensembles.


Atmosphere ◽  
2021 ◽  
Vol 12 (12) ◽  
pp. 1630
Author(s):  
Andrew Wilkins ◽  
Aaron Johnson ◽  
Xuguang Wang ◽  
Nicholas A. Gasperoni ◽  
Yongming Wang

Convection-allowing model (CAM) ensembles contain a distinctive ability to predict convective initiation location, mode, and morphology. Previous studies on CAM ensemble verification have primarily used neighborhood-based methods. A recently introduced object-based probabilistic (OBPROB) framework provides an alternative and novel framework in which to re-evaluate aspects of optimal CAM ensemble design with an emphasis on ensemble storm mode and morphology prediction. Herein, we adopt and extend the OBPROB method in conjunction with a traditional neighborhood-based method to evaluate forecasts of four differently configured 10-member CAM ensembles. The configurations include two single-model/single-physics, a single-model/multi-physics, and a multi-model/multi-physics configuration. Both OBPROB and neighborhood frameworks show that ensembles with more diverse member-to-member designs improve probabilistic forecasts over single-model/single-physics designs through greater sampling of different aspects of forecast uncertainties. Individual case studies are evaluated to reveal the distinct forecast features responsible for the systematic results identified from the different frameworks. Neighborhood verification, even at high reflectivity thresholds, is primarily impacted by mesoscale locations of convective and stratiform precipitation across scales. In contrast, the OBPROB verification explicitly focuses on convective precipitation only and is sensitive to the morphology of similarly located storms.


Author(s):  
Peter A. Gao ◽  
Hannah M. Director ◽  
Cecilia M. Bitz ◽  
Adrian E. Raftery

AbstractIn recent decades, warming temperatures have caused sharp reductions in the volume of sea ice in the Arctic Ocean. Predicting changes in Arctic sea ice thickness is vital in a changing Arctic for making decisions about shipping and resource management in the region. We propose a statistical spatio-temporal two-stage model for sea ice thickness and use it to generate probabilistic forecasts up to three months into the future. Our approach combines a contour model to predict the ice-covered region with a Gaussian random field to model ice thickness conditional on the ice-covered region. Using the most complete estimates of sea ice thickness currently available, we apply our method to forecast Arctic sea ice thickness. Point predictions and prediction intervals from our model offer comparable accuracy and improved calibration compared with existing forecasts. We show that existing forecasts produced by ensembles of deterministic dynamic models can have large errors and poor calibration. We also show that our statistical model can generate good forecasts of aggregate quantities such as overall and regional sea ice volume. Supplementary materials accompanying this paper appear on-line.


2021 ◽  
Author(s):  
Estee Y Cramer ◽  
Yuxin Huang ◽  
Yijin Wang ◽  
Evan L Ray ◽  
Matthew Cornell ◽  
...  

Academic researchers, government agencies, industry groups, and individuals have produced forecasts at an unprecedented scale during the COVID-19 pandemic. To leverage these forecasts, the United States Centers for Disease Control and Prevention (CDC) partnered with an academic research lab at the University of Massachusetts Amherst to create the US COVID-19 Forecast Hub. Launched in April 2020, the Forecast Hub is a dataset with point and probabilistic forecasts of incident hospitalizations, incident cases, incident deaths, and cumulative deaths due to COVID-19 at national, state, and county levels in the United States. Included forecasts represent a variety of modeling approaches, data sources, and assumptions regarding the spread of COVID-19. The goal of this dataset is to establish a standardized and comparable set of short-term forecasts from modeling teams. These data can be used to develop ensemble models, communicate forecasts to the public, create visualizations, compare models, and inform policies regarding COVID-19 mitigation. These open-source data are available via download from GitHub, through an online API, and through R packages.


2021 ◽  
Vol 302 ◽  
pp. 117498
Author(s):  
Jorge Ángel González-Ordiano ◽  
Tillmann Mühlpfordt ◽  
Eric Braun ◽  
Jianlei Liu ◽  
Hüseyin Çakmak ◽  
...  

Forecasting ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 763-773
Author(s):  
Tae-Ho Kang ◽  
Ashish Sharma ◽  
Lucy Marshall

The verification of probabilistic forecasts in hydro-climatology is integral to their development, use, and adoption. We propose here a means of utilizing goodness of fit measures for verifying the reliability of probabilistic forecasts. The difficulty in measuring the goodness of fit for a probabilistic prediction or forecast is that predicted probability distributions for a target variable are not stationary in time, meaning one observation alone exists to quantify goodness of fit for each prediction issued. Therefore, we suggest an additional dissociation that can dissociate target information from the other time variant part—the target to be verified in this study is the alignment of observations to the predicted probability distribution. For this dissociation, the probability integral transformation is used. To measure the goodness of fit for the predicted probability distributions, this study uses the root mean squared deviation metric. If the observations after the dissociation can be assumed to be independent, the mean square deviation metric becomes a chi-square test statistic, which enables statistically testing the hypothesis regarding whether the observations are from the same population as the predicted probability distributions. An illustration of our proposed rationale is provided using the multi-model ensemble prediction for El Niño–Southern Oscillation.


2021 ◽  
Author(s):  
honglin wen ◽  
Pierre Pinson ◽  
jinghuan ma ◽  
jie gu ◽  
Zhijiang Jin

We present a data-driven approach for probabilistic wind power forecasting based on conditional normalizing flow~(CNF). In contrast with the existing, this approach is distribution-free (as for non-parametric and quantile-based approaches) and can directly yield continuous probability densities, hence avoiding quantile crossing. It relies on a base distribution and a set of bijective mappings. Both the shape parameters of the base distribution and the bijective mappings are approximated with neural networks. Spline-based conditional normalizing flow is considered owing to its universal approximation capability. Over the training phase, the model sequentially maps input examples onto samples of base distribution, where parameters are estimated through maximum likelihood. To issue probabilistic forecasts, one eventually map samples of the base distribution into samples of a desired distribution. Case studies based on open datasets validate the effectiveness of the proposed model, and allows us to discuss its advantages and caveats with respect to the state of the art. Code will be released upon publication.


2021 ◽  
Author(s):  
honglin wen ◽  
Pierre Pinson ◽  
jinghuan ma ◽  
jie gu ◽  
Zhijiang Jin

We present a data-driven approach for probabilistic wind power forecasting based on conditional normalizing flow~(CNF). In contrast with the existing, this approach is distribution-free (as for non-parametric and quantile-based approaches) and can directly yield continuous probability densities, hence avoiding quantile crossing. It relies on a base distribution and a set of bijective mappings. Both the shape parameters of the base distribution and the bijective mappings are approximated with neural networks. Spline-based conditional normalizing flow is considered owing to its universal approximation capability. Over the training phase, the model sequentially maps input examples onto samples of base distribution, where parameters are estimated through maximum likelihood. To issue probabilistic forecasts, one eventually map samples of the base distribution into samples of a desired distribution. Case studies based on open datasets validate the effectiveness of the proposed model, and allows us to discuss its advantages and caveats with respect to the state of the art. Code will be released upon publication.


Sign in / Sign up

Export Citation Format

Share Document