STAR-ESDM: A New Bias Correction Method for High-Resolution Station- and Grid-Based Climate Projections

Author(s):  
Katharine Hayhoe ◽  
Anne Marie Stoner ◽  
Ian Scott-Fleming ◽  
Hamed Ibrahim

<p>The Seasonal Trends and Analysis of Residuals (STAR) Empirical-Statistical Downscaling Model (ESDM) is a new bias correction and downscaling method that employs a signal processing approach to decompose observed and model-simulated temperature and precipitation into long-term trends, static and dynamic annual climatologies, and day-to-day variability. It then individually bias-corrects each signal, using a nonparametric Kernel Density Estimation function for the daily anomalies, before reassembling into a coherent time series.</p><p>Comparing the performance of this method in bias-correcting daily temperature and precipitation relative to 25km high-resolution dynamical global model simulations shows significant improvement over commonly-used ESDMs in North America for high and low quantiles of the distribution and overall minimal bias acceptable for all but the most extreme precipitation amounts (beyond the 99.9<sup>th</sup> quantile of wet days) and for temperature at very high elevations during peak historical snowmelt months.</p><p>STAR-ESDM is a MATLAB-based code that minimizes computational demand to enable rapid bias-correction and spatial downscaling of multiple datasets. Here, we describe new CMIP5 and CMIP6-based datasets of daily maximum and minimum temperature and daily precipitation for nearly 10,000 weather stations across North and Central America, as well as gridded datasets for the contiguous U.S., Canada, and globally. In 2022, we plan to extend the station-based downscaling globally as well, since point-source projections can be of use in assessment of climate impacts in many fields, from urban health to water supply.</p><p>The projections have furthermore been translated into a series of impact-relevant indicators at the seasonal,  monthly, and daily scale including multi-day heat waves, extreme precipitation events, threshold exceedences, and cumulative degree-days for individual RCP/ssp scenarios as well as by global mean temperature thresholds as described in Hayhoe et al. (2018; U.S. Fourth National Climate Assessment Volume 1 Chapter 4).</p><p>In this presentation we describe the methodology, briefly highlight results from the evaluation and comparison analysis, and summarize available and forthcoming projections using this computational framework.</p>

2019 ◽  
Vol 58 (12) ◽  
pp. 2617-2632 ◽  
Author(s):  
Qifen Yuan ◽  
Thordis L. Thorarinsdottir ◽  
Stein Beldring ◽  
Wai Kwok Wong ◽  
Shaochun Huang ◽  
...  

AbstractIn applications of climate information, coarse-resolution climate projections commonly need to be downscaled to a finer grid. One challenge of this requirement is the modeling of subgrid variability and the spatial and temporal dependence at the finer scale. Here, a postprocessing procedure for temperature projections is proposed that addresses this challenge. The procedure employs statistical bias correction and stochastic downscaling in two steps. In the first step, errors that are related to spatial and temporal features of the first two moments of the temperature distribution at model scale are identified and corrected. Second, residual space–time dependence at the finer scale is analyzed using a statistical model, from which realizations are generated and then combined with an appropriate climate change signal to form the downscaled projection fields. Using a high-resolution observational gridded data product, the proposed approach is applied in a case study in which projections of two regional climate models from the Coordinated Downscaling Experiment–European Domain (EURO-CORDEX) ensemble are bias corrected and downscaled to a 1 km × 1 km grid in the Trøndelag area of Norway. A cross-validation study shows that the proposed procedure generates results that better reflect the marginal distributional properties of the data product and have better consistency in space and time when compared with empirical quantile mapping.


2018 ◽  
Vol 22 (1) ◽  
pp. 673-687 ◽  
Author(s):  
Antoine Colmet-Daage ◽  
Emilia Sanchez-Gomez ◽  
Sophie Ricci ◽  
Cécile Llovel ◽  
Valérie Borrell Estupina ◽  
...  

Abstract. The climate change impact on mean and extreme precipitation events in the northern Mediterranean region is assessed using high-resolution EuroCORDEX and MedCORDEX simulations. The focus is made on three regions, Lez and Aude located in France, and Muga located in northeastern Spain, and eight pairs of global and regional climate models are analyzed with respect to the SAFRAN product. First the model skills are evaluated in terms of bias for the precipitation annual cycle over historical period. Then future changes in extreme precipitation, under two emission scenarios, are estimated through the computation of past/future change coefficients of quantile-ranked model precipitation outputs. Over the 1981–2010 period, the cumulative precipitation is overestimated for most models over the mountainous regions and underestimated over the coastal regions in autumn and higher-order quantile. The ensemble mean and the spread for future period remain unchanged under RCP4.5 scenario and decrease under RCP8.5 scenario. Extreme precipitation events are intensified over the three catchments with a smaller ensemble spread under RCP8.5 revealing more evident changes, especially in the later part of the 21st century.


2021 ◽  
Author(s):  
Thomas Noël ◽  
Harilaos Loukos ◽  
Dimitri Defrance

A high-resolution climate projections dataset is obtained by statistically downscaling climate projections from the CMIP6 experiment using the ERA5-Land reanalysis from the Copernicus Climate Change Service. This global dataset has a spatial resolution of 0.1°x 0.1°, comprises 5 climate models and includes two surface daily variables at monthly resolution: air temperature and precipitation. Two greenhouse gas emissions scenarios are available: one with mitigation policy (SSP126) and one without mitigation (SSP585). The downscaling method is a Quantile Mapping method (QM) called the Cumulative Distribution Function transform (CDF-t) method that was first used for wind values and is now referenced in dozens of peer-reviewed publications. The data processing includes quality control of metadata according to the climate modelling community standards and value checking for outlier detection.


2021 ◽  
Author(s):  
Jérôme Kopp ◽  
Pauline Rivoire ◽  
S. Mubashshir Ali ◽  
Yannick Barton ◽  
Olivia Martius

<p>Temporal clustering of extreme precipitation events on subseasonal time scales is a type of compound event, which can cause large precipitation accumulations and lead to floods. We present a novel count-based procedure to identify subseasonal clustering of extreme precipitation events. Furthermore, we introduce two metrics to characterise the frequency of subseasonal clustering episodes and their relevance for large precipitation accumulations. The advantage of this approach is that it does not require the investigated variable (here precipitation) to satisfy any specific statistical properties. Applying this methodology to the ERA5 reanalysis data set, we identify regions where subseasonal clustering of annual high precipitation percentiles occurs frequently and contributes substantially to large precipitation accumulations. Those regions are the east and northeast of the Asian continent (north of Yellow Sea, in the Chinese provinces of Hebei, Jilin and Liaoning; North and South Korea; Siberia and east of Mongolia), central Canada and south of California, Afghanistan, Pakistan, the southeast of the Iberian Peninsula, and the north of Argentina and south of Bolivia. Our method is robust with respect to the parameters used to define the extreme events (the percentile threshold and the run length) and the length of the subseasonal time window (here 2 – 4 weeks). The procedure could also be used to identify temporal clustering of other variables (e.g. heat waves) and can be applied on different time scales (e.g. for drought years). <span>For a complementary study on the subseasonal clustering of European extreme precipitation events and its relationship to large-scale atmospheric drivers, please refer to Barton et al.</span></p>


2012 ◽  
Vol 16 (2) ◽  
pp. 305-318 ◽  
Author(s):  
I. Haddeland ◽  
J. Heinke ◽  
F. Voß ◽  
S. Eisner ◽  
C. Chen ◽  
...  

Abstract. Due to biases in the output of climate models, a bias correction is often needed to make the output suitable for use in hydrological simulations. In most cases only the temperature and precipitation values are bias corrected. However, often there are also biases in other variables such as radiation, humidity and wind speed. In this study we tested to what extent it is also needed to bias correct these variables. Responses to radiation, humidity and wind estimates from two climate models for four large-scale hydrological models are analysed. For the period 1971–2000 these hydrological simulations are compared to simulations using meteorological data based on observations and reanalysis; i.e. the baseline simulation. In both forcing datasets originating from climate models precipitation and temperature are bias corrected to the baseline forcing dataset. Hence, it is only effects of radiation, humidity and wind estimates that are tested here. The direct use of climate model outputs result in substantial different evapotranspiration and runoff estimates, when compared to the baseline simulations. A simple bias correction method is implemented and tested by rerunning the hydrological models using bias corrected radiation, humidity and wind values. The results indicate that bias correction can successfully be used to match the baseline simulations. Finally, historical (1971–2000) and future (2071–2100) model simulations resulting from using bias corrected forcings are compared to the results using non-bias corrected forcings. The relative changes in simulated evapotranspiration and runoff are relatively similar for the bias corrected and non bias corrected hydrological projections, although the absolute evapotranspiration and runoff numbers are often very different. The simulated relative and absolute differences when using bias corrected and non bias corrected climate model radiation, humidity and wind values are, however, smaller than literature reported differences resulting from using bias corrected and non bias corrected climate model precipitation and temperature values.


2017 ◽  
Vol 21 (6) ◽  
pp. 2649-2666 ◽  
Author(s):  
Matthew B. Switanek ◽  
Peter A. Troch ◽  
Christopher L. Castro ◽  
Armin Leuprecht ◽  
Hsin-I Chang ◽  
...  

Abstract. Commonly used bias correction methods such as quantile mapping (QM) assume the function of error correction values between modeled and observed distributions are stationary or time invariant. This article finds that this function of the error correction values cannot be assumed to be stationary. As a result, QM lacks justification to inflate/deflate various moments of the climate change signal. Previous adaptations of QM, most notably quantile delta mapping (QDM), have been developed that do not rely on this assumption of stationarity. Here, we outline a methodology called scaled distribution mapping (SDM), which is conceptually similar to QDM, but more explicitly accounts for the frequency of rain days and the likelihood of individual events. The SDM method is found to outperform QM, QDM, and detrended QM in its ability to better preserve raw climate model projected changes to meteorological variables such as temperature and precipitation.


Proceedings ◽  
2018 ◽  
Vol 7 (1) ◽  
pp. 4
Author(s):  
Georgia Lazoglou ◽  
Christina Anagnostopoulou ◽  
Charalampos Skoulikaris ◽  
Konstantia Tolika

The projection of extreme precipitation events with higher accuracy and reliability that engender severe socioeconomic impacts more frequently is considered a priority research topic in the scientific community. Although large-scale initiatives for monitoring meteorological and hydrological variables exist, the lack of data is still evident particularly in regions with complex topographic characteristics. The latter results in the use of reanalysis data or data derived from regional climate models, however both datasets are biased to the observations resulting in nonaccurate results in hydrological studies. The current research presents a newly developed statistical method for the bias correction of the maximum rainfall amount at watershed scale. In particular, the proposed approach necessitates the coupling of a spatial distribution method, namely Thiessen polygons, with a multivariate probabilistic distribution method, namely copulas, for the bias correction of the maximum precipitation. The case study area is the Nestos River basin where the several extreme episodes that have been recorded have direct impacts to the regional agricultural economy. Thus, using daily data by three monitoring stations and daily reanalysis precipitation values from the grids closest to these stations, the results demonstrated that the bias corrected maximum precipitation totals (greater than 90%) is much closer to the real max precipitation totals, while the respective reanalysis value underestimates the real precipitation totals. The overall improvement of the output shows that the proposed Thiessen-copula method could constitute a significant asset to hydrologic simulations.


2019 ◽  
Vol 58 (8) ◽  
pp. 1763-1777
Author(s):  
Patrick J. Clemins ◽  
Gabriela Bucini ◽  
Jonathan M. Winter ◽  
Brian Beckage ◽  
Erin Towler ◽  
...  

AbstractGeneral circulation models (GCMs) are essential for projecting future climate; however, despite the rapid advances in their ability to simulate the climate system at increasing spatial resolution, GCMs cannot capture the local and regional weather dynamics necessary for climate impacts assessments. Temperature and precipitation, for which dense observational records are available, can be bias corrected and downscaled, but many climate impacts models require a larger set of variables such as relative humidity, cloud cover, wind speed and direction, and solar radiation. To address this need, we develop and demonstrate an analog-based approach, which we call a “weather estimator.” The weather estimator employs a highly generalizable structure, utilizing temperature and precipitation from previously downscaled GCMs to select analogs from a reanalysis product, resulting in a complete daily gridded dataset. The resulting dataset, constructed from the selected analogs, contains weather variables needed for impacts modeling that are physically, spatially, and temporally consistent. This approach relies on the weather variables’ correlation with temperature and precipitation, and our correlation analysis indicates that the weather estimator should best estimate evaporation, relative humidity, and cloud cover and do less well in estimating pressure and wind speed and direction. In addition, while the weather estimator has several user-defined parameters, a sensitivity analysis shows that the method is robust to small variations in important model parameters. The weather estimator recreates the historical distributions of relative humidity, pressure, evaporation, shortwave radiation, cloud cover, and wind speed well and outperforms a multiple linear regression estimator across all predictands.


Sign in / Sign up

Export Citation Format

Share Document