scholarly journals Evaluating quantile-based bias adjustment methods for climate change scenarios

2021 ◽  
Author(s):  
Fabian Lehner ◽  
Imran Nadeem ◽  
Herbert Formayer

Abstract. Daily meteorological data such as temperature or precipitation from climate models is needed for many climate impact studies, e.g. in hydrology or agriculture but direct model output can contain large systematic errors. Thus, statistical bias adjustment is applied to correct climate model outputs. Here we review existing statistical bias adjustment methods and their shortcomings, and present a method which we call EQA (Empirical Quantile Adjustment), a development of the methods EDCDFm and PresRAT. We then test it in comparison to two existing methods using real and artificially created daily temperature and precipitation data for Austria. We compare the performance of the three methods in terms of the following demands: (1): The model data should match the climatological means of the observational data in the historical period. (2): The long-term climatological trends of means (climate change signal), either defined as difference or as ratio, should not be altered during bias adjustment, and (3): Even models with too few wet days (precipitation above 0.1 mm) should be corrected accurately, so that the wet day frequency is conserved. EQA fulfills (1) almost exactly and (2) at least for temperature. For precipitation, an additional correction included in EQA assures that the climate change signal is conserved, and for (3), we apply another additional algorithm to add precipitation days.

2020 ◽  
Author(s):  
Fabian Lehner ◽  
Imran Nadeem ◽  
Herbert Formayer

Abstract. Daily meteorological data from climate models is needed for many climate impact studies, e.g. in hydrology or agriculture but direct model output can contain large systematic errors. Thus, statistical bias correcting is applied to correct the raw model data. However, up to now no method has been introduced that fulfills the following demands simultaneously: (1) The long term climatological trends (climate change signal) should not be altered during bias correction, (2) the model data should match the observational data in the historical period as accurate as possible in a climatological sense and (3) models with too little wet days (precipitation above 0.1 mm) should be corrected accurately, which means that the wet day frequency is conserved. We improve the already existing quantile mapping approach so that it satisfies all three conditions. Our new method is called empirical percentile–percentile mapping (EPPM) which uses empirical distributions for meteorological variables and is therefore computationally inexpensive. The correction of precipitation is particularly challenging so our main focus is on precipitation. EPPM corrects the historical model data so that precipitation sums and wet days are equal to the observational data.


2011 ◽  
Vol 15 (9) ◽  
pp. 2777-2788 ◽  
Author(s):  
T. Bosshard ◽  
S. Kotlarski ◽  
T. Ewen ◽  
C. Schär

Abstract. The annual cycle of temperature and precipitation changes as projected by climate models is of fundamental interest in climate impact studies. Its estimation, however, is impaired by natural variability. Using a simple form of the delta change method, we show that on regional scales relevant for hydrological impact models, the projected changes in the annual cycle are prone to sampling artefacts. For precipitation at station locations, these artefacts may have amplitudes that are comparable to the climate change signal itself. Therefore, the annual cycle of the climate change signal should be filtered when generating climate change scenarios. We test a spectral smoothing method to remove the artificial fluctuations. Comparison against moving monthly averages shows that sampling artefacts in the climate change signal can successfully be removed by spectral smoothing. The method is tested at Swiss climate stations and applied to regional climate model output of the ENSEMBLES project. The spectral method performs well, except in cases with a strong annual cycle and large relative precipitation changes.


2011 ◽  
Vol 8 (1) ◽  
pp. 1161-1192 ◽  
Author(s):  
T. Bosshard ◽  
S. Kotlarski ◽  
T. Ewen ◽  
C. Schär

Abstract. The annual cycle of temperature and precipitation changes as projected by climate models is of fundamental interest in climate impact studies. Its estimation, however, is impaired by natural variability. Using a simple form of the delta change method, we show that on regional scales relevant for hydrological impact models, the projected changes in the annual cycle are prone to sampling artefacts. For precipitation at station locations, these artefacts may have amplitudes that are comparable to the climate change signal itself. Therefore, the annual cycle of the climate change signal should be filtered when generating climate change scenarios. We test a spectral smoothing method to remove the artificial fluctuations. Comparison against moving monthly averages shows that sampling artefacts in the climate change signal can successfully be removed by spectral smoothing. The method is tested at Swiss climate stations and applied to regional climate model output of the ENSEMBLES project. The spectral method performs well, except in cases with a strong annual cycle and large relative precipitation changes.


2020 ◽  
Author(s):  
Torben O. Sonnenborg ◽  
Ernesto Pasten-Zapata ◽  
Theresa Eberhart ◽  
Karsten Høgh Jensen

<p>The uncertainty of projections from climate models can be significant, especially with respect to precipitation. This represents a challenge for decision makers as the spread of the climate model ensemble can be large and, even there can be no consensus on the direction of the climate change signal. This problem is carried through to impact models such as hydrological models. Here, we evaluate different approaches to reduce the uncertainty using 16 Euro-CORDEX Regional Climate Models (RCMs) that drive three different setups of the integrated and distributed MIKE-SHE hydrological model for a catchment in Denmark. Each model is calibrated against an extensive database of hydrological observations (stream discharge, hydraulic head, actual evapotranspiration, soil moisture). We evaluate the skills of the raw and bias-corrected RCMs to simulate precipitation in a historical period using sets of nine, six, five, and three metrics for nine steps. After each step, the lowest-performing model is removed from the ensemble and the standard deviation of the new ensemble is estimated. Subsequently, the uncertainty on the projected groundwater head and stream discharge are evaluated. Based on the evaluation of raw RCM simulations, the largest decrease in the uncertainty of projected discharge (5<sup>th</sup>, 50<sup>th</sup> and 95<sup>th</sup> percentiles) is obtained using the set of five metrics. When evaluating the bias-corrected RCMs, the largest uncertainty reduction in stream discharge is obtained when the set of all nine metrics is considered. Similar results are obtained for groundwater head . The reduction of initial uncertainty is almost a factor of two higher when the evaluation of models is based on bias-corrected compared to raw climate models results. This analysis gives an insight of how different approaches could decrease the uncertainty of future projections for hydrological analyses of the impact of climate change.</p>


2020 ◽  
Vol 11 (4) ◽  
pp. 995-1012
Author(s):  
Lukas Brunner ◽  
Angeline G. Pendergrass ◽  
Flavio Lehner ◽  
Anna L. Merrifield ◽  
Ruth Lorenz ◽  
...  

Abstract. The sixth Coupled Model Intercomparison Project (CMIP6) constitutes the latest update on expected future climate change based on a new generation of climate models. To extract reliable estimates of future warming and related uncertainties from these models, the spread in their projections is often translated into probabilistic estimates such as the mean and likely range. Here, we use a model weighting approach, which accounts for the models' historical performance based on several diagnostics as well as model interdependence within the CMIP6 ensemble, to calculate constrained distributions of global mean temperature change. We investigate the skill of our approach in a perfect model test, where we use previous-generation CMIP5 models as pseudo-observations in the historical period. The performance of the distribution weighted in the abovementioned manner with respect to matching the pseudo-observations in the future is then evaluated, and we find a mean increase in skill of about 17 % compared with the unweighted distribution. In addition, we show that our independence metric correctly clusters models known to be similar based on a CMIP6 “family tree”, which enables the application of a weighting based on the degree of inter-model dependence. We then apply the weighting approach, based on two observational estimates (the fifth generation of the European Centre for Medium-Range Weather Forecasts Retrospective Analysis – ERA5, and the Modern-Era Retrospective analysis for Research and Applications, version 2 – MERRA-2), to constrain CMIP6 projections under weak (SSP1-2.6) and strong (SSP5-8.5) climate change scenarios (SSP refers to the Shared Socioeconomic Pathways). Our results show a reduction in the projected mean warming for both scenarios because some CMIP6 models with high future warming receive systematically lower performance weights. The mean of end-of-century warming (2081–2100 relative to 1995–2014) for SSP5-8.5 with weighting is 3.7 ∘C, compared with 4.1 ∘C without weighting; the likely (66%) uncertainty range is 3.1 to 4.6 ∘C, which equates to a 13 % decrease in spread. For SSP1-2.6, the weighted end-of-century warming is 1 ∘C (0.7 to 1.4 ∘C), which results in a reduction of −0.1 ∘C in the mean and −24 % in the likely range compared with the unweighted case.


2018 ◽  
Vol 31 (16) ◽  
pp. 6591-6610 ◽  
Author(s):  
Martin Aleksandrov Ivanov ◽  
Jürg Luterbacher ◽  
Sven Kotlarski

Climate change impact research and risk assessment require accurate estimates of the climate change signal (CCS). Raw climate model data include systematic biases that affect the CCS of high-impact variables such as daily precipitation and wind speed. This paper presents a novel, general, and extensible analytical theory of the effect of these biases on the CCS of the distribution mean and quantiles. The theory reveals that misrepresented model intensities and probability of nonzero (positive) events have the potential to distort raw model CCS estimates. We test the analytical description in a challenging application of bias correction and downscaling to daily precipitation over alpine terrain, where the output of 15 regional climate models (RCMs) is reduced to local weather stations. The theoretically predicted CCS modification well approximates the modification by the bias correction method, even for the station–RCM combinations with the largest absolute modifications. These results demonstrate that the CCS modification by bias correction is a direct consequence of removing model biases. Therefore, provided that application of intensity-dependent bias correction is scientifically appropriate, the CCS modification should be a desirable effect. The analytical theory can be used as a tool to 1) detect model biases with high potential to distort the CCS and 2) efficiently generate novel, improved CCS datasets. The latter are highly relevant for the development of appropriate climate change adaptation, mitigation, and resilience strategies. Future research needs to focus on developing process-based bias corrections that depend on simulated intensities rather than preserving the raw model CCS.


2019 ◽  
Vol 23 (3) ◽  
pp. 1409-1429 ◽  
Author(s):  
Sjoukje Philip ◽  
Sarah Sparrow ◽  
Sarah F. Kew ◽  
Karin van der Wiel ◽  
Niko Wanders ◽  
...  

Abstract. In August 2017 Bangladesh faced one of its worst river flooding events in recent history. This paper presents, for the first time, an attribution of this precipitation-induced flooding to anthropogenic climate change from a combined meteorological and hydrological perspective. Experiments were conducted with three observational datasets and two climate models to estimate changes in the extreme 10-day precipitation event frequency over the Brahmaputra basin up to the present and, additionally, an outlook to 2 ∘C warming since pre-industrial times. The precipitation fields were then used as meteorological input for four different hydrological models to estimate the corresponding changes in river discharge, allowing for comparison between approaches and for the robustness of the attribution results to be assessed. In all three observational precipitation datasets the climate change trends for extreme precipitation similar to that observed in August 2017 are not significant, however in two out of three series, the sign of this insignificant trend is positive. One climate model ensemble shows a significant positive influence of anthropogenic climate change, whereas the other large ensemble model simulates a cancellation between the increase due to greenhouse gases (GHGs) and a decrease due to sulfate aerosols. Considering discharge rather than precipitation, the hydrological models show that attribution of the change in discharge towards higher values is somewhat less uncertain than in precipitation, but the 95 % confidence intervals still encompass no change in risk. Extending the analysis to the future, all models project an increase in probability of extreme events at 2 ∘C global heating since pre-industrial times, becoming more than 1.7 times more likely for high 10-day precipitation and being more likely by a factor of about 1.5 for discharge. Our best estimate on the trend in flooding events similar to the Brahmaputra event of August 2017 is derived by synthesizing the observational and model results: we find the change in risk to be greater than 1 and of a similar order of magnitude (between 1 and 2) for both the meteorological and hydrological approach. This study shows that, for precipitation-induced flooding events, investigating changes in precipitation is useful, either as an alternative when hydrological models are not available or as an additional measure to confirm qualitative conclusions. Besides this, it highlights the importance of using multiple models in attribution studies, particularly where the climate change signal is not strong relative to natural variability or is confounded by other factors such as aerosols.


2019 ◽  
Vol 58 (5) ◽  
pp. 1061-1078 ◽  
Author(s):  
Abdelkader Mezghani ◽  
Andreas Dobler ◽  
Rasmus Benestad ◽  
Jan Erik Haugen ◽  
Kajsa M. Parding ◽  
...  

ABSTRACTMost impact studies using downscaled climate data as input assume that the selection of few global climate models (GCMs) representing the largest spread covers the likely range of future changes. This study shows that including more GCMs can result in a very different behavior. We tested the influence of selecting various subsets of GCMs on the climate change signal over Poland from simulations based on dynamical and empirical–statistical downscaling methods. When the climate variable is well simulated by the GCM, such as temperature, results showed that both downscaling methods agree on a warming over Poland by up to 2° or 5°C assuming intermediate or high emission scenarios, respectively, by 2071–2100. As a less robust simulated signal through GCMs, precipitation is expected to increase by up to 10% by 2071–2100 assuming the intermediate emission scenario. However, these changes are uncertain when the high emission scenario and the end of the twenty-first century are of interest. Further, an additional bootstrap test revealed an underestimation in the warming rate varying from 0.5° to more than 4°C over Poland that was found to be largely influenced by the selection of few driving GCMs instead of considering the full range of possible climate model outlooks. Furthermore, we found that differences between various combinations of small subsets from the GCM ensemble of opportunities can be as large as the climate change signal.


2021 ◽  
Author(s):  
Paola Nanni ◽  
David J. Peres ◽  
Rosaria E. Musumeci ◽  
Antonino Cancelliere

<p>Climate change is a phenomenon that is claimed to be responsible for a significant alteration of the precipitation regime in different regions worldwide and for the induced potential changes on related hydrological hazards. In particular, some consensus has raised about the fact that climate changes can induce a shift to shorter but more intense rainfall events, causing an intensification of urban and flash flooding hazards.  Regional climate models (RCMs) are a useful tool for trying to predict the impacts of climate change on hydrological events, although their application may lead to significant differences when different models are adopted. For this reason, it is of key importance to ascertain the quality of regional climate models (RCMs), especially with reference to their ability to reproduce the main climatological regimes with respect to an historical period. To this end, several studies have focused on the analysis of annual or monthly data, while few studies do exist that analyze the sub-daily data that are made available by the regional climate projection initiatives. In this study, with reference to specific locations in eastern Sicily (Italy), we first evaluate historical simulations of precipitation data from selected RCMs belonging to the Euro-CORDEX (Coordinated Regional Climate Downscaling Experiment for the Euro-Mediterranean area) with high temporal resolution (three-hourly), in order to understand how they compare to fine-resolution observations. In particular, we investigate the ability to reproduce rainfall event characteristics, as well as annual maxima precipitation at different durations. With reference to rainfall event characteristics, we specifically focus on duration, intensity, and inter-arrival time between events. Annual maxima are analyzed at sub-daily durations. We then analyze the future simulations according to different Representative concentration scenarios. The proposed analysis highlights the differences between the different RCMs, supporting the selection of the most suitable climate model for assessing the impacts in the considered locations, and to understand what trends for intense precipitation are to be expected in the future.</p>


2020 ◽  
Author(s):  
Ana Casanueva ◽  
Sixto Herrera ◽  
Maialen Iturbide ◽  
Stefan Lange ◽  
Martin Jury ◽  
...  

<p>Systematic biases in climate models hamper their direct use in impact studies and, as a consequence, many bias adjustment methods, which merely correct for deficiencies in the distribution, have been developed. Despite adjusting the desired features under historical simulations, their application in a climate change context is subject to additional uncertainties and modifications of the change signals, especially for climate indices which have not been tackled by the methods. In this sense, some of the commonly-used bias adjustment methods allow changes of the signals, which appear by construction in case of intensity-dependent biases; some others ensure the trends in some statistics of the original, raw models. Two relevant sources of uncertainty, often overlooked, which bring further uncertainties are the sensitivity to the observational reference used to calibrate the method and the effect of the resolution mismatch between model and observations (downscaling effect).</p><p>In the present work, we assess the impact of these factors on the climate change signal of a set of climate indices of temperature and precipitation considering marginal, temporal and extreme aspects. We use eight standard and state-of-the-art bias adjustment methods (spanning a variety of methods regarding their nature -empirical or parametric-, fitted parameters and preservation of the signals) for a case study in the Iberian Peninsula. The quantile trend-preserving methods (namely quantile delta mapping -QDM-, scaled distribution mapping -SDM- and the method from the third phase of ISIMIP -ISIMIP3) preserve better the raw signals for the different indices and variables (not all preserved by construction). However they rely largely on the reference dataset used for calibration, thus present a larger sensitivity to the observations, especially for precipitation intensity, spells and extreme indices. Thus, high-quality observational datasets are essential for comprehensive analyses in larger (continental) domains. Similar conclusions hold for experiments carried out at high (approximately 20km) and low (approximately 120km) spatial resolutions.</p>


Sign in / Sign up

Export Citation Format

Share Document