scholarly journals From climate simulations to statistics - Introducing the wux package

2016 ◽  
Vol 45 (1) ◽  
pp. 81-96 ◽  
Author(s):  
Thomas Mendlik ◽  
Georg Heinrich ◽  
Andreas Gobiet ◽  
Armin Leuprecht

We present the R package wux, a toolbox to analyze projected climate change signals by numerical climate model simulations and the associated uncertainties. The focus of this package is to automatically process big amounts of climate model data from multi-model ensembles in a user-friendly and flexible way. For this purpose, climate model output in common binary format (NetCDF) is read in and stored in a data frame, after first being aggregated to a desired temporal resolution and then being averaged over spatial domains of interest. The data processing can be performed for any number of meteorological parameters at one go, which allows multivariate statistical analysis of the climate model ensemble.

2011 ◽  
Vol 4 (1) ◽  
pp. 45-63 ◽  
Author(s):  
T. Marke ◽  
W. Mauser ◽  
A. Pfeiffer ◽  
G. Zängl

Abstract. The present study investigates a statistical approach for the downscaling of climate simulations focusing on those meteorological parameters most commonly required as input for climate change impact models (temperature, precipitation, air humidity and wind speed), including the option to correct biases in the climate model simulations. The approach is evaluated by the utilization of a hydrometeorological model chain consisting of (i) the regional climate model MM5 (driven by reanalysis data at the boundaries of the model domain), (ii) the downscaling and model interface SCALMET, and (iii) the hydrological model PROMET. The results of four hydrological model runs are compared to discharge recordings at the gauge of the Upper Danube Watershed (Central Europe) for the historical period of 1972–2000 on a daily time basis. The comparison reveals that the presented approaches allow for a more accurate simulation of discharge for the catchment of the Upper Danube Watershed and the considered gauge at the outlet in Achleiten. The correction for subgrid-scale variability is shown to reduce biases in simulated discharge compared to the utilization of bilinear interpolation. Further enhancements in model performance could be achieved by a correction of biases in the RCM data within the downscaling process. Although the presented downscaling approach strongly improves the performance of the hydrological model, deviations from the observed discharge conditions persist that are not found when driving the hydrological model with spatially distributed meteorological observations.


2011 ◽  
Vol 2 (1) ◽  
pp. 161-177 ◽  
Author(s):  
L. A. van den Berge ◽  
F. M. Selten ◽  
W. Wiegerinck ◽  
G. S. Duane

Abstract. In the current multi-model ensemble approach climate model simulations are combined a posteriori. In the method of this study the models in the ensemble exchange information during simulations and learn from historical observations to combine their strengths into a best representation of the observed climate. The method is developed and tested in the context of small chaotic dynamical systems, like the Lorenz 63 system. Imperfect models are created by perturbing the standard parameter values. Three imperfect models are combined into one super-model, through the introduction of connections between the model equations. The connection coefficients are learned from data from the unperturbed model, that is regarded as the truth. The main result of this study is that after learning the super-model is a very good approximation to the truth, much better than each imperfect model separately. These illustrative examples suggest that the super-modeling approach is a promising strategy to improve weather and climate simulations.


2020 ◽  
Author(s):  
Mohamadou Diallo ◽  
Hella Garny ◽  
Roland Eichinger ◽  
Valentina Aquila ◽  
Manfred Ern ◽  
...  

<p>The stratospheric Brewer--Dobson circulation (BDC) is an important element of climate system as it determines the concentration of radiatively active trace gases like water vapor, ozone and aerosol above the tropopause. Climate models predict that increasing greenhouse gas levels speed up the stratospheric circulation. BDC changes is substantially modulated by different modes of climate variability (QBO, ENSO, solar cycle), including the volcanic aerosols. However, such variability is often not reliably included or represented in current climate model simulations, challenging the evaluation of models’ behavior against observations and constituting a major uncertainty in current climate simulations. </p><p>Here, we investigate the main differences between the reanalysis and the CCMI/CMIP6 climate models’ response to stratospheric volcanic forcings regarding the depth/strength of the stratospheric BDC, with a focus on potential changes in the deep and shallow circulation branches. We also discuss the key reasons of the discrepancies (incl. uncertainties associated with volcanological forcing datasets and missing direct aerosol heating in the reanalysis) in the BDC response between reanalysis-driven and climate model simulations in the lower, mid and upper stratosphere. Finally, we assess the dynamical mechanisms involved in the volcanically-induced BDC changes to understand the opposite regime between lower, middle and upper stratosphere after the Mt Pinatubo eruption.</p>


2012 ◽  
Vol 8 (3) ◽  
pp. 919-933 ◽  
Author(s):  
I. Dorado Liñán ◽  
U. Büntgen ◽  
F. González-Rouco ◽  
E. Zorita ◽  
J. P. Montávez ◽  
...  

Abstract. Past temperature variations are usually inferred from proxy data or estimated using general circulation models. Comparisons between climate estimations derived from proxy records and from model simulations help to better understand mechanisms driving climate variations, and also offer the possibility to identify deficiencies in both approaches. This paper presents regional temperature reconstructions based on tree-ring maximum density series in the Pyrenees, and compares them with the output of global simulations for this region and with regional climate model simulations conducted for the target region. An ensemble of 24 reconstructions of May-to-September regional mean temperature was derived from 22 maximum density tree-ring site chronologies distributed over the larger Pyrenees area. Four different tree-ring series standardization procedures were applied, combining two detrending methods: 300-yr spline and the regional curve standardization (RCS). Additionally, different methodological variants for the regional chronology were generated by using three different aggregation methods. Calibration verification trials were performed in split periods and using two methods: regression and a simple variance matching. The resulting set of temperature reconstructions was compared with climate simulations performed with global (ECHO-G) and regional (MM5) climate models. The 24 variants of May-to-September temperature reconstructions reveal a generally coherent pattern of inter-annual to multi-centennial temperature variations in the Pyrenees region for the last 750 yr. However, some reconstructions display a marked positive trend for the entire length of the reconstruction, pointing out that the application of the RCS method to a suboptimal set of samples may lead to unreliable results. Climate model simulations agree with the tree-ring based reconstructions at multi-decadal time scales, suggesting solar variability and volcanism as the main factors controlling preindustrial mean temperature variations in the Pyrenees. Nevertheless, the comparison also highlights differences with the reconstructions, mainly in the amplitude of past temperature variations and in the 20th century trends. Neither proxy-based reconstructions nor model simulations are able to perfectly track the temperature variations of the instrumental record, suggesting that both approximations still need further improvements.


Author(s):  
David A Stainforth ◽  
Thomas E Downing ◽  
Richard Washington ◽  
Ana Lopez ◽  
Mark New

There is a scientific consensus regarding the reality of anthropogenic climate change. This has led to substantial efforts to reduce atmospheric greenhouse gas emissions and thereby mitigate the impacts of climate change on a global scale. Despite these efforts, we are committed to substantial further changes over at least the next few decades. Societies will therefore have to adapt to changes in climate. Both adaptation and mitigation require action on scales ranging from local to global, but adaptation could directly benefit from climate predictions on regional scales while mitigation could be driven solely by awareness of the global problem; regional projections being principally of motivational value. We discuss how recent developments of large ensembles of climate model simulations can be interpreted to provide information on these scales and to inform societal decisions. Adaptation is most relevant as an influence on decisions which exist irrespective of climate change, but which have consequences on decadal time-scales. Even in such situations, climate change is often only a minor influence; perhaps helping to restrict the choice of ‘no regrets’ strategies. Nevertheless, if climate models are to provide inputs to societal decisions, it is important to interpret them appropriately. We take climate ensembles exploring model uncertainty as potentially providing a lower bound on the maximum range of uncertainty and thus a non-discountable climate change envelope. An analysis pathway is presented, describing how this information may provide an input to decisions, sometimes via a number of other analysis procedures and thus a cascade of uncertainty. An initial screening is seen as a valuable component of this process, potentially avoiding unnecessary effort while guiding decision makers through issues of confidence and robustness in climate modelling information. Our focus is the usage of decadal to centennial time-scale climate change simulations as inputs to decision making, but we acknowledge that robust adaptation to the variability of present day climate encourages the development of less vulnerable systems as well as building critical experience in how to respond to climatic uncertainty.


2020 ◽  
Vol 16 (6) ◽  
pp. 2039-2054
Author(s):  
Suzanne Alice Ghislaine Leroy ◽  
Klaus Arpe ◽  
Uwe Mikolajewicz ◽  
Jing Wu

Abstract. Publications on temperate deciduous tree refugia in Europe are abundant, but little is known about the patterns of temperate tree refugia in eastern Asia, an area where biodiversity survived Quaternary glaciations and which has the world's most diverse temperate flora. Our goal is to compare climate model simulations with pollen data in order to establish the location of glacial refugia during the Last Glacial Maximum (LGM). Limits in which temperate deciduous trees can survive are taken from the literature. The model outputs are first tested for the present by comparing climate models with published modern pollen data. As this method turned out to be satisfactory for the present, the same approach was used for the LGM. Climate model simulations (ECHAM5 T106), statistically further downscaled, are used to infer the temperate deciduous tree distribution during the LGM. These were compared with available fossil temperate tree pollen occurrences. The impact of the LGM on the eastern Asian climate was much weaker than on the European climate. The area of possible tree growth shifts only by about 2∘ to the south between the present and the LGM. This contributes to explaining the greater biodiversity of forests in eastern Asia compared to Europe. Climate simulations and the available, although fractional, fossil pollen data agree. Therefore, climate estimations can safely be used to fill areas without pollen data by mapping potential refugia distributions. The results show two important areas with population connectivity: the Yellow Sea emerged shelf and the southern Himalayas. These two areas were suitable for temperate deciduous tree growth, providing corridors for population migration and connectivity (i.e. less population fragmentation) in glacial periods. Many tree populations live in interglacial refugia, not glacial ones. The fact that the model simulation for the LGM fits so well with observed pollen distribution is another indication that the model used is good enough to also simulate the LGM period.


2018 ◽  
Vol 9 (1) ◽  
pp. 135-151 ◽  
Author(s):  
Nadja Herger ◽  
Gab Abramowitz ◽  
Reto Knutti ◽  
Oliver Angélil ◽  
Karsten Lehmann ◽  
...  

Abstract. End users studying impacts and risks caused by human-induced climate change are often presented with large multi-model ensembles of climate projections whose composition and size are arbitrarily determined. An efficient and versatile method that finds a subset which maintains certain key properties from the full ensemble is needed, but very little work has been done in this area. Therefore, users typically make their own somewhat subjective subset choices and commonly use the equally weighted model mean as a best estimate. However, different climate model simulations cannot necessarily be regarded as independent estimates due to the presence of duplicated code and shared development history. Here, we present an efficient and flexible tool that makes better use of the ensemble as a whole by finding a subset with improved mean performance compared to the multi-model mean while at the same time maintaining the spread and addressing the problem of model interdependence. Out-of-sample skill and reliability are demonstrated using model-as-truth experiments. This approach is illustrated with one set of optimisation criteria but we also highlight the flexibility of cost functions, depending on the focus of different users. The technique is useful for a range of applications that, for example, minimise present-day bias to obtain an accurate ensemble mean, reduce dependence in ensemble spread, maximise future spread, ensure good performance of individual models in an ensemble, reduce the ensemble size while maintaining important ensemble characteristics, or optimise several of these at the same time. As in any calibration exercise, the final ensemble is sensitive to the metric, observational product, and pre-processing steps used.


2018 ◽  
Author(s):  
Patrick J. Bartlein ◽  
Sarah L. Shafer

Abstract. The “paleo calendar effect” is a common expression for the impact that the changes in the length of months or seasons over time, related to changes in the eccentricity of Earth's orbit and precession, have on the analysis or summarization of climate-model output. This effect can have significant implications for paleoclimate analyses. In particular, using a “fixed-length” definition of months (i.e. defined by a fixed number of days), as opposed to a “fixed-angular” definition (i.e. defined by a fixed number of degrees of the Earth's orbit), leads to comparisons of data from different positions along the Earth's orbit when comparing paleo with modern simulations. This effect can impart characteristic spatial patterns or signals in comparisons of time-slice simulations that otherwise might be interpreted in terms of specific paleoclimatic mechanisms, and we provide examples for 6, 97, 116, and 127 ka. The calendar effect is exacerbated in transient climate simulations, where, in addition to spatial or map-pattern effects, it can influence the apparent timing of extrema in individual time series and the characterization of phase relationships among series. We outline an approach for adjusting paleo simulations that have been summarized using a modern fixed-length definition of months and that can also be used for summarizing and comparing data archived as daily data. We describe the implementation of this approach in a set of Fortran 90 programs and modules (PaleoCalAdjust v1.0).


2017 ◽  
Author(s):  
Nadja Herger ◽  
Gab Abramowitz ◽  
Reto Knutti ◽  
Oliver Angélil ◽  
Karsten Lehmann ◽  
...  

Abstract. End-users studying impacts and risks caused by human-induced climate change are often presented with large multi-model ensembles of climate projections whose composition and size are arbitrarily determined. An efficient and versatile method that finds a subset which maintains certain key properties from the full ensemble is needed, but very little work has been done in this area. Therefore, users typically make their own somewhat subjective subset choices and commonly use the equally-weighted model mean as a best estimate. However, different climate model simulations cannot necessarily be regarded as independent estimates due to the presence of duplicated code and shared development history. Here, we present an efficient and flexible tool that makes better use of the ensemble as a whole by finding a subset with improved mean performance compared to the multi-model mean while at the same time maintaining the spread and addressing the problem of model interdependence. Out-of-sample skill and reliability are demonstrated using model-as-truth experiments. This approach is illustrated with one set of optimisation criteria but we also highlight the flexibility of cost functions, depending on the focus of different users. The technique is useful for a range of applications that, for example, minimise present day bias to obtain an accurate ensemble mean, reduce dependence in ensemble spread, maximise future spread, ensure good performance of individual models in an ensemble, reduce the ensemble size while maintaining important ensemble characteristics, or optimise several of these at the same time. As in any calibration exercise, the final ensemble is sensitive to the metric, observational product and pre-processing steps used.


2018 ◽  
Vol 19 (11) ◽  
pp. 1881-1898 ◽  
Author(s):  
Sjoukje Philip ◽  
Sarah F. Kew ◽  
Geert Jan van Oldenborgh ◽  
Emma Aalbers ◽  
Robert Vautard ◽  
...  

Abstract The extreme precipitation that resulted in historic flooding in central-northern France began 26 May 2016 and was linked to a large cutoff low. The floods caused some casualties and over a billion euros in damage. To objectively answer the question of whether anthropogenic climate change played a role, a near-real-time “rapid” attribution analysis was performed, using well-established event attribution methods, best available observational data, and as many climate simulations as possible within that time frame. This study confirms the results of the rapid attribution study. We estimate how anthropogenic climate change has affected the likelihood of exceedance of the observed amount of 3-day precipitation in April–June for the Seine and Loire basins. We find that the observed precipitation in the Seine basin was very rare, with a return period of hundreds of years. It was less rare on the Loire—roughly 1 in 20 years. We evaluated five climate model ensembles for 3-day basin-averaged precipitation extremes in April–June. The four ensembles that simulated the statistics agree well. Combining the results reduces the uncertainty and indicates that the probability of such rainfall has increased over the last century by about a factor of 2.2 (>1.4) on the Seine and 1.9 (>1.5) on the Loire due to anthropogenic emissions. These numbers are virtually the same as those in the near-real-time attribution study by van Oldenborgh et al. Together with the evaluation of the attribution of Storm Desmond by Otto et al., this shows that, for these types of events, near-real-time attribution studies are now possible.


Sign in / Sign up

Export Citation Format

Share Document