scholarly journals Estimation of ECHAM5 climate model closure parameters with adaptive MCMC

2010 ◽  
Vol 10 (20) ◽  
pp. 9993-10002 ◽  
Author(s):  
H. Järvinen ◽  
P. Räisänen ◽  
M. Laine ◽  
J. Tamminen ◽  
A. Ilin ◽  
...  

Abstract. Climate models contain closure parameters to which the model climate is sensitive. These parameters appear in physical parameterization schemes where some unresolved variables are expressed by predefined parameters rather than being explicitly modeled. Currently, best expert knowledge is used to define the optimal closure parameter values, based on observations, process studies, large eddy simulations, etc. Here, parameter estimation, based on the adaptive Markov chain Monte Carlo (MCMC) method, is applied for estimation of joint posterior probability density of a small number (n=4) of closure parameters appearing in the ECHAM5 climate model. The parameters considered are related to clouds and precipitation and they are sampled by an adaptive random walk process of the MCMC. The parameter probability densities are estimated simultaneously for all parameters, subject to an objective function. Five alternative formulations of the objective function are tested, all related to the net radiative flux at the top of the atmosphere. Conclusions of the closure parameter estimation tests with a low-resolution ECHAM5 climate model indicate that (i) adaptive MCMC is a viable option for parameter estimation in large-scale computational models, and (ii) choice of the objective function is crucial for the identifiability of the parameter distributions.

2010 ◽  
Vol 10 (5) ◽  
pp. 11951-11973
Author(s):  
H. Järvinen ◽  
P. Räisänen ◽  
M. Laine ◽  
J. Tamminen ◽  
A. Ilin ◽  
...  

Abstract. Climate models contain closure parameters to which the model climate is sensitive. These parameters appear in physical parameterization schemes where some unresolved variables are expressed by predefined parameters rather than being explicitly modeled. Currently, best expert knowledge is used to define the optimal closure parameter values, based on observations, process studies, large eddy simulations, etc. Here, parameter estimation, based on the adaptive Markov chain Monte Carlo (MCMC) method, is applied for estimation of joint posterior probability density of a small number (n=4) of closure parameters appearing in the ECHAM5 climate model. The parameters considered are related to clouds and precipitation and they are sampled by an adaptive random walk process of the MCMC. The parameter probability densities are estimated simultaneously for all parameters, subject to an objective function. Five alternative formulations of the objective function are tested, all related to the net radiative flux at the top of the atmosphere. Conclusions of the closure parameter estimation tests with a low-resolution ECHAM5 climate model indicate that (i) adaptive MCMC is a viable option for parameter estimation in large-scale computational models, and (ii) choice of the objective function is crucial for the identifiability of the parameter distributions.


2008 ◽  
Vol 21 (22) ◽  
pp. 6052-6059 ◽  
Author(s):  
B. Timbal ◽  
P. Hope ◽  
S. Charles

Abstract The consistency between rainfall projections obtained from direct climate model output and statistical downscaling is evaluated. Results are averaged across an area large enough to overcome the difference in spatial scale between these two types of projections and thus make the comparison meaningful. Undertaking the comparison using a suite of state-of-the-art coupled climate models for two forcing scenarios presents a unique opportunity to test whether statistical linkages established between large-scale predictors and local rainfall under current climate remain valid in future climatic conditions. The study focuses on the southwest corner of Western Australia, a region that has experienced recent winter rainfall declines and for which climate models project, with great consistency, further winter rainfall reductions due to global warming. Results show that as a first approximation the magnitude of the modeled rainfall decline in this region is linearly related to the model global warming (a reduction of about 9% per degree), thus linking future rainfall declines to future emission paths. Two statistical downscaling techniques are used to investigate the influence of the choice of technique on projection consistency. In addition, one of the techniques was assessed using different large-scale forcings, to investigate the impact of large-scale predictor selection. Downscaled and direct model projections are consistent across the large number of models and two scenarios considered; that is, there is no tendency for either to be biased; and only a small hint that large rainfall declines are reduced in downscaled projections. Among the two techniques, a nonhomogeneous hidden Markov model provides greater consistency with climate models than an analog approach. Differences were due to the choice of the optimal combination of predictors. Thus statistically downscaled projections require careful choice of large-scale predictors in order to be consistent with physically based rainfall projections. In particular it was noted that a relative humidity moisture predictor, rather than specific humidity, was needed for downscaled projections to be consistent with direct model output projections.


2021 ◽  
Vol 17 (4) ◽  
pp. 1665-1684
Author(s):  
Leonore Jungandreas ◽  
Cathy Hohenegger ◽  
Martin Claussen

Abstract. Global climate models experience difficulties in simulating the northward extension of the monsoonal precipitation over north Africa during the mid-Holocene as revealed by proxy data. A common feature of these models is that they usually operate on grids that are too coarse to explicitly resolve convection, but convection is the most essential mechanism leading to precipitation in the West African Monsoon region. Here, we investigate how the representation of tropical deep convection in the ICOsahedral Nonhydrostatic (ICON) climate model affects the meridional distribution of monsoonal precipitation during the mid-Holocene by comparing regional simulations of the summer monsoon season (July to September; JAS) with parameterized and explicitly resolved convection. In the explicitly resolved convection simulation, the more localized nature of precipitation and the absence of permanent light precipitation as compared to the parameterized convection simulation is closer to expectations. However, in the JAS mean, the parameterized convection simulation produces more precipitation and extends further north than the explicitly resolved convection simulation, especially between 12 and 17∘ N. The higher precipitation rates in the parameterized convection simulation are consistent with a stronger monsoonal circulation over land. Furthermore, the atmosphere in the parameterized convection simulation is less stably stratified and notably moister. The differences in atmospheric water vapor are the result of substantial differences in the probability distribution function of precipitation and its resulting interactions with the land surface. The parametrization of convection produces light and large-scale precipitation, keeping the soils moist and supporting the development of convection. In contrast, less frequent but locally intense precipitation events lead to high amounts of runoff in the explicitly resolved convection simulations. The stronger runoff inhibits the moistening of the soil during the monsoon season and limits the amount of water available to evaporation in the explicitly resolved convection simulation.


2020 ◽  
Vol 13 (5) ◽  
pp. 2355-2377
Author(s):  
Vijay S. Mahadevan ◽  
Iulian Grindeanu ◽  
Robert Jacob ◽  
Jason Sarich

Abstract. One of the fundamental factors contributing to the spatiotemporal inaccuracy in climate modeling is the mapping of solution field data between different discretizations and numerical grids used in the coupled component models. The typical climate computational workflow involves evaluation and serialization of the remapping weights during the preprocessing step, which is then consumed by the coupled driver infrastructure during simulation to compute field projections. Tools like Earth System Modeling Framework (ESMF) (Hill et al., 2004) and TempestRemap (Ullrich et al., 2013) offer capability to generate conservative remapping weights, while the Model Coupling Toolkit (MCT) (Larson et al., 2001) that is utilized in many production climate models exposes functionality to make use of the operators to solve the coupled problem. However, such multistep processes present several hurdles in terms of the scientific workflow and impede research productivity. In order to overcome these limitations, we present a fully integrated infrastructure based on the Mesh Oriented datABase (MOAB) (Tautges et al., 2004; Mahadevan et al., 2015) library, which allows for a complete description of the numerical grids and solution data used in each submodel. Through a scalable advancing-front intersection algorithm, the supermesh of the source and target grids are computed, which is then used to assemble the high-order, conservative, and monotonicity-preserving remapping weights between discretization specifications. The Fortran-compatible interfaces in MOAB are utilized to directly link the submodels in the Energy Exascale Earth System Model (E3SM) to enable online remapping strategies in order to simplify the coupled workflow process. We demonstrate the superior computational efficiency of the remapping algorithms in comparison with other state-of-the-science tools and present strong scaling results on large-scale machines for computing remapping weights between the spectral element atmosphere and finite volume discretizations on the polygonal ocean grids.


2021 ◽  
Author(s):  
Antoine Doury ◽  
Samuel Somot ◽  
Sébastien Gadat ◽  
Aurélien Ribes ◽  
Lola Corre

Abstract Providing reliable information on climate change at local scale remains a challenge of first importance for impact studies and policymakers. Here, we propose a novel hybrid downscaling method combining the strengths of both empirical statistical downscaling methods and Regional Climate Models (RCMs). The aim of this tool is to enlarge the size of high-resolution RCM simulation ensembles at low cost.We build a statistical RCM-emulator by estimating the downscaling function included in the RCM. This framework allows us to learn the relationship between large-scale predictors and a local surface variable of interest over the RCM domain in present and future climate. Furthermore, the emulator relies on a neural network architecture, which grants computational efficiency. The RCM-emulator developed in this study is trained to produce daily maps of the near-surface temperature at the RCM resolution (12km). The emulator demonstrates an excellent ability to reproduce the complex spatial structure and daily variability simulated by the RCM and in particular the way the RCM refines locally the low-resolution climate patterns. Training in future climate appears to be a key feature of our emulator. Moreover, there is a huge computational benefit in running the emulator rather than the RCM, since training the emulator takes about 2 hours on GPU, and the prediction is nearly instantaneous. However, further work is needed to improve the way the RCM-emulator reproduces some of the temperature extremes, the intensity of climate change, and to extend the proposed methodology to different regions, GCMs, RCMs, and variables of interest.


2012 ◽  
Vol 16 (2) ◽  
pp. 305-318 ◽  
Author(s):  
I. Haddeland ◽  
J. Heinke ◽  
F. Voß ◽  
S. Eisner ◽  
C. Chen ◽  
...  

Abstract. Due to biases in the output of climate models, a bias correction is often needed to make the output suitable for use in hydrological simulations. In most cases only the temperature and precipitation values are bias corrected. However, often there are also biases in other variables such as radiation, humidity and wind speed. In this study we tested to what extent it is also needed to bias correct these variables. Responses to radiation, humidity and wind estimates from two climate models for four large-scale hydrological models are analysed. For the period 1971–2000 these hydrological simulations are compared to simulations using meteorological data based on observations and reanalysis; i.e. the baseline simulation. In both forcing datasets originating from climate models precipitation and temperature are bias corrected to the baseline forcing dataset. Hence, it is only effects of radiation, humidity and wind estimates that are tested here. The direct use of climate model outputs result in substantial different evapotranspiration and runoff estimates, when compared to the baseline simulations. A simple bias correction method is implemented and tested by rerunning the hydrological models using bias corrected radiation, humidity and wind values. The results indicate that bias correction can successfully be used to match the baseline simulations. Finally, historical (1971–2000) and future (2071–2100) model simulations resulting from using bias corrected forcings are compared to the results using non-bias corrected forcings. The relative changes in simulated evapotranspiration and runoff are relatively similar for the bias corrected and non bias corrected hydrological projections, although the absolute evapotranspiration and runoff numbers are often very different. The simulated relative and absolute differences when using bias corrected and non bias corrected climate model radiation, humidity and wind values are, however, smaller than literature reported differences resulting from using bias corrected and non bias corrected climate model precipitation and temperature values.


2016 ◽  
Vol 20 (5) ◽  
pp. 2047-2061 ◽  
Author(s):  
Sebastiano Piccolroaz ◽  
Michele Di Lazzaro ◽  
Antonio Zarlenga ◽  
Bruno Majone ◽  
Alberto Bellin ◽  
...  

Abstract. We present HYPERstream, an innovative streamflow routing scheme based on the width function instantaneous unit hydrograph (WFIUH) theory, which is specifically designed to facilitate coupling with weather forecasting and climate models. The proposed routing scheme preserves geomorphological dispersion of the river network when dealing with horizontal hydrological fluxes, irrespective of the computational grid size inherited from the overlaying climate model providing the meteorological forcing. This is achieved by simulating routing within the river network through suitable transfer functions obtained by applying the WFIUH theory to the desired level of detail. The underlying principle is similar to the block-effective dispersion employed in groundwater hydrology, with the transfer functions used to represent the effect on streamflow of morphological heterogeneity at scales smaller than the computational grid. Transfer functions are constructed for each grid cell with respect to the nodes of the network where streamflow is simulated, by taking advantage of the detailed morphological information contained in the digital elevation model (DEM) of the zone of interest. These characteristics make HYPERstream well suited for multi-scale applications, ranging from catchment up to continental scale, and to investigate extreme events (e.g., floods) that require an accurate description of routing through the river network. The routing scheme enjoys parsimony in the adopted parametrization and computational efficiency, leading to a dramatic reduction of the computational effort with respect to full-gridded models at comparable level of accuracy. HYPERstream is designed with a simple and flexible modular structure that allows for the selection of any rainfall-runoff model to be coupled with the routing scheme and the choice of different hillslope processes to be represented, and it makes the framework particularly suitable to massive parallelization, customization according to the specific user needs and preferences, and continuous development and improvements.


2020 ◽  
Author(s):  
Danijel Belusic ◽  
Petter Lind ◽  
Oskar Landgren ◽  
Dominic Matte ◽  
Rasmus Anker Pedersen ◽  
...  

<p>Current literature strongly indicates large benefits of convection permitting models for subdaily summer precipitation extremes. There has been less insight about other variables, seasons and weather conditions. We examine new climate simulations over the Nordic region, performed with the HCLIM38 regional climate model at both convection permitting and coarser scales, searching for benefits of using convection permitting resolutions. The Nordic climate is influenced by the North Atlantic storm track and characterised by large seasonal contrasts in temperature and precipitation. It is also in rapid change, most notably in the winter season when feedback processes involving retreating snow and ice lead to larger warming than in many other regions. This makes the area an ideal testbed for regional climate models. We explore the effects of higher resolution and better reproduction of convection on various aspects of the climate, such as snow in the mountains, coastal and other thermal circulations, convective storms and precipitation with a special focus on extreme events. We investigate how the benefits of convection permitting models change with different variables and seasons, and also their sensitivity to different circulation regimes.</p>


2017 ◽  
Vol 98 (1) ◽  
pp. 79-93 ◽  
Author(s):  
Elizabeth J. Kendon ◽  
Nikolina Ban ◽  
Nigel M. Roberts ◽  
Hayley J. Fowler ◽  
Malcolm J. Roberts ◽  
...  

Abstract Regional climate projections are used in a wide range of impact studies, from assessing future flood risk to climate change impacts on food and energy production. These model projections are typically at 12–50-km resolution, providing valuable regional detail but with inherent limitations, in part because of the need to parameterize convection. The first climate change experiments at convection-permitting resolution (kilometer-scale grid spacing) are now available for the United Kingdom; the Alps; Germany; Sydney, Australia; and the western United States. These models give a more realistic representation of convection and are better able to simulate hourly precipitation characteristics that are poorly represented in coarser-resolution climate models. Here we examine these new experiments to determine whether future midlatitude precipitation projections are robust from coarse to higher resolutions, with implications also for the tropics. We find that the explicit representation of the convective storms themselves, only possible in convection-permitting models, is necessary for capturing changes in the intensity and duration of summertime rain on daily and shorter time scales. Other aspects of rainfall change, including changes in seasonal mean precipitation and event occurrence, appear robust across resolutions, and therefore coarse-resolution regional climate models are likely to provide reliable future projections, provided that large-scale changes from the global climate model are reliable. The improved representation of convective storms also has implications for projections of wind, hail, fog, and lightning. We identify a number of impact areas, especially flooding, but also transport and wind energy, for which very high-resolution models may be needed for reliable future assessments.


2016 ◽  
Vol 29 (11) ◽  
pp. 4099-4119 ◽  
Author(s):  
Shan Li ◽  
Shaoqing Zhang ◽  
Zhengyu Liu ◽  
Xiaosong Yang ◽  
Anthony Rosati ◽  
...  

Abstract Uncertainty in cumulus convection parameterization is one of the most important causes of model climate drift through interactions between large-scale background and local convection that use empirically set parameters. Without addressing the large-scale feedback, the calibrated parameter values within a convection scheme are usually not optimal for a climate model. This study first designs a multiple-column atmospheric model that includes large-scale feedbacks for cumulus convection and then explores the role of large-scale feedbacks in cumulus convection parameter estimation using an ensemble filter. The performance of convection parameter estimation with or without the presence of large-scale feedback is examined. It is found that including large-scale feedbacks in cumulus convection parameter estimation can significantly improve the estimation quality. This is because large-scale feedbacks help transform local convection uncertainties into global climate sensitivities, and including these feedbacks enhances the statistical representation of the relationship between parameters and state variables. The results of this study provide insights for further understanding of climate drift induced from imperfect cumulus convection parameterization, which may help improve climate modeling.


Sign in / Sign up

Export Citation Format

Share Document