Multi-model ensemble mean of global climate models fails to reproduce early twentieth century Arctic warming

Polar Science ◽  
2021 ◽  
pp. 100677
Author(s):  
Mikhail M. Latonin ◽  
Igor L. Bashmachnikov ◽  
Leonid P. Bobylev ◽  
Richard Davy
2013 ◽  
Vol 26 (14) ◽  
pp. 4910-4929 ◽  
Author(s):  
Sharon C. Delcambre ◽  
David J. Lorenz ◽  
Daniel J. Vimont ◽  
Jonathan E. Martin

Abstract The present study focuses on diagnosing the intermodel variability of nonzonally averaged NH winter jet stream portrayal in 17 global climate models (GCMs) from phase three of the Coupled Model Intercomparison Project (CMIP3). Relative to the reanalysis, the ensemble-mean 300-hPa Atlantic jet is too zonally extended and located too far equatorward in GCMs. The Pacific jet varies significantly between modeling groups, with large biases in the vicinity of the jet exit region that cancel in the ensemble mean. After seeking relationships between twentieth-century model wind biases and 1) the internal modes of jet variability or 2) tropical sea surface temperatures (SSTs), it is found that biases in upper-level winds are strongly related to an ENSO-like pattern in winter-mean tropical Pacific Ocean SST biases. The spatial structure of the leading modes of variability of the upper-level jet in the twentieth century is found to be accurately modeled in all 17 GCMs. Also, it is shown that Pacific model biases in the longitude of EOFs 1 and 2 are strongly linked to the modeled longitude of the Pacific jet exit, indicating that the improved characterization of the mean state of the Pacific jet may positively impact the modeled variability. This work suggests that improvements in model portrayal of the tropical Pacific mean state may significantly advance the portrayal of the mean state of the Pacific and Atlantic jets, which will consequently improve the modeled jet stream variability in the Pacific. To complement these findings, a companion paper examines the twenty-first-century GCM projections of the nonzonally averaged NH jet streams.


Author(s):  
SOURABH SHRIVASTAVA ◽  
RAM AVTAR ◽  
PRASANTA KUMAR BAL

The coarse horizontal resolution global climate models (GCMs) have limitations in producing large biases over the mountainous region. Also, single model output or simple multi-model ensemble (SMME) outputs are associated with large biases. While predicting the rainfall extreme events, this study attempts to use an alternative modeling approach by using five different machine learning (ML) algorithms to improve the skill of North American Multi-Model Ensemble (NMME) GCMs during Indian summer monsoon rainfall from 1982 to 2009 by reducing the model biases. Random forest (RF), AdaBoost (Ada), gradient (Grad) boosting, bagging (Bag) and extra (Extra) trees regression models are used and the results from each models are compared against the observations. In simple MME (SMME), a wet bias of 20[Formula: see text]mm/day and an RMSE up to 15[Formula: see text]mm/day are found over the Himalayan region. However, all the ML models can bring down the mean bias up to [Formula: see text][Formula: see text]mm/day and RMSE up to 2[Formula: see text]mm/day. The interannual variability in ML outputs is closer to observation than the SMME. Also, a high correlation from 0.5 to 0.8 is found between in all ML models and then in SMME. Moreover, representation of RF and Grad is found to be best out of all five ML models that represent a high correlation over the Himalayan region. In conclusion, by taking full advantage of different models, the proposed ML-based multi-model ensemble method is shown to be accurate and effective.


2021 ◽  
Author(s):  
Thedini Asali Peiris ◽  
Petra Döll

<p>Unlike global climate models, hydrological models cannot simulate the feedbacks among atmospheric processes, vegetation, water, and energy exchange at the land surface. This severely limits their ability to quantify the impact of climate change and the concurrent increase of atmospheric CO<sub>2</sub> concentrations on evapotranspiration and thus runoff. Hydrological models generally calculate actual evapotranspiration as a fraction of potential evapotranspiration (PET), which is computed as a function of temperature and net radiation and sometimes of humidity and wind speed. Almost no hydrological model takes into account that PET changes because the vegetation responds to changing CO<sub>2</sub> and climate. This active vegetation response consists of three components. With higher CO<sub>2</sub> concentrations, 1) plant stomata close, reducing transpiration (physiological effect) and 2) plants may grow better, with more leaves, increasing transpiration (structural effect), while 3) climatic changes lead to changes in plants growth and even biome shifts, changing evapotranspiration. Global climate models, which include dynamic vegetation models, simulate all these processes, albeit with a high uncertainty, and take into account the feedbacks to the atmosphere.</p><p>Milly and Dunne (2016) (MD) found that in the case of RCP8.5 the change of PET (computed using the Penman-Monteith equation) between 1981- 2000 and 2081-2100 is much higher than the change of non-water-stressed evapotranspiration (NWSET) computed by an ensemble of global climate models. This overestimation is partially due to the neglect of active vegetation response and partially due to the neglected feedbacks between the atmosphere and the land surface.</p><p>The objective of this paper is to present a simple approach for hydrological models that enables them to mimic the effect of active vegetation on potential evapotranspiration under climate change, thus improving computation of freshwater-related climate change hazards by hydrological models. MD proposed an alternative approach to estimate changes in PET for impact studies that is only a function of the changes in energy and not of temperature and achieves a good fit to the ensemble mean change of evapotranspiration computed by the ensemble of global climate models in months and grid cells without water stress. We developed an implementation of the MD idea for hydrological models using the Priestley-Taylor equation (PET-PT) to estimate PET as a function of net radiation and temperature. With PET-PT, an increasing temperature trend leads to strong increases in PET. Our proposed methodology (PET-MD) helps to remove this effect, retaining the impact of temperature on PET but not on long-term PET change.</p><p>We implemented the PET-MD approach in the global hydrological model WaterGAP2.2d. and computed daily time series of PET between 1981 and 2099 using bias-adjusted climate data of four global climate models for RCP 8.5. We evaluated, computed PET-PT and PET-MD at the grid cell level and globally, comparing also to the results of the Milly-Dunne study. The global analysis suggests that the application of PET-MD reduces the PET change until the end of this century from 3.341 mm/day according to PET-PT to 3.087 mm/day (ensemble mean over the four global climate models).</p><p>Milly, P.C.D., Dunne K.A. (2016). DOI:10.1038/nclimate3046.</p>


2020 ◽  
Author(s):  
James Murphy

<p>The challenge of combining initialised and uninitialised decadal projections</p><p>James Murphy, Robin Clark, Nick Dunstone, Glen Harris, Leon Hermanson and Doug Smith</p><p>During the past 10 years or so, exploratory work in initialised decadal climate prediction, using global climate models started from recent analyses of observations, has grown into a coordinated international programme that contributes to IPCC assessments. At the same time, countries have continued to develop and update their national climate change scenarios.  These typically cover the full 21<sup>st</sup> century, including the initial decade that overlaps with the latest initialised forecasts. To date, however, national scenarios continue to be based exclusively on long-term (uninitialised) climate change simulations, with initialised information regarded as a separate stream of information.</p><p>We will use early results from the latest UK national scenarios (UKCP), and the latest CMIP6 initialised predictions, to illustrate the potential and challenges associated with the notion of combining both streams of information. This involves assessing the effects of initialisation on predictability and uncertainty (as indicated, for example, by the skill of ensemble-mean forecasts and the spread amongst constituent ensemble members). Here, a particular challenge involves interpretation of the “signal-to-noise” problem, in which ensemble-mean skill can sometimes be found which is larger than would be expected on the basis of the ensemble spread. In addition to initialisation, we will also emphasise the importance of understanding how the assessment of climate risks depends on other features of prediction system design, including the sampling of model uncertainties and the simulation of internal climate variability.</p>


2021 ◽  
Vol 289 ◽  
pp. 01009
Author(s):  
Valeriya Petruhina

The problem of predicting climate change and its impact on humans is quite important and relevant in recent times. For a long time, mechanisms and methods for predicting the behavior of the climate in various regions and regions of our planet have been developed. Due to climate change, aggressive human impact on nature, and other various factors, the methods developed in the mid-twentieth century are becoming ineffective, and it is time-consuming but feasible to calculate using several methods. The article considers the technology of processing geoclimatic data, which is used to form spatially distributed predictive estimates of the state of the atmosphere.


2018 ◽  
Vol 2018 ◽  
pp. 1-24 ◽  
Author(s):  
Jacob Agyekum ◽  
Thompson Annor ◽  
Benjamin Lamptey ◽  
Emmannuel Quansah ◽  
Richard Yao Kuma Agyeman

A selected number of global climate models (GCMs) from the fifth Coupled Model Intercomparison Project (CMIP5) were evaluated over the Volta Basin for precipitation. Biases in models were computed by taking the differences between the averages over the period (1950–2004) of the models and the observation, normalized by the average of the observed for the annual and seasonal timescales. The Community Earth System Model, version 1-Biogeochemistry (CESM1-BGC), the Community Climate System Model Version 4 (CCSM4), the Max Planck Institute Earth System Model, Medium Range (MPI-ESM-MR), the Norwegian Earth System Model (NorESM1-M), and the multimodel ensemble mean were able to simulate the observed climatological mean of the annual total precipitation well (average biases of 1.9% to 7.5%) and hence were selected for the seasonal and monthly timescales. Overall, all the models (CESM1-BGC, CCSM4, MPI-ESM-MR, and NorESM1-M) scored relatively low for correlation (<0.5) but simulated the observed temporal variability differently ranging from 1.0 to 3.0 for the seasonal total. For the annual cycle of the monthly total, the CESM1-BGC, the MPI-ESM-MR, and the NorESM1-M were able to simulate the peak of the observed rainy season well in the Soudano-Sahel, the Sahel, and the entire basin, respectively, while all the models had difficulty in simulating the bimodal pattern of the Guinea Coast. The ensemble mean shows high performance compared to the individual models in various timescales.


2021 ◽  
Author(s):  
Matias Ezequiel Olmo ◽  
Rocio Balmaceda-Huarte ◽  
Maria Laura Bettolli

Abstract High-resolution climate information is required over southeastern South America (SESA) for a better understanding of the observed and projected climate changes due to their strong socio-economic and hydrological impacts. Thereby, this work focuses on the construction of an unprecedented multi-model ensemble of statistically downscaled global climate models (GCMs) for daily precipitation, considering different statistical techniques - including analogs, generalized linear models and neural networks - and a variety of CMIP5 and CMIP6 models. The skills and shortcomings of the different downscaled models were identified. Most of the methods added value in the representation of the main features of daily precipitation, especially in the spatial and intra-annual variability of extremes. The statistical methods showed to be sensitive to the driver GCMs, although the ESD family choice also introduced differences in the simulations. The statistically downscaled projections depicted increases in mean precipitation associated with a rising frequency of extreme events - mostly during the warm season - following the registered trends over SESA. Change rates were consistent among downscaled models up to the middle 21st century when model spread started to emerge. Furthermore, these projections were compared to the available CORDEX-CORE RCM simulations, evidencing a consistent agreement between statistical and dynamical downscaling procedures in terms of the sign of the changes, presenting some differences in their intensity. Overall, this study evidences the potential of statistical downscaling in a changing climate and contributes to its undergoing development over SESA.


2012 ◽  
Vol 25 (7) ◽  
pp. 2456-2470 ◽  
Author(s):  
Koichi Sakaguchi ◽  
Xubin Zeng ◽  
Michael A. Brunke

Abstract Motivated by increasing interests in regional- and decadal-scale climate predictions, this study systematically analyzed the spatial- and temporal-scale dependence of the prediction skill of global climate models in surface air temperature (SAT) change in the twentieth century. The linear trends of annual mean SAT over moving time windows (running linear trends) from two observational datasets and simulations by three global climate models [Community Climate System Model, version 3.0 (CCSM3.0), Climate Model, version 2.0 (CM2.0), and Model E-H] that participated in CMIP3 are compared over several temporal (10-, 20-, 30-, 40-, and 50-yr trends) and spatial (5° × 5°, 10° × 10°, 15° × 15°, 20° × 20°, 30° × 30°, 30° latitudinal bands, hemispheric, and global) scales. The distribution of root-mean-square error is improved with increasing spatial and temporal scales, approaching the observational uncertainty range at the largest scales. Linear correlation shows a similar tendency, but the limited observational length does not provide statistical significance over the longer temporal scales. The comparison of RMSE to climatology and a Monte Carlo test using preindustrial control simulations suggest that the multimodel ensemble mean is able to reproduce robust climate signals at 30° zonal mean or larger spatial scales, while correlation requires hemispherical or global mean for the twentieth-century simulations. Persistent lower performance is observed over the northern high latitudes and the North Atlantic southeast of Greenland. Although several caveats exist for the metrics used in this study, the analyses across scales and/or over running time windows can be taken as one of the approaches for climate system model evaluations.


2017 ◽  
Vol 30 (16) ◽  
pp. 6279-6295 ◽  
Author(s):  
Martin B. Stolpe ◽  
Iselin Medhaug ◽  
Reto Knutti

Recent studies have suggested that significant parts of the observed warming in the early and the late twentieth century were caused by multidecadal internal variability centered in the Atlantic and Pacific Oceans. Here, a novel approach is used that searches for segments of unforced preindustrial control simulations from global climate models that best match the observed Atlantic and Pacific multidecadal variability (AMV and PMV, respectively). In this way, estimates of the influence of AMV and PMV on global temperature that are consistent both spatially and across variables are made. Combined Atlantic and Pacific internal variability impacts the global surface temperatures by up to 0.15°C from peak-to-peak on multidecadal time scales. Internal variability contributed to the warming between the 1920s and 1940s, the subsequent cooling period, and the warming since then. However, variations in the rate of warming still remain after removing the influence of internal variability associated with AMV and PMV on the global temperatures. During most of the twentieth century, AMV dominates over PMV for the multidecadal internal variability imprint on global and Northern Hemisphere temperatures. Less than 10% of the observed global warming during the second half of the twentieth century is caused by internal variability in these two ocean basins, reinforcing the attribution of most of the observed warming to anthropogenic forcings.


2006 ◽  
Vol 19 (20) ◽  
pp. 5305-5318 ◽  
Author(s):  
Terry C. K. Lee ◽  
Francis W. Zwiers ◽  
Xuebin Zhang ◽  
Min Tsao

Abstract It is argued that simulations of the twentieth century performed with coupled global climate models with specified historical changes in external radiative forcing can be interpreted as climate hindcasts. A simple Bayesian method for postprocessing such simulations is described, which produces probabilistic hindcasts of interdecadal temperature changes on large spatial scales. Hindcasts produced for the last two decades of the twentieth century are shown to be skillful. The suggestion that skillful decadal forecasts can be produced on large regional scales by exploiting the response to anthropogenic forcing provides additional evidence that anthropogenic change in the composition of the atmosphere has influenced the climate. In the absence of large negative volcanic forcing on the climate system (which cannot presently be forecast), it is predicted that the global mean temperature for the decade 2000–09 will lie above the 1970–99 normal with a probability of 0.94. The global mean temperature anomaly for this decade relative to 1970–99 is predicted to be 0.35°C with a 5%–95% confidence range of 0.21°–0.48°C.


Sign in / Sign up

Export Citation Format

Share Document