Common EOFs in atmospheric science and large-scale flow

Author(s):  
Abdel Hannachi ◽  
Kathrin Finke ◽  
Nickolay Trendafilov

<p>Conventional analysis of the large-scale atmospheric variability and teleconnections are obtained using the empirical orthogonal function (EOF) method, which was developed mainly to deal with single fields. With the increase of the amount of observed/simulated large-scale atmospheric data including climate models, e.g., CMIP, there is a need to develop methods with efficient algorithms that enable analysis and comparison/validation of climate model simulations. Here we describe the common EOF method, which finds common patterns of a set of large scale atmospheric fields, and enables comparing several model outputs simultaneously. A step-wise/sequential algorithm is presented, which avoids the difficulty encountered in previous algorithms related to the lack of simultaneous monotonic change of the eigenvalues of all fields. The theory and algorithm are presented, and the application to large-scale teleconnections from various reanalysis products and CMIP6 are discussed.</p>

2013 ◽  
Vol 13 (3) ◽  
pp. 779-793 ◽  
Author(s):  
B. Ziv ◽  
Y. Kushnir ◽  
J. Nakamura ◽  
N. H. Naik ◽  
T. Harpaz

Abstract. The study aims to evaluate the ability of global, coupled climate models to reproduce the synoptic regime of the Mediterranean Basin. The output of simulations of the 9 models included in the IPCC CMIP3 effort is compared to the NCEP-NCAR reanalyzed data for the period 1961–1990. The study examined the spatial distribution of cyclone occurrence, the mean Mediterranean upper- and lower-level troughs, the inter-annual variation and trend in the occurrence of the Mediterranean cyclones, and the main large-scale circulation patterns, represented by rotated EOFs of 500 hPa and sea level pressure. The models reproduce successfully the two maxima in cyclone density in the Mediterranean and their locations, the location of the average upper- and lower-level troughs, the relative inter-annual variation in cyclone occurrences and the structure of the four leading large scale EOFs. The main discrepancy is the models' underestimation of the cyclone density in the Mediterranean, especially in its western part. The models' skill in reproducing the cyclone distribution is found correlated with their spatial resolution, especially in the vertical. The current improvement in model spatial resolution suggests that their ability to reproduce the Mediterranean cyclones would be improved as well.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Thomas Slater ◽  
Andrew Shepherd ◽  
Malcolm McMillan ◽  
Amber Leeson ◽  
Lin Gilbert ◽  
...  

AbstractRunoff from the Greenland Ice Sheet has increased over recent decades affecting global sea level, regional ocean circulation, and coastal marine ecosystems, and it now accounts for most of the contemporary mass imbalance. Estimates of runoff are typically derived from regional climate models because satellite records have been limited to assessments of melting extent. Here, we use CryoSat-2 satellite altimetry to produce direct measurements of Greenland’s runoff variability, based on seasonal changes in the ice sheet’s surface elevation. Between 2011 and 2020, Greenland’s ablation zone thinned on average by 1.4 ± 0.4 m each summer and thickened by 0.9 ± 0.4 m each winter. By adjusting for the steady-state divergence of ice, we estimate that runoff was 357 ± 58 Gt/yr on average – in close agreement with regional climate model simulations (root mean square difference of 47 to 60 Gt/yr). As well as being 21 % higher between 2011 and 2020 than over the preceding three decades, runoff is now also 60 % more variable from year-to-year as a consequence of large-scale fluctuations in atmospheric circulation. Because this variability is not captured in global climate model simulations, our satellite record of runoff should help to refine them and improve confidence in their projections.


2021 ◽  
Vol 118 (39) ◽  
pp. e2112797118 ◽  
Author(s):  
Michael E. Mann

More than two decades ago, my coauthors, Raymond Bradley and Malcolm Hughes, and I published the now iconic “hockey stick” curve. It was a simple graph, derived from large-scale networks of diverse climate proxy (“multiproxy”) data such as tree rings, ice cores, corals, and lake sediments, that captured the unprecedented nature of the warming taking place today. It became a focal point in the debate over human-caused climate change and what to do about it. Yet, the apparent simplicity of the hockey stick curve betrays the dynamicism and complexity of the climate history of past centuries and how it can inform our understanding of human-caused climate change and its impacts. In this article, I discuss the lessons we can learn from studying paleoclimate records and climate model simulations of the “Common Era,” the period of the past two millennia during which the “signal” of human-caused warming has risen dramatically from the background of natural variability.


2008 ◽  
Vol 21 (22) ◽  
pp. 6052-6059 ◽  
Author(s):  
B. Timbal ◽  
P. Hope ◽  
S. Charles

Abstract The consistency between rainfall projections obtained from direct climate model output and statistical downscaling is evaluated. Results are averaged across an area large enough to overcome the difference in spatial scale between these two types of projections and thus make the comparison meaningful. Undertaking the comparison using a suite of state-of-the-art coupled climate models for two forcing scenarios presents a unique opportunity to test whether statistical linkages established between large-scale predictors and local rainfall under current climate remain valid in future climatic conditions. The study focuses on the southwest corner of Western Australia, a region that has experienced recent winter rainfall declines and for which climate models project, with great consistency, further winter rainfall reductions due to global warming. Results show that as a first approximation the magnitude of the modeled rainfall decline in this region is linearly related to the model global warming (a reduction of about 9% per degree), thus linking future rainfall declines to future emission paths. Two statistical downscaling techniques are used to investigate the influence of the choice of technique on projection consistency. In addition, one of the techniques was assessed using different large-scale forcings, to investigate the impact of large-scale predictor selection. Downscaled and direct model projections are consistent across the large number of models and two scenarios considered; that is, there is no tendency for either to be biased; and only a small hint that large rainfall declines are reduced in downscaled projections. Among the two techniques, a nonhomogeneous hidden Markov model provides greater consistency with climate models than an analog approach. Differences were due to the choice of the optimal combination of predictors. Thus statistically downscaled projections require careful choice of large-scale predictors in order to be consistent with physically based rainfall projections. In particular it was noted that a relative humidity moisture predictor, rather than specific humidity, was needed for downscaled projections to be consistent with direct model output projections.


2017 ◽  
Vol 10 (2) ◽  
pp. 889-901 ◽  
Author(s):  
Daniel J. Lunt ◽  
Matthew Huber ◽  
Eleni Anagnostou ◽  
Michiel L. J. Baatsen ◽  
Rodrigo Caballero ◽  
...  

Abstract. Past warm periods provide an opportunity to evaluate climate models under extreme forcing scenarios, in particular high ( >  800 ppmv) atmospheric CO2 concentrations. Although a post hoc intercomparison of Eocene ( ∼  50  Ma) climate model simulations and geological data has been carried out previously, models of past high-CO2 periods have never been evaluated in a consistent framework. Here, we present an experimental design for climate model simulations of three warm periods within the early Eocene and the latest Paleocene (the EECO, PETM, and pre-PETM). Together with the CMIP6 pre-industrial control and abrupt 4 ×  CO2 simulations, and additional sensitivity studies, these form the first phase of DeepMIP – the Deep-time Model Intercomparison Project, itself a group within the wider Paleoclimate Modelling Intercomparison Project (PMIP). The experimental design specifies and provides guidance on boundary conditions associated with palaeogeography, greenhouse gases, astronomical configuration, solar constant, land surface processes, and aerosols. Initial conditions, simulation length, and output variables are also specified. Finally, we explain how the geological data sets, which will be used to evaluate the simulations, will be developed.


Author(s):  
Raquel Barata ◽  
Raquel Prado ◽  
Bruno Sansó

Abstract. We present a data-driven approach to assess and compare the behavior of large-scale spatial averages of surface temperature in climate model simulations and in observational products. We rely on univariate and multivariate dynamic linear model (DLM) techniques to estimate both long-term and seasonal changes in temperature. The residuals from the DLM analyses capture the internal variability of the climate system and exhibit complex temporal autocorrelation structure. To characterize this internal variability, we explore the structure of these residuals using univariate and multivariate autoregressive (AR) models. As a proof of concept that can easily be extended to other climate models, we apply our approach to one particular climate model (MIROC5). Our results illustrate model versus data differences in both long-term and seasonal changes in temperature. Despite differences in the underlying factors contributing to variability, the different types of simulation yield very similar spectral estimates of internal temperature variability. In general, we find that there is no evidence that the MIROC5 model systematically underestimates the amplitude of observed surface temperature variability on multi-decadal timescales – a finding that has considerable relevance regarding efforts to identify anthropogenic “fingerprints” in observational surface temperature data. Our methodology and results present a novel approach to obtaining data-driven estimates of climate variability for purposes of model evaluation.


2012 ◽  
Vol 8 (3) ◽  
pp. 1653-1685 ◽  
Author(s):  
P. Brohan ◽  
R. Allan ◽  
E. Freeman ◽  
D. Wheeler ◽  
C. Wilkinson ◽  
...  

Abstract. The current assessment that twentieth-century global temperature change is unusual in the context of the last thousand years relies on estimates of temperature changes from natural proxies (tree-rings, ice-cores etc.) and climate model simulations. Confidence in such estimates is limited by difficulties in calibrating the proxies and systematic differences between proxy reconstructions and model simulations. As the difference between the estimates extends into the relatively recent period of the early nineteenth century it is possible to compare them with a reliable instrumental estimate of the temperature change over that period, provided that enough early thermometer observations, covering a wide enough expanse of the world, can be collected. One organisation which systematically made observations and collected the results was the English East-India Company (EEIC), and their archives have been preserved in the British Library. Inspection of those archives revealed 900 log-books of EEIC ships containing daily instrumental measurements of temperature and pressure, and subjective estimates of wind speed and direction, from voyages across the Atlantic and Indian Oceans between 1789 and 1834. Those records have been extracted and digitised, providing 273 000 new weather records offering an unprecedentedly detailed view of the weather and climate of the late eighteenth and early nineteenth centuries. The new thermometer observations demonstrate that the large-scale temperature response to the Tambora eruption and the 1809 eruption was modest (perhaps 0.5 °C). This provides a powerful out-of-sample validation for the proxy reconstructions – supporting their use for longer-term climate reconstructions. However, some of the climate model simulations in the CMIP5 ensemble show much larger volcanic effects than this – such simulations are unlikely to be accurate in this respect.


2021 ◽  
Vol 17 (4) ◽  
pp. 1665-1684
Author(s):  
Leonore Jungandreas ◽  
Cathy Hohenegger ◽  
Martin Claussen

Abstract. Global climate models experience difficulties in simulating the northward extension of the monsoonal precipitation over north Africa during the mid-Holocene as revealed by proxy data. A common feature of these models is that they usually operate on grids that are too coarse to explicitly resolve convection, but convection is the most essential mechanism leading to precipitation in the West African Monsoon region. Here, we investigate how the representation of tropical deep convection in the ICOsahedral Nonhydrostatic (ICON) climate model affects the meridional distribution of monsoonal precipitation during the mid-Holocene by comparing regional simulations of the summer monsoon season (July to September; JAS) with parameterized and explicitly resolved convection. In the explicitly resolved convection simulation, the more localized nature of precipitation and the absence of permanent light precipitation as compared to the parameterized convection simulation is closer to expectations. However, in the JAS mean, the parameterized convection simulation produces more precipitation and extends further north than the explicitly resolved convection simulation, especially between 12 and 17∘ N. The higher precipitation rates in the parameterized convection simulation are consistent with a stronger monsoonal circulation over land. Furthermore, the atmosphere in the parameterized convection simulation is less stably stratified and notably moister. The differences in atmospheric water vapor are the result of substantial differences in the probability distribution function of precipitation and its resulting interactions with the land surface. The parametrization of convection produces light and large-scale precipitation, keeping the soils moist and supporting the development of convection. In contrast, less frequent but locally intense precipitation events lead to high amounts of runoff in the explicitly resolved convection simulations. The stronger runoff inhibits the moistening of the soil during the monsoon season and limits the amount of water available to evaporation in the explicitly resolved convection simulation.


2020 ◽  
Vol 13 (5) ◽  
pp. 2355-2377
Author(s):  
Vijay S. Mahadevan ◽  
Iulian Grindeanu ◽  
Robert Jacob ◽  
Jason Sarich

Abstract. One of the fundamental factors contributing to the spatiotemporal inaccuracy in climate modeling is the mapping of solution field data between different discretizations and numerical grids used in the coupled component models. The typical climate computational workflow involves evaluation and serialization of the remapping weights during the preprocessing step, which is then consumed by the coupled driver infrastructure during simulation to compute field projections. Tools like Earth System Modeling Framework (ESMF) (Hill et al., 2004) and TempestRemap (Ullrich et al., 2013) offer capability to generate conservative remapping weights, while the Model Coupling Toolkit (MCT) (Larson et al., 2001) that is utilized in many production climate models exposes functionality to make use of the operators to solve the coupled problem. However, such multistep processes present several hurdles in terms of the scientific workflow and impede research productivity. In order to overcome these limitations, we present a fully integrated infrastructure based on the Mesh Oriented datABase (MOAB) (Tautges et al., 2004; Mahadevan et al., 2015) library, which allows for a complete description of the numerical grids and solution data used in each submodel. Through a scalable advancing-front intersection algorithm, the supermesh of the source and target grids are computed, which is then used to assemble the high-order, conservative, and monotonicity-preserving remapping weights between discretization specifications. The Fortran-compatible interfaces in MOAB are utilized to directly link the submodels in the Energy Exascale Earth System Model (E3SM) to enable online remapping strategies in order to simplify the coupled workflow process. We demonstrate the superior computational efficiency of the remapping algorithms in comparison with other state-of-the-science tools and present strong scaling results on large-scale machines for computing remapping weights between the spectral element atmosphere and finite volume discretizations on the polygonal ocean grids.


2021 ◽  
Author(s):  
Antoine Doury ◽  
Samuel Somot ◽  
Sébastien Gadat ◽  
Aurélien Ribes ◽  
Lola Corre

Abstract Providing reliable information on climate change at local scale remains a challenge of first importance for impact studies and policymakers. Here, we propose a novel hybrid downscaling method combining the strengths of both empirical statistical downscaling methods and Regional Climate Models (RCMs). The aim of this tool is to enlarge the size of high-resolution RCM simulation ensembles at low cost.We build a statistical RCM-emulator by estimating the downscaling function included in the RCM. This framework allows us to learn the relationship between large-scale predictors and a local surface variable of interest over the RCM domain in present and future climate. Furthermore, the emulator relies on a neural network architecture, which grants computational efficiency. The RCM-emulator developed in this study is trained to produce daily maps of the near-surface temperature at the RCM resolution (12km). The emulator demonstrates an excellent ability to reproduce the complex spatial structure and daily variability simulated by the RCM and in particular the way the RCM refines locally the low-resolution climate patterns. Training in future climate appears to be a key feature of our emulator. Moreover, there is a huge computational benefit in running the emulator rather than the RCM, since training the emulator takes about 2 hours on GPU, and the prediction is nearly instantaneous. However, further work is needed to improve the way the RCM-emulator reproduces some of the temperature extremes, the intensity of climate change, and to extend the proposed methodology to different regions, GCMs, RCMs, and variables of interest.


Sign in / Sign up

Export Citation Format

Share Document