Past Climates: How We Get Our Data

Author(s):  
Eelco J. Rohling

On the work floor, research on past climates is known as paleoclimatology, and research on past oceans as paleoceanography. But they are very tightly related, and we shall discuss both combined under the one term of paleoclimatology. Within paleoclimatology, interests are spread over three fundamental fields. The first field is concerned with dating ancient evidence and is referred to as chronological studies. These studies are essential because all records of past climate change need to be dated as accurately as possible to ensure that we know when the studied climate changes occurred, how fast they were, and whether changes seen in various components of the climate system happened at the same time or at different times. The second field concerns observational studies, where the observations can be of different types. Some are direct measurements; for example, sunspot counts or temperature records. Some are historical, written accounts of anecdotal evidence, such as reports on the frequency of frozen rivers, floods, or droughts. Such records are very local and often subjective, so they are usually no good as primary evidence. But they can offer great support and validation to reconstructions from other tools. Besides direct and anecdotal data, we encounter the dominant type of evidence used in the discipline. These are the so- called proxy data, or proxies. Proxies are indirect measures that approximate (hence the name proxy) changes in important climate- system variables, such as temperature, CO2 concentrations, nutrient concentrations, and so on. This chapter outlines some of the most important proxies. The third field in paleoclimatology concerns modeling. It employs numerical models for climate system simulation and simpler classes of so-called box- models. Numerical climate models range from Earth System models that are relatively crude and can therefore be set to run simulations of many thousands of years, to very complex and refined coupled models that are computationally very greedy and thus give simulations of great detail but only over short intervals of time. Box- models are much simpler and faster to run, and they are most used in modeling of the carbon cycle or other geochemical properties.

2020 ◽  
Author(s):  
Xiaoqing Liu ◽  
Matthew Huber ◽  
Gavin L Foster ◽  
R Mark Leckie ◽  
Yi Ge Zhang

<p>When the Earth warms, the high latitudes often warm more than the low latitudes, a phenomenon commonly known as high latitude amplification. Although high latitude amplification has been observed by both climate data and models, the trajectory of high latitude amplification in our future changing climate is uncertain. Pacific-wide reconstructions of sea surface temperature variability from past climates are important for establishing the historical records of high latitude amplification. Multiple extratropical temperature records have been established for the past 10 million years (Myr). However, it is debated whether the warmest end member, the Western Pacific Warm Pool (WPWP), warmed during the late Miocene (~12 to 5 million years ago, Ma) and Pliocene (5 to 3 Ma). Here we present new multi-proxy, multi-site paleotemperature records from the WPWP. These results, based on lipid biomarkers and foraminiferal Mg/Ca, unequivocally show warmer temperatures in the past, and a secular cooling over the last 10 Myr. We combine these new data, along with the previously established paleotemperature records, to reveal a persistent pattern of change in the Pacific described by a high latitude amplification factor of ~1.7, which does not seem to be affected by the major climate changes over the past 10 Myr. The evolution of spatial temperature gradients in the Pacific is also evident in climate model output and instrumental observations covering the last 160 years, and thus appears to be a robust and predictable feature of the climate system. These results therefore confirm that climate models can capture the major features of past climate change, providing increased confidence in their predictions of future patterns that are likely to be similar to those reconstructed here.</p>


1994 ◽  
Vol 34 (2) ◽  
pp. 104
Author(s):  
C.D. Mitchell

New observations of the chemical composition of the atmosphere are reshaping scientific understanding of the global sources and sinks of the greenhouse gases. Current trends in the atmospheric concentrations of some of these gases are reviewed, with reference to new work emerging from Antarctic ice cores.Accompanying an understanding of the composition of the atmosphere, is the need to understand the processes which drive the global climate system, including interactions between the atmosphere and oceans. Studies of climatic processes therefore form the scientific underpinning for the development of numerical models that describe the response of the global climate system to observed changes in the composition of the atmosphere.Success or failure in efforts to improve model simulations can be assessed using a variety of objective statistical tests. Examples of such tests show demonstrable progress in the ability of global climate models to simulate the present day climate realistically.Since confidence in the regional details of climate predictions from climate models is low, considerable effort is being devoted to developing models capable of providing improved regional estimates of climate change and in practice a variety of models not limited to the global-scale models are used in this work. In the meantime, several approaches to assessing the potential impacts of climate change are possible. These are discussed with special reference to tropical cyclones and east coast lows.Throughout this review emphasis is placed on recent Australian contributions to the field, most notably work conducted within CSIRO.


Author(s):  
Luke Skinner

From a socio-economic perspective, the ‘sharp end’ of climate research is very much about looking forward in time. As far as possible, we need to know what to expect and approximately when to expect it. However, it is argued here that our approach to climate change (including its scientific basis and its policy implications) is firmly linked to our understanding of the past. This is mainly due to the role played by palaeoclimate reconstructions in shaping our expectations of the climate system, in particular via their ability to test the accuracy of our climate models. Importantly, this includes the intuitive models that each of us carries around in our mind, as well as the more complex numerical models hiding inside supercomputers. It is through such models that palaeoclimate insights may affect the scientific and political judgements that we must make in the face of persistent and ultimately irreducible predictive uncertainty. Already we can demonstrate a great deal of confidence in our current understanding of the global climate system based specifically on insights from the geological record. If further advances are to be made effectively, climate models should take advantage of both past and present constraints on their behaviour, and should be given added credence to the extent that they are compatible with an increasingly rich tapestry of past climatic phenomena. Furthermore, palaeoclimate data should be accompanied by clearly defined uncertainties, and organized in arrays that are capable of speaking directly to numerical models, and their limitations in particular.


2021 ◽  
Author(s):  
Christian Zeman ◽  
Christoph Schär

<p>Since their first operational application in the 1950s, atmospheric numerical models have become essential tools in weather and climate prediction. As such, they are a constant subject to changes, thanks to advances in computer systems, numerical methods, and the ever increasing knowledge about the atmosphere of Earth. Many of the changes in today's models relate to seemingly unsuspicious modifications, associated with minor code rearrangements, changes in hardware infrastructure, or software upgrades. Such changes are meant to preserve the model formulation, yet the verification of such changes is challenged by the chaotic nature of our atmosphere - any small change, even rounding errors, can have a big impact on individual simulations. Overall this represents a serious challenge to a consistent model development and maintenance framework.</p><p>Here we propose a new methodology for quantifying and verifying the impacts of minor atmospheric model changes, or its underlying hardware/software system, by using ensemble simulations in combination with a statistical hypothesis test. The methodology can assess effects of model changes on almost any output variable over time, and can also be used with different hypothesis tests.</p><p>We present first applications of the methodology with the regional weather and climate model COSMO. The changes considered include a major system upgrade of the supercomputer used, the change from double to single precision floating-point representation, changes in the update frequency of the lateral boundary conditions, and tiny changes to selected model parameters. While providing very robust results, the methodology also shows a large sensitivity to more significant model changes, making it a good candidate for an automated tool to guarantee model consistency in the development cycle.</p>


2021 ◽  
Author(s):  
Oliver Krueger ◽  
Frauke Feser ◽  
Christopher Kadow ◽  
Ralf Weisse

<p>Global atmospheric reanalyses are commonly applied for the validation of climate models, diagnostic studies, and driving higher resolution numerical models with the emphasis on assessing climate variability and long-term trends. Over recent years, longer reanalyses spanning a period of more than hundred years have become available. In this study, the variability and long-term trends of storm activity is assessed over the northeast Atlantic in modern centennial reanalysis datasets, namely ERA-20cm, ERA-20c, CERA-20c, and the 20CR-reanalysis suite with 20CRv3 being the most recent one. All reanalyses, except from ERA-20cm, assimilate surface pressure observations, whereby ERA-20C and CERA-20c additionally assimilate surface winds. For the assessment, the well-established storm index of higher annual percentiles of geostrophic wind speeds derived from pressure observations at sea level over a relatively densely monitored marine area is used.</p><p>The results indicate that the examined centennial reanalyses are not able to represent long-term trends of storm activity over the northeast Atlantic, particularly in the earlier years of the period examined when compared with the geostrophic wind index based on pressure observations. Moreover, the reanalyses show inconsistent long-term behaviour when compared with each other. Only in the latter half of the 20th century, the variability of reanalysed and observed storminess time series starts to agree with each other. Additionally, 20CRv3, the most recent centennial reanalysis examined, shows markedly improved results with increased uncertainty, albeit multidecadal storminess variability does not match observed values in earlier times before about 1920.</p><p>The behaviour shown by the centennial reanalyses are likely caused by the increasing number of assimilated observations, changes in the observational databases used, and the different underlying numerical model systems. Furthermore, the results derived from the ERA-20cm reanalysis that does not assimilate any pressure or wind observations suggests that the variability and uncertainty of storminess over the northeast Atlantic is high making it difficult to determine storm activity when numerical models are not bound by observations. The results of this study imply and reconfirm previous findings that the assessment of long-term storminess trends and variability in centennial reanalyses remains a rather delicate matter, at least for the northeast Atlantic region.</p>


1998 ◽  
Vol 27 ◽  
pp. 427-432 ◽  
Author(s):  
Anthony P. Worby ◽  
Xingren Wu

The importance of monitoring sea ice for studies of global climate has been well noted for several decades. Observations have shown that sea ice exhibits large seasonal variability in extent, concentration and thickness. These changes have a significant impact on climate, and the potential nature of many of these connections has been revealed in studies with numerical models. An accurate representation of the sea-ice distribution (including ice extent, concentration and thickness) in climate models is therefore important for modelling global climate change. This work presents an overview of the observed sea-ice characteristics in the East Antarctic pack ice (60-150° E) and outlines possible improvements to the simulation of sea ice over this region by modifying the ice-thickness parameterisation in a coupled sea-ice-atmosphere model, using observational data of ice thickness and concentration. Sensitivity studies indicate that the simulation of East Antarctic sea ice can be improved by modifying both the “lead parameterisation” and “rafting scheme” to be ice-thickness dependent. The modelled results are currently out of phase with the observed data, and the addition of a multilevel ice-thickness distribution would improve the simulation significantly.


2016 ◽  
Author(s):  
Andrew Dawson ◽  
Peter Düben

Abstract. This paper describes the rpe library which has the capability to emulate the use of arbitrary reduced floating-point precision within large numerical models written in Fortran. The rpe software allows model developers to test how reduced floating-point precision affects the result of their simulations without having to make extensive code changes or port the model onto specialised hardware. The software can be used to identify parts of a program that are problematic for numerical precision and to guide changes to the program to allow a stronger reduction in precision. The development of rpe was motivated by the strong demand for more computing power. If numerical precision can be reduced for an application under consideration while still achieving results of acceptable quality, computational cost can be reduced, since a reduction in numerical precision may allow an increase in performance or a reduction in power consumption. For simulations with weather and climate models, savings due to a reduction in precision could be reinvested to allow model simulations at higher spatial resolution or complexity, or to increase the number of ensemble members to improve predictions. rpe was developed with particular focus on the community of weather and climate modelling, but the software could be used with numerical simulations from other domains.


2021 ◽  
Author(s):  
Michael Hollaway ◽  
Peter Henrys ◽  
Rebecca Killick ◽  
Amber Leeson ◽  
John Watkins

<p>     Numerical models are essential tools for understanding the complex and dynamic nature of the natural environment and how it will respond to a changing climate. With ever increasing volumes of environmental data and increased availability of high powered computing, these models are becoming more complex and detailed in nature. Therefore the ability of these models to represent reality is critical in their use and future development. This has presented a number of challenges, including providing research platforms for collaborating scientists to explore big data, develop and share new methods, and communicate their results to stakeholders and decision makers. This work presents an example of a cloud-based research platform known as DataLabs and how it can be used to simplify access to advanced statistical methods (in this case changepoint analysis) for environmental science applications.</p><p>     A combination of changepoint analysis and fuzzy logic is used to assess the ability of numerical models to capture local scale temporal events seen in observations. The fuzzy union based metric factors in uncertainty of the changepoint location to calculate individual similarity scores between the numerical model and reality for each changepoint in the observed record. The application of the method is demonstrated through a case study on a high resolution model dataset which was able to pick up observed changepoints in temperature records over Greenland to varying degrees of success. The case study is presented using the DataLabs framework, demonstrating how the method can be shared with other users of the platform and the results visualised and communicated to users of different areas of expertise.</p>


Water ◽  
2020 ◽  
Vol 12 (11) ◽  
pp. 3221
Author(s):  
Lucie Dal Soglio ◽  
Charles Danquigny ◽  
Naomi Mazzilli ◽  
Christophe Emblanch ◽  
Gérard Massonnat

The main outlets of karst systems are springs, the hydrographs of which are largely affected by flow processes in the unsaturated zone. These processes differ between the epikarst and transmission zone on the one hand and the matrix and conduit on the other hand. However, numerical models rarely consider the unsaturated zone, let alone distinguishing its subsystems. Likewise, few models represent conduits through a second medium, and even fewer do this explicitly with discrete features. This paper focuses on the interest of hybrid models that take into account both unsaturated subsystems and discrete conduits to simulate the reservoir-scale response, especially the outlet hydrograph. In a synthetic karst aquifer model, we performed simulations for several parameter sets and showed the ability of hybrid models to simulate the overall response of complex karst aquifers. Varying parameters affect the pathway distribution and transit times, which results in a large variety of hydrograph shapes. We propose a classification of hydrographs and selected characteristics, which proves useful for analysing the results. The relationships between model parameters and hydrograph characteristics are not all linear; some of them have local extrema or threshold limits. The numerous simulations help to assess the sensitivity of hydrograph characteristics to the different parameters and, conversely, to identify the key parameters which can be manipulated to enhance the modelling of field cases.


Author(s):  
Gabriele Vissio ◽  
Valerio Lucarini

AbstractThe understanding of the fundamental properties of the climate system has long benefitted from the use of simple numerical models able to parsimoniously represent the essential ingredients of its processes. Here, we introduce a new model for the atmosphere that is constructed by supplementing the now-classic Lorenz ’96 one-dimensional lattice model with temperature-like variables. The model features an energy cycle that allows for energy to be converted between the kinetic form and the potential form and for introducing a notion of efficiency. The model’s evolution is controlled by two contributions—a quasi-symplectic and a gradient one, which resemble (yet not conforming to) a metriplectic structure. After investigating the linear stability of the symmetric fixed point, we perform a systematic parametric investigation that allows us to define regions in the parameters space where at steady-state stationary, quasi-periodic, and chaotic motions are realised, and study how the terms responsible for defining the energy budget of the system depend on the external forcing injecting energy in the kinetic and in the potential energy reservoirs. Finally, we find preliminary evidence that the model features extensive chaos. We also introduce a more complex version of the model that is able to accommodate for multiscale dynamics and that features an energy cycle that more closely mimics the one of the Earth’s atmosphere.


Sign in / Sign up

Export Citation Format

Share Document