scholarly journals Revising Return Periods for Record Events in a Climate Event Attribution Context

2018 ◽  
Vol 31 (9) ◽  
pp. 3411-3422 ◽  
Author(s):  
Philippe Naveau ◽  
Aurélien Ribes ◽  
Francis Zwiers ◽  
Alexis Hannart ◽  
Alexandre Tuel ◽  
...  

Both climate and statistical models play an essential role in the process of demonstrating that the distribution of some atmospheric variable has changed over time and in establishing the most likely causes for the detected change. One statistical difficulty in the research field of detection and attribution resides in defining events that can be easily compared and accurately inferred from reasonable sample sizes. As many impacts studies focus on extreme events, the inference of small probabilities and the computation of their associated uncertainties quickly become challenging. In the particular context of event attribution, the authors address the question of how to compare records between the counterfactual “world as it might have been” without anthropogenic forcings and the factual “world that is.” Records are often the most important events in terms of impact and get much media attention. The authors will show how to efficiently estimate the ratio of two small probabilities of records. The inferential gain is particularly substantial when a simple hypothesis-testing procedure is implemented. The theoretical justification of such a proposed scheme can be found in extreme value theory. To illustrate this study’s approach, classical indicators in event attribution studies, like the risk ratio or the fraction of attributable risk, are modified and tailored to handle records. The authors illustrate the advantages of their method through theoretical results, simulation studies, temperature records in Paris, and outputs from a numerical climate model.

Author(s):  
Philippe Naveau ◽  
Alexis Hannart ◽  
Aurélien Ribes

Changes in the Earth's climate have been increasingly observed. Assessing the likelihood that each of these changes has been caused by human influence is important for decision making on mitigation and adaptation policy. Because of their large societal and economic impacts, extreme events have garnered much media attention—have they become more frequent and more intense, and if so, why? To answer such questions, extreme event attribution (EEA) tries to estimate extreme event likelihoods under different scenarios. Over the past decade, statistical methods and experimental designs based on numerical models have been developed, tested, and applied. In this article, we review the basic probability schemes, inference techniques, and statistical hypotheses used in EEA. To implement EEA analysis, the climate community relies on the use of large ensembles of climate model runs. We discuss, from a statistical perspective, how extreme value theory could help to deal with the different modeling uncertainties. In terms of interpretation, we stress that causal counterfactual theory offers an elegant framework that clarifies the design of event attributions. Finally, we pinpoint some remaining statistical challenges, including the choice of the appropriate spatio-temporal scales to enhance attribution power, the modeling of concomitant extreme events in a multivariate context, and the coupling of multi-ensemble and observational uncertainties.


2013 ◽  
Vol 26 (20) ◽  
pp. 7929-7937 ◽  
Author(s):  
Elsa Bernard ◽  
Philippe Naveau ◽  
Mathieu Vrac ◽  
Olivier Mestre

Abstract One of the main objectives of statistical climatology is to extract relevant information hidden in complex spatial–temporal climatological datasets. To identify spatial patterns, most well-known statistical techniques are based on the concept of intra- and intercluster variances (like the k-means algorithm or EOFs). As analyzing quantitative extremes like heavy rainfall has become more and more prevalent for climatologists and hydrologists during these last decades, finding spatial patterns with methods based on deviations from the mean (i.e., variances) may not be the most appropriate strategy in this context of studying such extremes. For practitioners, simple and fast clustering tools tailored for extremes have been lacking. A possible avenue to bridging this methodological gap resides in taking advantage of multivariate extreme value theory, a well-developed research field in probability, and to adapt it to the context of spatial clustering. In this paper, a novel algorithm based on this plan is proposed and studied. The approach is compared and discussed with respect to the classical k-means algorithm throughout the analysis of weekly maxima of hourly precipitation recorded in France (fall season, 92 stations, 1993–2011).


2005 ◽  
Vol 18 (13) ◽  
pp. 2429-2440 ◽  
Author(s):  
Terry C. K. Lee ◽  
Francis W. Zwiers ◽  
Gabriele C. Hegerl ◽  
Xuebin Zhang ◽  
Min Tsao

Abstract A Bayesian analysis of the evidence for human-induced climate change in global surface temperature observations is described. The analysis uses the standard optimal detection approach and explicitly incorporates prior knowledge about uncertainty and the influence of humans on the climate. This knowledge is expressed through prior distributions that are noncommittal on the climate change question. Evidence for detection and attribution is assessed probabilistically using clearly defined criteria. Detection requires that there is high likelihood that a given climate-model-simulated response to historical changes in greenhouse gas concentration and sulphate aerosol loading has been identified in observations. Attribution entails a more complex process that involves both the elimination of other plausible explanations of change and an assessment of the likelihood that the climate-model-simulated response to historical forcing changes is correct. The Bayesian formalism used in this study deals with this latter aspect of attribution in a more satisfactory way than the standard attribution consistency test. Very strong evidence is found to support the detection of an anthropogenic influence on the climate of the twentieth century. However, the evidence from the Bayesian attribution assessment is not as strong, possibly due to the limited length of the available observational record or sources of external forcing on the climate system that have not been accounted for in this study. It is estimated that strong evidence from a Bayesian attribution assessment using a relatively stringent attribution criterion may be available by 2020.


2021 ◽  
Vol 66 (3) ◽  
pp. 7-21
Author(s):  
Mirosław Szreder

Increasing numbers of non-random errors are observed in contemporary sample surveying – in particular, those resulting from no response or faulty measutrements (imprecise statistical observation). Until recently, the consequences of these kinds of errors have not been widely discussed in the context of the testing of hypoteses. Researchers focused almost entirely on sampling errors (random errors), whose magnitude decreases as the size of the random sample grows. In consequence, researchers who often use samples of very large sizes tend to overlook the influence random and non-random errors have on the results of their study. The aim of this paper is to present how non-random errors can affect the decision-making process based on the classical hypothesis testing procedure. Particular attention is devoted to cases in which researchers manage samples of large sizes. The study proved the thesis that samples of large sizes cause statistical tests to be more sensitive to non-random errors. Systematic errors, as a special case of non-random errors, increase the probability of making the wrong decision to reject a true hypothesis as the sample size grows. Supplementing the testing of hypotheses with the analysis of confidence intervals may in this context provide substantive support for the researcher in drawing accurate inferences.


2015 ◽  
Vol 8 (7) ◽  
pp. 1943-1954 ◽  
Author(s):  
D. R. Feldman ◽  
W. D. Collins ◽  
J. L. Paige

Abstract. Top-of-atmosphere (TOA) spectrally resolved shortwave reflectances and long-wave radiances describe the response of the Earth's surface and atmosphere to feedback processes and human-induced forcings. In order to evaluate proposed long-duration spectral measurements, we have projected 21st Century changes from the Community Climate System Model (CCSM3.0) conducted for the Intergovernmental Panel on Climate Change (IPCC) A2 Emissions Scenario onto shortwave reflectance spectra from 300 to 2500 nm and long-wave radiance spectra from 2000 to 200 cm−1 at 8 nm and 1 cm−1 resolution, respectively. The radiative transfer calculations have been rigorously validated against published standards and produce complementary signals describing the climate system forcings and feedbacks. Additional demonstration experiments were performed with the Model for Interdisciplinary Research on Climate (MIROC5) and Hadley Centre Global Environment Model version 2 Earth System (HadGEM2-ES) models for the Representative Concentration Pathway 8.5 (RCP8.5) scenario. The calculations contain readily distinguishable signatures of low clouds, snow/ice, aerosols, temperature gradients, and water vapour distributions. The goal of this effort is to understand both how climate change alters reflected solar and emitted infrared spectra of the Earth and determine whether spectral measurements enhance our detection and attribution of climate change. This effort also presents a path forward to understand the characteristics of hyperspectral observational records needed to confront models and inline instrument simulation. Such simulation will enable a diverse set of comparisons between model results from coupled model intercomparisons and existing and proposed satellite instrument measurement systems.


2008 ◽  
Vol 5 (3) ◽  
pp. 847-864 ◽  
Author(s):  
P. W. Boyd ◽  
S. C. Doney ◽  
R. Strzepek ◽  
J. Dusenberry ◽  
K. Lindsay ◽  
...  

Abstract. Concurrent changes in ocean chemical and physical properties influence phytoplankton dynamics via alterations in carbonate chemistry, nutrient and trace metal inventories and upper ocean light environment. Using a fully coupled, global carbon-climate model (Climate System Model 1.4-carbon), we quantify anthropogenic climate change relative to the background natural interannual variability for the Southern Ocean over the period 2000 and 2100. Model results are interpreted using our understanding of the environmental control of phytoplankton growth rates – leading to two major findings. Firstly, comparison with results from phytoplankton perturbation experiments, in which environmental properties have been altered for key species (e.g., bloom formers), indicates that the predicted rates of change in oceanic properties over the next few decades are too subtle to be represented experimentally at present. Secondly, the rate of secular climate change will not exceed background natural variability, on seasonal to interannual time-scales, for at least several decades – which may not provide the prevailing conditions of change, i.e. constancy, needed for phytoplankton adaptation. Taken together, the relatively subtle environmental changes, due to climate change, may result in adaptation by resident phytoplankton, but not for several decades due to the confounding effects of climate variability. This presents major challenges for the detection and attribution of climate change effects on Southern Ocean phytoplankton. We advocate the development of multi-faceted tests/metrics that will reflect the relative plasticity of different phytoplankton functional groups and/or species to respond to changing ocean conditions.


2017 ◽  
Author(s):  
Pakawat Phalitnonkiat ◽  
Wenxiu Sun ◽  
Mircea D. Grigoriu ◽  
Peter G. M. Hess ◽  
Gennady Samorodnitsky ◽  
...  

Abstract. The co-occurrence of heat waves and pollution events and the resulting high mortality rates emphasizes the importance of the co-occurrence of pollution and temperature extremes. Through the use of extreme value theory and other statistical methods ozone and temperature extremes and their joint occurrence are analyzed over the United States during the summer months (JJA) using Clean Air Status and Trends Network (CASTNET) measurement data and simulations of the present and future climate and chemistry in the Community Earth System Model (CESM1) CAM4-chem. Three simulations using CAM4-chem were analyzed: the Chemistry Climate Model Initiative (CCMI) reference experiment using specified dynamics (REFC1SD) between 1992–2010, a 25-year present-day simulation branched off the CCMI REFC2 simulation in the year 2000 and a 25-year future simulation branched off the CCMI REFC2 simulation in 2100. The latter two simulations differed in their concentration of carbon dioxide (representative of the years 2000 and 2100) but were otherwise identical. A new metric is developed to measure the joint extremal dependence of ozone and temperature by evaluating the spectral dependence of their extremes. Two regions of the U.S. give the strongest measured extreme dependence of ozone and temperature: the northeast and the southeast. The simulations do not capture the relationship between temperature and ozone over the northeast but do simulate a strong dependence of ozone on extreme temperatures over the southeast. In general, the simulations of ozone and temperature do not capture the width of the measured temperature and ozone distributions. While on average the future increase in the 90th percentile temperature and the 90th percentile ozone slightly exceed the mean increase over the continental U.S., in many regions the width of the temperature and ozone distributions decrease. The location of future increases in the tails of the ozone distribution are weakly related to those of temperature with a correlation of 0.3.


2018 ◽  
Author(s):  
Ethan G. Hyland ◽  
Katharine W. Huntington ◽  
Nathan D. Sheldon ◽  
Tammo Reichgelt

Abstract. Paleogene greenhouse climate equability has long been a paradox in paleoclimate research. However, recent developments in proxy and modeling methods have suggested that strong seasonality may be a feature of at least some greenhouse periods. Here we present the first multi-proxy record of seasonal temperatures during the Paleogene from paleofloras, paleosol geochemistry, and carbonate clumped isotope thermometry in the Green River Basin (Wyoming, USA). These combined temperature records allow for the reconstruction of past seasonality in the continental interior, which shows that temperatures were warmer in all seasons during the peak early Eocene climatic optimum and that the mean annual range of temperature was high, similar to the modern value (~ 26 °C). Proxy data and downscaled Eocene regional climate model results suggest amplified seasonality during greenhouse events. Increased seasonality reconstructed for the early Eocene is similar in scope to the higher seasonal range predicted by downscaled climate model ensembles for future high-CO2 emissions scenarios. Overall, these data and model comparisons have substantial implications for understanding greenhouse climates in general, and may be important for predicting future seasonal climate regimes and their impacts in continental regions.


2003 ◽  
Vol 11 (4) ◽  
pp. 381-396 ◽  
Author(s):  
Joshua D. Clinton ◽  
Adam Meirowitz

Scholars of legislative studies typically use ideal point estimates from scaling procedures to test theories of legislative politics. We contend that theory and methods may be better integrated by directly incorporating maintained and to be tested hypotheses in the statistical model used to estimate legislator preferences. In this view of theory and estimation, formal modeling (1) provides auxiliary assumptions that serve as constraints in the estimation process, and (2) generates testable predictions. The estimation and hypothesis testing procedure uses roll call data to evaluate the validity of theoretically derived to be tested hypotheses in a world where maintained hypotheses are presumed true. We articulate the approach using the language of statistical inference (both frequentist and Bayesian). The approach is demonstrated in analyses of the well-studied Powell amendment to the federal aid-to-education bill in the 84th House and the Compromise of 1790 in the 1st House.


2021 ◽  
Author(s):  
Yoann Robin ◽  
Aurélien Ribes

<p>We describe a statistical method to derive event attribution diagnoses combining climate model simulations and observations. We fit nonstationary Generalized Extreme Value (GEV) distributions to extremely hot temperatures from an ensemble of Coupled Model Intercomparison Project phase 5 (CMIP)<br>models. In order to select a common statistical model, we discuss which GEV parameters have to be nonstationary and which do not. Our tests suggest that the location and scale parameters of GEV distributions should be considered nonstationary. Then, a multimodel distribution is constructed and constrained by observations using a Bayesian method. This new method is applied to the July 2019 French heatwave. Our results show that<br>both the probability and the intensity of that event have increased significantly in response to human influence.<br>Remarkably, we find that the heat wave considered might not have been possible without climate change. Our<br>results also suggest that combining model data with observations can improve the description of hot temperature<br>distribution.</p>


Sign in / Sign up

Export Citation Format

Share Document