optimal fingerprinting
Recently Published Documents


TOTAL DOCUMENTS

22
(FIVE YEARS 6)

H-INDEX

11
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Ross McKitrick

AbstractAllen and Tett (1999, herein AT99) introduced a Generalized Least Squares (GLS) regression methodology for decomposing patterns of climate change for attribution purposes and proposed the “Residual Consistency Test” (RCT) to check the GLS specification. Their methodology has been widely used and highly influential ever since, in part because subsequent authors have relied upon their claim that their GLS model satisfies the conditions of the Gauss-Markov (GM) Theorem, thereby yielding unbiased and efficient estimators. But AT99 stated the GM Theorem incorrectly, omitting a critical condition altogether, their GLS method cannot satisfy the GM conditions, and their variance estimator is inconsistent by construction. Additionally, they did not formally state the null hypothesis of the RCT nor identify which of the GM conditions it tests, nor did they prove its distribution and critical values, rendering it uninformative as a specification test. The continuing influence of AT99 two decades later means these issues should be corrected. I identify 6 conditions needing to be shown for the AT99 method to be valid.


Author(s):  
Yan Li ◽  
Kun Chen ◽  
Jun Yan ◽  
Xuebin Zhang

2021 ◽  
Vol 34 (1) ◽  
pp. 215-228
Author(s):  
William R. Hobbs ◽  
Christopher Roach ◽  
Tilla Roy ◽  
Jean-Baptiste Sallée ◽  
Nathaniel Bindoff

AbstractIn this study, we compare observed Southern Ocean temperature and salinity changes with the historical simulations from 13 models from phase 5 of the Coupled Model Intercomparison Project (CMIP5), using an optimal fingerprinting framework. We show that there is an unequivocal greenhouse gas–forced warming in the Southern Ocean. This warming is strongest in the Subantarctic Mode Waters but is also detectable in denser water masses, which has not been shown in previous studies. We also find greenhouse gas–forced salinity changes, most notably a freshening of Antarctic Intermediate Waters. Our analysis also shows that non–greenhouse gas anthropogenic forcings—anthropogenic aerosols and stratospheric ozone depletion—have played an important role in mitigating the Southern Ocean’s warming. However, the detectability of these responses using optimal fingerprinting is model dependent, and this result is therefore not as robust as for the greenhouse gas response.


2020 ◽  
Vol 47 (15) ◽  
Author(s):  
L. Trenary ◽  
T. DelSole ◽  
M.K. Tippett

2019 ◽  
Vol 10 (1) ◽  
Author(s):  
Xing Yuan ◽  
Linying Wang ◽  
Peili Wu ◽  
Peng Ji ◽  
Justin Sheffield ◽  
...  

Abstract Flash droughts refer to a type of droughts that have rapid intensification without sufficient early warning. To date, how will the flash drought risk change in a warming future climate remains unknown due to a diversity of flash drought definition, unclear role of anthropogenic fingerprints, and uncertain socioeconomic development. Here we propose a new method for explicitly characterizing flash drought events, and find that the exposure risk over China will increase by about 23% ± 11% during the middle of this century under a socioeconomic scenario with medium challenge. Optimal fingerprinting shows that anthropogenic climate change induced by the increased greenhouse gas concentrations accounts for 77% ± 26% of the upward trend of flash drought frequency, and population increase is also an important factor for enhancing the exposure risk of flash drought over southernmost humid regions. Our results suggest that the traditional drought-prone regions would expand given the human-induced intensification of flash drought risk.


2018 ◽  
Vol 52 (7-8) ◽  
pp. 4111-4126 ◽  
Author(s):  
Timothy DelSole ◽  
Laurie Trenary ◽  
Xiaoqin Yan ◽  
Michael K. Tippett

2016 ◽  
Vol 2016 (4) ◽  
pp. 470-487 ◽  
Author(s):  
Gábor György Gulyás ◽  
Gergely Acs ◽  
Claude Castelluccia

Abstract Several recent studies have demonstrated that people show large behavioural uniqueness. This has serious privacy implications as most individuals become increasingly re-identifiable in large datasets or can be tracked, while they are browsing the web, using only a couple of their attributes, called as their fingerprints. Often, the success of these attacks depends on explicit constraints on the number of attributes learnable about individuals, i.e., the size of their fingerprints. These constraints can be budget as well as technical constraints imposed by the data holder. For instance, Apple restricts the number of applications that can be called by another application on iOS in order to mitigate the potential privacy threats of leaking the list of installed applications on a device. In this work, we address the problem of identifying the attributes (e.g., smartphone applications) that can serve as a fingerprint of users given constraints on the size of the fingerprint. We give the best fingerprinting algorithms in general, and evaluate their effectiveness on several real-world datasets. Our results show that current privacy guards limiting the number of attributes that can be queried about individuals is insufficient to mitigate their potential privacy risks in many practical cases.


2016 ◽  
Vol 29 (6) ◽  
pp. 1977-1998 ◽  
Author(s):  
Alexis Hannart

Abstract The present paper introduces and illustrates methodological developments intended for so-called optimal fingerprinting methods, which are of frequent use in detection and attribution studies. These methods used to involve three independent steps: preliminary reduction of the dimension of the data, estimation of the covariance associated to internal climate variability, and, finally, linear regression inference with associated uncertainty assessment. It is argued that such a compartmentalized treatment presents several issues; an integrated method is thus introduced to address them. The suggested approach is based on a single-piece statistical model that represents both linear regression and control runs. The unknown covariance is treated as a nuisance parameter that is eliminated by integration. This allows for the introduction of regularization assumptions. Point estimates and confidence intervals follow from the integrated likelihood. Further, it is shown that preliminary dimension reduction is not required for implementability and that computational issues associated to using the raw, high-dimensional, spatiotemporal data can be resolved quite easily. Results on simulated data show improved performance compared to existing methods w.r.t. both estimation error and accuracy of confidence intervals and also highlight the need for further improvements regarding the latter. The method is illustrated on twentieth-century precipitation and surface temperature, suggesting a potentially high informational benefit of using the raw, nondimension-reduced data in detection and attribution (D&A), provided model error is appropriately built into the inference.


Sign in / Sign up

Export Citation Format

Share Document