scholarly journals Influence of Bias Correcting Predictors on Statistical Downscaling Models

2017 ◽  
Vol 56 (1) ◽  
pp. 5-26 ◽  
Author(s):  
Mathieu Vrac ◽  
Pradeebane Vaittinada Ayar

AbstractStatistical downscaling models (SDMs) and bias correction (BC) methods are commonly used to provide regional or debiased climate projections. However, most SDMs are utilized in a “perfect prognosis” context, meaning that they are calibrated on reanalysis predictors before being applied to GCM simulations. If the latter are biased, SDMs might suffer from discrepancies with observations and therefore provide unrealistic projections. It is then necessary to study the influence of applying bias correcting to large-scale predictors for SDMs, since it can have impacts on the local-scale simulations: such an investigation for daily temperature and precipitation is the goal of this study. Hence, four temperature and three precipitation SDMs are calibrated over a historical period. First, the SDMs are forced by historical predictors from two GCMs, corrected or not corrected. The two types of simulations are compared with reanalysis-driven SDM outputs to characterize the quality of the simulations. Second, changes in basic statistical properties of the raw GCM projections and those of the SDM simulations—driven by bias-corrected or raw predictors from GCM future projections—are compared. Third, the stationarity of the SDM changes brought by the BC of the predictors is investigated. Changes are computed over a historical (1976–2005) and future (2071–2100) time period and compared to assess the nonstationarity. Overall, BC can have impacts on the SDM simulations, although its influence varies from one SDM to another and from one GCM to another, with different spatial structures, and depends on the considered statistical properties. Nevertheless, corrected predictors generally improve the historical projections and can impact future evolutions with potentially strong nonstationary behaviors.

2010 ◽  
Vol 4 (4) ◽  
pp. 2233-2275 ◽  
Author(s):  
G. Levavasseur ◽  
M. Vrac ◽  
D. M. Roche ◽  
D. Paillard ◽  
A. Martin ◽  
...  

Abstract. We quantify the agreement between permafrost distributions from PMIP2 (Paleoclimate Modeling Intercomparison Project) climate models and permafrost data. We evaluate the ability of several climate models to represent permafrost and assess the inter-variability between them. Studying an heterogeneous variable such as permafrost implies to conduct analysis at a smaller spatial scale compared with climate models resolution. Our approach consists in applying statistical downscaling methods (SDMs) on large- or regional-scale atmospheric variables provided by climate models, leading to local permafrost modelling. Among the SDMs, we first choose a transfer function approach based on Generalized Additive Models (GAMs) to produce high-resolution climatology of surface air temperature (SAT). Then, we define permafrost distribution over Eurasia by SAT conditions. In a first validation step on present climate (CTRL period), GAM shows some limitations with non-systemic improvements in comparison with the large-scale fields. So, we develop an alternative method of statistical downscaling based on a stochastic generator approach through a Multinomial Logistic Regression (MLR), which directly models the probabilities of local permafrost indices. The obtained permafrost distributions appear in a better agreement with data. In both cases, the provided local information reduces the inter-variability between climate models. Nevertheless, this also proves that a simple relationship between permafrost and the SAT only is not always sufficient to represent local permafrost. Finally, we apply each method on a very different climate, the Last Glacial Maximum (LGM) time period, in order to quantify the ability of climate models to represent LGM permafrost. Our SDMs do not significantly improve permafrost distribution and do not reduce the inter-variability between climate models, at this period. We show that LGM permafrost distribution from climate models strongly depends on large-scale SAT. The differences with LGM data, larger than in the CTRL period, reduce the contribution of downscaling and depend on several factors deserving further studies.


2019 ◽  
Vol 58 (6) ◽  
pp. 1267-1278 ◽  
Author(s):  
Cristina L. Archer ◽  
Joseph F. Brodie ◽  
Sara A. Rauscher

AbstractThe goal of this study is to evaluate the effects of anthropogenic climate change on air quality, in particular on ozone, during the summer in the U.S. mid-Atlantic region. First, we establish a connection between high-ozone (HO) days, defined as those with observed 8-h average ozone concentration greater than 70 parts per billion (ppb), and certain weather patterns, called synoptic types. We identify four summer synoptic types that most often are associated with HO days based on a 30-yr historical period (1986–2015) using NCEP–NCAR reanalysis. Second, we define thresholds for mean near-surface temperature and precipitation that characterize HO days during the four HO synoptic types. Next, we look at climate projections from five models from phase 5 of the Coupled Model Intercomparison Project (CMIP5) for the early and late midcentury (2025–34 and 2045–54) and analyze the frequency of HO days. We find a general increasing trend, weaker in the early midcentury and stronger in the late midcentury, with 2 and 5 extra HO days per year, respectively, from 16 in 2015. These 5 extra days are the result of two processes. On one hand, the four HO synoptic types will increase in frequency, which explains about 1.5–2 extra HO days. The remaining 3–3.5 extra days are explained by the increase in near-surface temperatures during the HO synoptic types. Future air quality regulations, which have been successful in the historical period at reducing ozone concentrations in the mid-Atlantic, may need to become stricter to compensate for the underlying increasing trends from global warming.


2021 ◽  
Vol 37 ◽  
pp. 00049
Author(s):  
Michael Naumov ◽  
Ludmila Reznichenko ◽  
Yana Masalykina ◽  
Ivan Styazhkin

This scientific article deals with the problem of antibiotic resistance. It is very difficult to give a complete picture of the resistance of microorganisms to antibiotics, because this topic is very diverse and is being actively investigated. Accordingly, information about more and more cases of antibiotic resistance appears very quickly. Less than a century has passed since the beginning of large-scale use of antibiotics. In this short historical period of time, the threat of antibiotic resistance has reached a global level, and it would be wrong to deny that it is humanity that has created such an enemy through its own efforts. Antibiotic resistance is a property of a microorganism that is associated with a decrease in the quality of the effect of an antibiotic on a given culture. The driving force behind this phenomenon is evolution. With the help of random mutations, an individual appears that is not susceptible to the effects of a previously used drug. The emergence of superbugs-cultures that do not respond to the use of currently existing antibiotics will lead to a decrease in the quality of life of people. Diseases that no longer caused concern in modern society will once again become deadly.


2017 ◽  
Author(s):  
Wil Roebroeks ◽  
Sabine Gaudzinski-Windheuser ◽  
Michael Baales ◽  
Ralf-Dietrich Kahlke

AbstractThe database regarding the earliest occupation of Europe has increased significantly in quantity and quality of data points over the last two decades, mainly through the addition of new sites as a result of long-term systematic excavations and large-scale prospections of Early and early Middle Pleistocene exposures. The site distribution pattern suggests an ephemeral presence of hominins in the south of Europe from around one million years ago, with occasional short northward expansions along the western coastal areas when temperate conditions permitted. From around 600,000-700,000 years ago Acheulean artefacts appear in Europe and somewhat later hominin presence seems to pick up, with more sites and now some also present in colder climatic settings. It is again only later, around 350,000 years ago, that the first sites show up in more continental, central parts of Europe, east of the Rhine. A series of recent papers on the Early Pleistocene palaeontological site of Untermassfeld (Germany) makes claims that are of great interest for studies of earliest Europe and are at odds with the described pattern: the papers suggest that Untermassfeld has yielded stone tools and humanly modified faunal remains, evidence for a one million years old hominin presence in European continental mid-latitudes, and additional evidence that hominins were well-established in Europe already around that time period. Here we evaluate these claims and demonstrate that these studies are severely flawed in terms of data on provenance of the materials studied and in the interpretation of faunal remains and lithics as testifying to a hominin presence at the site. In actual fact any reference to the Untermassfeld site as an archaeological one is unwarranted. Furthermore, it is not the only European Early Pleistocene site where inferred evidence for hominin presence is problematic. The strength of the spatiotemporal patterns of hominin presence and absence depend on the quality of the data points we work with, and data base maintenance, including critical evaluation of new sites, is crucial to advance our knowledge of the expansions and contractions of hominin ranges during the Pleistocene.


2011 ◽  
Vol 7 (3) ◽  
pp. 1647-1692 ◽  
Author(s):  
G. Levavasseur ◽  
M. Vrac ◽  
D. M. Roche ◽  
D. Paillard ◽  
A. Martin ◽  
...  

Abstract. We quantify the agreement between permafrost distributions from PMIP2 (Paleoclimate Modeling Intercomparison Project) climate models and permafrost data. We evaluate the ability of several climate models to represent permafrost and assess the inter-variation between them. Studying an heterogeneous variable such as permafrost implies to conduct analysis at a smaller spatial scale compared with climate models resolution. Our approach consists in applying statistical downscaling methods (SDMs) on large- or regional-scale atmospheric variables provided by climate models, leading to local-scale permafrost modelling. Among the SDMs, we first choose a transfer function approach based on Generalized Additive Models (GAMs) to produce high-resolution climatology of air temperature at the surface. Then, we define permafrost distribution over Eurasia by air temperature conditions. In a first validation step on present climate (CTRL period), this method shows some limitations with non-systemic improvements in comparison with the large-scale fields. So, we develop an alternative method of statistical downscaling based on a Multinomial Logistic GAM (ML-GAM), which directly predicts the occurrence probabilities of local-scale permafrost. The obtained permafrost distributions appear in a better agreement with data. In average for the nine PMIP2 models, we measure a global agreement by kappa statistic of 0.80 with CTRL permafrost data, against 0.68 for the GAM method. In both cases, the provided local information reduces the inter-variation between climate models. This also confirms that a simple relationship between permafrost and the air temperature only is not always sufficient to represent local-scale permafrost. Finally, we apply each method on a very different climate, the Last Glacial Maximum (LGM) time period, in order to quantify the ability of climate models to represent LGM permafrost. The prediction of the SDMs is not significantly better than large-scale fields with 0.46 (GAM) and 0.49 (ML-GAM) of global agreement with LGM permafrost data. At the LGM, both methods do not reduce the inter-variation between climate models. We show that LGM permafrost distribution from climate models strongly depends on large-scale air temperature at the surface. LGM simulations from climate models lead to larger differences with permafrost data, than in the CTRL period. These differences reduce the contribution of downscaling and depend on several other factors deserving further studies.


2020 ◽  
Vol 6 (15) ◽  
pp. eaay4444 ◽  
Author(s):  
Ernest N. Koffi ◽  
Peter Bergamaschi ◽  
Romain Alkama ◽  
Alessandro Cescatti

Wetlands are a major source of methane (CH4) and contribute between 30 and 40% to the total CH4 emissions. Wetland CH4 emissions depend on temperature, water table depth, and both the quantity and quality of organic matter. Global warming will affect these three drivers of methanogenesis, raising questions about the feedbacks between natural methane production and climate change. Until present the large-scale response of wetland CH4 emissions to climate has been investigated with land-surface models that have produced contrasting results. Here, we produce a novel global estimate of wetland methane emissions based on atmospheric inverse modeling of CH4 fluxes and observed temperature and precipitation. Our data-driven model suggests that by 2100, current emissions may increase by 50% to 80%, which is within the range of 50% and 150% reported in previous studies. This finding highlights the importance of limiting global warming below 2°C to avoid substantial climate feedbacks driven by methane emissions from natural wetlands.


2016 ◽  
Vol 78 (6-12) ◽  
Author(s):  
Mahiuddin Alamgir ◽  
Sahar Hadi Pour ◽  
Morteza Mohsenipour ◽  
M. Mehedi Hasan ◽  
Tarmizi Ismail

Reliable projection of future rainfall in Bangladesh is very important for the assessment of possible impacts of climate change and implementation of necessary adaptation and mitigation measures. Statistical downscaling methods are widely used for downscaling coarse resolution general circulation model (GCM) output at local scale. Selection of predictors and their spatial domain is very important to facilitate downscaling future climate projected by GCMs. The present paper reports the finding of the study conducted to identify the GCM predictors and demarcate their climatic domain for statistical downscaling in Bangladesh at local or regional scale. Twenty-six large scale atmospheric variables which are widely simulated GCM predictors from 45 grid points around the country were analysed using various statistical methods for this purpose. The study reveals that large-scale atmospheric variables at the grid points located in the central-west part of Bangladesh have the highest influence on rainfall.  It is expected that the finding of the study will help different meteorological and agricultural organizations of Bangladesh to project rainfall and temperature at local scale in order to provide various agricultural or hydrological services.


1970 ◽  
Vol 6 ◽  
pp. 39-43 ◽  
Author(s):  
Sabina Sultana ◽  
Selina Parween ◽  
M Altaf Hossain

Seven different species viz. Chanda baculis, Chanda ranga, Amblypharyngodon mola, Oxygaster bacaila, Clupisoma atherinoides, Corica soborna, Mystus vittatus and a group of mixed SIS fishes viz. Mastacembelus pancalus, Xenntodon cancila, Chanda baculis and Glossogobius giuris were used for preparation of dust which can be preserved for a time period. The fishes were sun dried or oven dried, which are also method of preservation. Quality of the oven-dried fish was better than that of the sun-dried fish, but sun-drying process is easy and can be used in large scale. The fish powder remained in good condition for 7-9 months at normal room temperature, but at -18°C the powder was in good condition throughout the year. Highest quantity of powder from 1 kg of fish was obtained in case of the mixed species as 24.61% and the lowest in O. bacaila which was 20.52%. Biochemical analysis showed that the maximum calcium content was found as 1.34% in M. vittatus and minimum was 0.80% in mixed SIS fishes. Maximum phosphorus content was 2.90% in C. ranga and minimum was 1.72% in C. soborna. Maximum iron content was found as 45.20 mg/100g in mixed SIS fishes and minimum was found as 16.85 mg/100g in O. bacaila. The maximum moisture content was found in C. ranga (13.50%) and the minimum in mixed SIS fishes (11.65%). The maximum protein content was recorded in the mixed SIS fishes (72.45%) and the minimum in C. ranga (52.65%). The experiment was replicated three times and conducted from July 2005 to July 2008. DOI: http://dx.doi.org/10.3329/jles.v6i0.9719 JLES 2011 6: 39-43


2017 ◽  
Vol 30 (8) ◽  
pp. 2829-2847 ◽  
Author(s):  
Paul C. Loikith ◽  
Benjamin R. Lintner ◽  
Alex Sweeney

The self-organizing maps (SOMs) approach is demonstrated as a way to identify a range of archetypal large-scale meteorological patterns (LSMPs) over the northwestern United States and connect these patterns with local-scale temperature and precipitation extremes. SOMs are used to construct a set of 12 characteristic LSMPs (nodes) based on daily reanalysis circulation fields spanning the range of observed synoptic-scale variability for the summer and winter seasons for the period 1979–2013. Composites of surface variables are constructed for subsets of days assigned to each node to explore relationships between temperature, precipitation, and the node patterns. The SOMs approach also captures interannual variability in daily weather regime frequency related to El Niño–Southern Oscillation. Temperature and precipitation extremes in high-resolution gridded observations and in situ station data show robust relationships with particular nodes in many cases, supporting the approach as a way to identify LSMPs associated with local extremes. Assigning days from the extreme warm summer of 2015 and wet winter of 2016 to nodes illustrates how SOMs may be used to assess future changes in extremes. These results point to the applicability of SOMs to climate model evaluation and assessment of future projections of local-scale extremes without requiring simulations to reliably resolve extremes at high spatial scales.


2018 ◽  
Vol 99 (4) ◽  
pp. 791-803 ◽  
Author(s):  
John R. Lanzante ◽  
Keith W. Dixon ◽  
Mary Jo Nath ◽  
Carolyn E. Whitlock ◽  
Dennis Adams-Smith

AbstractStatistical downscaling (SD) is commonly used to provide information for the assessment of climate change impacts. Using as input the output from large-scale dynamical climate models and observation-based data products, SD aims to provide a finer grain of detail and to mitigate systematic biases. It is generally recognized as providing added value. However, one of the key assumptions of SD is that the relationships used to train the method during a historical period are unchanged in the future, in the face of climate change. The validity of this assumption is typically quite difficult to assess in the normal course of analysis, as observations of future climate are lacking. We approach this problem using a “perfect model” experimental design in which high-resolution dynamical climate model output is used as a surrogate for both past and future observations.We find that while SD in general adds considerable value, in certain well-defined circumstances it can produce highly erroneous results. Furthermore, the breakdown of SD in these contexts could not be foreshadowed during the typical course of evaluation based on only available historical data. We diagnose and explain the reasons for these failures in terms of physical, statistical, and methodological causes. These findings highlight the need for caution in the use of statistically downscaled products and the need for further research to consider other hitherto unknown pitfalls, perhaps utilizing more advanced perfect model designs than the one we have employed.


Sign in / Sign up

Export Citation Format

Share Document