scholarly journals Global sensitivity analysis of the climate–vegetation system to astronomical forcing: an emulator-based approach

2014 ◽  
Vol 5 (2) ◽  
pp. 901-943 ◽  
Author(s):  
N. Bounceur ◽  
M. Crucifix ◽  
R. D. Wilkinson

Abstract. A global sensitivity analysis is used to describe the response of the Earth Climate Model of Intermediate Complexity LOVECLIM to components of the astronomical forcing (longitude of perihelion, obliquity, and eccentricity) assuming interglacial boundary conditions. Compared to previous studies, the sensitivity is global in the sense that it considers the full range of astronomical forcing that occurred during the Quaternary. We provide a geographical description of the variance due to the different components and their combinations and identify non-linear responses. The methodology relies on the estimation of sensitivity measures, which due to the computational cost of LOVECLIM cannot be obtained directly. Instead, we use a fast surrogate of the climate model, called an emulator, in place of the simulator. A space filling design (a maximin Latin hypercube constrained to span the range of astronomical forcings characterising the Pleistocene) is used to determine a set of experiments to run, which are then used to train a reduced-rank Gaussian process emulator. The simulator outputs considered are the principal modes of the annual mean temperature, precipitation, and the growing degree days, extracted using a principal component analysis. The experiments are run on two distinct land surface schemes to address the effect of vegetation response on climate. Sensitivity to initial conditions is also explicitly assessed. Precession and obliquity are found to contribute equally to growing degree days (GDD) in the Northern Hemisphere, and the effects of obliquity on the response of Southern Hemisphere temperature dominate precession effects. Further, compared to the original land-surface scheme with fixed vegetation, the LOVECLIM interactive vegetation induces non-linear responses in the Sahel-Sahara and Arctic sea-ice area. Finally, we find that there is no synergy between obliquity and precession.

2015 ◽  
Vol 6 (1) ◽  
pp. 205-224 ◽  
Author(s):  
N. Bounceur ◽  
M. Crucifix ◽  
R. D. Wilkinson

Abstract. A global sensitivity analysis is performed to describe the effects of astronomical forcing on the climate–vegetation system simulated by the model of intermediate complexity LOVECLIM in interglacial conditions. The methodology relies on the estimation of sensitivity measures, using a Gaussian process emulator as a fast surrogate of the climate model, calibrated on a set of well-chosen experiments. The outputs considered are the annual mean temperature and precipitation and the growing degree days (GDD). The experiments were run on two distinct land surface schemes to estimate the importance of vegetation feedbacks on climate variance. This analysis provides a spatial description of the variance due to the factors and their combinations, in the form of "fingerprints" obtained from the covariance indices. The results are broadly consistent with the current under-standing of Earth's climate response to the astronomical forcing. In particular, precession and obliquity are found to contribute in LOVECLIM equally to GDD in the Northern Hemisphere, and the effect of obliquity on the response of Southern Hemisphere temperature dominates precession effects. Precession dominates precipitation changes in subtropical areas. Compared to standard approaches based on a small number of simulations, the methodology presented here allows us to identify more systematically regions susceptible to experiencing rapid climate change in response to the smooth astronomical forcing change. In particular, we find that using interactive vegetation significantly enhances the expected rates of climate change, specifically in the Sahel (up to 50% precipitation change in 1000 years) and in the Canadian Arctic region (up to 3° in 1000 years). None of the tested astronomical configurations were found to induce multiple steady states, but, at low obliquity, we observed the development of an oscillatory pattern that has already been reported in LOVECLIM. Although the mathematics of the analysis are fairly straightforward, the emulation approach still requires considerable care in its implementation. We discuss the effect of the choice of length scales and the type of emulator, and estimate uncertainties associated with specific computational aspects, to conclude that the principal component emulator is a good option for this kind of application.


2005 ◽  
Vol 12 (3) ◽  
pp. 373-379 ◽  
Author(s):  
C. Tiede ◽  
K. Tiampo ◽  
J. Fernández ◽  
C. Gerstenecker

Abstract. A quantitative global sensitivity analysis (SA) is applied to the non-linear inversion of gravity changes and displacement data which measured in an active volcanic area. The common inversion of this data is based on the solution of the generalized Navier equations which couples both types of observation, gravity and displacement, in a homogeneous half space. The sensitivity analysis has been carried out using Sobol's variance-based approach which produces the total sensitivity indices (TSI), so that all interactions between the unknown input parameters are taken into account. Results of the SA show quite different sensitivities for the measured changes as they relate to the unknown parameters for the east, north and height component, as well as the pressure, radial and mass component of an elastic-gravitational source. The TSIs are implemented into the inversion in order to stabilize the computation of the unknown parameters, which showed wide dispersion ranges in earlier optimization approaches. Samples which were computed using a genetic algorithm (GA) optimization are compared to samples in which the results of the global sensitivity analysis are integrated by a reweighting of the cofactor matrix in the objective function. The comparison shows that the implementation of the TSI's can decrease the dispersion rate of unknown input parameters, producing a great improvement the reliable determination of the unknown parameters.


2019 ◽  
Vol 11 (20) ◽  
pp. 2424 ◽  
Author(s):  
Egor Prikaziuk ◽  
Christiaan van der Tol

Sentinel-3 satellite has provided simultaneous observations in the optical (visible, near infrared (NIR), shortwave infrared (SWIR)) and thermal infrared (TIR) domains since 2016, with a revisit time of 1–2 days. The high temporal resolution and spectral coverage make the data of this mission attractive for vegetation monitoring. This study explores the possibilities of using the Soil Canopy Observation, Photochemistry and Energy fluxes (SCOPE) model together with Sentinel-3 to exploit the two sensors onboard of Sentinel-3 (the ocean and land color instrument (OLCI) and sea and land surface temperature radiometer (SLSTR)) in synergy. Sobol’ variance based global sensitivity analysis (GSA) of top of atmosphere (TOA) radiance produced with a coupled SCOPE-6S model was conducted for optical bands of OLCI and SLSTR, while another GSA of SCOPE was conducted for the land surface temperature (LST) product of SLSTR. The results show that in addition to ESA level-2 Sentinel-3 products, SCOPE is able to retrieve leaf area index (LAI), leaf chlorophyll content (Cab), leaf water content (Cw), leaf senescent material (Cs), leaf inclination distribution (LAD). Leaf dry matter content (Cdm) and soil brightness, despite being important, were not confidently retrieved in some cases. GSA of LST in TIR domain showed that plant biochemical parameters—maximum carboxylation rate (Vcmax) and stomata conductance-photosynthesis slope (Ball-Berry m)—can be constrained if prior information on near-surface weather conditions is available. We conclude that the combination of optical and thermal domains facilitates the constraint of the land surface energy balance using SCOPE.


2019 ◽  
Author(s):  
Razi Sheikholeslami ◽  
Saman Razavi ◽  
Amin Haghnegahdar

Abstract. Complex, software-intensive, technically advanced, and computationally demanding models, presumably with ever-growing realism and fidelity, have been widely used to simulate and predict the dynamics of the Earth and environmental systems. The parameter-induced simulation crash (failure) problem is typical across most of these models, despite considerable efforts that modellers have directed at model development and implementation over the last few decades. A simulation failure mainly occurs due to the violation of the numerical stability conditions, non-robust numerical implementations, or errors in programming. However, the existing sampling-based analysis techniques such as global sensitivity analysis (GSA) methods, which require running these models under many configurations of parameter values, are ill-equipped to effectively deal with model failures. To tackle this problem, we propose a novel approach that allows users to cope with failed designs (samples) during the GSA, without knowing where they took place and without re-running the entire experiment. This approach deems model crashes as missing data and uses strategies such as median substitution, single nearest neighbour, or response surface modelling to fill in for model crashes. We test the proposed approach on a 10-paramter HBV-SASK rainfall-runoff model and a 111-parameter MESH land surface-hydrology model. Our results show that response surface modelling is a superior strategy, out of the data filling strategies tested, and can scale well to the dimensionality of the model, sample size, and the ratio of number of failures to the sample size. Further, we conduct a "failure analysis" and discuss some possible causes of the MESH model failure.


2019 ◽  
Vol 12 (10) ◽  
pp. 4275-4296 ◽  
Author(s):  
Razi Sheikholeslami ◽  
Saman Razavi ◽  
Amin Haghnegahdar

Abstract. Complex, software-intensive, technically advanced, and computationally demanding models, presumably with ever-growing realism and fidelity, have been widely used to simulate and predict the dynamics of the Earth and environmental systems. The parameter-induced simulation crash (failure) problem is typical across most of these models despite considerable efforts that modellers have directed at model development and implementation over the last few decades. A simulation failure mainly occurs due to the violation of numerical stability conditions, non-robust numerical implementations, or errors in programming. However, the existing sampling-based analysis techniques such as global sensitivity analysis (GSA) methods, which require running these models under many configurations of parameter values, are ill equipped to effectively deal with model failures. To tackle this problem, we propose a new approach that allows users to cope with failed designs (samples) when performing GSA without rerunning the entire experiment. This approach deems model crashes as missing data and uses strategies such as median substitution, single nearest-neighbor, or response surface modeling to fill in for model crashes. We test the proposed approach on a 10-parameter HBV-SASK (Hydrologiska Byråns Vattenbalansavdelning modified by the second author for educational purposes) rainfall–runoff model and a 111-parameter Modélisation Environmentale–Surface et Hydrologie (MESH) land surface–hydrology model. Our results show that response surface modeling is a superior strategy, out of the data-filling strategies tested, and can comply with the dimensionality of the model, sample size, and the ratio of the number of failures to the sample size. Further, we conduct a “failure analysis” and discuss some possible causes of the MESH model failure that can be used for future model improvement.


2020 ◽  
Vol 20 (7) ◽  
pp. 4047-4058 ◽  
Author(s):  
Oliver Wild ◽  
Apostolos Voulgarakis ◽  
Fiona O'Connor ◽  
Jean-François Lamarque ◽  
Edmund M. Ryan ◽  
...  

Abstract. Projections of future atmospheric composition change and its impacts on air quality and climate depend heavily on chemistry–climate models that allow us to investigate the effects of changing emissions and meteorology. These models are imperfect as they rely on our understanding of the chemical, physical and dynamical processes governing atmospheric composition, on the approximations needed to represent these numerically, and on the limitations of the observations required to constrain them. Model intercomparison studies show substantial diversity in results that reflect underlying uncertainties, but little progress has been made in explaining the causes of this or in identifying the weaknesses in process understanding or representation that could lead to improved models and to better scientific understanding. Global sensitivity analysis provides a valuable method of identifying and quantifying the main causes of diversity in current models. For the first time, we apply Gaussian process emulation with three independent global chemistry-transport models to quantify the sensitivity of ozone and hydroxyl radicals (OH) to important climate-relevant variables, poorly characterised processes and uncertain emissions. We show a clear sensitivity of tropospheric ozone to atmospheric humidity and precursor emissions which is similar for the models, but find large differences between models for methane lifetime, highlighting substantial differences in the sensitivity of OH to primary and secondary production. This approach allows us to identify key areas where model improvements are required while providing valuable new insight into the processes driving tropospheric composition change.


Sign in / Sign up

Export Citation Format

Share Document