Prospects for improving savanna biophysical models by using multiple-constraints model-data assimilation methods

2005 ◽  
Vol 53 (7) ◽  
pp. 689 ◽  
Author(s):  
Damian J. Barrett ◽  
Michael J. Hill ◽  
Lindsay B. Hutley ◽  
Jason Beringer ◽  
Johnny H. Xu ◽  
...  

A ‘multiple-constraints’ model-data assimilation scheme using a diverse range of data types offers the prospect of improved predictions of carbon and water budgets at regional scales. Global savannas, occupying more than 12% of total land area, are an economically and ecologically important biome but are relatively poorly covered by observations. In Australia, savannas are particularly poorly sampled across their extent, despite their amenity to ground-based measurement (largely intact vegetation, low relief and accessible canopies). In this paper, we describe the theoretical and practical requirements of integrating three types of data (ground-based observations, measurements of CO2/H2O fluxes and remote-sensing data) into a terrestrial carbon, water and energy budget model by using simulated observations for a hypothetical site of given climatic and vegetation conditions. The simulated data mimic specific errors, biases and uncertainties inherent in real data. Retrieval of model parameters and initial conditions by the assimilation scheme, using only one data type, led to poor representation of modelled plant-canopy production and ecosystem respiration fluxes because of errors and bias inherent in the underlying data. By combining two or more types of data, parameter retrieval was improved; however, the full compliment of data types was necessary before all measurement errors and biases in data were minimised. This demonstration illustrates the potential of these techniques to improve the performance of ecosystem biophysical models by examining consistency among datasets and thereby reducing uncertainty in model parameters and predictions. Furthermore, by using existing available data, it is possible to design field campaigns with a specified network design for sampling to maximise uncertainty reduction, given available funding. Application of these techniques will not only help fill knowledge gaps in the carbon and water dynamics of savannas but will result in better information for decision support systems to solve natural-resource management problems in this biome worldwide.

2020 ◽  
Vol 24 (4) ◽  
pp. 1677-1689 ◽  
Author(s):  
Matthew J. Knowling ◽  
Jeremy T. White ◽  
Catherine R. Moore ◽  
Pawel Rakowski ◽  
Kevin Hayley

Abstract. It has been advocated that history matching numerical models to a diverse range of observation data types, particularly including environmental tracer concentrations and their interpretations and derivatives (e.g., mean age), constitutes an effective and appropriate means to improve model forecast reliability. This study presents two regional-scale modeling case studies that directly and rigorously assess the value of discrete tritium concentration observations and tritium-derived mean residence time (MRT) estimates in two decision-support contexts; “value” is measured herein as both the improvement (or otherwise) in the reliability of forecasts through uncertainty variance reduction and bias minimization as a result of assimilating tritium or tritium-derived MRT observations. The first case study (Heretaunga Plains, New Zealand) utilizes a suite of steady-state and transient flow models and an advection-only particle-tracking model to evaluate the worth of tritium-derived MRT estimates relative to hydraulic potential, spring discharge and river–aquifer exchange flux observations. The worth of MRT observations is quantified in terms of the change in the uncertainty surrounding ecologically sensitive spring discharge forecasts via first-order second-moment (FOSM) analyses. The second case study (Hauraki Plains, New Zealand) employs paired simple–complex transient flow and transport models to evaluate the potential for assimilation-induced bias in simulated surface-water nitrate discharge to an ecologically sensitive estuary system; formal data assimilation of tritium observations is undertaken using an iterative ensemble smoother. The results of these case studies indicate that, for the decision-relevant forecasts considered, tritium observations are of variable benefit and may induce damaging bias in forecasts; these biases are a result of an imperfect model's inability to properly and directly assimilate the rich information content of the tritium observations. The findings of this study challenge the advocacy of the increasing use of tracers, and of diverse data types more generally, whenever environmental model data assimilation is undertaken with imperfect models. This study also highlights the need for improved imperfect-model data assimilation strategies. While these strategies will likely require increased model complexity (including advanced discretization, processes and parameterization) to allow for appropriate assimilation of rich and diverse data types that operate across a range of spatial and temporal scales commensurate with a forecast of management interest, it is critical that increased model complexity does not preclude the application of formal data assimilation and uncertainty quantification techniques due to model instability and excessive run times.


2019 ◽  
Author(s):  
Matthew J. Knowling ◽  
Jeremy T. White ◽  
Catherine R. Moore ◽  
Pawel Rakowski ◽  
Kevin Hayley

Abstract. It has been advocated that history-matching numerical models to a diverse range of observation data types, particularly including environmental tracer concentrations and their interpretations/derivatives (e.g., mean age), constitutes an effective and appropriate means to improve model forecast reliability. This study presents two regional-scale modeling case studies that directly and rigorously assess the value of discrete tritium concentration observations and tritium-derived mean residence time (MRT) estimates in two decision-support contexts; value herein is measured as the improvement (or otherwise) in the reliability of forecasts through uncertainty variance reduction and bias minimization as a result of assimilating tritium or tritium-derived MRT observations. The first case study (Heretaunga Plains, New Zealand) utilizes a suite of steady-state and transient flow models and an advection-only particle-tracking model to evaluate the worth of tritium-derived MRT estimates relative to hydraulic potential, spring discharge and river/aquifer exchange flux observations. The worth of MRT observations is quantified in terms of the change in the uncertainty surrounding ecologically-sensitive spring discharge forecasts via first-order second-moment analyses. The second case study (Hauraki Plains, New Zealand) employs paired simple/complex transient flow and transport models to evaluate the potential for assimilation-induced bias in simulated surface-water nitrate discharge to an ecologically-sensitive estuary system; formal data assimilation of tritium observations is undertaken using an iterative ensemble smoother. The results of these case studies indicate that, for the decision-relevant forecasts considered, tritium observations are of variable benefit and may induce damaging bias in forecasts; these biases are a result of an imperfect model's inability to properly and directly assimilate the rich information content of the tritium observations. The findings of this study challenge the unqualified advocacy of the increasing use of tracers, and diverse data types more generally, whenever environmental model data assimilation is undertaken with imperfect models. This study also highlights the need for improved imperfect-model data assimilation strategies. While these strategies will likely require increased model complexity (including advanced discretization, processes and parameterization) to allow for appropriate assimilation of rich and diverse data types that operate across a range of spatial and temporal scales commensurate with a forecast of management interest, it is critical that increased model complexity does not preclude the application of formal data assimilation and uncertainty quantification techniques due to model instability and excessive run times.


2020 ◽  
Author(s):  
Arthur Filoche ◽  
Julien Brajard ◽  
Anastase Charantonis ◽  
Dominique Béréziat

<p>The analogy between data assimilation and machine learning has already been shown and is still being investigated to address the problem of improving physics-based models. Even though both techniques learn from data, machine learning focuses on inferring model parameters while data assimilation concentrates on hidden system state estimation with the help of a dynamical model. <br> <br>Also, neural networks and more precisely ResNet-like architectures can be seen as dynamical systems and numerical schemes, respectively. They are now considered state of the art in a vast amount of tasks involving spatio-temporal forecasting. But to train such networks, one needs dense and representative data which is rarely the case in earth sciences. At the same time, data assimilation offers a proper Bayesian framework allowing to learn from partial, noisy and indirect observations. Thus, each of this field can profit from the other by providing either a learnable class of dynamical models or dense data sets.</p><p>In this work, we benefit from powerful and flexible tools provided by the deep learning community based on automatic differentiation that are clearly suitable for variational data assimilation, avoiding explicit adjoint modelling. We use a hybrid model divided into 2 terms. The first term is a numerical scheme that comes from the discretisation of physics-based equations, the second is a convolutional neural network that represents the unresolved part of the dynamics. From the Data Assimilation point of view, our network can be seen as a particular parametrisation of the model error. We then jointly learn this parameterisation and estimate hidden system states within a variational data assimilation scheme. Indirectly, the issue of incorporating physical knowledge into machine learning models is also addressed. </p><p>We show that the hybrid model improves forecast skill compared to traditional data assimilation techniques. The generalisation of the method on different models and data will also be discussed.</p>


2019 ◽  
Vol 147 (5) ◽  
pp. 1429-1445 ◽  
Author(s):  
Yuchu Zhao ◽  
Zhengyu Liu ◽  
Fei Zheng ◽  
Yishuai Jin

Abstract We performed parameter estimation in the Zebiak–Cane model for the real-world scenario using the approach of ensemble Kalman filter (EnKF) data assimilation and the observational data of sea surface temperature and wind stress analyses. With real-world data assimilation in the coupled model, our study shows that model parameters converge toward stable values. Furthermore, the new parameters improve the real-world ENSO prediction skill, with the skill improved most by the parameter of the highest climate sensitivity (gam2), which controls the strength of anomalous upwelling advection term in the SST equation. The improved prediction skill is found to be contributed mainly by the improvement in the model dynamics, and second by the improvement in the initial field. Finally, geographic-dependent parameter optimization further improves the prediction skill across all the regions. Our study suggests that parameter optimization using ensemble data assimilation may provide an effective strategy to improve climate models and their real-world climate predictions in the future.


2012 ◽  
Vol 48 (1) ◽  
Author(s):  
K. S. Barnhart ◽  
T. H. Illangasekare

2012 ◽  
Vol 12 (12) ◽  
pp. 3719-3732 ◽  
Author(s):  
L. Mediero ◽  
L. Garrote ◽  
A. Chavez-Jimenez

Abstract. Opportunities offered by high performance computing provide a significant degree of promise in the enhancement of the performance of real-time flood forecasting systems. In this paper, a real-time framework for probabilistic flood forecasting through data assimilation is presented. The distributed rainfall-runoff real-time interactive basin simulator (RIBS) model is selected to simulate the hydrological process in the basin. Although the RIBS model is deterministic, it is run in a probabilistic way through the results of calibration developed in a previous work performed by the authors that identifies the probability distribution functions that best characterise the most relevant model parameters. Adaptive techniques improve the result of flood forecasts because the model can be adapted to observations in real time as new information is available. The new adaptive forecast model based on genetic programming as a data assimilation technique is compared with the previously developed flood forecast model based on the calibration results. Both models are probabilistic as they generate an ensemble of hydrographs, taking the different uncertainties inherent in any forecast process into account. The Manzanares River basin was selected as a case study, with the process being computationally intensive as it requires simulation of many replicas of the ensemble in real time.


Sign in / Sign up

Export Citation Format

Share Document