scholarly journals A Compensatory Approach of the Fixed Localization in EnKF

2014 ◽  
Vol 142 (10) ◽  
pp. 3713-3733 ◽  
Author(s):  
Xinrong Wu ◽  
Wei Li ◽  
Guijun Han ◽  
Shaoqing Zhang ◽  
Xidong Wang

Abstract While fixed covariance localization can greatly increase the reliability of the background error covariance in filtering by suppressing the long-distance spurious correlations evaluated by a finite ensemble, it may degrade the assimilation quality in an ensemble Kalman filter (EnKF) as a result of restricted longwave information. Tuning an optimal cutoff distance is usually very expensive and time consuming, especially for a general circulation model (GCM). Here the authors present an approach to compensate the demerit in fixed localization. At each analysis step, after the standard EnKF is done, a multiple-scale analysis technique is used to extract longwave information from the observational residual (referred to the EnKF ensemble mean). Within a biased twin-experiment framework consisting of a global barotropical spectral model and an idealized observing system, the performance of the new method is examined. Compared to a standard EnKF, the hybrid method is superior when an overly small/large cutoff distance is used, and it has less dependence on cutoff distance. The new scheme is also able to improve short-term weather forecasts, especially when an overly large cutoff distance is used. Sensitivity studies show that caution should be taken when the new scheme is applied to a dense observing system with an overly small cutoff distance in filtering. In addition, the new scheme has a nearly equivalent computational cost to the standard EnKF; thus, it is particularly suitable for GCM applications.

2015 ◽  
Vol 143 (11) ◽  
pp. 4714-4735 ◽  
Author(s):  
Xinrong Wu ◽  
Wei Li ◽  
Guijun Han ◽  
Lianxin Zhang ◽  
Caixia Shao ◽  
...  

Abstract Although the fixed covariance localization in the ensemble Kalman filter (EnKF) can significantly increase the reliability of background error covariance, it has been demonstrated that extreme impact radii can cause the EnKF to lose some useful information. Tuning an optimal impact radius, on the other hand, is always difficult for a general circulation model. The EnKF multiscale analysis (MSA) approach was presented to make up for the above-mentioned drawback of the fixed localization. As a follow-up, this study presents an adaptive compensatory approach to further improve the performance of the EnKF-MSA. The new method adaptively triggers a multigrid analysis (MGA) to extract multiscale information from the observational residual after the EnKF without inflation is completed at each analysis step. Within a biased twin experiment framework consisting of a barotropic spectral model and an idealized observing system, the performance of the adaptive method is examined. Results show that the MGA reduces the computational cost of the MSA by 93%. On the assimilation quality, the adaptive method has an incremental improvement over the EnKF-MSA. That is, the adaptive EnKF-MGA reduces to the EnKF without inflation, which is better than the EnKF-MSA, for moderate impact radii. The proposed scheme works for a broader range of impact radii than the standard EnKF (i.e., the EnKF with inflation). For extreme impact radii, the adaptive EnKF-MGA can produce smaller assimilation errors than the standard EnKF and shorten the spinup period by 53%. In addition, the computational cost of the MGA is negligible relative to that of the standard EnKF.


2007 ◽  
Vol 135 (11) ◽  
pp. 3785-3807 ◽  
Author(s):  
A. Bellucci ◽  
S. Masina ◽  
P. DiPietro ◽  
A. Navarra

Abstract In this paper results from the application of an ocean data assimilation (ODA) system, combining a multivariate reduced-order optimal interpolator (OI) scheme with a global ocean general circulation model (OGCM), are described. The present ODA system, designed to assimilate in situ temperature and salinity observations, has been used to produce ocean reanalyses for the 1962–2001 period. The impact of assimilating observed hydrographic data on the ocean mean state and temporal variability is evaluated. A special focus of this work is on the ODA system skill in reproducing a realistic ocean salinity state. Results from a hierarchy of different salinity reanalyses, using varying combinations of assimilated data and background error covariance structures, are described. The impact of the space and time resolution of the background error covariance parameterization on salinity is addressed.


Sensors ◽  
2020 ◽  
Vol 20 (3) ◽  
pp. 877 ◽  
Author(s):  
Elias David Nino-Ruiz ◽  
Alfonso Mancilla-Herrera ◽  
Santiago Lopez-Restrepo ◽  
Olga Quintero-Montoya

This paper proposes an efficient and practical implementation of the Maximum Likelihood Ensemble Filter via a Modified Cholesky decomposition (MLEF-MC). The method works as follows: via an ensemble of model realizations, a well-conditioned and full-rank square-root approximation of the background error covariance matrix is obtained. This square-root approximation serves as a control space onto which analysis increments can be computed. These are calculated via Line-Search (LS) optimization. We theoretically prove the convergence of the MLEF-MC. Experimental simulations were performed using an Atmospheric General Circulation Model (AT-GCM) and a highly nonlinear observation operator. The results reveal that the proposed method can obtain posterior error estimates within reasonable accuracies in terms of ℓ − 2 error norms. Furthermore, our analysis estimates are similar to those of the MLEF with large ensemble sizes and full observational networks.


2008 ◽  
Vol 1 (1) ◽  
pp. 53-68 ◽  
Author(s):  
R. S. Smith ◽  
J. M. Gregory ◽  
A. Osprey

Abstract. FAMOUS is an ocean-atmosphere general circulation model of low resolution, capable of simulating approximately 120 years of model climate per wallclock day using current high performance computing facilities. It uses most of the same code as HadCM3, a widely used climate model of higher resolution and computational cost, and has been tuned to reproduce the same climate reasonably well. FAMOUS is useful for climate simulations where the computational cost makes the application of HadCM3 unfeasible, either because of the length of simulation or the size of the ensemble desired. We document a number of scientific and technical improvements to the original version of FAMOUS. These improvements include changes to the parameterisations of ozone and sea-ice which alleviate a significant cold bias from high northern latitudes and the upper troposphere, and the elimination of volume-averaged drifts in ocean tracers. A simple model of the marine carbon cycle has also been included. A particular goal of FAMOUS is to conduct millennial-scale paleoclimate simulations of Quaternary ice ages; to this end, a number of useful changes to the model infrastructure have been made.


2019 ◽  
Vol 9 ◽  
pp. A30 ◽  
Author(s):  
Sean Elvidge ◽  
Matthew J. Angling

The Advanced Ensemble electron density (Ne) Assimilation System (AENeAS) is a new data assimilation model of the ionosphere/thermosphere. The background model is provided by the Thermosphere Ionosphere Electrodynamics General Circulation Model (TIE-GCM) and the assimilation uses the local ensemble transform Kalman filter (LETKF). An outline derivation of the LETKF is provided and the equations are presented in a form analogous to the classic Kalman filter. An enhancement to the efficient LETKF implementation to reduce computational cost is also described. In a 3 day test in June 2017, AENeAS exhibits a total electron content (TEC) RMS error of 2.1 TECU compared with 5.5 TECU for NeQuick and 6.8 for TIE-GCM (with an NeQuick topside).


2015 ◽  
Vol 2015 ◽  
pp. 1-16 ◽  
Author(s):  
Guijun Han ◽  
Xinrong Wu ◽  
Shaoqing Zhang ◽  
Zhengyu Liu ◽  
Ionel Michael Navon ◽  
...  

Coupling parameter estimation (CPE) that uses observations to estimate the parameters in a coupled model through error covariance between variables residing in different media may increase the consistency of estimated parameters in an air-sea coupled system. However, it is very challenging to accurately evaluate the error covariance between such variables due to the different characteristic time scales at which flows vary in different media. With a simple Lorenz-atmosphere and slab ocean coupled system that characterizes the interaction of two-timescale media in a coupled “climate” system, this study explores feasibility of the CPE with four-dimensional variational analysis and ensemble Kalman filter within a perfect observing system simulation experiment framework. It is found that both algorithms can improve the representation of air-sea coupling processes through CPE compared to state estimation only. These simple model studies provide some insights when parameter estimation is implemented with a coupled general circulation model for improving climate estimation and prediction initialization.


2013 ◽  
Vol 10 (6) ◽  
pp. 6963-7001
Author(s):  
S. Barthélémy ◽  
S. Ricci ◽  
O. Pannekoucke ◽  
O. Thual ◽  
P. O. Malaterre

Abstract. This study describes the emulation of an Ensemble Kalman Filter (EnKF) algorithm on a 1-D flood wave propagation model. This model is forced at the upstream boundary with a random variable with gaussian statistics and a correlation function in time with gaussian shape. This allows for, in the case without assimilation, the analytical study of the covariance functions of the propagated signal anomaly. This study is validated numerically with an ensemble method. In the case with assimilation with one observation point, where synthetical observations are generated by adding an error to a true state, the dynamic of the background error covariance functions is not straightforward and a numerical approach using an EnKF algorithm is prefered. First, those numerical experiments show that both background error variance and correlation length scale are reduced at the observation point. This reduction of variance and correlation length scale is propagated downstream by the dynamics of the model. Then, it is shown that the application of a Best Linear Unbiased Estimator (BLUE) algorithm using the background error covariance matrix converged from the EnKF algorithm, provides the same results as the EnKF but with a cheaper computational cost, thus allowing for the use of data assimilation in the context of real time flood forecasting. Moreover it was demonstrated that the reduction of background error correlation length scale and variance at the observation point depends on the error observation statistics. This feature is quantified by abacus built from linear regressions over a limited set of EnKF experiments. These abacus that describe the background error variance and the correlation length scale in the neighboring of the observation point combined with analytical expressions that describe the background error variance and the correlation length scale away from the observation point provide parametrized models for the variance and the correlation length scale. Using this parametrized variance and correlation length scale with a diffusion operator makes it possible to model the converged background error covariance matrix from the EnKF without actually integrating the EnKF algorithm. This method was finally applied to a case with two different observation point with different error statistics. It was shown that the results of this emulated EnKF (EEnKF) in terms of background error variance, correlation length scale and analyzed water level is close to those of the EnKF but with a significantly reduced computational cost.


2020 ◽  
Author(s):  
Guillaume Le Gland ◽  
Sergio M. Vallina ◽  
S. Lan Smith ◽  
Pedro Cermeño

Abstract. Diversity plays a key role in the adaptive capacities of marine ecosystems to environmental changes. However, modeling phytoplankton trait diversity remains challenging due to the strength of the competitive exclusion of sub-optimal phenotypes. Trait diffusion (TD) is a recently developed approach to sustain diversity in plankton models by allowing the evolution of functional traits at ecological timescales. In this study, we present a model for Simulating Plankton Evolution with Adaptive Dynamics (SPEAD), where phytoplankton phenotypes characterized by two traits, nitrogen half-saturation constant and optimal temperature, can mutate at each generation using the TD mechanism. SPEAD does not resolve the different phenotypes as discrete entities, computing instead six aggregate properties: total phytoplankton biomass, mean value of each trait, trait variances, and inter-trait covariance of a single population in a continuous trait space. Therefore SPEAD resolves the dynamics of the population's continuous trait distribution by solving its statistical moments, where the variances of trait values represent the diversity of ecotypes. The ecological model is coupled to a vertically-resolved (1D) physical environment, and therefore the adaptive dynamics of the simulated phytoplankton population are driven by seasonal variations in vertical mixing, nutrient concentration, water temperature, and solar irradiance. The simulated bulk properties are validated by observations from BATS in the Sargasso Sea. We find that moderate mutation rates sustain trait diversity at decadal timescales and soften the almost total inter-trait correlation induced by the environment alone, without reducing the annual primary production or promoting permanently maladapted phenotypes, as occur with high mutation rates. As a way to evaluate the performance of the continuous-trait approximation, we also compare the solutions of SPEAD to the solutions of a classical discrete entities approach, both approaches including TD as a mechanism to sustain trait variance. We only find minor discrepancies between the continuous model SPEAD and the discrete model, the computational cost of SPEAD being lower by two orders of magnitude. Therefore SPEAD should be an ideal eco-evolutionary plankton model to be coupled to a general circulation model (GCM) at the global ocean.


2018 ◽  
Vol 115 (39) ◽  
pp. 9684-9689 ◽  
Author(s):  
Stephan Rasp ◽  
Michael S. Pritchard ◽  
Pierre Gentine

The representation of nonlinear subgrid processes, especially clouds, has been a major source of uncertainty in climate models for decades. Cloud-resolving models better represent many of these processes and can now be run globally but only for short-term simulations of at most a few years because of computational limitations. Here we demonstrate that deep learning can be used to capture many advantages of cloud-resolving modeling at a fraction of the computational cost. We train a deep neural network to represent all atmospheric subgrid processes in a climate model by learning from a multiscale model in which convection is treated explicitly. The trained neural network then replaces the traditional subgrid parameterizations in a global general circulation model in which it freely interacts with the resolved dynamics and the surface-flux scheme. The prognostic multiyear simulations are stable and closely reproduce not only the mean climate of the cloud-resolving simulation but also key aspects of variability, including precipitation extremes and the equatorial wave spectrum. Furthermore, the neural network approximately conserves energy despite not being explicitly instructed to. Finally, we show that the neural network parameterization generalizes to new surface forcing patterns but struggles to cope with temperatures far outside its training manifold. Our results show the feasibility of using deep learning for climate model parameterization. In a broader context, we anticipate that data-driven Earth system model development could play a key role in reducing climate prediction uncertainty in the coming decade.


2009 ◽  
Vol 22 (11) ◽  
pp. 2850-2870 ◽  
Author(s):  
Shu-Chih Yang ◽  
Christian Keppenne ◽  
Michele Rienecker ◽  
Eugenia Kalnay

Abstract Coupled bred vectors (BVs) generated from the NASA Global Modeling and Assimilation Office (GMAO) coupled general circulation model are designed to capture the uncertainties related to slowly varying coupled instabilities. Two applications of the BVs are investigated in this study. First, the coupled BVs are used as initial perturbations for ensemble-forecasting purposes. Results show that the seasonal-to-interannual variability forecast skill can be improved when the oceanic and atmospheric perturbations are initialized with coupled BVs. The impact is particularly significant when the forecasts are initialized from the cold phase of tropical Pacific SST (e.g., August and November), because at these times the early coupled model errors, not accounted for in the BVs, are small. Second, the structure of the BVs is applied to construct hybrid background error covariances carrying flow-dependent information for the ocean data assimilation. Results show that the accuracy of the ocean analyses is improved when Gaussian background covariances are supplemented with a term obtained from the BVs. The improvement is especially noticeable for the salinity field.


Sign in / Sign up

Export Citation Format

Share Document