background error
Recently Published Documents


TOTAL DOCUMENTS

429
(FIVE YEARS 110)

H-INDEX

42
(FIVE YEARS 5)

Abstract We describe a method for the efficient generation of the covariance operators of a variational data assimilation scheme which is suited to implementation on a massively parallel computer. The elementary components of this scheme are what we call ‘beta filters’, since they are based on the same spatial profiles possessed by the symmetric beta distributions of probability theory. These approximately Gaussian (bell-shaped) polynomials blend smoothly to zero at the ends of finite intervals, which makes them better suited to parallelization than the present quasi-Gaussian ‘recursive filters’ used in operations at NCEP. These basic elements are further combined at a hierarchy of different spatial scales into an overall multigrid structure formulated to preserve the necessary self-adjoint attribute possessed by any valid covariance operator. This paper describes the underlying idea of the beta filter and discusses how generalized Helmholtz operators can be enlisted to weight the elementary contributions additively in such a way that the covariance operators may exhibit realistic negative sidelobes, which are not easily obtained through the recursive filter paradigm. The main focus of the paper is on the basic logistics of the multigrid structure by which more general covariance forms are synthesized from the basic quasi-Gaussian elements. We describe several ideas on how best to organize computation, which led us to a generalization of this structure which made it practical so that it can efficiently perform with any rectangular arrangement of processing elements. Some simple idealized examples of the applications of these ideas are given.


Sensors ◽  
2021 ◽  
Vol 21 (23) ◽  
pp. 8067
Author(s):  
Zhihong Liao ◽  
Bin Xu ◽  
Junxia Gu ◽  
Chunxiang Shi

Sea surface temperature (SST) is critical for global climate change analysis and research. In this study, we used visible and infrared scanning radiometer (VIRR) sea surface temperature (SST) data from the Fengyun-3C (FY-3C) satellite for SST analysis, and applied the Kalman filtering methods with oriented elliptic correlation scales to construct SST fields. Firstly, the model for the oriented elliptic correlation scale was established for SST analysis. Secondly, observation errors from each type of SST data source were estimated using the optimal matched datasets, and background field errors were calculated using the model of oriented elliptic correlation scale. Finally, the blended SST analysis product was obtained using the Kalman filtering method, then the SST fields using the optimum interpolation (OI) method were chosen for comparison to validate results. The quality analysis for 2016 revealed that the Kalman analysis with a root-mean-square error (RMSE) of 0.3243 °C had better performance than did the OI analysis with a RMSE of 0.3911 °C, which was closer to the OISST product RMSE of 0.2897 °C. The results demonstrated that the Kalman filtering method with dynamic observation error and background error estimation was significantly superior to the OI method in SST analysis for FY-3C SST data.


Abstract Recent numerical weather prediction systems have significantly improved medium-range forecasts by implementing hybrid background error covariance, for which climatological (static) and ensemble-based (flow-dependent) error covariance are combined. While the hybrid approach has been investigated mainly in variational systems, this study aims at exploring methods for implementing the hybrid approach for the local ensemble transform Kalman filter (LETKF). Following Kretchmer et al. (2015), the present study constructed hybrid background error covariance by adding collections of climatological perturbations to the forecast ensemble. In addition, this study proposes a new localization method that attenuates the ensemble perturbation (Z-localization) instead of inflating observation error variance (R-localization). A series of experiments with a simplified global atmospheric model revealed that the hybrid LETKF resulted in smaller forecast errors than the LETKF, especially in sparsely observed regions. Due to the larger ensemble enabled by the hybrid approach, optimal localization length scales for the hybrid LETKF were larger than those for the LETKF. With the LETKF, the Z-localization resulted in similar forecast errors as the R-localization. However, Z-localization has an advantage in enabling to apply different localization scales for flow-dependent perturbation and climatological static perturbations with the hybrid LETKF. The optimal localization for climatological perturbations was slightly larger than that for flow-dependent perturbations. This study proposes Optimal EigenDecomposition (OED) ETKF formulation to reduce computational costs. The computational expense of the OED ETKF formulation became significantly smaller than that of standard ETKF formulations as the number of climatological perturbations was increased beyond a few hundred.


2021 ◽  
Author(s):  
Pascal Marquet ◽  
Pauline Martinet ◽  
Jean-François Mahfouf ◽  
Alina Lavinia Barbu ◽  
Benjamin Ménétrier

Abstract. This study aims at introducing two conservative thermodynamic variables (moist-air entropy potential temperature and total water content) into a one-dimensional variational data assimilation system (1D-Var) to demonstrate the benefit for future operational assimilation schemes. This system is assessed using microwave brightness temperatures from a ground-based radiometer installed during the field campaign SOFGO3D dedicated to fog forecast improvement. An underlying objective is to ease the specification of background error covariance matrices that are currently highly dependent on weather conditions making difficult the optimal retrievals of cloud and thermodynamic properties during fog conditions. Background error covariance matrices for these new conservative variables have thus been computed by an ensemble approach based on the French convective scale model AROME, for both all-weather and fog conditions. A first result shows that the use of these matrices for the new variables reduces some dependencies to the meteorological conditions (diurnal cycle, presence or not of clouds) compared to usual variables (temperature, specific humidity). Then, two 1D-Var experiments (classical vs. conservative variables) are evaluated over a full diurnal cycle characterized by a stratus-evolving radiative fog situation, using hourly brightness temperatures. Results show, as expected, that analysed brightness temperatures by the 1D-Var are much closer to the observed ones than background values for both variable choices. This is especially the case for channels sensitive to water vapour and liquid water. On the other hand, analysis increments in model space (water vapour, liquid water) show significant differences between the two sets of variables.


2021 ◽  
Vol 28 (4) ◽  
pp. 565-583
Author(s):  
Zofia Stanley ◽  
Ian Grooms ◽  
William Kleiber

Abstract. Localization is widely used in data assimilation schemes to mitigate the impact of sampling errors on ensemble-derived background error covariance matrices. Strongly coupled data assimilation allows observations in one component of a coupled model to directly impact another component through the inclusion of cross-domain terms in the background error covariance matrix. When different components have disparate dominant spatial scales, localization between model domains must properly account for the multiple length scales at play. In this work, we develop two new multivariate localization functions, one of which is a multivariate extension of the fifth-order piecewise rational Gaspari–Cohn localization function; the within-component localization functions are standard Gaspari–Cohn with different localization radii, while the cross-localization function is newly constructed. The functions produce positive semidefinite localization matrices which are suitable for use in both Kalman filters and variational data assimilation schemes. We compare the performance of our two new multivariate localization functions to two other multivariate localization functions and to the univariate and weakly coupled analogs of all four functions in a simple experiment with the bivariate Lorenz 96 system. In our experiments, the multivariate Gaspari–Cohn function leads to better performance than any of the other multivariate localization functions.


2021 ◽  
Author(s):  
Ivette H. Banos ◽  
Will D. Mayfield ◽  
Guoqing Ge ◽  
Luiz F. Sapucci ◽  
Jacob R. Carley ◽  
...  

Abstract. The Rapid Refresh Forecast System (RRFS) is currently under development and aims to replace the National Centers for Environmental Prediction (NCEP) operational suite of regional and convective scale modeling systems in the next upgrade. In order to achieve skillful forecasts comparable to the current operational suite, each component of the RRFS needs to be configured through exhaustive testing and evaluation. The current data assimilation component uses the Gridpoint Statistical Interpolation (GSI) system. In this study, various data assimilation algorithms and configurations in GSI are assessed for their impacts on RRFS analyses and forecasts of a squall line over Oklahoma on 4 May 2020. Results show that a baseline RRFS run without data assimilation is able to represent the observed convection, but with stronger cells and large location errors. With data assimilation, these errors are reduced, especially in the 4 and 6 h forecasts using 75 % of the ensemble background error covariance (BEC) and with the supersaturation removal function activated in GSI. Decreasing the vertical ensemble localization radius in the first 10 layers of the hybrid analysis results in overall less skillful forecasts. Convection and precipitation are overforecast in most forecast hours when using planetary boundary layer pseudo-observations, but the root mean square error and bias of the 2 h forecast of 2 m dew point temperature are reduced by 1.6 K during the afternoon hours. Lighter hourly accumulated precipitation is predicted better when using 100 % ensemble BEC in the first 4 h forecast, but heavier hourly accumulated precipitation is better predicted with 75 % ensemble BEC. Our results provide insight into current capabilities of the RRFS data assimilation system and identify configurations that should be considered as candidates for the first version of RRFS.


2021 ◽  
Vol 21 (18) ◽  
pp. 13747-13761
Author(s):  
Xinghong Cheng ◽  
Zilong Hao ◽  
Zengliang Zang ◽  
Zhiquan Liu ◽  
Xiangde Xu ◽  
...  

Abstract. We develop a new inversion method which is suitable for linear and nonlinear emission source (ES) modeling, based on the three-dimensional decoupled direct (DDM-3D) sensitivity analysis module in the Community Multiscale Air Quality (CMAQ) model and the three-dimensional variational (3DVAR) data assimilation technique. We established the explicit observation operator matrix between the ES and receptor concentrations and the background error covariance (BEC) matrix of the ES, which can reflect the impacts of uncertainties of the ES on assimilation. Then we constructed the inversion model of the ES by combining the sensitivity analysis with 3DVAR techniques. We performed the simulation experiment using the inversion model for a heavy haze case study in the Beijing–Tianjin–Hebei (BTH) region during 27–30 December 2016. Results show that the spatial distribution of sensitivities of SO2 and NOx ESs to their concentrations, as well as the BEC matrix of ES, is reasonable. Using an a posteriori inversed ES, underestimations of SO2 and NO2 during the heavy haze period are remarkably improved, especially for NO2. Spatial distributions of SO2 and NO2 concentrations simulated by the constrained ES were more accurate compared with an a priori ES in the BTH region. The temporal variations in regionally averaged SO2, NO2, and O3 modeled concentrations using an a posteriori inversed ES are consistent with in situ observations at 45 stations over the BTH region, and simulation errors decrease significantly. These results are of great significance for studies on the formation mechanism of heavy haze, the reduction of uncertainties of the ES and its dynamic updating, and the provision of accurate “virtual” emission inventories for air-quality forecasts and decision-making services for optimization control of air pollution.


Symmetry ◽  
2021 ◽  
Vol 13 (9) ◽  
pp. 1563
Author(s):  
Xiang Xing ◽  
Bainian Liu ◽  
Weimin Zhang ◽  
Xiaoqun Cao ◽  
Jingzhe Sun

Mainstream numerical weather prediction (NWP) centers usually estimate the standard deviations of background error by using a randomization technique to calibrate specific parameters of the background error covariance model in variational data assimilation (VAR) systems. However, the sampling size of the randomization technique is typically several orders of magnitude smaller than that of model state variables, and using finite-sized estimates as a proxy for the truth can lead to sampling noise, which may contaminate the estimation of the standard deviation. The sampling noise is firstly investigated in an atmospheric model to show that the sampling noise has a symmetrical structure oscillating around the truth on a small scale. To alleviate the sampling noise, a heterogeneous local weighting filtering is proposed based on distance-weighted correlation and similarity-weighted correlation. Local weighting filtering is easy to implement in the VAR operational systems and has a low computational cost in the post-processing of reducing the sampling noise. The validity and performance of local weighting filtering method are examined in a realistic model framework to show that the proposed filtering is able to eliminate most of the sampling noise dramatically, the details of the filtered results are more visible, and the accuracy of the filtered results is almost the same as that estimated from the larger sample. The signal-to-noise ratio of the optimal filtered field is improved by nearly 20%. A comparison with the widely used spectral filtering approach in the operational system is considered, showing that the proposed filtering method is more efficient to implement in the filtering procedure and exhibits very good performance in terms of preserving the local anisotropic features of the estimates. These attractive results show the potential efficiency of the local weighting filtering method for solving the noise issue in the randomization technique.


2021 ◽  
Vol 9 (9) ◽  
pp. 920
Author(s):  
Xiying Liu ◽  
Zicheng Sha ◽  
Chenchen Lu

To study the effectiveness of methods to reduce errors for Arctic Sea ice initialization due to underestimation of background error covariance, an advanced ensemble analysis system has been developed. The system integrates the local ensemble transform Kalman filter (LETKF) with the community ice code (CICE). With a mixed layer ocean model used to compute the sea surface temperature (SST), the experiments on assimilation of observations of sea ice concentration (SIC) have been carried out. Assimilation experiments were performed over a 3-month period from January to March in 1997. The model was sequentially constrained with daily observation data. The effects of observation density, amplification factor for analysis error covariance, and relaxation of disturbance and spread on the results of SIC initialization were studied. It is shown that doubling the density of observation of SIC does not bring significant further improvement on the analysis result; when the ensemble size is doubled, most severe SIC biases in the Labrador, Greenland, Norwegian, and Barents seas are reduced; amplifying the analysis error covariance, relaxing disturbance, and relaxing spread all contribute to improving the reproduction of SIC with amplifying covariance with the largest magnitude.


Sign in / Sign up

Export Citation Format

Share Document