scholarly journals The Effect of Thinning and Superobservations in a Simple One-Dimensional Data Analysis with Mischaracterized Error

2018 ◽  
Vol 146 (4) ◽  
pp. 1181-1195 ◽  
Author(s):  
Ross N. Hoffman

A one-dimensional (1D) analysis problem is defined and analyzed to explore the interaction of observation thinning or superobservation with observation errors that are correlated or systematic. The general formulation might be applied to a 1D analysis of radiance or radio occultation observations in order to develop a strategy for the use of such data in a full data assimilation system, but is applied here to a simple analysis problem with parameterized error covariances. Findings for the simple problem include the following. For a variational analysis method that includes an estimate of the full observation error covariances, the analysis is more sensitive to variations in the estimated background and observation error standard deviations than to variations in the corresponding correlation length scales. Furthermore, if everything else is fixed, the analysis error increases with decreasing true background error correlation length scale and with increasing true observation error correlation length scale. For a weighted least squares analysis method that assumes the observation errors are uncorrelated, best results are obtained for some degree of thinning and/or tuning of the weights. Without tuning, the best strategy is superobservation with a spacing approximately equal to the observation error correlation length scale.

2005 ◽  
Vol 133 (8) ◽  
pp. 2148-2162 ◽  
Author(s):  
Diana J. M. Greenslade ◽  
Ian R. Young

Abstract One of the main limitations to current wave data assimilation systems is the lack of an accurate representation of the structure of the background errors. One method that may be used to determine background errors is the “NMC method.” This method examines the forecast divergence component of the background error growth by considering differences between forecasts of different ranges valid at the same time. In this paper, the NMC method is applied to global forecasts of significant wave height (SWH) and surface wind speed (U10). It is found that the isotropic correlation length scale of the SWH forecast divergence (LSWH) has considerable geographical variability, with the longest scales just to the south of the equator in the eastern Pacific Ocean, and the shortest scales at high latitudes. The isotropic correlation length scale of the U10 forecast divergence (LU10) has a similar distribution with a stronger latitudinal dependence. It is found that both LSWH and LU10 increase as the forecast period increases. The increase in LSWH is partly due to LU10 also increasing. Another explanation is that errors in the analysis or the short-range SWH forecast propagate forward in time and disperse and their scale becomes larger. It is shown that the forecast divergence component of the background error is strongly anisotropic with the longest scales perpendicular to the likely direction of propagation of swell. In addition, in regions where the swell propagation is seasonal, the forecast divergence component of the background error shows a similar strong seasonal signal. It is suggested that the results of this study provide a lower bound to the description of the total background error in global wave models.


2021 ◽  
Author(s):  
David F. Baker ◽  
Emily Bell ◽  
Kenneth J. Davis ◽  
Joel F. Campbell ◽  
Bing Lin ◽  
...  

Abstract. To check the accuracy of column-average dry air CO2 mole fractions (XCO2) retrieved from Orbiting Carbon Overvatory (OCO-2) data, a similar quantity has been measured from the Multi-functional Fiber Laser Lidar (MFLL) aboard aircraft flying underneath OCO-2 as part of the Atmospheric Carbon and Transport (ACT)-America flight campaigns. Here we do a lagged correlation analysis of these MFLL-OCO-2 column CO2 differences and find that their correlation spectrum falls off rapidly at along-track separation distances of under 10 km, with a correlation length scale of about 10 km, and less rapidly at longer separation distances, with a correlation length scale of about 20 km. The OCO-2 satellite takes many CO2 measurements with small (~3 km2) fields of view (FOVs) in a thin (<10 km wide) swath running parallel to its orbit: up to 24 separate FOVs may be obtained per second (across a ~6.75 km distance on the ground), though clouds, aerosols, and other factors cause considerable data dropout. Errors in the CO2 retrieval method have long been thought to be correlated at these fine scales, and methods to account for these when assimilating these data into top-down atmospheric CO2 flux inversions have been developed. A common approach has been to average the data at coarser scales (e.g., in 10-second-long bins) along-track, then assign an uncertainty to the averaged value that accounts for the error correlations. Here we outline the methods used up to now for computing these 10-second averages and their uncertainties, including the constant-correlation-with-distance error model currently being used to summarize the OCO-2 version 9 XCO2 retrievals as part of the OCO-2 flux inversion model intercomparison project. We then derive a new one-dimensional error model using correlations that decay exponentially with separation distance, apply this model to the OCO-2 data using the correlation length scales derived from the MFLL-OCO-2 differences, and compare the results (for both the average and its uncertainty) to those given by the current constant-correlation error model. To implement this new model, the data are averaged first across 2-second spans, to collapse the cross-track distribution of the real data onto the 1-D path assumed by the new model. A small percentage of the data that cause nonphysical negative averaging weights in the model are thrown out. The correlation lengths over the ocean, which the land-based MFLL data do not clarify, are assumed to be twice those over the land. The new correlation model gives 10-second XCO2 averages that are only a few tenths of a ppm different from the constant-correlation model. Over land, the uncertainties in the mean are also similar, suggesting that the +0.3 constant correlation coefficient currently used in the model there is accurate. Over the oceans, the twice-the-land correlation lengths that we assume here result in a significantly lower uncertainty on the mean than the +0.6 constant correlation currently gives – measurements similar to the MFLL ones are needed over the oceans to do better. Finally, we show how our 1-D exponential error correlation model may be used to account for correlations in those inversion methods that choose to assimilate each XCO2 retrieval individually, and to account for correlations between separate 10-second averages when these are assimilated instead.


2021 ◽  
Author(s):  
David F Baker ◽  
Emily Bell ◽  
Kenneth J Davis ◽  
Joel F Campbell ◽  
Bing Lin ◽  
...  

2007 ◽  
Vol 46 (6) ◽  
pp. 714-725 ◽  
Author(s):  
Louis Garand ◽  
Sylvain Heilliette ◽  
Mark Buehner

Abstract The interchannel observation error correlation (IOEC) associated with radiance observations is currently assumed to be zero in meteorological data assimilation systems. This assumption may lead to suboptimal analyses. Here, the IOEC is inferred for the Atmospheric Infrared Radiance Sounder (AIRS) hyperspectral radiance observations using a subset of 123 channels covering the spectral range of 4.1–15.3 μm. Observed minus calculated radiances are computed for a 1-week period using a 6-h forecast as atmospheric background state. A well-established technique is used to separate the observation and background error components for each individual channel and each channel pair. The large number of collocations combined with the 40-km horizontal spacing between AIRS fields of view allows robust results to be obtained. The resulting background errors are in good agreement with those inferred from the background error matrix used operationally in data assimilation at the Meteorological Service of Canada. The IOEC is in general high among the water vapor–sensing channels in the 6.2–7.2-μm region and among surface-sensitive channels. In contrast, it is negligible for channels within the main carbon dioxide absorption band (13.2–15.4 μm). The impact of incorporating the IOEC is evaluated from 1D variational retrievals at 381 clear-sky oceanic locations. Temperature increments differ on average by 0.25 K, and ln(q) increments by 0.10, where q is specific humidity. Without IOEC, the weight given to the observations appears to be too high; the assimilation attempts to fit the observations nearly perfectly. The IOEC better constrains the variational assimilation process, and the rate of convergence is systematically faster by a factor of 2.


2013 ◽  
Vol 10 (6) ◽  
pp. 6963-7001
Author(s):  
S. Barthélémy ◽  
S. Ricci ◽  
O. Pannekoucke ◽  
O. Thual ◽  
P. O. Malaterre

Abstract. This study describes the emulation of an Ensemble Kalman Filter (EnKF) algorithm on a 1-D flood wave propagation model. This model is forced at the upstream boundary with a random variable with gaussian statistics and a correlation function in time with gaussian shape. This allows for, in the case without assimilation, the analytical study of the covariance functions of the propagated signal anomaly. This study is validated numerically with an ensemble method. In the case with assimilation with one observation point, where synthetical observations are generated by adding an error to a true state, the dynamic of the background error covariance functions is not straightforward and a numerical approach using an EnKF algorithm is prefered. First, those numerical experiments show that both background error variance and correlation length scale are reduced at the observation point. This reduction of variance and correlation length scale is propagated downstream by the dynamics of the model. Then, it is shown that the application of a Best Linear Unbiased Estimator (BLUE) algorithm using the background error covariance matrix converged from the EnKF algorithm, provides the same results as the EnKF but with a cheaper computational cost, thus allowing for the use of data assimilation in the context of real time flood forecasting. Moreover it was demonstrated that the reduction of background error correlation length scale and variance at the observation point depends on the error observation statistics. This feature is quantified by abacus built from linear regressions over a limited set of EnKF experiments. These abacus that describe the background error variance and the correlation length scale in the neighboring of the observation point combined with analytical expressions that describe the background error variance and the correlation length scale away from the observation point provide parametrized models for the variance and the correlation length scale. Using this parametrized variance and correlation length scale with a diffusion operator makes it possible to model the converged background error covariance matrix from the EnKF without actually integrating the EnKF algorithm. This method was finally applied to a case with two different observation point with different error statistics. It was shown that the results of this emulated EnKF (EEnKF) in terms of background error variance, correlation length scale and analyzed water level is close to those of the EnKF but with a significantly reduced computational cost.


2014 ◽  
Vol 31 (10) ◽  
pp. 2330-2349 ◽  
Author(s):  
Andrea Storto ◽  
Simona Masina ◽  
Srdjan Dobricic

Abstract Optimally modeling background-error horizontal correlations is crucial in ocean data assimilation. This paper investigates the impact of releasing the assumption of uniform background-error correlations in a global ocean variational analysis system. Spatially varying horizontal correlations are introduced in the recursive filter operator, which is used for modeling horizontal covariances in the Centro Euro-Mediterraneo sui Cambiamenti Climatici (CMCC) analysis system. The horizontal correlation length scales (HCLSs) were defined on the full three-dimensional model space and computed from both a dataset of monthly anomalies with respect to the monthly climatology and through the so-called National Meteorological Center (NMC) method. Different formulas for estimating the correlation length scale are also discussed and applied to the two forecast error datasets. The new formulation is tested within a 12-yr period (2000–11) in the ½° resolution system. The comparison with the data assimilation system using uniform background-error horizontal correlations indicates the superiority of the former, especially in eddy-dominated areas. Verification skill scores report a significant reduction of RMSE, and the use of nonuniform length scales improves the representation of the eddy kinetic energy at midlatitudes, suggesting that uniform, latitude, or Rossby radius-dependent formulations are insufficient to represent the geographical variations of the background-error correlations. Furthermore, a small tuning of the globally uniform value of the length scale was found to have a small impact on the analysis system. The use of either anomalies or NMC-derived correlation length scales also has a marginal effect with respect to the use of nonuniform HCLSs. On the other hand, the application of overestimated length scales has proved to be detrimental to the analysis system in all areas and for all parameters.


2021 ◽  
Vol 40 (3) ◽  
Author(s):  
Bo Hou ◽  
Yongbin Ge

AbstractIn this paper, by using the local one-dimensional (LOD) method, Taylor series expansion and correction for the third derivatives in the truncation error remainder, two high-order compact LOD schemes are established for solving the two- and three- dimensional advection equations, respectively. They have the fourth-order accuracy in both time and space. By the von Neumann analysis method, it shows that the two schemes are unconditionally stable. Besides, the consistency and convergence of them are also proved. Finally, numerical experiments are given to confirm the accuracy and efficiency of the present schemes.


1985 ◽  
Vol 111 (8-9) ◽  
pp. 419-422 ◽  
Author(s):  
N.M. Bogoliubov ◽  
V.E. Korepin

2021 ◽  
Vol 13 (3) ◽  
pp. 426
Author(s):  
Zheng Qi Wang ◽  
Roger Randriamampianina

The assimilation of microwave and infrared (IR) radiance satellite observations within numerical weather prediction (NWP) models have been an important component in the effort of improving the accuracy of analysis and forecast. Such capabilities were implemented during the development of the high-resolution Copernicus European Regional Reanalysis (CERRA), funded by the Copernicus Climate Change Services (C3S). The CERRA system couples the deterministic system with the ensemble data assimilation to provide periodic updates of the background error covariance matrix. Several key factors for the assimilation of radiances were investigated, including appropriate use of variational bias correction (VARBC), surface-sensitive AMSU-A observations and observation error correlation. Twenty-one-day impact studies during the summer and winter seasons were conducted. Generally, the assimilation of radiances has a small impact on the analysis, while greater impacts are observed on short-range (12 and 24-h) forecasts with an error reduction of 1–2% for the mid and high troposphere. Although, the current configuration provided less accurate forecasts from 09 and 18 UTC analysis times. With the increased thinning distances and the rejection of IASI observation over land, the errors in the analyses and 3 h forecasts on geopotential height were reduced up to 2%.


Sign in / Sign up

Export Citation Format

Share Document