scholarly journals How much information do extinction and backscattering measurements contain about the chemical composition of atmospheric aerosol?

2016 ◽  
Author(s):  
Michael Kahnert ◽  
Emma Andersson

Abstract. We theoretically and numerically investigate the problem of assimilating lidar observations of extinction and backscattering coefficients of aerosols into a chemical transport model. More specifically, we consider the inverse problem of determining the chemical composition of aerosols from these observations. The main questions are how much information the observations contain to constrain the particles' chemical composition, and how one can optimise a chemical data assimilation system to make maximum use of the available information. We first quantify the information content of the measurements by computing the singular values of the observation operator. From the singular values we can compute the number of signal degrees of freedom and the reduction in Shannon entropy. For an observation standard deviation of 10 %, it is found that simultaneous measurements of extinction and backscattering allows us to constrain twice as many model variables as extinction measurements alone. The same holds for measurements at two wavelengths compared to measurements at a single wavelength. However, when we extend the set of measurements from two to three wavelengths then we observe only a small increase in the number of signal degrees of freedom, and a minor change in the Shannon entropy. The information content is strongly sensitive to the observation error; both the number of signal degrees of freedom and the reduction in Shannon entropy steeply decrease as the observation standard deviation increases in the range between 1 and 100 %. The right singular vectors of the observation operator can be employed to transform the model variables into a new basis in which the components of the state vector can be divided into signal-related and noise-related components. We incorporate these results in a chemical data assimilation algorithm by introducing weak constraints that restrict the assimilation algorithm to acting on the signal-related model variables only. This ensures that the information contained in the measurements is fully exploited, but not over-used. Numerical experiments confirm that the constrained data assimilation algorithm solves the inverse problem in a way that automatises the choice of control variables, and that restricts the minimisation of the costfunction to the signal-related model variables.

2017 ◽  
Vol 17 (5) ◽  
pp. 3423-3444 ◽  
Author(s):  
Michael Kahnert ◽  
Emma Andersson

Abstract. We theoretically and numerically investigate the problem of assimilating multiwavelength lidar observations of extinction and backscattering coefficients of aerosols into a chemical transport model. More specifically, we consider the inverse problem of determining the chemical composition of aerosols from these observations. The main questions are how much information the observations contain to determine the particles' chemical composition, and how one can optimize a chemical data assimilation system to make maximum use of the available information. We first quantify the information content of the measurements by computing the singular values of the scaled observation operator. From the singular values we can compute the number of signal degrees of freedom, Ns, and the reduction in Shannon entropy, H. As expected, the information content as expressed by either Ns or H grows as one increases the number of observational parameters and/or wavelengths. However, the information content is strongly sensitive to the observation error. The larger the observation error variance, the lower the growth rate of Ns or H with increasing number of observations. The right singular vectors of the scaled observation operator can be employed to transform the model variables into a new basis in which the components of the state vector can be partitioned into signal-related and noise-related components. We incorporate these results in a chemical data assimilation algorithm by introducing weak constraints that restrict the assimilation algorithm to acting on the signal-related model variables only. This ensures that the information contained in the measurements is fully exploited, but not overused. Numerical tests show that the constrained data assimilation algorithm provides a solution to the inverse problem that is considerably less noisy than the corresponding unconstrained algorithm. This suggests that the restriction of the algorithm to the signal-related model variables suppresses the assimilation of noise in the observations.


2021 ◽  
Author(s):  
Ross N. Hoffman ◽  
Katherine Lukens ◽  
Kayo Ide ◽  
Kevin Garrett

<p>In this study we propose and test a feature track correction (FTC) observation operator for atmospheric motion vectors (AMVs).  The FTC has four degrees of freedom corresponding to wind speed multiplicative and additive corrections (γ and δ<em><strong>V</strong></em>), a vertical height assignment correction (<em>h</em>), and an estimate of the depth of the layer that contributes to the AMV (Δ<em>z</em>).  Since the effect of the FTC observation operator is to add a bias correction to a weighted average of the profile of background winds an alternate formulation is in terms of a profile of weights (<em>w<sub>k</sub></em>) and δ<em><strong>V </strong></em>.</p><p>The FTC observation operator is tested in the context of a collocation study between AMVs projected onto the collocated Aeolus horizontal line-of-sight (HLOS) and the Aeolus HLOS wind profiles.  This is a prototype for an implementation in a variational data assimilation system and here the Aeolus profiles act as the background in the FTC observation operator.  Results were obtained for ten days of data using modest QC.  The overall OMB or collocation difference SD for a global solution applied to the independent sample is 5.49 m/s with negligible mean.  For comparison the corresponding simple (or pure) collocation SD is 7.85 m/s, and the null solution, which only interpolates the Aeolus profile to the reported height of the AMV and removes the overall bias, has an OMB SD of 7.23 m/s. These values correspond to reductions of variance of 51.0% and 42.3%, due to the FTC observation operator in comparison to the simple collocation and null solution, respectively.</p><p>These preliminary tests demonstrate the potential for the FTC observation operator for </p><ul><li>Improving AMV collocations (including triple collocation) with profile wind data.</li> <li>Characterizing AMVs. For example, summary results for the HLOS winds show that AMVs compare best with wind profiles averaged over a 4.5 km layer centered 0.5 km above the reported AMV height.</li> <li>Improving AMV observation usage within data assimilation (DA) systems. Lower estimated error and more realistic representation of AMVs with variational FTC (VarFTC) should result in greater information extracted.  The FTC observation operator accomplishes this by accounting for the effects of <em>h</em> and Δ<em>z</em>. </li> </ul>


2017 ◽  
Vol 145 (2) ◽  
pp. 709-725 ◽  
Author(s):  
Alison Margaret Fowler

There is a vast amount of information about the atmosphere available from instruments on board satellites. One example is the Infrared Atmospheric Sounding Interferometer (IASI) instrument, which measures radiances emitted from Earth’s atmosphere and surface in 8461 channels. It is difficult to transmit, store, and assimilate such a large amount of data. A practical solution to this has been to select a subset of a few hundred channels based on those that contain the most useful information. Different measures of information content for objective channel selection have been suggested for application to variational data assimilation. These include mutual information and the degrees of freedom for signal. To date, the calculation of these measures of information content has been based on the linear theory that is at the heart of operational variational data assimilation. However, the retrieval of information about the atmosphere from the satellite radiances can be highly nonlinear. Here, a sampling method for calculating the mutual information that is free from assumptions about the linearity of the relationship between the observed radiances and the state variables is examined. It is found that large linearization errors can indeed lead to large discrepancies in the value of mutual information. How this new estimate of information content can be used in channel selection is addressed, with particular attention given to the efficiency of the new method. It is anticipated that accounting for the nonlinearity in the channel selection will be beneficial when using nonlinear data assimilation methods currently in development.


2014 ◽  
Vol 31 (9) ◽  
pp. 2008-2014 ◽  
Author(s):  
Xin Zhang ◽  
Ying-Hwa Kuo ◽  
Shu-Ya Chen ◽  
Xiang-Yu Huang ◽  
Ling-Feng Hsiao

Abstract The nonlocal excess phase observation operator for assimilating the global positioning system (GPS) radio occultation (RO) sounding data has been proven by some research papers to produce significantly better analyses for numerical weather prediction (NWP) compared to the local refractivity observation operator. However, the high computational cost and the difficulties in parallelization associated with the nonlocal GPS RO operator deter its application in research and operational NWP practices. In this article, two strategies are designed and implemented in the data assimilation system for the Weather Research and Forecasting Model to demonstrate the capability of parallel assimilation of GPS RO profiles with the nonlocal excess phase observation operator. In particular, to solve the parallel load imbalance problem due to the uneven geographic distribution of the GPS RO observations, round-robin scheduling is adopted to distribute GPS RO observations among the processing cores to balance the workload. The wall clock time required to complete a five-iteration minimization on a demonstration Antarctic case with 106 GPS RO observations is reduced from more than 3.5 h with a single processing core to 2.5 min with 106 processing cores. These strategies present the possibility of application of the nonlocal GPS RO excess phase observation operator in operational data assimilation systems with a cutoff time limit.


2014 ◽  
Vol 10 (2) ◽  
pp. 437-449 ◽  
Author(s):  
P. Breitenmoser ◽  
S. Brönnimann ◽  
D. Frank

Abstract. We investigate relationships between climate and tree-ring data on a global scale using the process-based Vaganov–Shashkin Lite (VSL) forward model of tree-ring width formation. The VSL model requires as inputs only latitude, monthly mean temperature, and monthly accumulated precipitation. Hence, this simple, process-based model enables ring-width simulation at any location where monthly climate records exist. In this study, we analyse the growth response of simulated tree rings to monthly climate conditions obtained from the CRU TS3.1 data set back to 1901. Our key aims are (a) to assess the VSL model performance by examining the relations between simulated and observed growth at 2287 globally distributed sites, (b) indentify optimal growth parameters found during the model calibration, and (c) to evaluate the potential of the VSL model as an observation operator for data-assimilation-based reconstructions of climate from tree-ring width. The assessment of the growth-onset threshold temperature of approximately 4–6 °C for most sites and species using a Bayesian estimation approach complements other studies on the lower temperature limits where plant growth may be sustained. Our results suggest that the VSL model skilfully simulates site level tree-ring series in response to climate forcing for a wide range of environmental conditions and species. Spatial aggregation of the tree-ring chronologies to reduce non-climatic noise at the site level yielded notable improvements in the coherence between modelled and actual growth. The resulting distinct and coherent patterns of significant relationships between the aggregated and simulated series further demonstrate the VSL model's ability to skilfully capture the climatic signal contained in tree-ring series. Finally, we propose that the VSL model can be used as an observation operator in data assimilation approaches to reconstruct past climate.


2012 ◽  
Vol 518-523 ◽  
pp. 1586-1591
Author(s):  
Hao Zhang ◽  
Ze Meng Zhao ◽  
Ahmet Palazoglu ◽  
Wei Sun

Surface ozone in the air boundary layer is one of the most harmful air pollutants produced by photochemical reaction between nitrogen oxides and volatile hydrocarbons, which causes great damage to human beings and environment. The prediction of surface ozone levels plays an important role in the control and the reduction of air pollutants. As model-driven statistical prediction models, hidden Markov Models (HMMs) are rich in mathematical structure and work well in many important applications. Due to the complex structure of HMM, long observation sequences would increase computational load by geometric ratio. In order to reduce training time, wavelet decomposition is used to compress the original observations into shorter ones. During compression step, observation sequences compressed by different wavelet basis functions keep different information content. This may have impact on prediction results. In this paper, ozone prediction performance of HMM based on different wavelet basis functions are discussed. Shannon entropy is employed to measure how much information content is kept in the new sequence compared to the original one. Data from Houston Metropolitan Area, TX are used in this paper. Results show that wavelet basis functions used in data compression step can affect the HMM model performance significantly. The new sequence with the maximum Shannon entropy generates the best prediction result.



2020 ◽  
Author(s):  
Giovanni Leone ◽  
Fortuna Munno ◽  
Rocco Pierri

This manuscript has been accepted for publication on IEEE Transactions on Antennas and Propagation.<br><br><div>Abstract:</div><div>The paper adopts an inverse problem approach to examine the role of some 2D geometries in the source reconstruction from far zone data. It aims at evaluating the number of independent pieces of information, i.e. the number of degrees of freedom (NDF), of the source and pointing out the set of far zone fields corresponding to stable solutions of the inverse problem. Some of the results are relevant to the synthesis problem of conformal antennas, since a general comparison of different source geometries in providing radiation pattern specifications is proposed.</div>


Sign in / Sign up

Export Citation Format

Share Document