scholarly journals Improved retrieval of nitrogen dioxide (NO<sub>2</sub>) column densities by means of MKIV Brewer spectrophotometers

2014 ◽  
Vol 7 (11) ◽  
pp. 4009-4022 ◽  
Author(s):  
H. Diémoz ◽  
A. M. Siani ◽  
A. Redondas ◽  
V. Savastiouk ◽  
C. T. McElroy ◽  
...  

Abstract. A new algorithm to retrieve nitrogen dioxide (NO2) column densities using MKIV ("Mark IV") Brewer spectrophotometers is described. The method includes several improvements, such as a more recent spectroscopic data set, the reduction of measurement noise, interference by other atmospheric species and instrumental settings, and a better determination of the zenith sky air mass factor. The technique was tested during an ad hoc calibration campaign at the high-altitude site of Izaña (Tenerife, Spain) and the results of the direct sun and zenith sky geometries were compared to those obtained by two reference instruments from the Network for the Detection of Atmospheric Composition Change (NDACC): a Fourier Transform Infrared Radiometer (FTIR) and an advanced visible spectrograph (RASAS-II) based on the differential optical absorption spectrometry (DOAS) technique. To determine the extraterrestrial constant, an easily implementable extension of the standard Langley technique for very clean sites without tropospheric NO2 was developed which takes into account the daytime linear drift of stratospheric nitrogen dioxide due to photochemistry. The measurement uncertainty was thoroughly determined by using a Monte Carlo technique. Poisson noise and wavelength misalignments were found to be the most influential contributors to the overall uncertainty, and possible solutions are proposed for future improvements. The new algorithm is backward-compatible, thus allowing for the reprocessing of historical data sets.

2014 ◽  
Vol 7 (7) ◽  
pp. 7367-7396
Author(s):  
H. Diémoz ◽  
A. M. Siani ◽  
A. Redondas ◽  
V. Savastiouk ◽  
C. T. McElroy

Abstract. A new algorithm to retrieve nitrogen dioxide (NO2) column densities using MKIV Brewer spectrophotometers is described. The method includes several improvements, such as a more recent spectroscopic dataset, the reduction of the measurement noise and interferences by other atmospheric species and instrumental settings, and a better determination of the air mass enhancement factors. The technique was tested during an ad-hoc calibration campaign at the high-altitude site of Izaña (Tenerife, Spain) and provided results compatible to those obtained from a spectrometer associated to the Network for the Detection of Atmospheric Composition Change (NDACC), with deviations of less than 0.02 DU. To determine the extraterrestrial constant, an easily implementable generalisation of the standard Langley technique was developed which takes into account the daytime linear drift of nitrogen dioxide due to the photochemistry. Estimates obtained from different observation geometries, by collecting the light from either the sun or the zenith sky, were found to be comparable within the measurement uncertainty. The latter was thoroughly determined by using a Monte Carlo technique. Finally, a method to retrieve additional products such as the degree of linear polarisation of the zenith sky and the oxygen dimer optical depth is presented. The new algorithm is backward-compatible, thus allowing for the reprocessing of historical datasets.


2015 ◽  
Vol 8 (6) ◽  
pp. 2417-2435 ◽  
Author(s):  
F. Tack ◽  
F. Hendrick ◽  
F. Goutail ◽  
C. Fayt ◽  
A. Merlaud ◽  
...  

Abstract. We present an algorithm for retrieving tropospheric nitrogen dioxide (NO2) vertical column densities (VCDs) from ground-based zenith–sky (ZS) measurements of scattered sunlight. The method is based on a four-step approach consisting of (1) the differential optical absorption spectroscopy (DOAS) analysis of ZS radiance spectra using a fixed reference spectrum corresponding to low NO2 absorption, (2) the determination of the residual amount in the reference spectrum using a Langley-plot-type method, (3) the removal of the stratospheric content from the daytime total measured slant column based on stratospheric VCDs measured at sunrise and sunset, and simulation of the rapid NO2 diurnal variation, (4) the retrieval of tropospheric VCDs by dividing the resulting tropospheric slant columns by appropriate air mass factors (AMFs). These steps are fully characterized and recommendations are given for each of them. The retrieval algorithm is applied on a ZS data set acquired with a multi-axis (MAX-) DOAS instrument during the Cabauw (51.97° N, 4.93° E, sea level) Intercomparison campaign for Nitrogen Dioxide measuring Instruments (CINDI) held from 10 June to 21 July 2009 in the Netherlands. A median value of 7.9 × 1015 molec cm−2 is found for the retrieved tropospheric NO2 VCDs, with maxima up to 6.0 × 1016 molec cm−2. The error budget assessment indicates that the overall error σTVCD on the column values is less than 28%. In the case of low tropospheric contribution, σTVCD is estimated to be around 39% and is dominated by uncertainties in the determination of the residual amount in the reference spectrum. For strong tropospheric pollution events, σTVCD drops to approximately 22% with the largest uncertainties on the determination of the stratospheric NO2 abundance and tropospheric AMFs. The tropospheric VCD amounts derived from ZS observations are compared to VCDs retrieved from off-axis and direct-sun measurements of the same MAX-DOAS instrument as well as to data from a co-located Système d'Analyse par Observations Zénithales (SAOZ) spectrometer. The retrieved tropospheric VCDs are in good agreement with the different data sets with correlation coefficients and slopes close to or larger than 0.9. The potential of the presented ZS retrieval algorithm is further demonstrated by its successful application on a 2-year data set, acquired at the NDACC (Network for the Detection of Atmospheric Composition Change) station Observatoire de Haute Provence (OHP; Southern France).


2012 ◽  
Vol 12 (5) ◽  
pp. 12357-12389
Author(s):  
F. Hendrick ◽  
E. Mahieu ◽  
G. E. Bodeker ◽  
K. F. Boersma ◽  
M. P. Chipperfield ◽  
...  

Abstract. The trend in stratospheric NO2 column at the NDACC (Network for the Detection of Atmospheric Composition Change) station of Jungfraujoch (46.5° N, 8.0° E) is assessed using ground-based FTIR and zenith-scattered visible sunlight SAOZ measurements over the period 1990 to 2009 as well as a composite satellite nadir data set constructed from ERS-2/GOME, ENVISAT/SCIAMACHY, and METOP-A/GOME-2 observations over the 1996–2009 period. To calculate the trends, a linear least squares regression model including explanatory variables for a linear trend, the mean annual cycle, the quasi-biennial oscillation (QBO), solar activity, and stratospheric aerosol loading is used. For the 1990–2009 period, statistically indistinguishable trends of −3.7 ± 1.1%/decade and −3.6 ± 0.9%/decade are derived for the SAOZ and FTIR NO2 column time series, respectively. SAOZ, FTIR, and satellite nadir data sets show a similar decrease over the 1996–2009 period, with trends of −2.4 ± 1.1%/decade, −4.3 ± 1.4%/decade, and −3.6 ± 2.2%/decade, respectively. The fact that these declines are opposite in sign to the globally observed +2.5%/decade trend in N2O, suggests that factors other than N2O are driving the evolution of stratospheric NO2 at northern mid-latitudes. Possible causes of the decrease in stratospheric NO2 columns have been investigated. The most likely cause is a change in the NO2/NO partitioning in favor of NO, due to a possible stratospheric cooling and a decrease in stratospheric chlorine content, the latter being further confirmed by the negative trend in the ClONO2 column derived from FTIR observations at Jungfraujoch. Decreasing ClO concentrations slows the NO + ClO → NO2 + Cl reaction and a stratospheric cooling slows the NO + O3 → NO2 + O2 reaction, leaving more NOx in the form of NO. The slightly positive trends in ozone estimated from ground- and satellite-based data sets are also consistent with the decrease of NO2 through the NO2 + O3 → NO3 + O2 reaction. Finally, we cannot rule out the possibility that a strengthening of the Dobson-Brewer circulation, which reduces the time available for N2O photolysis in the stratosphere, could also contribute to the observed decline in stratospheric NO2 above Jungfraujoch.


Entropy ◽  
2020 ◽  
Vol 22 (9) ◽  
pp. 1050
Author(s):  
Masengo Ilunga

This study assesses mainly the uncertainty of the mean annual runoff (MAR) for quaternary catchments (QCs) considered as metastable nonextensive systems (from Tsalllis entropy) in the Middle Vaal catchment. The study is applied to the surface water resources (WR) of the South Africa 1990 (WR90), 2005 (WR2005) and 2012 (WR2012) data sets. The q-information index (from the Tsalllis entropy) is used here as a deviation indicator for the spatial evolution of uncertainty for the different QCs, using the Shannon entropy as a baseline. It enables the determination of a (virtual) convergence point, zone of positive and negative uncertainty deviation, zone of null deviation and chaotic zone for each data set. Such a determination is not possible on the basis of the Shannon entropy alone as a measure for the MAR uncertainty of QCs, i.e., when they are viewed as extensive systems. Finally, the spatial distributions for the zones of the q-uncertainty deviation (gain or loss in information) of the MAR are derived and lead to iso q-uncertainty deviation maps.


Endocrinology ◽  
2019 ◽  
Vol 160 (10) ◽  
pp. 2395-2400 ◽  
Author(s):  
David J Handelsman ◽  
Lam P Ly

Abstract Hormone assay results below the assay detection limit (DL) can introduce bias into quantitative analysis. Although complex maximum likelihood estimation methods exist, they are not widely used, whereas simple substitution methods are often used ad hoc to replace the undetectable (UD) results with numeric values to facilitate data analysis with the full data set. However, the bias of substitution methods for steroid measurements is not reported. Using a large data set (n = 2896) of serum testosterone (T), DHT, estradiol (E2) concentrations from healthy men, we created modified data sets with increasing proportions of UD samples (≤40%) to which we applied five different substitution methods (deleting UD samples as missing and substituting UD sample with DL, DL/√2, DL/2, or 0) to calculate univariate descriptive statistics (mean, SD) or bivariate correlations. For all three steroids and for univariate as well as bivariate statistics, bias increased progressively with increasing proportion of UD samples. Bias was worst when UD samples were deleted or substituted with 0 and least when UD samples were substituted with DL/√2, whereas the other methods (DL or DL/2) displayed intermediate bias. Similar findings were replicated in randomly drawn small subsets of 25, 50, and 100. Hence, we propose that in steroid hormone data with ≤40% UD samples, substituting UD with DL/√2 is a simple, versatile, and reasonably accurate method to minimize left censoring bias, allowing for data analysis with the full data set.


2012 ◽  
Vol 12 (18) ◽  
pp. 8851-8864 ◽  
Author(s):  
F. Hendrick ◽  
E. Mahieu ◽  
G. E. Bodeker ◽  
K. F. Boersma ◽  
M. P. Chipperfield ◽  
...  

Abstract. The trend in stratospheric NO2 column at the NDACC (Network for the Detection of Atmospheric Composition Change) station of Jungfraujoch (46.5° N, 8.0° E) is assessed using ground-based FTIR and zenith-scattered visible sunlight SAOZ measurements over the period 1990 to 2009 as well as a composite satellite nadir data set constructed from ERS-2/GOME, ENVISAT/SCIAMACHY, and METOP-A/GOME-2 observations over the 1996–2009 period. To calculate the trends, a linear least squares regression model including explanatory variables for a linear trend, the mean annual cycle, the quasi-biennial oscillation (QBO), solar activity, and stratospheric aerosol loading is used. For the 1990–2009 period, statistically indistinguishable trends of −3.7 ± 1.1% decade−1 and −3.6 ± 0.9% decade−1 are derived for the SAOZ and FTIR NO2 column time series, respectively. SAOZ, FTIR, and satellite nadir data sets show a similar decrease over the 1996–2009 period, with trends of −2.4 ± 1.1% decade−1, −4.3 ± 1.4% decade−1, and −3.6 ± 2.2% decade−1, respectively. The fact that these declines are opposite in sign to the globally observed +2.5% decade−1 trend in N2O, suggests that factors other than N2O are driving the evolution of stratospheric NO2 at northern mid-latitudes. Possible causes of the decrease in stratospheric NO2 columns have been investigated. The most likely cause is a change in the NO2/NO partitioning in favor of NO, due to a possible stratospheric cooling and a decrease in stratospheric chlorine content, the latter being further confirmed by the negative trend in the ClONO2 column derived from FTIR observations at Jungfraujoch. Decreasing ClO concentrations slows the NO + ClO → NO2 + Cl reaction and a stratospheric cooling slows the NO + O3 → NO2 + O2 reaction, leaving more NOx in the form of NO. The slightly positive trends in ozone estimated from ground- and satellite-based data sets are also consistent with the decrease of NO2 through the NO2 + O3 → NO3 + O2 reaction. Finally, we cannot rule out the possibility that a strengthening of the Dobson-Brewer circulation, which reduces the time available for N2O photolysis in the stratosphere, could also contribute to the observed decline in stratospheric NO2 above Jungfraujoch.


Geophysics ◽  
1993 ◽  
Vol 58 (3) ◽  
pp. 408-418 ◽  
Author(s):  
L. R. Jannaud ◽  
P. M. Adler ◽  
C. G. Jacquin

A method developed for the determination of the characteristic lengths of an heterogeneous medium from the spectral analysis of codas is based on an extension of Aki’s theory to anisotropic elastic media. An equivalent Gaussian model is obtained and seems to be in good agreement with the two experimental data sets that illustrate the method. The first set was obtained in a laboratory experiment with an isotropic marble sample. This sample is characterized by a submillimetric length scale that can be directly observed on a thin section. The spectral analysis of codas and their inversion yields an equivalent correlation length that is in good agreement with the observed one. The second data set is obtained in a crosshole experiment at the usual scale of a seismic survey. The codas are recorded, analysed, and inverted. The analysis yields a vertical characteristic length for the studied subsurface that compares well with the characteristic length measured by seismic and stratigraphic logs.


2008 ◽  
Vol 41 (1) ◽  
pp. 83-95 ◽  
Author(s):  
Alexander Dudka

New methods for the determination of site occupancy factors are described. The methods are based on the analysis of differences between intensities of Friedel reflections in noncentrosymmetric crystals. In the first method (Anomalous-Expert) the site occupancy factor is determined by the condition that it is identical for two data sets: (1) initial data without averaging of Friedel intensities and (2) data that are averaged on Friedel pairs after the reduction of the anomalous scattering contribution. In the second method (anomalous anisotropic intermeasurement minimization method, Anomalous-AniMMM) the site occupancy factor is refined to satisfy the condition that the differences between the intensities of Friedel reflections that are reduced on the anomalous scattering contribution must be minimal. The methods were checked for three samples of RbTi1−xZrxOPO4crystals (A,BandC) with KTiOPO4structure, at 295 and 105 K (five experimental data sets). Microprobe measurements yield compositionsxA,B= 0.034 (5) andxC= 0.022 (4). The corresponding site occupancy factors areQA,B= 0.932 (10) andQC= 0.956 (8). Using Anomalous-AniMMM and three independent refinements for the first and second samples, the initial occupancy factor ofQA,B= 0.963 (15) was improved toQA,B= 0.938 (7). Of the three room-temperature data sets, one was improved toQA,B= 0.934 (2). For the third sample and one data set, the initial occupancy factor ofQC= 1.000 was improved toQC= 0.956 (1). The methods improve the Hirshfeld rigid-bond test. It is discussed how the description of chemical bonding influences the site occupancy factor.


2021 ◽  
Vol 2021 ◽  
pp. 1-12
Author(s):  
Guoyu Du ◽  
Xuehua Li ◽  
Lanjie Zhang ◽  
Libo Liu ◽  
Chaohua Zhao

The K-means algorithm has been extensively investigated in the field of text clustering because of its linear time complexity and adaptation to sparse matrix data. However, it has two main problems, namely, the determination of the number of clusters and the location of the initial cluster centres. In this study, we propose an improved K-means++ algorithm based on the Davies-Bouldin index (DBI) and the largest sum of distance called the SDK-means++ algorithm. Firstly, we use the term frequency-inverse document frequency to represent the data set. Secondly, we measure the distance between objects by cosine similarity. Thirdly, the initial cluster centres are selected by comparing the distance to existing initial cluster centres and the maximum density. Fourthly, clustering results are obtained using the K-means++ method. Lastly, DBI is used to obtain optimal clustering results automatically. Experimental results on real bank transaction volume data sets show that the SDK-means++ algorithm is more effective and efficient than two other algorithms in organising large financial text data sets. The F-measure value of the proposed algorithm is 0.97. The running time of the SDK-means++ algorithm is reduced by 42.9% and 22.4% compared with that for K-means and K-means++ algorithms, respectively.


Author(s):  
Tushar ◽  
Tushar ◽  
Shibendu Shekhar Roy ◽  
Dilip Kumar Pratihar

Clustering is a potential tool of data mining. A clustering method analyzes the pattern of a data set and groups the data into several clusters based on the similarity among themselves. Clusters may be either crisp or fuzzy in nature. The present chapter deals with clustering of some data sets using Fuzzy C-Means (FCM) algorithm and Entropy-based Fuzzy Clustering (EFC) algorithm. In FCM algorithm, the nature and quality of clusters depend on the pre-defined number of clusters, level of cluster fuzziness and a threshold value utilized for obtaining the number of outliers (if any). On the other hand, the quality of clusters obtained by the EFC algorithm is dependent on a constant used to establish the relationship between the distance and similarity of two data points, a threshold value of similarity and another threshold value used for determining the number of outliers. The clusters should ideally be distinct and at the same time compact in nature. Moreover, the number of outliers should be as minimum as possible. Thus, the above problem may be posed as an optimization problem, which will be solved using a Genetic Algorithm (GA). The best set of multi-dimensional clusters will be mapped into 2-D for visualization using a Self-Organizing Map (SOM).


Sign in / Sign up

Export Citation Format

Share Document