scholarly journals Optimal use of buffer volumes for the measurement of atmospheric gas concentration in multi-point systems

2016 ◽  
Vol 9 (9) ◽  
pp. 4665-4672 ◽  
Author(s):  
Alessandro Cescatti ◽  
Barbara Marcolla ◽  
Ignacio Goded ◽  
Carsten Gruening

Abstract. Accurate multi-point monitoring systems are required to derive atmospheric measurements of greenhouse gas concentrations both for the calculation of surface fluxes with inversion transport models and for the estimation of non-turbulent components of the mass balance equation (i.e. advection and storage fluxes) at eddy covariance sites. When a single analyser is used to monitor multiple sampling points, the deployment of buffer volumes (BVs) along sampling lines can reduce the uncertainty due to the discrete temporal sampling of the signal. In order to optimize the use of buffer volumes we explored various set-ups by simulating their effect on time series of high-frequency CO2 concentration collected at three Fluxnet sites. Besides, we proposed a novel scheme to calculate half-hourly weighted arithmetic means from discrete point samples, accounting for the probabilistic fraction of the signal generated in the averaging period. Results show that the use of BVs with the new averaging scheme reduces the mean absolute error (MAE) up to 80 % compared to a set-up without BVs and up to 60 % compared to the case with BVs and a standard, non-weighted averaging scheme. The MAE of CO2 concentration measurements was observed to depend on the variability of the concentration field and on the size of BVs, which therefore have to be carefully dimensioned. The optimal volume size depends on two main features of the instrumental set-up: the number of measurement points and the time needed to sample at one point (i.e. line purging plus sampling time). A linear and consistent relationship was observed at all sites between the sampling frequency, which summarizes the two features mentioned above, and the renewal frequency associated with the volume. Ultimately, this empirical relationship can be applied to estimate the optimal volume size according to the technical specifications of the sampling system.

2016 ◽  
Author(s):  
Alessandro Cescatti ◽  
Barbara Marcolla ◽  
Ignacio Goded ◽  
Carsten Gruening

Abstract. Accurate multi-point monitoring systems are required to derive atmospheric measurements of greenhouse gas concentrations both for the calculation of surface fluxes with inversion transport models and for the estimation of non-turbulent components of the mass balance equation (i.e. advection and storage fluxes) at eddy covariance sites. When a single analyser is used to monitor multiple sampling points, the deployment of buffer volumes (BV) along sampling lines can reduce the uncertainty due to the discrete temporal sampling of the signal. In order to optimize the use of buffer volumes we explored various setups by simulating their effect on time series of high-frequency CO2 concentration collected at three Fluxnet sites. Besides, we proposed a novel scheme to calculate half hourly weighted averages from discrete point samples, accounting for the probabilistic fraction of the signal generated in the averaging period. Results show that the use of BV with the new averaging scheme reduces the mean absolute errors (MAE) up to 80 % compared to set-up without BV and up to 60 % compared to the case with BV and a standard, non-weighted averaging scheme. The MAE of CO2 concentration measurements was observed to depend on the variability of the concentration field and on the size of BV, which therefore have to be carefully dimensioned. The optimal volume size depends on two main features of the instrumental setup: the number of measurement points and the time needed to sample at one point (i.e. line purging plus sampling time). A linear and consistent relationship was observed at all sites between the sampling frequency, which summarizes the two features mentioned above, and the renewal frequency associated to the volume. Ultimately, this empirical relationship can be applied to estimate the optimal volume size according to the technical specifications of the sampling system.


Author(s):  
Johannes Gradl ◽  
Florian Schwertfirm ◽  
Hans-Christoph Schwarzer ◽  
Hans-Joachim Schmid ◽  
Michael Manhart ◽  
...  

Mixing and consequently fluid dynamic is a key parameter to tailor the particle size distribution (PSD) in nanoparticle precipitation. Due to fast and intensive mixing a static T-mixer configuration is capable for synthesizing continuously nanoparticles. The flow and concentration field of the applied mixer is investigated experimentally at different flow rates by Particle Image Velocimetry (PIV) and Laser Induced Fluorescence (LIF). Due to the PIV measurements the flow field in the mixer was characterized qualitatively and the mixing process itself is quantified by the subsequent LIF-measurements. A special feature of the LIF set up is to detect structures in the flow field, which are smaller than the Batchelor length. Thereby a detailed insight into the mixing process in a static T-Mixer is given. In this study a CFD-based approach using Direct Numerical Simulation (DNS) in combination with the solid formation kinetics solving population balance equations (PBE) is applied, using barium sulfate as modeling material. A Lagrangian Particle Tracking strategy is used to couple the flow field information with a micro mixing model and with the classical theory of nucleation. We found that the DNS-PBE approach including macro and micro mixing, combined with the population balance is capable of predicting the full PSD in nanoparticle precipitation for different operating parameters. Additionally to the resulting PSD, this approach delivers a 3D-information about all running subprocesses in the mixer, i.e. supersaturation built-up or nucleation, which is visualized for different process variables.


2015 ◽  
Vol 37 (1) ◽  
pp. 43-47
Author(s):  
Sławomir Wąsik ◽  
Michał Arabski ◽  
Karolina Maciejec ◽  
Grażyna Suchanek ◽  
Anna Świercz

The objective of the present study has been to test the laser interferometry method in terms of its usability for investigating sorption properties of minerals. This method was used to test the absorption capacity of halloysite with reference to glucose, which is often found in industrial wastewater and whose excess can disturb the environmental eco-balance. The sorption capacity of halloysite was thus determined indirectly, basing on the comparison of concentration profiles as well as time characteristics of glucose quantities released from the control solution and from the solution incubated with a halloysite adsorbent. An analysis of glucose diffusion was conducted in a two-chamber membrane system. On the basis of the obtained concentration profiles, the evolution of the concentration field was determined; so were the removal efficiency (%) and the amount of glucose adsorbed at equilibrium (qe, mg/g). The obtained results confirm good sorption properties of halloysite with respect to the investigated substance as well as usability of the method for this kind of investigations. The presented tests suggest that the measurement set-up can be optimised in such as way that visual rendering and testing the kinetics of the adsorbed substance direct release from the studied material become possible.


2014 ◽  
Vol 14 (20) ◽  
pp. 27663-27729 ◽  
Author(s):  
T. Launois ◽  
P. Peylin ◽  
S. Belviso ◽  
B. Poulter

Abstract. Clear analogies between carbonyl sulfide (OCS) and carbon dioxide (CO2) diffusion pathways through leaves have been revealed by experimental studies with plant uptake playing an important role for the atmospheric budget of both species. Here we use atmospheric OCS to evaluate the gross primary production (GPP) of three dynamic global vegetation models (LPJ, NCAR-CLM4 and ORCHIDEE). Vegetation uptake of OCS is modeled as a linear function of GPP and LRU, the ratio of OCS to CO2 deposition velocities to plants. New parameterizations for the non-photosynthetic sinks (oxic soils, atmospheric oxidation) and biogenic sources (oceans and anoxic soils) of OCS are also provided. Despite new large oceanic emissions, global OCS budgets created with each vegetation model show exceeding sinks by several hundreds of Gg S yr−1. An inversion of the surface fluxes (optimization of a global scalar which accounts for flux uncertainties) led to balanced OCS global budgets, as atmospheric measurements suggest, mainly by drastic reduction (−30%) of soil and vegetation uptakes. The amplitude of variations in atmospheric OCS mixing ratios is mainly dictated by the vegetation sink over the Northern Hemisphere. This allows for bias recognition in the GPP representations of the three selected models. Main bias patterns are (i) the terrestrial GPP of ORCHIDEE at high Northern latitudes is currently over-estimated, (ii) the seasonal variations of the GPP are out of phase in the NCAR-CLM4 model, showing a maximum carbon uptake too early in spring in the northernmost ecosystems, (iii) the overall amplitude of the seasonal variations of GPP in NCAR-CLM4 is too small, and (iv) for the LPJ model, the GPP is slightly out of phase for northernmost ecosystems and the respiration fluxes might be too large in summer in the Northern Hemisphere.


2018 ◽  
Vol 75 (10) ◽  
pp. 3347-3363 ◽  
Author(s):  
Wojciech W. Grabowski

Influence of pollution on dynamics of deep convection continues to be a controversial topic. Arguably, only carefully designed numerical simulations can clearly separate the impact of aerosols from the effects of meteorological factors that affect moist convection. This paper argues that such a separation is virtually impossible using observations because of the insufficient accuracy of atmospheric measurements and the fundamental nature of the interaction between deep convection and its environment. To support this conjecture, results from numerical simulations are presented that apply modeling methodology previously developed by the author. The simulations consider small modifications, difficult to detect in observations, of the initial sounding, surface fluxes, and large-scale forcing tendencies. All these represent variations of meteorological conditions that affect deep convective dynamics independently of aerosols. The setup follows the case of daytime convective development over land based on observations during the Large-Scale Biosphere–Atmosphere (LBA) field project in Amazonia. The simulated observable macroscopic changes of convection, such as the surface precipitation and upper-tropospheric cloudiness, are similar to or larger than those resulting from changes of cloud condensation nuclei from pristine to polluted conditions studied previously using the same modeling case. Observations from Phase III of the Global Atmospheric Research Program Atlantic Tropical Experiment (GATE) are also used to support the argument concerning the impact of the large-scale forcing. The simulations suggest that the aerosol impacts on dynamics of deep convection cannot be isolated from meteorological effects, at least for the daytime development of unorganized deep convection considered in this study.


Author(s):  
Ioannis P. Chochliouros

The European Authorities have promoted a specific and innovative framework for the use of electronic signatures, allowing the free flow of electronic signature-related products and services cross borders, and ensuring a basic legal recognition of such facilities. The core aim was to promote the emergence of the internal market for certification products, mainly intending to satisfy various requirements for the proper use and immediate “adoption” of electronic signature applications related to e-government and personal e-banking services. Thus, a number of technical, procedural, and quality standards for electronic signature products and solutions have been developed, all conforming to the requirements imposed by the EU regulation and the relevant market needs. In the present work, we examine the role of standardization activities for the promotion of several needs of an “open” European market based on the effective usage of e-signatures, and being able to affect a great variety of technological, business- commercial, regulatory, and other issues. In any case, the transposition of legal requirements into technical specifications (or business practices) needs to be harmonized at a European member-states’ level in order to enable adequate interoperability of the final solutions proposed. Appropriate technical standards for the sector can help to establish a presumption of conformity that the electronic signature products following or implementing them comply with all the legal requirements imposed, in the background of the actual European policies. Thus we discuss recent European and/or national initiatives to fulfil such a fundamental option. The European Electronic Signature Standardization Initiative (EESSI) has been set up under the auspices of the European Commission for the carrying out of a work program aiming at the development of standards (be it technical specifications or policy practices) that would facilitate the implementation of the basic legal instrument (the “Electronic Signatures Directive”). Two major streams of possible standards-setting work have been determined, covering: (i) Qualitative and procedural standards for the provision of certification services and (ii) technical standards for product interoperability. We identify (and evaluate at a primary level) the basic components/modules of EESSI’s specific results, already developed and offered in the market either as technical regulations and/or as recognized standards, with respect to essential requirements imposed by the European regulation. We also discuss relevant “feedback” already gained from various market areas and we focus on challenges for further implementation, progress, adoption, and development, especially in the framework for the promotion of converged broadband (Internet-based) communications facilities. It is important for the market that expected standardization work takes into account new technological developments as, in the future, users will move their e-signature key from device-to-device in a connected world. The added value of standards in the e-signatures sector, for both end users and assessing parties (judge, arbitrator, conformity assessment body, etc.) is of extreme importance for the future of the European electronic communications market.


2008 ◽  
Vol 136 (11) ◽  
pp. 4373-4397 ◽  
Author(s):  
Agata Moscatello ◽  
Mario Marcello Miglietta ◽  
Richard Rotunno

Abstract The presence of a subsynoptic-scale vortex over the Mediterranean Sea in southeastern Italy on 26 September 2006 has been recently documented by the authors. The transit of the cyclone over land allowed an accurate diagnosis of the structure of the vortex, based on radar and surface station data, showing that the cyclone had features similar to those observed in tropical cyclones. To investigate the cyclone in greater depth, numerical simulations have been performed using the Weather Research and Forecasting (WRF) model, set up with two domains, in a two-way-nested configuration. Model simulations are able to properly capture the timing and intensity of the small-scale cyclone. Moreover, the present simulated cyclone agrees with the observational analysis of this case, identifying in this small-scale depression the typical characteristics of a Mediterranean tropical-like cyclone. An analysis of the mechanisms responsible for the genesis, development, and maintenance of the cyclone has also been performed. Sensitivity experiments show that cyclogenesis on the lee side of the Atlas Mountains is responsible for the generation of the cyclone. Surface sensible and latent heat fluxes become important during the subsequent phase of development in which the lee-vortex shallow depression evolved as it moved toward the south of Sicily. During this phase, the latent heating, associated with convective motions triggered by a cold front entering the central Mediterranean area, was important for the intensification and contraction of the horizontal scale of the vortex. The small-scale cyclone subsequently deepened as it moved over the Ionian Sea and then maintained its intensity during its later transit over the Adriatic Sea; in this later stage, latent heat release continued to play a major role in amplifying and maintaining the vortex, while the importance of the surface fluxes diminished.


2007 ◽  
Vol 7 (16) ◽  
pp. 4249-4256 ◽  
Author(s):  
M. Buchwitz ◽  
O. Schneising ◽  
J. P. Burrows ◽  
H. Bovensmann ◽  
M. Reuter ◽  
...  

Abstract. The reliable prediction of future atmospheric CO2 concentrations and associated global climate change requires an adequate understanding of the CO2 sources and sinks. The sparseness of the existing surface measurement network limits current knowledge about the global distribution of CO2 surface fluxes. The retrieval of CO2 total vertical columns from satellite observations is predicted to improve this situation. Such an application however requires very high accuracy and precision. We report on retrievals of the column-averaged CO2 dry air mole fraction, denoted XCO2, from the near-infrared nadir spectral radiance and solar irradiance measurements of the SCIAMACHY satellite instrument between 2003 and 2005. We focus on northern hemispheric large scale CO2 features such as the CO2 seasonal cycle and show - for the first time - that the atmospheric annual increase of CO2 can be directly observed using satellite measurements of the CO2 total column. The satellite retrievals are compared with global XCO2 obtained from NOAA's CO2 assimilation system CarbonTracker taking into account the spatio-temporal sampling and altitude sensitivity of the satellite data. We show that the measured CO2 year-to-year increase agrees within about 1 ppm/year with CarbonTracker. We also show that the latitude dependent amplitude of the northern hemispheric CO2 seasonal cycle agrees with CarbonTracker within about 2 ppm with the retrieved amplitude being systematically larger. The analysis demonstrates that it is possible using satellite measurements of the CO2 total column to retrieve information on the atmospheric CO2 on the level of a few parts per million.


2011 ◽  
Vol 11 (4) ◽  
pp. 1659-1670 ◽  
Author(s):  
A. Font ◽  
J.-A. Morguí ◽  
X. Rodó

Abstract. In this study the differences in the measured atmospheric CO2 mixing ratio at three aircraft profiling sites in NE Spain separated by 60 km are analyzed in regard to the variability of the surface fluxes in the regional surface influence area. First, the Regional Potential Surface Influence (RPSI) for fifty-one days in 2006 is calculated to assess the vertical, horizontal and temporal extent of the surface influence for the three sites at the regional scale (104 km2) at different altitudes of the profile (600, 1200, 2500 and 4000 meters above the sea level, m a.s.l.). Second, three flights carried out in 2006 (7 February, 24 August and 29 November) following the Crown Atmospheric Sampling (CAS) design are presented to study the relation between the measured CO2 variability and the Potential Surface Influence (PSI) and RPSI concepts. At 600 and 1200 m a.s.l. the regional signal is confined up to 50 h before the measurements whereas at higher altitudes (2500 and 4000 m a.s.l.) the regional surface influence is only recovered during spring and summer months. The RPSI from sites separated by 60 km overlap by up to 70% of the regional surface influence at 600 and 1200 m a.s.l., while the overlap decreases to 10–40% at higher altitudes (2500 and 4000 m a.s.l.). The scale of the RPSI area is suitable to understand the differences in the measured CO2 concentration in the three vertices of the CAS, as CO2 differences are attributed to local surrounding fluxes (February) or to the variability of regional surface influence as for the August and November flights. For these two flights, the variability in the regional scale influences the variability measured in the local scale. The CAS sampling design for aircraft measurements appears to be a suitable method to cope with the variability of a typical grid for inversion models as measurements are intensified within the PBL and the background concentration is measured every ~102 km.


Sign in / Sign up

Export Citation Format

Share Document