scholarly journals A new analysis of wind on chloride deposition for long-term aerosol chloride deposition monitoring with weekly sampling frequency

2019 ◽  
Vol 198 ◽  
pp. 46-54 ◽  
Author(s):  
Ngoc Duc Pham ◽  
Yukihisa Kuriyama ◽  
Naoya Kasai ◽  
Shinji Okazaki ◽  
Katsuyuki Suzuki ◽  
...  
2020 ◽  
Vol 20 (16) ◽  
pp. 9915-9938
Author(s):  
Kai-Lan Chang ◽  
Owen R. Cooper ◽  
Audrey Gaudel ◽  
Irina Petropavlovskikh ◽  
Valérie Thouret

Abstract. Detecting a tropospheric ozone trend from sparsely sampled ozonesonde profiles (typically once per week) is challenging due to the short-lived anomalies in the time series resulting from ozone's high temporal variability. To enhance trend detection, we have developed a sophisticated statistical approach that utilizes a geoadditive model to assess ozone variability across a time series of vertical profiles. Treating the profile time series as a set of individual time series on discrete pressure surfaces, a class of smoothing spline ANOVA (analysis of variance) models is used for the purpose of jointly modeling multiple correlated time series (on separate pressure surfaces) by their associated seasonal and interannual variabilities. This integrated fit method filters out the unstructured variation through a statistical regularization (i.e., a roughness penalty) by taking advantage of the additional correlated data points available on the pressure surfaces above and below the surface of interest. We have applied this technique to the trend analysis of the vertically correlated time series of tropospheric ozone observations from (1) IAGOS (In-service Aircraft for a Global Observing System) commercial aircraft profiles above Europe and China throughout 1994–2017 and (2) NOAA GML's (Global Monitoring Laboratory) ozonesonde records at Hilo, Hawaii, (1982–2018) and Trinidad Head, California (1998–2018). We illustrate the ability of this technique to detect a consistent trend estimate and its effectiveness in reducing the associated uncertainty in the profile data due to the low sampling frequency. We also conducted a sensitivity analysis of frequent IAGOS profiles above Europe (approximately 120 profiles per month) to determine how many profiles in a month are required for reliable long-term trend detection. When ignoring the vertical correlation, we found that a typical sampling strategy (i.e. four profiles per month) might result in 7 % of sampled trends falling outside the 2σ uncertainty interval derived from the full dataset with an associated 10 % of mean absolute percentage error. Based on a series of sensitivity studies, we determined optimal sampling frequencies for (1) basic trend detection and (2) accurate quantification of the trend. When applying the integrated fit method, we find that a typical sampling frequency of four profiles per month is adequate for basic trend detection; however, accurate quantification of the trend requires 14 profiles per month. Accurate trend quantification can be achieved with only 10 profiles per month if a regular sampling frequency is applied. In contrast, the standard separated fit method, which ignores the vertical correlation between pressure surfaces, requires 8 profiles per month for basic trend detection and 18 profiles per month for accurate trend quantification. While our method improves trend detection from sparse datasets, the key to substantially reducing the uncertainty is to increase the sampling frequency.


2020 ◽  
Author(s):  
Kai-Lan Chang ◽  
Owen R Cooper ◽  
Audrey Gaudel ◽  
Irina Petropavlovskikh ◽  
Valerie Thouret

Abstract. Detecting a tropospheric ozone trend from sparsely sampled ozonesonde profiles (typically once per week) is challenging due to the noise in the time series resulting from ozone's high temporal variability. To enhance trend detection we have developed a sophisticated statistical approach that utilizes a geoadditive model to assess ozone variability across a time series of vertical profiles. Treating the profile time series as a set of individual time series on discrete pressure surfaces, a class of smoothing spline ANOVA (analysis of variance) models is used for the purpose of jointly modeling multiple correlated time series (on separate pressure surfaces) by their associated seasonal and interannual variabilities. This integrated fit method filters out the unstructured noise through a statistical regularization (i.e. a roughness penalty), by taking advantage of the additional correlated data points available on the pressure surfaces above and below the surface of interest. We have applied this technique to the trend analysis of the vertically correlated time series of tropospheric ozone observations from 1) IAGOS (In-service Aircraft for a Global Observing System) commercial aircraft profiles above Europe and China, and 2) NOAA GMD's (Global Monitoring Division) ozonesonde records at Hilo, Hawaii and Trinidad Head, California. We illustrate the ability of this technique to detect a consistent trend estimate, and its effectiveness for reducing the associated uncertainty in the noisy profile data due to low sampling frequency. We also conducted a sensitivity analysis of frequent IAGOS profiles above Europe (approximately 120 profiles per month) to determine how many profiles in a month are required for reliable long-term trend detection. When ignoring the vertical correlation we found that a typical sampling strategy of 4 profiles-per-month results in 7 % of sampled trends falling outside the 2-sigma uncertainty interval derived from the full data set, with associated 10 % of mean absolute percentage error. We determined that an optimal sampling frequency is 14 profiles per month when using the integrated fit method for calculating trends; when the integrated fit method is not applied, the sampling frequency had to be increased to 18 profiles per month to achieve the same result. While our method improves trend detection from sparse data sets, the key to substantially reducing the uncertainty is to increase the sampling frequency.


2021 ◽  
Vol 8 ◽  
Author(s):  
Philipp Fischer ◽  
Peter Dietrich ◽  
Eric P. Achterberg ◽  
Norbert Anselm ◽  
Holger Brix ◽  
...  

A thorough and reliable assessment of changes in sea surface water temperatures (SSWTs) is essential for understanding the effects of global warming on long-term trends in marine ecosystems and their communities. The first long-term temperature measurements were established almost a century ago, especially in coastal areas, and some of them are still in operation. However, while in earlier times these measurements were done by hand every day, current environmental long-term observation stations (ELTOS) are often fully automated and integrated in cabled underwater observatories (UWOs). With this new technology, year-round measurements became feasible even in remote or difficult to access areas, such as coastal areas of the Arctic Ocean in winter, where measurements were almost impossible just a decade ago. In this context, there is a question over what extent the sampling frequency and accuracy influence results in long-term monitoring approaches. In this paper, we address this with a combination of lab experiments on sensor accuracy and precision and a simulated sampling program with different sampling frequencies based on a continuous water temperature dataset from Svalbard, Arctic, from 2012 to 2017. Our laboratory experiments showed that temperature measurements with 12 different temperature sensor types at different price ranges all provided measurements accurate enough to resolve temperature changes over years on a level discussed in the literature when addressing climate change effects in coastal waters. However, the experiments also revealed that some sensors are more suitable for measuring absolute temperature changes over time, while others are more suitable for determining relative temperature changes. Our simulated sampling program in Svalbard coastal waters over 5 years revealed that the selection of a proper sampling frequency is most relevant for discriminating significant long-term temperature changes from random daily, seasonal, or interannual fluctuations. While hourly and daily sampling could deliver reliable, stable, and comparable results concerning temperature increases over time, weekly sampling was less able to reliably detect overall significant trends. With even lower sampling frequencies (monthly sampling), no significant temperature trend over time could be detected. Although the results were obtained for a specific site, they are transferable to other aquatic research questions and non-polar regions.


2017 ◽  
Author(s):  
Paul Floury ◽  
Jérôme Gaillardet ◽  
Eric Gayer ◽  
Julien Bouchez ◽  
Gaëlle Tallec ◽  
...  

Abstract. Our understanding of hydrological and chemical processes at a catchment scale is limited by our capacity to record the full breadth of the information carried by river chemistry, both in terms of sampling frequency and in precision. Here, we present the proof-of-concept of a new system of water quality monitoring that we called the River Lab (RL), based on the idea of permanently installing a suite of laboratory instruments in the field. Confined in a bungalow next to the river, this set of instruments performs analyses at a frequency of 40-minutes for major dissolved species (Na+, K+, Mg2+, Ca2+, Cl−, SO42−, NO3−) through continuous sampling and filtration of the river water using automated ion chromatographs. The RL was deployed in the Orgeval Critical Zone Observatory, France for over a year of continuous analyses. Results show that the RL is able to capture long-term fine chemical variations with no drift and a precision a significantly better than conventionally achieved in the laboratory (up to ±0.5 % for all major species for over a day and up to 1.7 % over two months). Using chemical signals obtained as a benchmark, we assess the effects of a lower sampling frequency (typical of traditional field sampling campaigns) and of a lower precision (typically reached in the laboratory) on the chemical river signal. The RL is able to capture the abrupt changes in dissolved species concentrations during a typical 6-days flood event, as well as unexpected daily oscillations during a hydrological boring period of summer drought. The unprecedented, high-resolution, high precision measurements made possible by the RL open new perspectives for understanding critical zone hydro-bio-geochemical cycles. This approach also offers a solution for operational agencies to monitor the water quality in quasi real-time.


2002 ◽  
Vol 37 (1) ◽  
pp. 119-132 ◽  
Author(s):  
Richard B. Lowell ◽  
Joseph M. Culp

Abstract To estimate the effects of sampling frequency on detecting temporal patterns during environmental effects monitoring, we used multivariate analyses and data subsampling to investigate long-term (spanning 20 years) patterns in benthic invertebrate community structure downriver of a large pulp mill in southern British Columbia. Patterns in invertebrate abundance sampled yearly were related to long-term patterns in several physicochemical variables measured in the river using multidimensional scaling ordination. The only available physicochemical variables that were significantly correlated with invertebrate community structure over the 20-year period were the mill outputs of total phosphorus and suspended solids, and these were associated with increased abundances of five families of mayflies, stoneflies and caddisflies. To evaluate the implications of sampling on a more coarse (than yearly) time scale, the full data set spanning 20 years was subsampled to produce a series of smaller data sets, each simulating a sampling frequency of once every three years. Ordination of the subsample data sets showed that an average of 71% of the important taxa and 50% of the important physicochemical variables highlighted in the full analysis were missed in the subset analyses. These results underscore the importance of ensuring adequate temporal replication of sampling effort when a major goal is to directly measure or test for temporal patterns of stressor impacts.


2017 ◽  
Vol 21 (12) ◽  
pp. 6153-6165 ◽  
Author(s):  
Paul Floury ◽  
Jérôme Gaillardet ◽  
Eric Gayer ◽  
Julien Bouchez ◽  
Gaëlle Tallec ◽  
...  

Abstract. Our understanding of hydrological and chemical processes at the catchment scale is limited by our capacity to record the full breadth of the information carried by river chemistry, both in terms of sampling frequency and precision. Here, we present a proof-of-concept study of a lab in the field called the River Lab (RL), based on the idea of permanently installing a suite of laboratory instruments in the field next to a river. Housed in a small shed, this set of instruments performs analyses at a frequency of one every 40 min for major dissolved species (Na+, K+, Mg2+, Ca2+, Cl−, SO42−, NO3−) through continuous sampling and filtration of the river water using automated ion chromatographs. The RL was deployed in the Orgeval Critical Zone Observatory, France for over a year of continuous analyses. Results show that the RL is able to capture long-term fine chemical variations with no drift and a precision significantly better than conventionally achieved in the laboratory (up to ±0.5 % for all major species for over a day and up to 1.7 % over 2 months). The RL is able to capture the abrupt changes in dissolved species concentrations during a typical 6-day rain event, as well as daily oscillations during a hydrological low-flow period of summer drought. Using the measured signals as a benchmark, we numerically assess the effects of a lower sampling frequency (typical of conventional field sampling campaigns) and of a lower precision (typically reached in the laboratory) on the hydrochemical signal. The high-resolution, high-precision measurements made possible by the RL open new perspectives for understanding critical zone hydro-bio-geochemical cycles. Finally, the RL also offers a solution for management agencies to monitor water quality in quasi-real time.


Sign in / Sign up

Export Citation Format

Share Document