scholarly journals Estimating Pollution Loads in Snow Removed from a Port Facility: Snow Pile Sampling Strategies

2021 ◽  
Vol 232 (2) ◽  
Author(s):  
Arya Vijayan ◽  
Heléne Österlund ◽  
Jiri Marsalek ◽  
Maria Viklander

AbstractChoosing the appropriate sampling strategy is significant while estimating the pollutant loads in a snow pile and assessing environmental impacts of dumping snow into water bodies. This paper compares different snow pile sampling strategies, looking for the most efficient way to estimate the pollutant loads in a snow pile. For this purpose, 177 snow samples were collected from nine snow piles (average pile area − 30 m2, height − 2 m) during four sampling occasions at Frihamnen, Ports of Stockholm’s port area. The measured concentrations of TSS, LOI, pH, conductivity, and heavy metals (Zn, Cu, Cd, Cr, Pb, and V) in the collected samples indicated that pollutants are not uniformly distributed in the snow piles. Pollutant loads calculated from different sampling strategies were compared against the load calculated using all samples collected for each pile (best estimate of mass load, BEML). The results/study showed that systematic grid sampling is the best choice when the objective of sampling is to estimate the pollutant loads accurately. Estimating pollutant loads from single snow column samples (collected at a point from the snow pile through the entire depth of the pile) produced up to 400% variation from BEML, whereas samples composed by mixing volume-proportional subsamples from all samples (horizontal composite samples) produced only up to 50% variation. Around nine samples were required to estimate the pollutant loads within 50% deviation from BEML for the studied snow piles. Converting pollutant concentrations in snow to equivalent concentrations in snowmelt and comparing it with available guideline values for receiving water, Zn was identified as the critical pollutant.

2007 ◽  
Vol 4 (3) ◽  
pp. 1069-1094
Author(s):  
M. Rivas-Casado ◽  
S. White ◽  
P. Bellamy

Abstract. River restoration appraisal requires the implementation of monitoring programmes that assess the river site before and after the restoration project. However, little work has yet been developed to design effective and efficient sampling strategies. Three main variables need to be considered when designing monitoring programmes: space, time and scale. The aim of this paper is to describe the methodology applied to analyse the variation of depth in space, scale and time so more comprehensive monitoring programmes can be developed. Geostatistical techniques were applied to study the spatial dimension (sampling strategy and density), spectral analysis was used to study the scale at which depth shows cyclic patterns, whilst descriptive statistics were used to assess the temporal variation. A brief set of guidelines have been summarised in the conclusion.


Author(s):  
Stephen R. Lindemann ◽  
Anna Yershova ◽  
Steven M. LaValle

2018 ◽  
Vol 78 (6) ◽  
pp. 1407-1416
Author(s):  
Santiago Sandoval ◽  
Jean-Luc Bertrand-Krajewski ◽  
Nicolas Caradot ◽  
Thomas Hofer ◽  
Günter Gruber

Abstract The event mean concentrations (EMCs) that would have been obtained by four different stormwater sampling strategies are simulated by using total suspended solids (TSS) and flowrate time series (about one minute time-step and one year of data). These EMCs are compared to the reference EMCs calculated by considering the complete time series. The sampling strategies are assessed with datasets from four catchments: (i) Berlin, Germany, combined sewer overflow (CSO); (ii) Graz, Austria, CSO; (iii) Chassieu, France, separate sewer system; and (iv) Ecully, France, CSO. A sampling strategy in which samples are collected at constant time intervals over the rainfall event and sampling volumes are pre-set as proportional to the runoff volume discharged between two consecutive sample leads to the most representative results. Recommended sampling time intervals are of 5 min for Berlin and Chassieu (resp. 100 and 185 ha area) and 10 min for Graz and Ecully (resp. 335 and 245 ha area), with relative sampling errors between 7% and 20% and uncertainties in sampling errors of about 5%. Uncertainties related to sampling volumes, TSS laboratory analyses and beginning/ending of rainstorm events are reported as the most influent sources in the uncertainties of sampling errors and EMCs.


2015 ◽  
Vol 8 (2) ◽  
pp. 202-208 ◽  
Author(s):  
Shan Ran ◽  
Mengqiao Liu ◽  
Lisa A. Marchiondo ◽  
Jason L. Huang

Landers and Behrend (2015) question organizational researchers’ stubborn reliance on sample source to infer the validity of research findings, and they challenge the arbitrary distinctions researchers often make between sample sources. Unconditional favoritism toward particular sampling strategies (e.g., organizational samples) can restrict choices in methodology, which in turn may limit opportunities to answer certain research questions. Landers and Behrend (2015) contend that no sampling strategy is inherently superior (or inferior), and therefore, all types of samples warrant careful consideration before any validity-related conclusions can be made. Despite sound arguments, the focal article focuses its consideration on external validity and deemphasizes the potential influence of sample source on internal validity. Agreeing with the position that no samples are the “gold standard” in organizational research and practice, we focus on insufficient effort responding (IER; Huang, Curran, Keeney, Poposki, & DeShon, 2012) as a threat to internal validity across sample sources.


2013 ◽  
Vol 49 (No. 1) ◽  
pp. 36-47 ◽  
Author(s):  
M. Studnicki ◽  
W. Mądry ◽  
J. Schmidt

Establishing a core collection that represents the genetic diversity of the entire collection with a minimum loss of its original diversity and minimal redundancies is an important problem for gene bank curators and crop breeders. In this paper, we assess the representativeness of the original genetic diversity in core collections consisting of one-tenth of the entire collection obtained according to 23 sampling strategies. The study was performed using the Polish orchardgrass Dactylis glomerata L. germplasm collection as a model. The representativeness of the core collections was validated by the difference of means (MD%) and difference of mean squared Euclidean distance (d‒D%) for the studied traits in the core subsets and the entire collection. In this way, we compared the efficiency of a simple random and 22 (20 cluster-based and 2 direct cluster-based) stratified sampling strategies. Each cluster-based stratified sampling strategy is a combination of 2 clusterings, 5 allocations and 2 methods of sampling in a group. We used the accession genotypic predicted values for 8 quantitative traits tested in field trials. A sampling strategy is considered more effective for establishing core collections if the means of the traits in a core are maintained at the same level as the means in the entire collection (i.e., the mean of MD% in the simulated samples is close to zero) and, simultaneously, when the overall variation in a core collection is greater than in the entire collection (i.e., the mean of d‒D% in the simulated samples is greater than that obtained for the simple random sampling strategy). Both cluster analyses (unweighted pair group method with arithmetic mean UPGMA and Ward) were similarly useful in constructing those sampling strategies capable of establishing representative core collections. Among the allocation methods that are relatively most useful for constructing efficient samplings were proportional and D2 (including variation). Within the Ward clusters, the random sampling was better than the cluster-based sampling, but not within the UPGMA clusters.


2005 ◽  
Vol 22 (8) ◽  
pp. 1267-1281 ◽  
Author(s):  
I. Shulman ◽  
J. C. Kindle ◽  
D. J. McGillicuddy ◽  
M. A. Moline ◽  
S. H. D. Haddock ◽  
...  

Abstract The focus of this paper is on the development of methodology for short-term (1–3 days) oceanic bioluminescence (BL) predictions and the optimization of spatial and temporal bioluminescence sampling strategies. The approach is based on predictions of bioluminescence with an advection–diffusion–reaction (tracer) model with velocities and diffusivities from a circulation model. In previous research, it was shown that short-term changes in some of the salient features in coastal bioluminescence can be explained and predicted by using this approach. At the same time, it was demonstrated that optimization of bioluminescence sampling prior to the forecast is critical for successful short-term BL predictions with the tracer model. In the present paper, the adjoint to the tracer model is used to study the sensitivity of the modeled bioluminescence distributions to the sampling strategies for BL. The locations and times of bioluminescence sampling prior to the forecast are determined by using the adjoint-based sensitivity maps. The approach is tested with bioluminescence observations collected during August 2000 and 2003 in the Monterey Bay, California, area. During August 2000, BL surveys were collected during a strong wind relaxation event, while in August 2003, BL surveys were conducted during an extended (longer than a week) upwelling-favorable event. The numerical bioluminescence predictability experiments demonstrated a close agreement between observed and model-predicted short-term spatial and temporal changes of the coastal bioluminescence.


2011 ◽  
Vol 41 (9) ◽  
pp. 1819-1826 ◽  
Author(s):  
Piermaria Corona ◽  
Lorenzo Fattorini ◽  
Sara Franceschi

A two-stage sampling strategy is proposed to assess small woodlots outside the forests scattered on extensive territories. The first stage is performed to select a sample of small woodlots using fixed-size sampling schemes, and the second stage is performed to sample trees within woodlots selected at first stage. Usually, fixed- or variable-area plots are adopted to sample trees. However, the use of plot sampling in small patches such as woodlots is likely to induce a relevant amount of bias owing to edge effects. In this framework, sector sampling proves to be particularly effective. The present paper investigates the statistical properties of two-stage sampling strategies for estimating forest attributes of woodlot populations when sector sampling is adopted at the second stage. A two-stage estimator of population totals is derived together with a conservative estimator of its sampling variance. By means of a simulation study, the performance of the proposed estimator is checked and compared with that achieved using traditional plot sampling with edge corrections. Simulation results prove the adequacy of sector sampling and provide some guidelines for the effective planning of the strategy. In some countries, the proposed strategy can be performed with few modifications within the framework of large-scale forest inventories.


2013 ◽  
Vol 68 (12) ◽  
pp. 2683-2690 ◽  
Author(s):  
S. Sandoval ◽  
A. Torres ◽  
E. Pawlowsky-Reusing ◽  
M. Riechel ◽  
N. Caradot

The present study aims to explore the relationship between rainfall variables and water quality/quantity characteristics of combined sewer overflows (CSOs), by the use of multivariate statistical methods and online measurements at a principal CSO outlet in Berlin (Germany). Canonical correlation results showed that the maximum and average rainfall intensities are the most influential variables to describe CSO water quantity and pollutant loads whereas the duration of the rainfall event and the rain depth seem to be the most influential variables to describe CSO pollutant concentrations. The analysis of partial least squares (PLS) regression models confirms the findings of the canonical correlation and highlights three main influences of rainfall on CSO characteristics: (i) CSO water quantity characteristics are mainly influenced by the maximal rainfall intensities, (ii) CSO pollutant concentrations were found to be mostly associated with duration of the rainfall and (iii) pollutant loads seemed to be principally influenced by dry weather duration before the rainfall event. The prediction quality of PLS models is rather low (R² < 0.6) but results can be useful to explore qualitatively the influence of rainfall on CSO characteristics.


2021 ◽  
Vol 15 (1) ◽  
pp. 99-114
Author(s):  
Ankit Agrawal ◽  
Sarsij Tripathi ◽  
Manu Vardhan

Active learning approach is well known method for labeling huge un-annotated dataset requiring minimal effort and is conducted in a cost efficient way. This approach selects and adds most informative instances to the training set iteratively such that the performance of learner improves with each iteration. Named entity recognition (NER) is a key task for information extraction in which entities present in sequences are labeled with correct class. The traditional query sampling strategies for the active learning only considers the final probability value of the model to select the most informative instances. In this paper, we have proposed a new active learning algorithm based on the hybrid query sampling strategy which also considers the sentence similarity along with the final probability value of the model and compared them with four other well known pool based uncertainty query sampling strategies based active learning approaches for named entity recognition (NER) i.e. least confident sampling, margin of confidence sampling, ratio of confidence sampling and entropy query sampling strategies. The experiments have been performed over three different biomedical NER datasets of different domains and a Spanish language NER dataset. We found that all the above approaches are able to reach to the performance of supervised learning based approach with much less annotated data requirement for training in comparison to that of supervised approach. The proposed active learning algorithm performs well and further reduces the annotation cost in comparison to the other sampling strategies based active algorithm in most of the cases.


Sign in / Sign up

Export Citation Format

Share Document