Is the sampling strategy interfering with the study of spatial variability of zooplankton communities?

2000 ◽  
Vol 57 (9) ◽  
pp. 1940-1956 ◽  
Author(s):  
Carol Avois ◽  
Pierre Legendre ◽  
Stéphane Masson ◽  
Bernadette Pinel-Alloul

Surveys at the whole-lake scale take some time to carry out: several hours or several days. For logistic reasons, the sites are not sampled simultaneously or in a random sequence. Traditional limnological sampling methods require an appreciable amount of time at each site. Any sampling strategy that is not random or simultaneous introduces dependencies among the observations, which must be taken into account during the analysis and interpretation of the data. What is the real nature of the variation measured using a given sampling design? This question is approached using sites sampled by two boat teams during two consecutive days. Statistical modelling was used to partition the variation of zooplankton size-class data into environmental and spatial components. The conclusions reached after an analysis that did not control for the sampling design are erroneous and quite different from those reached when the effect of the sampling design (factors Day, Boat, and Hour) was taken into account. Clearly, when a significant effect of the sampling design is found, one must control for it during the analysis and interpretation of ecological variation.

2012 ◽  
Vol 102 (1) ◽  
pp. 88-94 ◽  
Author(s):  
Geruza L. Melo ◽  
Jonas Sponchiado ◽  
Nilton C. Cáceres

In order to evaluate the efficiency of different mammalian survey methods, we compared traditional sampling techniques (use of camera-traps on roads and artificial trails, track censuses, and direct field visualization) with an alternative sampling design (camera-traps positioned in natural areas such as natural trails and shelters). We conducted the study in a deciduous Atlantic-Forest park in southern Brazil, and additionally compared our results with a previous intensive study carried out in the same area. Our considerably smaller sampling effort (example: 336 trap.day for our camera-traps versus 2,154 trap.day for the earlier study) registered the presence of 85% of the local known species, with camera-traps being 68% efficient. Moreover, shelter camera-traps revealed a different species composition regarding most of other sampling methods. This sampling strategy involving natural forest sites was therefore able to effectively optimize the chances of evaluating species composition in a shorter period, especially with respect to lower-density and cryptic species, as well as to detect species that avoid open, disturbed sites such as roads and man-made forest trails.


2021 ◽  
Vol 37 (3) ◽  
pp. 655-671
Author(s):  
Paolo Righi ◽  
Piero Demetrio Falorsi ◽  
Stefano Daddi ◽  
Epifania Fiorello ◽  
Pierpaolo Massoli ◽  
...  

Abstract For the first time in 2018 the Italian Institute of Statistics (Istat) implemented the annual Permanent Population Census which relies on the Population Base Register (PBR) and the Population Coverage Survey (PCS). This article provides a general overview of the PCS sampling design, which makes use of the PBR to correct population counts with the extended dual system estimator (Nirel and Glickman 2009). The sample allocation, proven optimal under a set of precision constraints, is based on preliminary estimates of individual probabilities of over-coverage and under-coverage. It defines the expected sample size in terms of individuals, and it oversamples the sub-populations subject to the risk of under/over coverage. Finally, the article introduces a sample selection method, which to the greatest extent possible satisfies the planned allocation of persons in terms of socio-demographic characteristics. Under acceptable assumptions, the article also shows that the sampling strategy enhances the precision of the estimates.


2015 ◽  
Vol 21 (1) ◽  
pp. 63-69
Author(s):  
Christopher A. Taylor ◽  
Bryan S. Engelbert ◽  
Robert J. DiStefano

Abstract We conducted a study to investigate methods to assess crayfish populations typically found in low gradient, lentic, floodplain habitats in Missouri. We used a random site selection process that allowed us to capture all known species from this region of Missouri. We compared two sampling methods for primary burrowing crayfishes at our sampling sites: hook-and-line capture technique and burrow excavation. Adjacent standing water habitats at sites were also sampled using a timed search method. Hook-and-line capture success was substantially less than reported in the literature (0.7% versus 80%), while burrow excavation was higher than reported (64% versus 40.7%). We successfully captured six crayfish species using burrow excavation, whereas lentic timed search sampling captured nine species in adjacent standing waters at our sampling sites. Our results suggest that additional efforts sampling lentic habitats rather than additional time searching for and excavating burrows is more likely to capture total community richness. We found a seasonal influence on burrow occupancy surveys, as Julian day was positively correlated to finding active crayfish burrows. Crayfish capture in standing water was positively affected by soil temperature, and negatively correlated to Julian day.


2017 ◽  
Vol 5 (1) ◽  
pp. 1-21
Author(s):  
Steven K Thompson

Abstract In this paper, I discuss some of the wider uses of adaptive and network sampling designs. Three uses of sampling designs are to select units from a population to make inferences about population values, to select units to use in an experiment, and to distribute interventions to benefit a population. The most useful approaches for inference from adaptively selected samples are design-based methods and Bayesian methods. Adaptive link-tracing network sampling methods are important for sampling populations that are otherwise hard to reach. Sampling in changing populations involves temporal network or spatial sampling design processes with units selected both into and out of the sample over time. Averaging or smoothing fast-moving versions of these designs provides simple estimates of network-related characteristics. The effectiveness of intervention programs to benefit populations depends a great deal on the sampling and assignment designs used in spreading the intervention.


2021 ◽  
Author(s):  
Souta Nakajima ◽  
Masanao Sueyoshi ◽  
Shun K. Hirota ◽  
Nobuo Ishiyama ◽  
Ayumi Matsuo ◽  
...  

A key piece of information for ecosystem management is the relationship between the environment and population genetic structure. However, it is difficult to clearly quantify the effects of environmental factors on genetic differentiation because of spatial autocorrelation and analytical problems. In this study, we focused on stream ecosystems and the environmental heterogeneity caused by groundwater and constructed a sampling design in which geographic distance and environmental differences are not correlated. Using multiplexed ISSR genotyping by sequencing (MIG-seq) method, a fine-scale population genetics study was conducted in fluvial sculpin Cottus nozawae, for which summer water temperature is the determinant factor in distribution and survival. There was a clear genetic structure in the watershed. Although a significant isolation-by-distance pattern was detected in the watershed, there was no association between genetic differentiation and water temperature. Instead, asymmetric gene flow from relatively low-temperature streams to high-temperature streams was detected, indicating the importance of low-temperature streams and continuous habitats. The groundwater-focused sampling strategy yielded unexpected results and provided important insights for conservation.


Author(s):  
Padmapriya Banada ◽  
David Elson ◽  
Naranjargal Daivaa ◽  
Claire Park ◽  
Samuel Desind ◽  
...  

ABSTRACTSensitive, accessible, and biosafe sampling methods for COVID-19 reverse-transcriptase polymerase chain reaction (RT-PCR) assays are needed for frequent and widespread testing. We systematically evaluated diagnostic yield across different sample collection and transport workflows, including the incorporation of a viral inactivation buffer. We prospectively collected nasal swabs, oral swabs, and saliva, from 52 COVID-19 RT-PCR-confirmed patients, and nasopharyngeal (NP) swabs from 37 patients. Nasal and oral swabs were placed in both viral transport media (VTM) and eNAT™, a sterilizing transport buffer, prior to testing with the Xpert Xpress SARS-CoV-2 (Xpert) test. The sensitivity of each sampling strategy was compared using a composite positive standard. Overall, swab specimens collected in eNAT showed superior sensitivity compared to swabs in VTM (70% vs 57%, P=0.0022). Direct saliva 90.5%, (95% CI: 82%, 95%), followed by NP swabs in VTM and saliva in eNAT, was significantly more sensitive than nasal swabs in VTM (50%, P<0.001) or eNAT (67.8%, P=0.0012) and oral swabs in VTM (50%, P<0.0001) or eNAT (56%, P<0.0001). Saliva and use of eNAT buffer each increased detection of SARS-CoV-2 with the Xpert test; however, no single sample matrix identified all positive cases.


2021 ◽  
Vol 2021 ◽  
pp. 1-14
Author(s):  
Huanwei Xu ◽  
Xin Zhang ◽  
Hao Li ◽  
Ge Xiang

An ensemble of surrogate models with high robustness and accuracy can effectively avoid the difficult choice of surrogate model. However, most of the existing ensembles of surrogate models are constructed with static sampling methods. In this paper, we propose an ensemble of adaptive surrogate models by applying adaptive sampling strategy based on expected local errors. In the proposed method, local error expectations of the surrogate models are calculated. Then according to local error expectations, the new sample points are added within the dominating radius of the samples. Constructed by the RBF and Kriging models, the ensemble of adaptive surrogate models is proposed by combining the adaptive sampling strategy. The benchmark test functions and an application problem that deals with driving arm base of palletizing robot show that the proposed method can effectively improve the global and local prediction accuracy of the surrogate model.


2019 ◽  
pp. 001112871989026
Author(s):  
Kyle Vincent ◽  
Sheldon X. Zhang ◽  
Meredith Dank

Estimating the prevalence of sex trafficking requires a practical sampling strategy to reach the hidden population. In this study, we experimented with a network sampling design to obtain a sample of sex workers from the city of Muzaffarpur, India. Backed by census data and other auxiliary information, we obtained a stratified initial sample of 111 individuals and with two waves of referrals arrived at a final sample of 317 individuals. A sophisticated network-based approach is used to estimate the population size, and a respondent-driven sampling–based strategy is used to estimate characteristics related to sex trafficking violations. We detail the sampling design and present results from the study, highlighting significant findings and lessons learned that can be used for future studies.


Author(s):  
JUN LONG ◽  
JIANPING YIN ◽  
EN ZHU ◽  
WENTAO ZHAO

Active learning is an important approach to reduce data-collection costs for inductive learning problems by sampling only the most informative instances for labeling. We focus here on the sampling criterion for how to select these most informative instances. Three contributions are made in this paper. First, in contrast to the leading sampling strategy of halving the volume of version space, we present the sampling strategy of reducing the volume of version space by more than half with the assumption of target function being chosen from nonuniform distribution over version space. Second, we propose the idea of sampling the instances that would be most possibly misclassified. Third, we develop a sampling method named CBMPMS (Committee Based Most Possible Misclassification Sampling) which samples the instances that have the largest probability to be misclassified by the current classifier. Comparing the proposed CBMPMS method with the existing active learning methods, when the classifiers achieve the same accuracy, the former method will sample fewer times than the latter ones. The experiments show that the proposed method outperforms the traditional sampling methods on most selected datasets.


2017 ◽  
Vol 2017 ◽  
pp. 1-13
Author(s):  
Lurdes Borges Silva ◽  
Mário Alves ◽  
Rui Bento Elias ◽  
Luís Silva

Tree density is an important parameter affecting ecosystems functions and management decisions, while tree distribution patterns affect sampling design.Pittosporum undulatumstands in the Azores are being targeted with a biomass valorization program, for which efficient tree density estimators are required. We comparedT-Square sampling, Point Centered Quarter Method (PCQM), andN-tree sampling with benchmark quadrat (QD) sampling in six 900 m2plots established atP. undulatumstands in São Miguel Island. A total of 15 estimators were tested using a data resampling approach. The estimated density range (344–5056 trees/ha) was found to agree with previous studies using PCQM only. Although with a tendency to underestimate tree density (in comparison with QD), overall,T-Square sampling appeared to be the most accurate and precise method, followed by PCQM. Tree distribution pattern was found to be slightly aggregated in 4 of the 6 stands. Considering (1) the low level of bias and high precision, (2) the consistency among three estimators, (3) the possibility of use with aggregated patterns, and (4) the possibility of obtaining a larger number of independent tree parameter estimates, we recommend the use ofT-Square sampling inP. undulatumstands within the framework of a biomass valorization program.


Sign in / Sign up

Export Citation Format

Share Document