Sampling Theory

Author(s):  
David Hankin ◽  
Michael S. Mohr ◽  
Kenneth B. Newman

We present a rigorous but understandable introduction to the field of sampling theory for ecologists and natural resource scientists. Sampling theory concerns itself with development of procedures for random selection of a subset of units, a sample, from a larger finite population, and with how to best use sample data to make scientifically and statistically sound inferences about the population as a whole. The inferences fall into two broad categories: (a) estimation of simple descriptive population parameters, such as means, totals, or proportions, for variables of interest, and (b) estimation of uncertainty associated with estimated parameter values. Although the targets of estimation are few and simple, estimates of means, totals, or proportions see important and often controversial uses in management of natural resources and in fundamental ecological research, but few ecologists or natural resource scientists have formal training in sampling theory. We emphasize the classical design-based approach to sampling in which variable values associated with units are regarded as fixed and uncertainty of estimation arises via various randomization strategies that may be used to select samples. In addition to covering standard topics such as simple random, systematic, cluster, unequal probability (stressing the generality of Horvitz–Thompson estimation), multi-stage, and multi-phase sampling, we also consider adaptive sampling, spatially balanced sampling, and sampling through time, three areas of special importance for ecologists and natural resource scientists. The text is directed to undergraduate seniors, graduate students, and practicing professionals. Problems emphasize application of the theory and R programming in ecological and natural resource settings.

2008 ◽  
Vol 65 (2) ◽  
pp. 176-197 ◽  
Author(s):  
Kathryn L Mier ◽  
Susan J Picquelle

The main goal in estimating population abundance is to maximize its accuracy and precision. This is difficult when the survey area is large and resources are limited. We implemented a feasible adaptive sampling survey applied to an aggregated population in a marine environment and compared its performance with five classical survey designs. Specifically, larval walleye pollock (Theragra chalcogramma) in the Gulf of Alaska was used as an example of a widespread aggregated population. The six sampling designs included (i) adaptive cluster, (ii) simple random, (iii) systematic, (iv) systematic cluster, (v) stratified systematic, and (vi) unequal probability. Of the five different adaptive estimators used for the adaptive cluster design, the modified Hansen–Hurwitz performed best overall. Of the six survey designs, the stratified systematic survey provided the best overall estimator, given there was accurate prior information on which to base the strata. If no prior information was available, a systematic survey was best. A systematic survey using a single random starting point with a simple random estimator performed as well as and sometimes better than a systematic cluster survey with two starting points (clusters). The adaptive cluster survey showed no advantages when compared with these two designs and furthermore presented substantial logistical challenges.


2014 ◽  
Vol 4 (2) ◽  
pp. 154-163 ◽  
Author(s):  
San-dang Guo ◽  
Sifeng Liu ◽  
Zhigeng Fang ◽  
Lingling Wang

Purpose – The purpose of this paper is to put forward a multi-stage information aggregation method based on grey inspiriting control lines to evaluate the objects dynamically and comprehensively. Design/methodology/approach – According to the evaluation value of the objects, the positive and negative incentive lines were set up and the predicted values were solved based on the grey GM(1, 1) model, so the value with expected information could be evaluated. In the evaluation, the part above the positive incentive line should be “rewarded” and that below the negative incentive line should be “punished” appropriately. Thereby the double incentive effects of “the current development situation and future development trend” to objects could be implemented on the basis of control. Findings – This method can primarily describe the decision maker's expectancy of the development of evaluation objects and make the evaluation results have better practical application value. Research limitations/implications – Many comprehensive evaluations were always based on the past information. However, the future development trend of the evaluated object is also very important. This study can be used in the evaluation for future application and development. Originality/value – The paper succeeds in providing not only a method of multi-phase information aggregation with expectancy information, but also a simple and convenient method solving nonlinear inspiring lines objectively.


2018 ◽  
Vol 9 (1) ◽  
pp. 183
Author(s):  
Eleonora Sergeevna NAVASARDOVA ◽  
Roman Vladimirovich NUTRIKHIN ◽  
Tatyana Nikolaevna ZINOVYEVA ◽  
Vladimir Aleksandrovich SHISHKIN ◽  
Julia Valeryevna JOLUDEVA

The codification of the legislation on lands, forests, subsoil and other natural resources in the Russian Empire (1721-1917) is studied herein. Some sources of the systematization process in this field of legislation in the period, preceding the formation of the empire, from the time of the ʼCouncil Codeʼ to the reforms of Peter I (1649-1720) are revealed. Initially, the formation of the legal regulation in this field had the form of adoption of numerous separate legal acts. Such law-making methods were casual in nature and resulted in the emergence of internal contradictions in the legislation, which became too extensive and inconsistent. This was the strong reason for the urgent need for its systematization. The land law was most developed in Russia in the pre-imperial and imperial periods, which was due to the prevalence of agricultural production and the special importance of land relations. The land legislation was codified prior to other natural resource industries. The second most important in this area was the forest legislation. This was explained by the abundance of forests and their active use in economic activities, which required serious legal regulation. The importance of subsoil legislation had increased over time, due to increased exploitation of mineral resources. Later, water and faunal law began to develop actively and systematically. The milestone in the development of natural resource industries was M.M. Speransky's codification reform, the main result of which was the appearance of the ʼCode of Laws of the Russian Empireʼ. The separate codes included in it were specifically devoted to land, forest and mineral relations. First of all, they were the ʼCode of Survey Lawsʼ (vol. X), the ʼCode of Institutions and Forest Chartersʼ (vol. VIII) and the ʼCode of Institutions and Mineral Chartersʼ (vol. VII), which, however, were only the part of the array of legal norms on lands, forests and subsoil. Other volumes of the Code of Laws contained a large number of them. The norms of water and faunal law had no separate codes. Their systematization was carried out in the charters of the related branches of law. Along with this codification, a large number of separate normative nature-resource acts were issued. Not all of them were organically included in the relevant codes; they simply joined them as the official annexes. The systematization of the legislation on natural resources in the empire was not very consistent and was not always successful (Engelstein 1993: 339). Even after the most extensive imperial codification, it remained extremely fragmented. However, the demerger of certain natural resource charters from the Code of Laws as the separate codification units indicated the beginning of the formation of the land, forest and mineral law in pre-revolutionary Russia as the independent branches.


2015 ◽  
Vol 25 (4) ◽  
pp. 753-767 ◽  
Author(s):  
Pedro Mariano ◽  
Luís Correia

AbstractWe analyseGive and Take, a multi-stage resource sharing game to be played between two players. The payoff is dependent on the possession of an indivisible and durable resource, and in each stage players may either do nothing or, depending on their roles, give the resource or take it. Despite these simple rules, we show that this game has interesting complex dynamics. Unique toGive and Takeis the existence of multiple Pareto optimal profiles that can also be Nash equilibria, and a built-in punishment action. This game allows us to study cooperation in sharing an indivisible and durable resource. Since there are multiple strategies to cooperate,Give and Takeprovides a base to investigate coordination under implicit or explicit agreements. We discuss its position in face of other games and real world situations that are better modelled by it. The paper presents an in-depth analysis of the game for the range of admissible parameter values. We show that, when taking is costly for both players, cooperation emerges as players prefer to give the resource.


Information ◽  
2020 ◽  
Vol 11 (2) ◽  
pp. 65
Author(s):  
Marcin Lawnik ◽  
Arkadiusz Banasik

The Delphi method is one of the basic tools for forecasting values in various types of issues. It uses the knowledge of experts, which is properly aggregated (e.g., in the form of descriptive statistics measures) and returns to the previous group of experts again, thus starting the next round of forecasting. The multi-stage prediction under the Delphi method allows for better stabilization of the results, which is extremely important in the process of forecasting. Experts in the forecasting process often have access to time series forecasting software but do not necessarily use it. Therefore, it seems advisable to add to the aggregate the value obtained using forecasting software. The advantage of this approach is in saving the time and costs of obtaining a forecast. That should be understood as a smaller burden on data analysts and the value of their work. According to the above mentioned key factors, the main contribution of the article is the use of a virtual expert in the form of a computer-enhanced mathematical tool, i.e., a programming library for a forecasting time series. The chosen software tool is the Prophet library—a Facebook tool that can be used in Python or R programming languages.


2022 ◽  
pp. 223386592110729
Author(s):  
Uwomano Benjamin Okpevra

The Isoko, like other peoples of Nigeria, played significant roles in the historical process and evolution of Nigeria and should be acknowledged as such. The paper teases out much more clearly—and, more importantly, the multiple stages of the British expansion into Isoko. That is, how does that multi-stage, multi-phase process affect how we think more broadly about British colonial expansion in Africa in the 19th century? The paper deposes that the Isoko as a people did not accept British rule until the “punitive expedition” to the area in 1911 brought the whole of the Isoko country under British control. This is done within the context of the military conquest and subjugation of the people, colonial prejudices, and the resulting social economic, and political changes. The paper deploying both primary and secondary data highlights the role played by the Isoko in resisting British penetration into and subjugation of their country between 1896 and 1911. The year 1896 marked the beginning of British formal contact with the Isoko when the first treaty was signed with Owe (Owhe), while 1911 was when the Isoko were conquered by the British and brought under British control.


Author(s):  
Swati Saxena ◽  
Giridhar Jothiprasad ◽  
Corey Bourassa ◽  
Byron Pritchard

Aircraft engines ingest airborne particulate matter, such as sand, dirt, and volcanic ash, into their core. The ingested particulate is transported by the secondary flow circuits via compressor bleeds to the high pressure turbine and may deposit resulting in turbine fouling and loss of cooling effectiveness. Prior publications focused on particulate deposition and sand erosion patterns in a single stage of a compressor or turbine. The current work addresses the migration of ingested particulate through the high pressure compressor and bleed systems. This paper describes a 3D CFD methodology for tracking particles along a multi-stage axial compressor and presents particulate ingestion analysis for a high pressure compressor section. The commercial CFD multi-phase solver ANSYS CFX R has been used for flow and particulate simulations. Particle diameters of 20, 40, and 60 microns are analyzed. Particle trajectories and radial particulate profiles are compared for these particle diameters. The analysis demonstrates how the compressor centrifuges the particles radially towards the compressor case as they travel through the compressor; the larger diameter particles being more significantly affected. Non-spherical particles experience more drag as compared to spherical particles and hence a qualitative comparison between spherical and non-spherical particles is shown.


2013 ◽  
Vol 562-565 ◽  
pp. 311-316
Author(s):  
Xiao Wei Liu ◽  
Qiang Li ◽  
Guan Nan Sun ◽  
Wen Yan Liu

The theory of a Sigma-Delta modulator is introduced in this paper. Based on this theory, a feedback 2-1-1 multi-stage-noise-shaping (MASH) sigma-delta modulator is designed, and the coefficients of the modulator are calculated. The system-level simulation results show that the effective number of bits (ENOB) is 24 bits when the signal bandwidth is 1 kHz and the over-sampling (OSR) rate is 128. Then the circuits of modulator are designed, including integrator, comparator, multi-phase clock and the noise cancelling logic. The whole modulator is simulated in Cadence, the signal to noise ratio (SNR) of the modulator is 125.4dB, and the ENOB is 21.1bits, which meet the technical requirements of the sensor.


Sign in / Sign up

Export Citation Format

Share Document