DATA PROCESSING UNDER A PRIORI STATISTICAL UNCERTAINTY

1987 ◽  
pp. 213-217
Author(s):  
A.R. Pankov ◽  
A.M. Skuridin
1986 ◽  
Vol 19 (5) ◽  
pp. 213-217
Author(s):  
A.R. Pankov ◽  
A.M. Skuridin

Author(s):  
Igor Klimenko ◽  
A. Ivlev

The study carried out in this work made it possible to expand the rank scale for a priori assessment of the chosen strategy in terms of increasing the sensitivity of assessing the caution / negligence ratio using risky, as well as classical decision-making criteria under conditions of statistical uncertainty.


2021 ◽  
pp. 000276422110216
Author(s):  
Kazimierz M. Slomczynski ◽  
Irina Tomescu-Dubrow ◽  
Ilona Wysmulek

This article proposes a new approach to analyze protest participation measured in surveys of uneven quality. Because single international survey projects cover only a fraction of the world’s nations in specific periods, researchers increasingly turn to ex-post harmonization of different survey data sets not a priori designed as comparable. However, very few scholars systematically examine the impact of the survey data quality on substantive results. We argue that the variation in source data, especially deviations from standards of survey documentation, data processing, and computer files—proposed by methodologists of Total Survey Error, Survey Quality Monitoring, and Fitness for Intended Use—is important for analyzing protest behavior. In particular, we apply the Survey Data Recycling framework to investigate the extent to which indicators of attending demonstrations and signing petitions in 1,184 national survey projects are associated with measures of data quality, controlling for variability in the questionnaire items. We demonstrate that the null hypothesis of no impact of measures of survey quality on indicators of protest participation must be rejected. Measures of survey documentation, data processing, and computer records, taken together, explain over 5% of the intersurvey variance in the proportions of the populations attending demonstrations or signing petitions.


2021 ◽  
Vol 11 (4) ◽  
pp. 1399
Author(s):  
Jure Oder ◽  
Cédric Flageul ◽  
Iztok Tiselj

In this paper, we present uncertainties of statistical quantities of direct numerical simulations (DNS) with small numerical errors. The uncertainties are analysed for channel flow and a flow separation case in a confined backward facing step (BFS) geometry. The infinite channel flow case has two homogeneous directions and this is usually exploited to speed-up the convergence of the results. As we show, such a procedure reduces statistical uncertainties of the results by up to an order of magnitude. This effect is strongest in the near wall regions. In the case of flow over a confined BFS, there are no such directions and thus very long integration times are required. The individual statistical quantities converge with the square root of time integration so, in order to improve the uncertainty by a factor of two, the simulation has to be prolonged by a factor of four. We provide an estimator that can be used to evaluate a priori the DNS relative statistical uncertainties from results obtained with a Reynolds Averaged Navier Stokes simulation. In the DNS, the estimator can be used to predict the averaging time and with it the simulation time required to achieve a certain relative statistical uncertainty of results. For accurate evaluation of averages and their uncertainties, it is not required to use every time step of the DNS. We observe that statistical uncertainty of the results is uninfluenced by reducing the number of samples to the point where the period between two consecutive samples measured in Courant–Friedrichss–Levy (CFL) condition units is below one. Nevertheless, crossing this limit, the estimates of uncertainties start to exhibit significant growth.


2020 ◽  
Author(s):  
Emily S. Kappenman ◽  
Jaclyn Farrens ◽  
Wendy Zhang ◽  
Andrew X Stewart ◽  
Steven J Luck

Event-related potentials (ERPs) are noninvasive measures of human brain activity that index a range of sensory, cognitive, affective, and motor processes. Despite their broad application across basic and clinical research, there is little standardization of ERP paradigms and analysis protocols across studies. To address this, we created ERP CORE (Compendium of Open Resources and Experiments), a set of optimized paradigms, experiment control scripts, data processing pipelines, and sample data (N = 40 neurotypical young adults) for seven widely used ERP components: N170, mismatch negativity (MMN), N2pc, N400, P3, lateralized readiness potential (LRP), and error-related negativity (ERN). This resource makes it possible for researchers to 1) employ standardized ERP paradigms in their research, 2) apply carefully designed analysis pipelines and use a priori selected parameters for data processing, 3) rigorously assess the quality of their data, and 4) test new analytic techniques with standardized data from a wide range of paradigms.


2020 ◽  
Vol 12 (17) ◽  
pp. 2797
Author(s):  
Gabriel Vasile

This paper proposes a novel data processing framework dedicated to bedload monitoring in underwater environments. After calibration, by integration the of total energy in the nominal bandwidth, the proposed experimental set-up is able to accurately measure the mass of individual sediments hitting the steel plate. This requires a priori knowledge of the vibration transients in order to match a predefined dictionary. Based on unsupervised hierarchical agglomeration of complex vibration spectra, the proposed algorithms allow accurate localization of the transients corresponding to the shocks created by sediment impacts on a steel plate.


2021 ◽  
Vol 2 (4) ◽  
pp. 593-602
Author(s):  
Renil Septiano ◽  
Sarjon Defit ◽  
Laynita Sari

Data is a very valuable asset because it is able to provide accurate, easy and fast information. This study processes coffee shop business data in the city of Padang to determine how customers behave in choosing menus. From the results of data processing. Researchers processed the data using the a priori method. From the results of data processing at a support value of 15%, it is found that the majority of customers still buy menus in units.


2016 ◽  
Author(s):  
Thierry Leblanc ◽  
Robert J. Sica ◽  
J. Anne E. van Gijsel ◽  
Alexander Haefele ◽  
Guillaume Payen ◽  
...  

Abstract. A standardized approach for the definition, propagation and reporting of uncertainty in the temperature lidar data products contributing to the Network for the Detection for Atmospheric Composition Change (NDACC) database is proposed. One important aspect of the proposed approach is the ability to propagate all independent uncertainty components in parallel through the data processing chain. The individual uncertainty components are then combined together at the very last stage of processing to form the temperature combined standard uncertainty. The identified individual uncertainty components comprise signal detection uncertainty, uncertainty due to saturation correction, background noise extraction, the merging of multiple channels, the absorption cross-sections of ozone and NO2, the molecular extinction cross-sections, the a priori use of ancillary air, ozone, and NO2 number density, the a priori use of ancillary temperature to tie-on the top of the profile, the acceleration of gravity, and the molecular mass of air. The expression of the individual uncertainty components and their step-by-step propagation through the temperature data processing chain are thoroughly estimated. All sources of uncertainty except detection noise imply correlated terms in the vertical dimension, which means that covariance terms must be taken into account when vertical filtering is applied and when temperature is integrated form the top of the profile. Quantitatively, the uncertainty budget is presented in a generic form (i.e., as a function of instrument performance and wavelength), so that any NDACC temperature lidar investigator can easily estimate the expected impact of individual uncertainty components in the case of their own instrument. An example of a full uncertainty budget obtained from actual measurements by the JPL lidar at the Mauna Loa Observatory is also provided.


2018 ◽  
Author(s):  
Ali Jalali ◽  
Robert J. Sica ◽  
Alexander Haefele

Abstract. Hauchecorne and Chanin (1980) developed a robust method to calculate middle atmosphere temperature profiles using measurements from Rayleigh-scatter lidars. This traditional method has been successfully used to greatly improve our understanding of middle atmospheric dynamics, but the method has some shortcomings in regard to the calculation of systematic uncertainties and vertical resolution of the retrieval. Sica and Haefele (2015) have shown the Optimal Estimation Method (OEM) addresses these shortcomings and allows temperatures to be retrieved with confidence over a greater range of heights than the traditional method. We have developed a temperature climatology from Purple Crow Lidar (PCL) Rayleigh-scatter measurements on 519 nights using an OEM. Our OEM retrieval is a first-principle retrieval where the forward model is the lidar equation and the measurements are the level 0 count returns. It includes a quantitative determination of the top altitude of the retrieval, the evaluation of 9 systematic plus random uncertainties, and vertical resolution of the retrieval on a profile-by-profile basis. By using the calculated averaging kernels our new retrieval extends our original climatology by an additional 5 to 10 km in altitude relative to the traditional method. The OEM statistical uncertainty makes the largest contribution in the uncertainty budget. However, significant contributions are also made from the systematic uncertainties, in particular the uncertainty due to choosing a tie-on pressure as required by the assumption of hydrostatic equilibrium, mean molecular mass variations with height, and ozone absorption cross section uncertainty. The vertical resolution of the PCL climatology is 1 km up to about 90 km and then increases to about 3 km around 100 km. The new PCL temperature climatology is compared with three sodium lidar climatologies. The comparison between the PCL and sodium lidar climatologies shows improved agreement relative to the climatology generated using the method of Hauchecorne and Chanin, that is the PCL climatology is as similar to the sodium lidar climatologies as the sodium lidar climatologies are to each other. The height-extended OEM-derived climatology is highly insensitive to the choice of an a priori temperature profile, in the sense that the a priori temperature profile contributes much less uncertainty than the statistical uncertainty.


Sign in / Sign up

Export Citation Format

Share Document