statistical certainty
Recently Published Documents


TOTAL DOCUMENTS

19
(FIVE YEARS 1)

H-INDEX

5
(FIVE YEARS 0)

Author(s):  
Angel Fernando Kuri-Morales

The exploitation of large databases implies the investment of expensive resources both in terms of the storage and processing time. The correct assessment of the data implies that pre-processing steps be taken before its analysis. The transformation of categorical data by adequately encoding every instance of categorical variables is needed. Encoding must be implemented that preserves the actual patterns while avoiding the introduction of non-existing ones. The authors discuss CESAMO, an algorithm which allows us to statistically identify the pattern preserving codes. The resulting database is more economical and may encompass mixed databases. Thus, they obtain an optimal transformed representation that is considerably more compact without impairing its informational content. For the equivalence of the original (FD) and reduced data set (RD), they apply an algorithm that relies on a multivariate regression algorithm (AA). Through the combined application of CESAMO and AA, the equivalent behavior of both FD and RD may be guaranteed with a high degree of statistical certainty.


2013 ◽  
Vol 20 (3) ◽  
pp. 405-412 ◽  
Author(s):  
Brian E. Bewer

Analyzer-based imaging has improved tissue X-ray imaging beyond what conventional radiography was able to achieve. The extent of the improvement is dependent on the crystal reflection used in the monochromator and analyzer combination, the imaging photon energy, the geometry of the sample and the imaging detector. These many factors determine the ability of the system to distinguish between various bone tissues or soft tissues with a specified statistical certainty between pixels in a counting detector before any image processing. The following discussion will detail changes in the required number of imaging photons and the resulting surface absorbed dose when the imaging variables are altered. The process whereby the optimal imaging parameters to deliver the minimum surface absorbed dose to a sample while obtaining a desired statistical certainty between sample materials for an arbitrary analyzer-based imaging system will be described. Two-component samples consisting of bone and soft tissue are discussed as an imaging test case. The two-component approach will then be generalized for a multiple-component sample.


Author(s):  
Charles Malcolm ◽  
Andrew Young ◽  
Ellen Willmott ◽  
Matthew Holmes

Author(s):  
Yura A. Sevcenco ◽  
David Walters ◽  
Andrew P. Crayford ◽  
Richard Marsh ◽  
Philip J. Bowen ◽  
...  

This study is part of an ongoing European Aviation Safety Agency (EASA) programme (‘SAMPLE’). The effects of gas stream flow regimes in the sample transport line and dilution strategies for removal of the volatile fraction on measured PM size distribution are evaluated behind a simulated aero-derivative gas turbine exhaust using a fast mobility DMS500 particle sizer. The PM size distribution and concentration within the primary transport sample was found to be relatively insensitive to flow regime, with conditions of turbulent flow (lowest residence time) providing the highest number concentrations, and hence least losses. However, given the natural variation of PM production from the combustor source the statistical certainty of these observations require consolidation. A ‘bespoke’ volatile particle removal system based on the European automotive PMP protocol was constructed to allow the effects of dilution ratio and evaporation tube residence time to be investigated. It was shown that both strategies of increasing dilution ratio and residence times in the evaporation tube did not affect the size distribution at the two distinct nucleation and accumulation modes to any statistical certainty. When using high (420:1) dilution ratios in the VPR, a third larger (200nm) mode appears, which requires further investigation.


Sign in / Sign up

Export Citation Format

Share Document