Effects of a Data Reduction Technique on Anthropometric Accommodation

Author(s):  
Pierre Meunier

Multivariate data reduction techniques such as principal components analysis (PCA), offer the potential of simplifying the task of designing and evaluating workspaces for anthropometric accommodation of the user population. Simplification occurs by reducing the number of variables that one has to consider while retaining most, e.g. 89%, of the original dataset's variability. The error introduced by choosing to ignore some (11%) of the variability is examined in this paper. A set of eight design mannequins was generated using a data reduction method developed for MIL-STD-1776A. These mannequins, which were located on the periphery of a circle encompassing 90%, 95% and 99% of the population on two principal components, were compared with the true multivariate 90%, 95% and 99% of the population. The PCA mannequins were found to include less of the population than originally intended. The degree to which the mannequins included the true percentage of the population was found to depend mainly on the size of the initial envelope (larger envelopes were closer to the true accommodation limits). The paper also discusses some of the limitations of using limited numbers of test cases to predict population accommodation.

2013 ◽  
Vol 7 (1) ◽  
pp. 19-24
Author(s):  
Kevin Blighe

Elaborate downstream methods are required to analyze large microarray data-sets. At times, where the end goal is to look for relationships between (or patterns within) different subgroups or even just individual samples, large data-sets must first be filtered using statistical thresholds in order to reduce their overall volume. As an example, in anthropological microarray studies, such ‘dimension reduction’ techniques are essential to elucidate any links between polymorphisms and phenotypes for given populations. In such large data-sets, a subset can first be taken to represent the larger data-set. For example, polling results taken during elections are used to infer the opinions of the population at large. However, what is the best and easiest method of capturing a sub-set of variation in a data-set that can represent the overall portrait of variation? In this article, principal components analysis (PCA) is discussed in detail, including its history, the mathematics behind the process, and in which ways it can be applied to modern large-scale biological datasets. New methods of analysis using PCA are also suggested, with tentative results outlined.


1998 ◽  
Vol 63 (4) ◽  
pp. 635-650 ◽  
Author(s):  
William C. Prentiss

Sullivan and Rozen's (1985) debitage typology has been proposed as a method for measuring the effects of variation in lithic reduction by describing “distinctive assemblages.” This is in contrast to many traditional analytical methods oriented toward identifying the effects of lithic reduction techniques on individual flakes. Debate over the use of the typology has focused primarily on the ability of the typology to accurately measure variation in lithic reduction behavior, and secondarily on the role of experimental studies in archaeology. In this paper I present an analysis designed to estimate the reliability and validity of the typology. An experimental design is developed to permit data collection with minimal analyst induced random or systematic error. Principal components analysis and the coefficient theta demonstrate that the typology provides reliable or replicable results when applied to debitage assemblages of similar technological origin. Further principal components analysis suggests that the instrument is of limited utility in recognizing effects of variation in reduction activities associated with highly vitreous lithic raw materials. A means of expanding the typology and increasing its accuracy in archaeological pattern recognition is presented.


Sign in / Sign up

Export Citation Format

Share Document