The Reliability and Validity of a Lithic Debitage Typology: Implications for Archaeological Interpretation

1998 ◽  
Vol 63 (4) ◽  
pp. 635-650 ◽  
Author(s):  
William C. Prentiss

Sullivan and Rozen's (1985) debitage typology has been proposed as a method for measuring the effects of variation in lithic reduction by describing “distinctive assemblages.” This is in contrast to many traditional analytical methods oriented toward identifying the effects of lithic reduction techniques on individual flakes. Debate over the use of the typology has focused primarily on the ability of the typology to accurately measure variation in lithic reduction behavior, and secondarily on the role of experimental studies in archaeology. In this paper I present an analysis designed to estimate the reliability and validity of the typology. An experimental design is developed to permit data collection with minimal analyst induced random or systematic error. Principal components analysis and the coefficient theta demonstrate that the typology provides reliable or replicable results when applied to debitage assemblages of similar technological origin. Further principal components analysis suggests that the instrument is of limited utility in recognizing effects of variation in reduction activities associated with highly vitreous lithic raw materials. A means of expanding the typology and increasing its accuracy in archaeological pattern recognition is presented.

Author(s):  
Marcela Falsetti ◽  
Adriana Favieri ◽  
Roxana Scorzo ◽  
Betina Williner

<p class="p1"><span class="s1"><strong> Resumen</strong>. </span>El presente artículo reporta un estudio transeccional descriptivo sobre el desarrollo de habilidades matemáticas así como la relación de éstas con actividades matemáticas y contenidos específicos. Analizamos producciones escritas de estudiantes de carreras de ingeniería que realizan su primer curso de Cálculo Diferencial. Para esta experiencia han trabajado en un taller usando el software Mathematica®. Describimos los criterios considerados para la clasificación de las actividades y habilidades, los instrumentos para la evaluación y el procesamiento de los datos. En las conclusiones establecemos relaciones entre tipos de actividades y habilidades promovidas y nos referimos al rol del software en la enseñanza y en el aprendizaje de la introducción al Cálculo Diferencial. Finalmente mediante el análisis estadístico descriptivo y el de componentes principales reforzamos la hipótesis de que una habilidad debe medirse en estrecha dependencia con el contenido y la tarea realizada.</p> <p class="p1"><span class="s2"><strong>Palabras clave</strong>: </span>Habilidades matemáticas, diseño de actividades, software matemático de cálculo simbólico y numérico, cálculo diferencial, análisis estadístico por componentes principales.</p> <p class="p1"><span class="s1"><strong> Abstract</strong>. </span>This paper reports a transactional descriptive study on math skills, as part of mathematical competence, and its relationship with math activities and specific content. We analyze written productions of students, from Engineering careers, of the their first-Calculus course. For this experience, they have worked in a workshop using Mathematica ®software. We describe here the criteria used for the classification of activities and skills, the tools for evaluation and processing of the data. In the conclusions we establish relationships between types of activities and skills promoted and we refer to the role of software in teaching and learning in the introductory Differential Calculus course. Finally through descriptive statistical analysis and principal components analysis we also reinforce the hypothesis that a skill should be measured in close dependence on the content and the task.</p> <p class="p1"><span class="s2"><strong>KeyWords</strong>: </span>Math Skills, design activities, mathematical calculus software, Differential Calculus, principal components analysis.</p>


1999 ◽  
Vol 85 (2) ◽  
pp. 579-582 ◽  
Author(s):  
Andile Mji

The article reports reliability and validity of the Conceptions of Mathematics Questionnaires, based on responses of 154 undergraduate mathematics majors from four universities in South Africa. The reliability estimated as internal consistency had a Cronbach alpha of .84. To establish the validity, Principal components analysis with varimax rotation yielded a two-component solution accounting for 44% of variance. The components were interpreted as Fragmented Conceptions and Cohesive Conceptions of mathematics, as in Australia Since the factor solution was comparable to that reported in Australia, this result is a sufficient basis for the use of the questionnaire in South Africa.


2013 ◽  
Vol 7 (1) ◽  
pp. 19-24
Author(s):  
Kevin Blighe

Elaborate downstream methods are required to analyze large microarray data-sets. At times, where the end goal is to look for relationships between (or patterns within) different subgroups or even just individual samples, large data-sets must first be filtered using statistical thresholds in order to reduce their overall volume. As an example, in anthropological microarray studies, such ‘dimension reduction’ techniques are essential to elucidate any links between polymorphisms and phenotypes for given populations. In such large data-sets, a subset can first be taken to represent the larger data-set. For example, polling results taken during elections are used to infer the opinions of the population at large. However, what is the best and easiest method of capturing a sub-set of variation in a data-set that can represent the overall portrait of variation? In this article, principal components analysis (PCA) is discussed in detail, including its history, the mathematics behind the process, and in which ways it can be applied to modern large-scale biological datasets. New methods of analysis using PCA are also suggested, with tentative results outlined.


Author(s):  
Ronan de Kervenoael ◽  
Alan Hallsworth ◽  
David Tng

Geography, retailing, and power are institutionally bound up together. Within these, the authors situate their research in Clegg's work on power. Online shopping offers a growing challenge to the apparent hegemony of traditional physical retail stores' format. While novel e-formats appear regularly, blogshops in Singapore are enjoying astonishing success that has taken the large retailers by surprise. Even though there are well-developed theoretical frameworks for understanding the role of institutional entrepreneurs and other major stakeholders in bringing about change and innovation, much less attention has been paid to the role of unorganized, nonstrategic actors—such as blogshops—in catalyzing retail change. The authors explore how blogshops are perceived by consumers and how they challenge the power of other shopping formats. They use Principal Components Analysis to analyze results from a survey of 349 blogshops users. While the results show that blogshops stay true to traditional online shopping attributes, deviations occur on the concept of value. Furthermore, consumer power is counter intuitively found to be strongly present in the areas related to cultural ties, excitement, and search for individualist novelty (as opposed to mass-production), thereby encouraging researchers to think critically about emerging power behavior in media practices.


Author(s):  
Pierre Meunier

Multivariate data reduction techniques such as principal components analysis (PCA), offer the potential of simplifying the task of designing and evaluating workspaces for anthropometric accommodation of the user population. Simplification occurs by reducing the number of variables that one has to consider while retaining most, e.g. 89%, of the original dataset's variability. The error introduced by choosing to ignore some (11%) of the variability is examined in this paper. A set of eight design mannequins was generated using a data reduction method developed for MIL-STD-1776A. These mannequins, which were located on the periphery of a circle encompassing 90%, 95% and 99% of the population on two principal components, were compared with the true multivariate 90%, 95% and 99% of the population. The PCA mannequins were found to include less of the population than originally intended. The degree to which the mannequins included the true percentage of the population was found to depend mainly on the size of the initial envelope (larger envelopes were closer to the true accommodation limits). The paper also discusses some of the limitations of using limited numbers of test cases to predict population accommodation.


Perception ◽  
10.1068/p7267 ◽  
2012 ◽  
Vol 41 (11) ◽  
pp. 1373-1391 ◽  
Author(s):  
Anna Lindqvist ◽  
Anders Höglund ◽  
Birgitta Berglund

Twenty participants scaled similarities in odour quality, odour intensity and pleasantness/unpleasantness of 10 binary and 5 higher-order mixtures of 5 odorous degradation products from the polymer Polyamide 6.6. The perceived odour qualities of all binary mixtures were represented well as intermediary vectors relative to their component-odour vectors in a three-component principal components analysis. The odour qualities of the “floral/fruity” 2-pentylcyclopentan-1-one and the “sharp/cheese-like” pentanoic acid contributed profoundly to their binary mixtures, as did the “minty” cyclopentanone, but in fewer cases. Conversely, the “ether-like” 2-methyl pyridine and “nutty” butanamide did not contribute much. Odour similarity was shown to be caused by odour quality, rather than odour intensity. Three out of five degradation products formed distinct clusters of odours and were therefore interpreted to be profound contributors to the odour quality of the binary mixtures. The higher-order mixtures created new odour qualities which were completely different and untraceable to their various parts as perceived alone. These results demonstrate that it is critical to research the perception of natural mixtures in order to be able to understand the human olfactory code.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Mario Marchetti ◽  
Lee Chapman ◽  
Abderrahmen Khalifa ◽  
Michel Buès

Thermal mapping uses IR thermometry to measure road pavement temperature at a high resolution to identify and to map sections of the road network prone to ice occurrence. However, measurements are time-consuming and ultimately only provide a snapshot of road conditions at the time of the survey. As such, there is a need for surveys to be restricted to a series of specific climatic conditions during winter. Typically, five to six surveys are used, but it is questionable whether the full range of atmospheric conditions is adequately covered. This work investigates the role of statistics in adding value to thermal mapping data. Principal components analysis is used to interpolate between individual thermal mapping surveys to build a thermal map (or even a road surface temperature forecast), for a wider range of climatic conditions than that permitted by traditional surveys. The results indicate that when this approach is used, fewer thermal mapping surveys are actually required. Furthermore, comparisons with numerical models indicate that this approach could yield a suitable verification method for the spatial component of road weather forecasts—a key issue currently in winter road maintenance.


MAKSIMUM ◽  
2012 ◽  
Vol 1 (2) ◽  
pp. 109
Author(s):  
Wahyu Manuhara Putra

Abstract This study aimed to test whether the Corporate Governance associated the Agent- Conflicts This research Used exploratory principal components analysis and kanonikal analysis on 6 individual governance variables to get the 5 factors representing different dimensions of corporate governance and treasures the agent, conflict firms based on 7 agency conflict proxies used in the literature. Results of analysis found that Companies with greater agency conflict has a mechanism for better corporate governance, in particular that the low ownership structure has a high impact on institutional ownership. Overall the result support the theory that the existence and role of corporate governance mechanisms on firm is a function of agency Conflict in the company. Keyword : Agency Conflicts, Corporate Governance, exploratory principal components analysis, kanonikal analysis


1996 ◽  
Vol 82 (1) ◽  
pp. 195-198 ◽  
Author(s):  
John Maltby ◽  
Christopher Alan Lewis

The present study examined the reliability and validity of the 15-item Breskin Rigidity Scale. Two samples of UK adults ( n = 216; n = 277) completed the rigidity scale alongside the Wilson-Patterson Attitude Inventory and the trait measure of the Sandler-Hazari Obsessionality Inventory. Satisfactory reliability coefficients were obtained for the rigidity scale in both samples. A principal components analysis of the 15-item rigidity scale showed that 13 items loaded on the first component and 2 on a second component. These items were removed and the reliability and validity estimates were recalculated. Comparisons were drawn between the original and amended versions. The amended 13-item scale shows an improved reliability and higher association with other measures of rigidity and rigid character.


Sign in / Sign up

Export Citation Format

Share Document