Describing Quantitative Data with Frequency Distributions

2013 ◽  
pp. 49-75
Author(s):  
R. Lee Lyman

Graphs of quantitative data are analytical tools that facilitate visual thinking. In many disciplines, the use of graphs was preceded by tables summarizing quantitative data. Graphs known by North American archaeologists as “battleship curves” are temporal frequency distributions of relative abundances of specimens in each of several artifact types. They are unimodal frequency distributions known as spindle graphs. In the early 1950s, it was suggested that the idea of spindle graphs was borrowed by archaeologists from paleontology. Archaeologists occasionally used bar graphs and line graphs to diagram change in artifact inventories in the early twentieth century. The questions addressed in this volume are: (i) did North American archaeologists borrow the idea of spindle graphs from paleontology, and (ii) what was the frequency of use by North American archaeologists of each of the various graph types to diagram culture change during the early and middle twentieth century?


Author(s):  
Paul Cleary ◽  
Sam Ghebrehewet ◽  
David Baxter

This chapter provides a grounding in basic statistics, descriptive epidemiology, analytical epidemiology, and hypothesis testing appropriate for health protection practitioners. The analysis of categorical data using frequency distributions, and charts, and the interpretation of epidemic curves is described. The description of quantitative data including central tendency, standard deviation, and interquartile range is concisely explained. The role of geographical information systems and different disease map types is used to demonstrate how disease clusters may be detected. Determining possible association between specific risk factors and outcome is described in the section on analytical epidemiology, using the risk ratio and the odds ratio. The use of these in different study/investigation types is explained. The importance of confounding, matching, and standardization in study design is described. The final part of the chapter covers hypothesis testing to distinguish between real differences and chance variation, and the use of confidence intervals.


Author(s):  
L.E. Murr

Ledges in grain boundaries can be identified by their characteristic contrast features (straight, black-white lines) distinct from those of lattice dislocations, for example1,2 [see Fig. 1(a) and (b)]. Simple contrast rules as pointed out by Murr and Venkatesh2, can be established so that ledges may be recognized with come confidence, and the number of ledges per unit length of grain boundary (referred to as the ledge density, m) measured by direct observations in the transmission electron microscope. Such measurements can then give rise to quantitative data which can be used to provide evidence for the influence of ledges on the physical and mechanical properties of materials.It has been shown that ledge density can be systematically altered in some metals by thermo-mechanical treatment3,4.


Author(s):  
M.E. Rosenfeld ◽  
C. Karboski ◽  
M.F. Prescott ◽  
P. Goodwin ◽  
R. Ross

Previous research documenting the chronology of the cellular interactions that occur on or below the surface of the endothelium during the initiation and progression of arterial lesions, primarily consisted of descriptive studies. The recent development of lower cost image analysis hardware and software has facilitated the collection of high resolution quantitative data from microscopic images. In this report we present preliminary quantitative data on the sequence of cellular interactions that occur on the endothelium during the initiation of atherosclerosis or vasculitis utilizing digital analysis of images obtained directly from the scanning electron microscope. Segments of both atherosclerotic and normal arteries were obtained from either diet-induced or endogenously (WHHL) hypercholesterolemic rabbits following 1-4 months duration of hypercholesterolemia and age matched control rabbits. Vasculitis was induced in rats following placement of an endotoxin soaked thread adjacent to the adventitial surface of arteries.


Author(s):  
Manoj Raje ◽  
Karvita B. Ahluwalia

In Acute Lymphocytic Leukemia motility of lymphocytes is associated with dissemination of malignancy and establishment of metastatic foci. Normal and leukemic lymphocytes in circulation reach solid tissues where due to in adequate perfusion some cells get trapped among tissue spaces. Although normal lymphocytes reenter into circulation leukemic lymphocytes are thought to remain entrapped owing to reduced mobility and form secondary metastasis. Cell surface, transmembrane interactions, cytoskeleton and level of cell differentiation are implicated in lymphocyte mobility. An attempt has been made to correlate ultrastructural information with quantitative data obtained by Laser Doppler Velocimetry (LDV). TEM of normal & leukemic lymphocytes revealed heterogeneity in cell populations ranging from well differentiated (Fig. 1) to poorly differentiated cells (Fig. 2). Unlike other cells, surface extensions in differentiated lymphocytes appear to originate by extrusion of large vesicles in to extra cellular space (Fig. 3). This results in persistent unevenness on lymphocyte surface which occurs due to a phenomenon different from that producing surface extensions in other cells.


1963 ◽  
Vol 03 (02) ◽  
pp. 175-182 ◽  
Author(s):  
Bo Bergman ◽  
Rune Söremark

SummaryBy means of neutron activation and gamma-ray spectrometry the concentrations in the human mandibular articular disc of the following elements have been determined: Na, Mn, Cu, Zn, Rb, Sr, Cd, W, and Au. The discs were obtained at necropsy from seven men and nine women, ranging in age from 56 to 71 years.The activation was carried out in a thermal neutron flux of about 1.7 XlO12 neutrons × cm−2 × sec.−1 for about 20 hours. A chemical group separationwas performed before the gamma-ray spectrometry. Quantitative data based on the dry weight of the cartilage samples were obtained by comparing the photo-peak area of the identified elements with those of appropriate standards.


2005 ◽  
Vol 44 (01) ◽  
pp. 29-32 ◽  
Author(s):  
I. Garai ◽  
J. Varga ◽  
G. Szücs ◽  
Z. Galajda ◽  
C. András ◽  
...  

Summary Aim: We investigated the circulatory characteristics of patients suffering of primary and secondary Raynaud’s syndrome. Patients, methods: We examined 106 patients presenting with the classical symptoms of Raynaud’s syndrom (47 primary, 59 secondary) by hand perfusion scintigraphy developed by our Department of Nuclear Medicine. After visual evaluation we analyzed the images semiquantitatively, using the finger to palm ratio. We statistically compared the patients with primary and those with secondary Raynaud’s syndrome. Results: By visual evaluation we constated regional perfusion disturbances in 42 from 59 patients with secondary Raynaud’s syndrome. However, this was observed in only 3 from 47 patients with the primary form of this disease. This difference was statistically significant (p <0.001). Semiquantitative analysis showed that the finger/palm ratios (FPR) were significantly lower (p <0.05) for the patients with primary Raynaud’s syndrome. No differences in the FPR values concerning sex or right and left side. Conclusion: The hand perfusion scintigraphy with 99mTc-DTPA is a noninvasive, cost effective diagnostic tool, which objectively reflects the global and regional microcirculatory abnormalities of the hands, and provides quantitative data for follow-up.


2018 ◽  
Vol 4 (2) ◽  
pp. 43-55
Author(s):  
Ika Yulianti ◽  
Endah Masrunik ◽  
Anam Miftakhul Huda ◽  
Diana Elvianita

This study aims to find a comparison of the calculation of the cost of goods manufactured in the CV. Mitra Setia Blitar uses the company's method and uses the Job Order Costing (JOC) method. The method used in this study is quantitative. The types of data used are quantitative and qualitative. Quantitative data is in the form of map production cost data while qualitative data is in the form of information about map production process. The result of calculating the cost of production of the map between the two methods results in a difference of Rp. 306. Calculation using the company method is more expensive than using the Job Order Costing method. Calculation of cost of goods manufactured using the company method is Rp. 2,205,000, - or Rp. 2,205, - each unit. While using the Job Order Costing (JOC) method is Rp. 1,899,000, - or Rp 1,899, - each unit. So that the right method used in calculating the cost of production is the Job Order Costing (JOC) method


2019 ◽  
pp. 91-106 ◽  
Author(s):  
Rostislav I. Kapeliushnikov

Using published estimates of inequality for two countries (Russia and USA) the paper demonstrates that inequality measuring still remains in the state of “statistical cacophony”. Under this condition, it seems at least untimely to pass categorical normative judgments and offer radical political advice for governments. Moreover, the mere practice to draw normative conclusions from quantitative data is ethically invalid since ordinary people (non-intellectuals) tend to evaluate wealth and incomes as admissible or inadmissible not on the basis of their size but basing on whether they were obtained under observance or violations of the rules of “fair play”. The paper concludes that a current large-scale ideological campaign of “struggle against inequality” has been unleashed by left-wing intellectuals in order to strengthen even more their discursive power over the public.


2018 ◽  
Vol 2 (1) ◽  
pp. 45-54
Author(s):  
Qurotul Aini ◽  
Siti Ria Zuliana ◽  
Nuke Puji Lestari Santoso

The scale is usually used to check and determine the value of a qualitative factor in quantitative measures. The measurement scale is a management in agreement that is used as a reference to determine the short length of the interval that is in the measuring instrument, so that the measuring instrument when used in measurements will produce quantitative data. The results of the scale management calculation must be interpreted carefully because in addition to producing a rough picture, the respondent's answers are not just straightforward to be trusted. Types of measurement scales: Likert scale, Guttman scale, semantic differential scale, rating scale, Thurstone scale, Borgadus scale, and various other measurement management scales. One of the most difficult jobs for information technology researchers faced with the necessity of measuring variables is: finding directions in the midst of many existing sizes. If there is a good size for a particular variable, it seems that there are not many reasons to compile a new size yourself. Keywords: Scale, Measurement, Variables.


Sign in / Sign up

Export Citation Format

Share Document