graph type
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 19)

H-INDEX

4
(FIVE YEARS 1)

2022 ◽  
Vol 80 (1) ◽  
Author(s):  
Zainab Dawood ◽  
Naeem Majeed

Abstract Background Almost 2.5 million neonates died in the first year of life in the year 2017. These account for almost half of the total deaths of children under the age of 5 years. Overall, child mortality has declined over the past two decades. Comparatively, the pace of decline in neonatal mortality has remained much slow. Significant inequalities in health across several dimensions – including wealth, ethnicity, and geography – continue to exist both between and within countries, and these contribute to neonatal mortality. This study aims to quantify the magnitude of inequalities in neonatal mortality trends by wealth quintile and place of residence with province wise segregation. Methods The study was done using raw data from the last three Pakistan Demographic & Health Surveys (2017–18, 2012–13 and 2006–07). The concentration curves were drawn in Microsoft Excel 365 using scatter plot as graph type while the frequencies were calculated using SPSS 24. Results The situation of inequity across provinces and in rural vs urban areas has slightly declined, however, gross inequities continue to exist. Conclusions Presentation of outcomes data, such as neonatal mortality in various wealth quintiles is an effective way to highlight the inequities amongst income groups as it highlights the vulnerable and at-risk groups. In other countries, rural-urban distribution, or ethnic groups may also reflect similar differences and help in identifying high-risk groups.


2021 ◽  
Vol 12 ◽  
Author(s):  
Fang Zhao ◽  
Robert Gaschler

Different graph types might differ in group comparison due to differences in underlying graph schemas. Thus, this study examined whether graph schemas are based on perceptual features (i.e., each graph has a specific schema) or common invariant structures (i.e., graphs share several common schemas), and which graphic type (bar vs. dot vs. tally) is the best to compare discrete groups. Three experiments were conducted using the mixing-costs paradigm. Participants received graphs with quantities for three groups in randomized positions and were given the task of comparing two groups. The results suggested that graph schemas are based on a common invariant structure. Tally charts mixed either with bar graphs or with dot graphs showed mixing costs. Yet, bar and dot graphs showed no mixing costs when paired together. Tally charts were the more efficient format for group comparison compared to bar graphs. Moreover, processing time increased when the position difference of compared groups was increased.


2021 ◽  
Vol 11 (24) ◽  
pp. 12135
Author(s):  
László Beinrohr ◽  
Eszter Kail ◽  
Péter Piros ◽  
Erzsébet Tóth ◽  
Rita Fleiner ◽  
...  

Data science and machine learning are buzzwords of the early 21st century. Now pervasive through human civilization, how do these concepts translate to use by researchers and clinicians in the life-science and medical field? Here, we describe a software toolkit, just large enough in scale, so that it can be maintained and extended by a small team, optimised for problems that arise in small/medium laboratories. In particular, this system may be managed from data ingestion statistics preparation predictions by a single person. At the system’s core is a graph type database, so that it is flexible in terms of irregular, constantly changing data types, as such data types are common during explorative research. At the system’s outermost shell, the concept of ’user stories’ is introduced to help the end-user researchers perform various tasks separated by their expertise: these range from simple data input, data curation, statistics, and finally to predictions via machine learning algorithms. We compiled a sizable list of already existing, modular Python platform libraries usable for data analysis that may be used as a reference in the field and may be incorporated into this software. We also provide an insight into basic concepts, such as labelled-unlabelled data, supervised vs. unsupervised learning, regression vs. classification, evaluation by different error metrics, and an advanced concept of cross-validation. Finally, we show some examples from our laboratory using our blood sample and blood clot data from thrombosis patients (sufferers from stroke, heart and peripheral thrombosis disease) and how such tools can help to set up realistic expectations and show caveats.


2021 ◽  
Author(s):  
Robert Claude Meffan ◽  
Julian Menges ◽  
Fabian Dolamore ◽  
Daniel Mak ◽  
Conan Fee ◽  
...  

A novel capillary action microfluidic viscometer has been designed that can measure the relative viscosity of a sample compared to a control liquid. Using capillary action circuits, the viscosity of a sample is transformed into a microfluidic bar-graph format without the use of external instrumentation. The bars in this case are represented by the distance that a liquid has flown through a microfluidic channel, relative to another liquid in an identical channel. As the device does not require external instrumentation, its use is targeted at point-of-care (PoC) situations. This implementation is made practical through capillaric Field Effect Transistors, and the conditional flow paths they enable. In this paper, we report on the design, operation, and performance of a two-channel version viscometer device exclusively based on capillary action circuits. Using poly-ethylene glycol solutions as viscous samples, we demonstrate that the device can transduce the relative viscosity consistently to within 2%. Enabled by the flexibility of the capillary action circuits, we additionally present a modified device which can measure transparent liquids without the need to add colorants to the sample. The forms of the device presented in this work have applications in both medical care and scientific measurements—particularly for PoC measurements.


Author(s):  
R. Lee Lyman

Given explicit recognition between ~1915 and the 1930s that certain artifact types display unimodal frequency distributions over time, archaeologists initially presented tables of those frequencies but by the 1930s were experimenting with different types of graphs to present visual images of culture change. The lack of familiarity with graph theory and graph grammar meant numerous kinds of graph were published, often only once each as researchers sought effective (readily deciphered) and efficient (minimal ink and space) graph forms. These experimental graph types range from fairly simplistic to complex and virtually indecipherable. Lack of decipherability and errors in some graphs reflect poor understanding of the principles of graph construction and the precise nature of what a graph type is meant to illustrate. The analytical focus on culture history and recognition that artifact form varied along both time and geographic space led to some efforts to incorporate all three dimensions—form, time, space—into some graphs. It is not surprising that in the search for a useful graph type, the one-off graphs variously implied transformational, variational, and a combination of variational and transformational evolutionary change.


Author(s):  
R. Lee Lyman

Close examination of James A. Ford’s self-reported 1952 history of how he developed the centered and stacked bars style of spindle graph for which he is famous indicates he likely invented this kind of spindle graph with a bit of assistance from his colleagues George Quimby and Gordon Willey. In the 1930s, his diagrams of culture change were spatio-temporal rectangles or bar graphs; his first centered and stacked bars spindle diagram appeared in the 1949 published version of his doctoral dissertation. That graph style was picked up by American Southwest archaeologist Paul S(ydney) Martin that same year; Martin had, like many of his colleagues, initially used line graphs and bar graphs to illustrate culture change. Subsequently, numerous individuals adopted Ford’s centered and stacked bars form of spindle diagram. During the 1950s in Europe, French Paleolithic archaeologist François Bordes adopted ogive or cumulative relative frequency curves as a graphic means to compare assemblages of lithic tools. Quickly adopted by many European archaeologists, this graph type was only occasionally used in North America. After Ford, most graphs diagramed variational evolutionary change.


Author(s):  
R. Lee Lyman

Graphs are analytical tools and communication tools, and they summarize visually what has been learned. Granting that a major purpose of archaeology is to document and explain culture change, it is odd that the hows and whys of graphing culture change have received minimal attention in the archaeology literature. Spindle graphs will likely continue to be the most frequently used graph type for diagraming change, but continued development of computer software may result in new graph types and styles. Recent modifications to spindle graphs include scaling bar thickness to temporal duration of the represented assemblage. Classic data on temporal change in kaolin pipe stem hole diameters can be graphed using a regression line, a bar graph, and a spindle graph; the different graphs highlight that how phenomena are classified, how data are graphed, and one’s theory of change are mutually influential. Deciding which graph type to use in any particular situation will depend on what the researcher hopes to illustrate, along with the goal to produce a readily deciphered graph. The majority of archaeological graphs that appeared in the twentieth century depict variational evolution. Once developed in the late 1940s, spindle graphs quickly became the graph type preferred by North American archaeologists. There is weak circumstantial evidence archaeologists may have borrowed the idea of spindle graphs from paleontology, but it seems more likely the idea was stumbled upon by early archaeologists who perceived unimodal pulses in artifact frequencies over time and developed general models of those pulses.


2021 ◽  
Vol 29 (1) ◽  
pp. 35-44
Author(s):  
Irina A. Belskikh ◽  
Alina I. Belogurova

Aim. This investigation studies the parameters of the psychomotor components of the types of the nervous system (NS) of personality in different styles of cognitive functioning. Materials and Methods. One hundred medical university students (mean age 22.011.84 years, 15 males and 85 females) were examined. Experimental research methods: 1. Express method for determining the properties of the NS computer modification of the tapping test of E.P. Ilyin Psychomotor test of the NS (Neurosoft, Ivanovo). Criteria of strength, endurance, and lability of nervous processes in connection with the intensity of work. 2. Assessment of cognitive functioning a method of discrimination of the properties of concepts (cognitive style concrete/abstract conceptualization). Results. All participants were ranked to 4 poles of cognitive style: 1 abstract subjectivity of conceptualization (4.9%); 2 abstract realism of conceptualization (10.1%); 3 concrete subjectivity of conceptualization (9.5%); 4 concrete realism of conceptualization (5.3%). In the group of studied individuals, a descending graph of movement speed dominated (61%), which corresponds to a weak type of NS; 10% of participants had a strong type of NS characterized by a convex graph type; a flat type was identified in 14% of patients. This type indicates a medium strength of the NS. Intermediate and concave types were diagnosed in 15% of participants, which corresponds to a moderately weak type of NS. In the statistical analysis of psychomotor NS parameters participant groups with different poles of conceptualization expression according to the criteria of E.P. Ilyin, the data obtained showed an interrelationship between a strong type of NS and a subjective concrete conceptualization. It was an expressed style of realistic abstractness and a weak type of NS. Conclusion. The maximum frequency of the tapping test, being a parameter of the speed aspect of psychomotor activity, allows using this criterion to assess the overall activity and the style peculiarities of the cognitive activity, expressed in different types of conceptualization. Peculiarities of cognitive activity, expressed in the increased subjectiveness of conceptualization, correlate with functional mobility enhancement of the cortical processes, increase in information processing speed, and the effectiveness of integrative brain activity.


2021 ◽  
pp. 002224372110021
Author(s):  
Junghan Kim ◽  
Arun Lakshmanan

This article shows that animated display of time-varying data (e.g., stock or commodity prices) enhances risk judgments. We outline a process whereby animated display enhances the visual salience of transitions in a trajectory (i.e., successive changes in data values), which leads to transitions being utilized more to form cognitive inferences about risk. In turn, this leads to inflated risk judgments. The studies reported in this article provide converging evidence via eye-tracking (Study 1), serial mediation analyses (Studies 2 and 3), and experimental manipulations of the process factors: transition salience (graph type; Study 3) and utilization of transitions (global trend; Study 4 and investment goals; Study 5) and in the process, outline boundary conditions. We also demonstrate the effect of animated display upon consequential investment decisions and behavior. This paper adds to the literature on salience effects by disambiguating the role of inference-making in how salience of stimuli causes biases in judgments. Broader implications for visual information processing, data visualization, financial decision-making, and public policy are also discussed.


Author(s):  
Stephanie M. Gardner ◽  
Elizabeth Suazo-Flores ◽  
Susan Maruca ◽  
Joel K. Abraham ◽  
Anupriya Karippadath ◽  
...  

AbstractGraphing is an important practice for scientists and in K-16 science curricula. Graphs can be constructed using an array of software packages as well as by hand, with pen-and-paper. However, we have an incomplete understanding of how students’ graphing practice vary by graphing environment; differences could affect how best to teach and assess graphing. Here we explore the role of two graphing environments in students’ graphing practice. We studied 43 undergraduate biology students’ graphing practice using either pen-and-paper (PP) (n = 21 students) or a digital graphing tool GraphSmarts (GS) (n = 22 students). Participants’ graphs and verbal justifications were analyzed to identify features such as the variables plotted, number of graphs created, raw data versus summarized data plotted, and graph types (e.g., scatter plot, line graph, or bar graph) as well as participants’ reasoning for their graphing choices. Several aspects of participant graphs were similar regardless of graphing environment, including plotting raw vs. summarized data, graph type, and overall graph quality, while GS participants were more likely to plot the most relevant variables. In GS, participants could easily make more graphs than in PP and this may have helped some participants show latent features of their graphing practice. Those students using PP tended to focus more on ease of constructing the graph than GS. This study illuminates how the different characteristics of the graphing environment have implications for instruction and interpretation of assessments of student graphing practices.


Sign in / Sign up

Export Citation Format

Share Document