scholarly journals Preprint of Graph Construction: Setting the Range of the Y‐Axis by Jessica Witt (Under review at Meta-Psychology)

2018 ◽  
Author(s):  
Jessica K. Witt

Graphs are an effective and compelling way to present scientific results. With few rigid guidelines, researchers have many degrees‐of‐freedom regarding graph construction. One such choice is the range of the y‐axis. A range set just beyond the data will bias readers to see all effects as big. Conversely, a range set to the full range of options will bias readers to see all effects as small.Researchers should maximize congruence between visual size of an effect and the actual size of the effect. To achieve congruency in scientific fields for which effects are standardized, the y‐axis range should be a function of the standard deviation. This improved graph comprehension by increasing sensitivity and reducing bias relative to the other options for the y‐axis range.

2019 ◽  
Vol 3 ◽  
Author(s):  
Jessica K. Witt

Graphs are an effective and compelling way to present scientific results.  With few rigid guidelines, researchers have many degrees-of-freedom regarding graph construction.  One such choice is the range of the y-axis.  A range set just beyond the data will bias readers to see all effects as big. Conversely, a range set to the full range of options will bias readers to see all effects as small.  Researchers should maximize congruence between visual size of an effect and the actual size of the effect.  In the experiments presented here, participants viewed graphs with the y-axis set to the minimum range required for all the data to be visible, the full range from 0 to 100, and a range of approximately 1.5 standard deviations.  The results showed that participants’ sensitivity to the effect depicted in the graph was better when the y-axis range was between one to two standard deviations than with either the minimum range or the full range.  In addition, bias was also smaller with the standardized axis range than the minimum or full axis ranges.  To achieve congruency in scientific fields for which effects are standardized, the y-axis range should be no less than 1 standard deviations, and aim to be at least 1.5 standard deviations.


2019 ◽  
Vol 20 (3) ◽  
pp. 285-295
Author(s):  
Chen Ling ◽  
Yuanhui Zhang ◽  
Jun Li ◽  
Wenli Chen ◽  
Changquan Ling

Traditional Chinese Medicine (TCM) has been practiced in China for thousands of years. As a complementary and alternative treatment, herbal medicines that are frequently used in the TCM are the most accepted in the Western world. However, animal materials, which are equally important in the TCM practice, are not well-known in other countries. On the other hand, the Chinese doctors had documented the toxic profiles of hundreds of animals and plants thousand years ago. Furthermore, they saw the potential benefits of these materials and used their toxic properties to treat a wide variety of diseases, such as heavy pain and cancer. Since the 50s of the last century, efforts of the Chinese government and societies to modernize TCM have achieved tremendous scientific results in both laboratory and clinic. A number of toxic proteins have been isolated and their functions identified. Although most of the literature was written in Chinese, this review provide a summary, in English, regarding our knowledge of the clinical use of the toxic proteins isolated from a plant, Tian Hua Fen, and an animal, scorpion, both of which are famous toxic prescriptions in TCM.


1. It is widely felt that any method of rejecting observations with large deviations from the mean is open to some suspicion. Suppose that by some criterion, such as Peirce’s and Chauvenet’s, we decide to reject observations with deviations greater than 4 σ, where σ is the standard error, computed from the standard deviation by the usual rule; then we reject an observation deviating by 4·5 σ, and thereby alter the mean by about 4·5 σ/ n , where n is the number of observations, and at the same time we reduce the computed standard error. This may lead to the rejection of another observation deviating from the original mean by less than 4 σ, and if the process is repeated the mean may be shifted so much as to lead to doubt as to whether it is really sufficiently representative of the observations. In many cases, where we suspect that some abnormal cause has affected a fraction of the observations, there is a legitimate doubt as to whether it has affected a particular observation. Suppose that we have 50 observations. Then there is an even chance, according to the normal law, of a deviation exceeding 2·33 σ. But a deviation of 3 σ or more is not impossible, and if we make a mistake in rejecting it the mean of the remainder is not the most probable value. On the other hand, an observation deviating by only 2 σ may be affected by an abnormal cause of error, and then we should err in retaining it, even though no existing rule will instruct us to reject such an observation. It seems clear that the probability that a given observation has been affected by an abnormal cause of error is a continuous function of the deviation; it is never certain or impossible that it has been so affected, and a process that completely rejects certain observations, while retaining with full weight others with comparable deviations, possibly in the opposite direction, is unsatisfactory in principle.


Algorithms ◽  
2021 ◽  
Vol 14 (7) ◽  
pp. 197
Author(s):  
Ali Seman ◽  
Azizian Mohd Sapawi

In the conventional k-means framework, seeding is the first step toward optimization before the objects are clustered. In random seeding, two main issues arise: the clustering results may be less than optimal and different clustering results may be obtained for every run. In real-world applications, optimal and stable clustering is highly desirable. This report introduces a new clustering algorithm called the zero k-approximate modal haplotype (Zk-AMH) algorithm that uses a simple and novel seeding mechanism known as zero-point multidimensional spaces. The Zk-AMH provides cluster optimality and stability, therefore resolving the aforementioned issues. Notably, the Zk-AMH algorithm yielded identical mean scores to maximum, and minimum scores in 100 runs, producing zero standard deviation to show its stability. Additionally, when the Zk-AMH algorithm was applied to eight datasets, it achieved the highest mean scores for four datasets, produced an approximately equal score for one dataset, and yielded marginally lower scores for the other three datasets. With its optimality and stability, the Zk-AMH algorithm could be a suitable alternative for developing future clustering tools.


2021 ◽  
Author(s):  
Toon Maas ◽  
Mohamad Tuffaha ◽  
Laurent Ney

<p>“A bridge has to be designed”. Every bridge is the exploration of all degrees of a freedom of a project: the context, cultural processes, technology, engineering and industrial skills. A successful bridge aims to dialogue with these degrees of freedom to achieve a delicate equilibrium, one that invites the participation of its users and emotes new perceptions for its viewers. In short, a good design “makes the bridge talk.”</p><p>Too often, the bridge, as an object, is reduced to its functionality. Matters of perceptions and experiences of the users are often not considered in the design process; they are relegated to levels of chance or treated as simple decorative matter. The longevity of infrastructure projects, in general, and bridges, in particular, highlights the deficiencies of such an approach. The framework to design bridges must include historical, cultural, and experiential dimensions. Technology and engineering are of paramount importance but cannot be considered as “an end in themselves but a means to an end”. This paper proposes to discuss three projects by Ney &amp; Partners that illustrate such a comprehensive exploration approach to footbridge design: the Poissy and Albi crossings and the Tintagel footbridge.</p><p>The footbridges of Poissy and Albi dialogue most clearly with their historical contexts, reconfiguring the relationship between old and new in the materiality and typology use. In Tintagel, legend replaces history. Becoming a metaphor for the void it crosses, the Tintagel footbridge illustrates the delicate dialogue of technology and engineering on one side and imagination and experience on the other.</p>


2021 ◽  
Vol 263 (1) ◽  
pp. 5154-5160
Author(s):  
Koichi Makino ◽  
Naoaki Shinohara

In Japan, yearly average of (day-evening-night sound level) as cumulative noise index has been adopted in national noise guideline of "Environmental Quality Standards for Aircraft Noise." Daily flight movements at civil airports are almost stable because of scheduled airline flight. On the other, daily total flight movements at military airfields greatly change day to day because of training flights, etc. Thus, noise exposure around the airport may change significantly from day to day due to change of flight movement. This paper shows examples of fluctuations, frequency distribution and deviation of daily using aircraft noise monitoring data around civil airports and military airfields. In the case of civil airports, standard deviation of daily was less than 5 dB at the monitoring stations where the yearly average of were about 55 dB or more. However, the standard deviation of daily increased 10 dB or more in some cases at points where yearly average of less than 55 dB. Furthermore, in the case of military airfields, the standard deviation of daily were 5 dB or more for all monitoring stations.


2018 ◽  
Vol 44 (4) ◽  
pp. 354-360 ◽  
Author(s):  
Koji Moriya ◽  
Takea Yoshizu ◽  
Naoto Tsubokawa ◽  
Hiroko Narisawa ◽  
Yutaka Maki

We report seven patients requiring tenolysis after primary or delayed primary flexor tendon repair and early active mobilization out of 148 fingers of 132 consecutive patients with Zone 1 or 2 injuries from 1993 to 2017. Three fingers had Zone 2A, two Zone 2B, and two Zone 2C injuries. Two fingers underwent tenolysis at Week 4 or 6 after repair because of suspected repair rupture. The other five fingers had tenolysis 12 weeks after repair. Adhesions were moderately dense between the flexor digitorum superficialis and profundus tendons or with the pulleys. According to the Strickland and Tang criteria, the outcomes were excellent in one finger, good in four, fair in one, and poor in one. Fingers requiring tenolysis after early active motion were 5% of the 148 fingers so treated. Indications for tenolysis were to achieve a full range of active motion in the patients rated good or improvement of range of active motion of the patients rated poor or fair. Not all of our patients with poor or fair outcomes wanted to have tenolysis. Level of evidence: IV


2010 ◽  
Vol 67 (5) ◽  
pp. 1655-1666 ◽  
Author(s):  
David M. Romps ◽  
Zhiming Kuang

Abstract Tracers are used in a large-eddy simulation of shallow convection to show that stochastic entrainment (and not cloud-base properties) determines the fate of convecting parcels. The tracers are used to diagnose the correlations between a parcel’s state above the cloud base and both the parcel’s state at the cloud base and its entrainment history. The correlation with the cloud-base state goes to zero a few hundred meters above the cloud base. On the other hand, correlations between a parcel’s state and its net entrainment are large. Evidence is found that the entrainment events may be described as a stochastic Poisson process. A parcel model is constructed with stochastic entrainment that is able to replicate the mean and standard deviation of cloud properties. Turning off cloud-base variability has little effect on the results, which suggests that stochastic mass-flux models may be initialized with a single set of properties. The success of the stochastic parcel model suggests that it holds promise as the framework for a convective parameterization.


2010 ◽  
Vol 24 (4) ◽  
pp. 538-543 ◽  
Author(s):  
R. D. Ranft ◽  
S. S. Seefeldt ◽  
M. Zhang ◽  
D. L. Barnes

The use of triclopyr for the removal of woody and broad-leaf vegetation in right-of-ways and agricultural settings has been proposed for Alaska. Triclopyr concentrations in soil after application are of concern because residual herbicide may affect growth of subsequent vegetation. In order to measure triclopyr residues in soil and determine the amount of herbicide taken up by the plant, soil bioassays were developed. Four agricultural species, turnip, lettuce, mustard, and radish, were tested to determine sensitivity to triclopyr in a 1-wk bioassay. The sensitivity (I50) of turnip, lettuce, mustard, and radish was 0.33 ± 0.05 kg ai ha−1, 0.78 ± 0.11 kg ai ha−1, 0.78 ± 0.07 kg ai ha−1, and 0.85 ± 0.10 kg ai ha−1 (mean ± SE), respectively. Mustard was the most consistent crop in the bioassay with a midrange response to triclopyr and lowest standard deviation for germination as compared to the other species. Thus, it was used in a bioassay to determine triclopyr concentrations in a field trial. The bioassay of mustard closely matched residual amounts of triclopyr in a field trial determined by chemical extraction. Estimates of residual triclopyr concentrations using the bioassay method were sometimes less than the triclopyr concentration determined using a chemical extraction. These differences in concentrations were most evident after spring thaw when the chemical extraction determined there was enough triclopyr in the soil to reduce mustard growth over 60%, yet the bioassay measured only a 10% reduction. The chemical extraction method may have identified nonphototoxic metabolites of triclopyr to be the herbicidal triclopyr acid. These methods, when analyzed together with a dose–response curve, offer a more complete picture of triclopyr residues and the potential for carryover injury to other plant species.



Sign in / Sign up

Export Citation Format

Share Document