scholarly journals The shifting boxplot. A boxplot based on essential summary statistics around the mean.

2010 ◽  
Vol 3 (1) ◽  
pp. 37-45 ◽  
Author(s):  
Fernando Marmolejo-Ramos ◽  
Tian Siva Tian

Boxplots are a useful and widely used graphical technique to explore data in order to better understand the information we are working with. Boxplots display the first, second and third quartile as well as the interquartile range and outliers of a data set. The information displayed by the boxplot, and most of its variations, is based on the data’s median. However, much of scientific applications analyse and report data using the mean. In this paper, we propose a variation of the classical boxplot that displays information around the mean. Some information about the median is displayed as well.

1996 ◽  
Vol 89 (8) ◽  
pp. 688-692
Author(s):  
Charles Vonder Embse ◽  
Arne Engebretsen

Summary statistics used to describe a data set are some of the most commonly taught statistical concepts in the secondary curriculum. Mean, median, mode, range, and standard deviation are topics that can be found in nearly every program. Technology empowers us to access these concepts and easily to create visual displays that interpret and describe the data in ways that enhance students' understanding. Many graphing calculators allow students to display nonparametric statistical information using a box-and-whiskers plot or a modified box plot showing a visual representation of the median, upper and lower quartiles, and the range of the data. But how can students visually display the mean of the data or show what it means to be within one standard deviation of the mean? One way to create this type of visual display is with a bar graph and constant functions. Unfortunately, graphing calculators, and some computer programs, only display histograms and not bar graphs. The tips in this issue focus on using graphing calculators to draw bar graphs that can help students visualize and interpret the mean and standard deviation of a data set.


2020 ◽  
Vol 72 (1) ◽  
Author(s):  
Chao Xiong ◽  
Claudia Stolle ◽  
Patrick Alken ◽  
Jan Rauberg

Abstract In this study, we have derived field-aligned currents (FACs) from magnetometers onboard the Defense Meteorological Satellite Project (DMSP) satellites. The magnetic latitude versus local time distribution of FACs from DMSP shows comparable dependences with previous findings on the intensity and orientation of interplanetary magnetic field (IMF) By and Bz components, which confirms the reliability of DMSP FAC data set. With simultaneous measurements of precipitating particles from DMSP, we further investigate the relation between large-scale FACs and precipitating particles. Our result shows that precipitation electron and ion fluxes both increase in magnitude and extend to lower latitude for enhanced southward IMF Bz, which is similar to the behavior of FACs. Under weak northward and southward Bz conditions, the locations of the R2 current maxima, at both dusk and dawn sides and in both hemispheres, are found to be close to the maxima of the particle energy fluxes; while for the same IMF conditions, R1 currents are displaced further to the respective particle flux peaks. Largest displacement (about 3.5°) is found between the downward R1 current and ion flux peak at the dawn side. Our results suggest that there exists systematic differences in locations of electron/ion precipitation and large-scale upward/downward FACs. As outlined by the statistical mean of these two parameters, the FAC peaks enclose the particle energy flux peaks in an auroral band at both dusk and dawn sides. Our comparisons also found that particle precipitation at dawn and dusk and in both hemispheres maximizes near the mean R2 current peaks. The particle precipitation flux maxima closer to the R1 current peaks are lower in magnitude. This is opposite to the known feature that R1 currents are on average stronger than R2 currents.


Neurosurgery ◽  
2012 ◽  
Vol 72 (3) ◽  
pp. 353-366 ◽  
Author(s):  
Francesco Cardinale ◽  
Massimo Cossu ◽  
Laura Castana ◽  
Giuseppe Casaceli ◽  
Marco Paolo Schiariti ◽  
...  

Abstract BACKGROUND: Stereoelectroencephalography (SEEG) methodology, originally developed by Talairach and Bancaud, is progressively gaining popularity for the presurgical invasive evaluation of drug-resistant epilepsies. OBJECTIVE: To describe recent SEEG methodological implementations carried out in our center, to evaluate safety, and to analyze in vivo application accuracy in a consecutive series of 500 procedures with a total of 6496 implanted electrodes. METHODS: Four hundred nineteen procedures were performed with the traditional 2-step surgical workflow, which was modified for the subsequent 81 procedures. The new workflow entailed acquisition of brain 3-dimensional angiography and magnetic resonance imaging in frameless and markerless conditions, advanced multimodal planning, and robot-assisted implantation. Quantitative analysis for in vivo entry point and target point localization error was performed on a sub-data set of 118 procedures (1567 electrodes). RESULTS: The methodology allowed successful implantation in all cases. Major complication rate was 12 of 500 (2.4%), including 1 death for indirect morbidity. Median entry point localization error was 1.43 mm (interquartile range, 0.91-2.21 mm) with the traditional workflow and 0.78 mm (interquartile range, 0.49-1.08 mm) with the new one (P < 2.2 × 10−16). Median target point localization errors were 2.69 mm (interquartile range, 1.89-3.67 mm) and 1.77 mm (interquartile range, 1.25-2.51 mm; P < 2.2 × 10−16), respectively. CONCLUSION: SEEG is a safe and accurate procedure for the invasive assessment of the epileptogenic zone. Traditional Talairach methodology, implemented by multimodal planning and robot-assisted surgery, allows direct electrical recording from superficial and deep-seated brain structures, providing essential information in the most complex cases of drug-resistant epilepsy.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Sidney R. Lehky ◽  
Keiji Tanaka ◽  
Anne B. Sereno

AbstractWhen measuring sparseness in neural populations as an indicator of efficient coding, an implicit assumption is that each stimulus activates a different random set of neurons. In other words, population responses to different stimuli are, on average, uncorrelated. Here we examine neurophysiological data from four lobes of macaque monkey cortex, including V1, V2, MT, anterior inferotemporal cortex, lateral intraparietal cortex, the frontal eye fields, and perirhinal cortex, to determine how correlated population responses are. We call the mean correlation the pseudosparseness index, because high pseudosparseness can mimic statistical properties of sparseness without being authentically sparse. In every data set we find high levels of pseudosparseness ranging from 0.59–0.98, substantially greater than the value of 0.00 for authentic sparseness. This was true for synthetic and natural stimuli, as well as for single-electrode and multielectrode data. A model indicates that a key variable producing high pseudosparseness is the standard deviation of spontaneous activity across the population. Consistently high values of pseudosparseness in the data demand reconsideration of the sparse coding literature as well as consideration of the degree to which authentic sparseness provides a useful framework for understanding neural coding in the cortex.


Ocean Science ◽  
2010 ◽  
Vol 6 (4) ◽  
pp. 887-900 ◽  
Author(s):  
M. Ezam ◽  
A. A. Bidokhti ◽  
A. H. Javid

Abstract. A three dimensional numerical model namely POM (Princeton Ocean Model) and observational data are used to study the Persian Gulf outflow structure and its spreading pathways during 1992. In the model, the monthly wind speed data were taken from ICOADS (International Comprehensive Ocean-Atmosphere Data Set) and the monthly SST (sea surface temperatures) were taken from AVHRR (Advanced Very High Resolution Radiometer) with the addition of monthly net shortwave radiations from NCEP (National Center for Environmental Prediction). The mean monthly precipitation rates from NCEP data and the calculated evaporation rates are used to impose the surface salinity fluxes. At the open boundaries the temperature and salinity were prescribed from the mean monthly climatological values from WOA05 (World Ocean Atlas 2005). Also the four major components of the tide were prescribed at the open boundaries. The results show that the outflow mainly originates from two branches at different depths in the Persian Gulf. The permanent branch exists during the whole year deeper than 40 m along the Gulf axis and originates from the inner parts of the Persian Gulf. The other seasonal branch forms in the vicinity of the shallow southern coasts due to high evaporation rates during winter. Near the Strait of Hormuz the two branches join and form the main outflow source water. The results of simulations reveal that during the winter the outflow boundary current mainly detaches from the coast well before Ras Al Hamra Cape, however during summer the outflow seems to follow the coast even after this Cape. This is due to a higher density of the colder outflow that leads to more sinking near the coast in winter. Thus, the outflow moves to a deeper depth of about 500 m (for which some explanations are given) while the main part detaches and spreads at a depth of about 300 m. However in summer it all moves at a depth of about 200–250 m. During winter, the deeper, stronger and wider outflow is more affected by the steep topography, leading to separation from the coast. While during summer, the weaker and shallower outflow is less influenced by bottom topography and so continues along the boundary.


2020 ◽  
Vol 22 (Supplement_2) ◽  
pp. ii83-ii83
Author(s):  
Nilan Vaghjiani ◽  
Andrew Schwieder ◽  
Sravya Uppalapati ◽  
Zachary Kons ◽  
Elizabeth Kazarian ◽  
...  

Abstract PURPOSE Radiation-induced meningiomas (RIMs) are associated with previous exposure to therapeutic irradiation. RIMs are rare and have not been well characterized relative to spontaneous meningiomas (SMs). METHODS 1003 patients with proven or presumed meningiomas were identified from the VCU brain tumor database. Chart review classified RIM patients and their characteristics. RESULTS Of the 1003 total patients, 76.47% were female with a mean ± SD age of 67.55 ± 15.50 years. 15 RIM patients were identified (66.67% female), with a mean ± SD age of 52.67 ± 15.46 years, 5 were African American and 10 were Caucasian. The incidence of RIMs was 1.49% in our data set. The mean age at diagnosis was 43.27 ± 15.06 years. The mean latency was 356.27 ± 116.96 months. The mean initiating dose was 44.28 ± 14.68 Gy. There was a significant difference between mean latency period and ethnicity, 258.3 months for African American population, and 405.2 months for Caucasian population (p = 0.003). There was a significant difference between the mean number of lesions in females (2.8) versus males (1.2; p = 0.046). Of the RIMs with characterized histology, 6 (55%) were WHO grade II and 5 (45%) were WHO grade I, demonstrating a prevalence of grade II tumors approximately double that found with SMs. RIMs were treated with combinations of observation, surgery, radiation, and medical therapy. Of the 8 patients treated with radiation, 4 demonstrated response. 8 of the 15 patients (53%) demonstrated recurrence/progression despite treatment. CONCLUSION RIMs are important because of the associated higher grade histology, gender, and ethnic incidences, and increased recurrence/progression compared to SMs. Despite the presumed contributory role of prior radiation, RIMs demonstrate a significant rate of responsiveness to radiation treatment.


2021 ◽  
pp. 58-60
Author(s):  
Naziru Fadisanku Haruna ◽  
Ran Vijay Kumar Singh ◽  
Samsudeen Dahiru

In This paper a modied ratio-type estimator for nite population mean under stratied random sampling using single auxiliary variable has been proposed. The expression for mean square error and bias of the proposed estimator are derived up to the rst order of approximation. The expression for minimum mean square error of proposed estimator is also obtained. The mean square error the proposed estimator is compared with other existing estimators theoretically and condition are obtained under which proposed estimator performed better. A real life population data set has been considered to compare the efciency of the proposed estimator numerically.


2021 ◽  
Vol 87 (4) ◽  
pp. 283-293
Author(s):  
Wei Wang ◽  
Yuan Xu ◽  
Yingchao Ren ◽  
Gang Wang

Recently, performance improvement in facade parsing from 3D point clouds has been brought about by designing more complex network structures, which cost huge computing resources and do not take full advantage of prior knowledge of facade structure. Instead, from the perspective of data distribution, we construct a new hierarchical mesh multi-view data domain based on the characteristics of facade objects to achieve fusion of deep-learning models and prior knowledge, thereby significantly improving segmentation accuracy. We comprehensively evaluate the current mainstream method on the RueMonge 2014 data set and demonstrate the superiority of our method. The mean intersection-over-union index on the facade-parsing task reached 76.41%, which is 2.75% higher than the current best result. In addition, through comparative experiments, the reasons for the performance improvement of the proposed method are further analyzed.


2014 ◽  
Vol 5 (2) ◽  
pp. 315-332 ◽  
Author(s):  
R. David Simpson

Abstract:It has occasionally been asserted that regulators typically overestimate the costs of the regulations they impose. A number of arguments have been proposed for why this might be the case. The most widely credited is that regulators fail sufficiently to appreciate the effects of innovation in reducing regulatory compliance costs. Most existing studies have found that regulators are more likely to over- than to underestimate costs. While it is difficult to develop summary statistics to aggregate the results of different studies of disparate industries, one such measure is the average of the ratio of ex ante estimates of compliance costs to ex post estimates of the same costs. This ratio is generally greater than one. In this paper I argue that neither the greater frequency of overestimates nor the fact that the average ratio of ex ante to ex post cost estimates is greater than one necessarily demonstrates that ex ante estimates are biased. There are several reasons to suppose that the distribution of compliance costs could be skewed, so that the median of the distribution would lie below the mean. It is not surprising, then, that most estimates would prove to be too high. Moreover, Jensen’s inequality implies that the expected ratio of ex ante to ex post compliance costs would be greater than one. I propose a regression-based test of the bias of ex ante compliance cost estimates, and cannot reject the hypothesis that estimates are unbiased. Failure to reject a hypothesis with limited and noisy data should not, of course, be interpreted as a strong argument to accept the hypothesis. Rather, this paper argues for the generation of more and better information. Despite the existence of a number of papers reporting ex ante and ex post compliance cost estimates, it is surprisingly difficult to get a large sample with which to make such comparisons.


Sign in / Sign up

Export Citation Format

Share Document