Statistical measures for evaluating protected group under-representation: analysis of the conflicting inferences drawn from the same data inPeoplev. BryantandAmbrosev. Booker

2015 ◽  
pp. mgv011 ◽  
Author(s):  
Joseph L. Gastwirth ◽  
Wenjing Xu ◽  
Qing Pan
2019 ◽  
Vol 42 (2) ◽  
pp. 180-195
Author(s):  
Shirley A. Jackson

In 2017, Oregon passed House Bill 2845 requiring Ethnic Studies curriculum in grades K–12. It was the first state in the nation to do so. The bill passed almost fifty years after the founding of the country’s first Ethnic Studies department. The passage of an Ethnic Studies bill in a state that once banned African Americans and removed Indigenous peoples from their land requires further examination. In addition, the bill mandates that Ethnic Studies curriculum in Oregon's schools includes “social minorities,” such as Jewish and LGBTQ+ populations which makes the bill even more remarkable. As such, it is conceivable for some observers, a watered-down version of its perceived original intent—one that focuses on racial and ethnic minorities. Similarly, one can draw analogies to the revision of the Civil Rights Bill of 1964 when it included women as a protected group. Grounded in a socio-political history that otherwise would not have been included, this essay examines the productive and challenging aspect of HB 2845. Framing the bill so it includes racial, ethnic, and social minorities solved the problem of a host of bills that may not have passed on their own merit while simultaneously and ironically making it easier to pass similar bills.


Author(s):  
Revati Kadu ◽  
U. A. Belorkar

One of the most common and augmenting health problems in the world are related to skin. The most  unpredictable and one of the most difficult entities to automatically detect and evaluate is the human skin disease because of complexities of texture, tone, presence of hair and other distinctive features. Many cases of skin diseases in the world have triggered a need to develop an effective automated screening method for detection and diagnosis of the area of disease. Therefore the objective of this work is to develop a new technique for automated detection and analysis of the skin disease images based on color and texture information for skin disease screening. In this paper, system is proposed which detects the skin diseases using Wavelet Techniques and Artificial Neural Network. This paper presents a wavelet-based texture analysis method for classification of five types of skin diseases. The method applies tree-structured wavelet transform on different color channels of red, green and blue dermoscopy images, and employs various statistical measures and ratios on wavelet coefficients. In all 99 unique features are extracted from the image. By using Artificial Neural Network, the system successfully detects different types of dermatological skin diseases. It consists of mainly three phases image processing, training phase, detection  and classification phase.


2018 ◽  
Vol 31 (1) ◽  
pp. 277 ◽  
Author(s):  
Methaq Talib Gaata

  With the fast progress of information technology and the computer networks, it becomes very easy to reproduce and share the geospatial data due to its digital styles. Therefore, the usage of geospatial data suffers from various problems such as data authentication, ownership proffering, and illegal copying ,etc. These problems can represent the big challenge to future uses of the geospatial data. This paper introduces a new watermarking scheme to ensure the copyright protection of the digital vector map. The main idea of proposed scheme is based on transforming  the digital map to frequently domain using the Singular Value Decomposition (SVD) in order to determine suitable areas to insert the watermark data. The digital map is separated into the isolated parts.Watermark data are embedded within the nominated magnitudes in each part when satisfied the definite criteria. The efficiency of proposed watermarking scheme is assessed within statistical measures based on two factors which are fidelity and robustness. Experimental results demonstrate the proposed watermarking scheme representing ideal trade off for disagreement issue between distortion amount and robustness. Also, the proposed scheme shows  robust resistance for many kinds of attacks.


Author(s):  
Kyle Hoegh ◽  
Trevor Steiner ◽  
Eyoab Zegeye Teshale ◽  
Shongtao Dai

Available methods for assessing hot-mix-asphalt pavements are typically restricted to destructive methods such as coring that damage the pavement and are limited in coverage. Recently, density profiling systems (DPS) have become available with the capability of measuring asphalt compaction continuously, giving instantaneous measurements a few hundred feet behind the final roller of the freshly placed pavement. Further developments of the methods involved with DPS processing have allowed for coreless calibration by correlating dielectric measurements with asphalt specimens fabricated at variable air void contents using superpave gyratory compaction. These developments make DPS technology an attractive potential tool for quality control because of the real-time nature of the results, and quality assurance because of the ability to measure a more statistically significant amount of data as compared with current quality assurance methods such as coring. To test the viability of these recently developed methods for implementation, multiple projects were selected for field trials. Each field trial was used to assess the coreless calibration prediction by comparing with field cores where dielectric measurements were made. Ground truth core validation on each project showed the reasonableness of the coreless calibration method. The validated dielectric to air void prediction curves allowed for assessment of the tested pavements in relation to as-built characteristics, with the DPS providing the equivalent of approximately 100,000 cores per mile. Statistical measures were used to demonstrate how DPS can provide a comprehensive asphalt compaction evaluation that can be used to inform construction-related decisions and has potential as a future quality assurance tool.


Science ◽  
2021 ◽  
Vol 372 (6539) ◽  
pp. eabf1941
Author(s):  
Sandipan Ray ◽  
Utham K. Valekunja ◽  
Alessandra Stangherlin ◽  
Steven A. Howell ◽  
Ambrosius P. Snijders ◽  
...  

Abruzzi et al. argue that transcriptome oscillations found in our study in the absence of Bmal1 are of low amplitude, statistical significance, and consistency. However, their conclusions rely solely on a different statistical algorithm than we used. We provide statistical measures and additional analyses showing that our original analyses and observations are accurate. Further, we highlight independent lines of evidence indicating Bmal1-independent 24-hour molecular oscillations.


2020 ◽  
Vol 154 (Supplement_1) ◽  
pp. S5-S5
Author(s):  
Ridin Balakrishnan ◽  
Daniel Casa ◽  
Morayma Reyes Gil

Abstract The diagnostic approach for ruling out suspected acute pulmonary embolism (PE) in the ED setting includes several tests: ultrasound, plasma d-dimer assays, ventilation-perfusion scans and computed tomography pulmonary angiography (CTPA). Importantly, a pretest probability scoring algorithm is highly recommended to triage high risk cases while also preventing unnecessary testing and harm to low/moderate risk patients. The d-dimer assay (both ELISA and immunoturbidometric) has been shown to be extremely sensitive to rule out PE in conjunction with clinical probability. In particularly, d-dimer testing is recommended for low/moderate risk patients, in whom a negative d-dimer essentially rules out PE sparing these patients from CTPA radiation exposure, longer hospital stay and anticoagulation. However, an unspecific increase in fibrin-degradation related products has been seen with increase in age, resulting in higher false positive rate in the older population. This study analyzed patient visits to the ED of a large academic institution for five years and looked at the relationship between d-dimer values, age and CTPA results to better understand the value of age-adjusted d-dimer cut-offs in ruling out PE in the older population. A total of 7660 ED visits had a CTPA done to rule out PE; out of which 1875 cases had a d-dimer done in conjunction with the CT and 5875 had only CTPA done. Out of the 1875 cases, 1591 had positive d-dimer results (>0.50 µg/ml (FEU)), of which 910 (57%) were from patients older than or equal to fifty years of age. In these older patients, 779 (86%) had a negative CT result. The following were the statistical measures of the d-dimer test before adjusting for age: sensitivity (98%), specificity (12%); negative predictive value (98%) and false positive rate (88%). After adjusting for age in people older than 50 years (d-dimer cut off = age/100), 138 patients eventually turned out to be d-dimer negative and every case but four had a CT result that was also negative for a PE. The four cases included two non-diagnostic results and two with subacute/chronic/subsegmental PE on imaging. None of these four patients were prescribed anticoagulation. The statistical measures of the d-dimer test after adjusting for age showed: sensitivity (96%), specificity (20%); negative predictive value (98%) and a decrease in the false positive rate (80%). Therefore, imaging could have been potentially avoided in 138/779 (18%) of the patients who were part of this older population and had eventual negative or not clinically significant findings on CTPA if age-adjusted d-dimers were used. This data very strongly advocates for the clinical usefulness of an age-adjusted cut-off of d-dimer to rule out PE.


Author(s):  
Andy H. Wong ◽  
Tae J. Kwon

Winter driving conditions pose a real hazard to road users with increased chance of collisions during inclement weather events. As such, road authorities strive to service the hazardous roads or collision hot spots by increasing road safety, mobility, and accessibility. One measure of a hot spot would be winter collision statistics. Using the ratio of winter collisions (WC) to all collisions, roads that show a high ratio of WC should be given a high priority for further diagnosis and countermeasure selection. This study presents a unique methodological framework that is built on one of the least explored yet most powerful geostatistical techniques, namely, regression kriging (RK). Unlike other variants of kriging, RK uses auxiliary variables to gain a deeper understanding of contributing factors while also utilizing the spatial autocorrelation structure for predicting WC ratios. The applicability and validity of RK for a large-scale hot spot analysis is evaluated using the northeast quarter of the State of Iowa, spanning five winter seasons from 2013/14 to 2017/18. The findings of the case study assessed via three different statistical measures (mean squared error, root mean square error, and root mean squared standardized error) suggest that RK is very effective for modeling WC ratios, thereby further supporting its robustness and feasibility for a statewide implementation.


2021 ◽  
Vol 11 (14) ◽  
pp. 6405
Author(s):  
Pere Marti-Puig ◽  
Alejandro Bennásar-Sevillá ◽  
Alejandro Blanco-M. ◽  
Jordi Solé-Casals

Today, the use of SCADA data for predictive maintenance and forecasting of wind turbines in wind farms is gaining popularity due to the low cost of this solution compared to others that require the installation of additional equipment. SCADA data provides four statistical measures (mean, standard deviation, maximum value, and minimum value) of hundreds of wind turbine magnitudes, usually in a 5-min or 10-min interval. Several studies have analysed the loss of information associated with the reduction of information when using five minutes instead of four seconds as a sampling frequency, or when compressing a time series recorded at 5 min to 10 min, concluding that some, but not all, of these magnitudes are seriously affected. However, to our knowledge, there are no studies on increasing the time interval beyond 10 min to take these four statistical values, and how this aggregation affects prognosis models. Our work shows that, despite the irreversible loss of information that occurs in the first 5 min, increasing the time considered to take the four representative statistical values improves the performance of the predicted targets in normality models.


Sign in / Sign up

Export Citation Format

Share Document