Added Value of Subscores and Hypothesis Testing

2018 ◽  
Vol 44 (1) ◽  
pp. 25-44
Author(s):  
Sandip Sinharay

The value-added method of Haberman is arguably one of the most popular methods to evaluate the quality of subscores. According to the method, a subscore has added value if the reliability of the subscore is larger than a quantity referred to as the proportional reduction in mean squared error of the total score. This article shows how well-known statistical tests can be used to determine the added value of subscores and augmented subscores. The usefulness of the suggested tests is demonstrated using two operational data sets.

2002 ◽  
Vol 27 (3) ◽  
pp. 255-270 ◽  
Author(s):  
J.R. Lockwood ◽  
Thomas A. Louis ◽  
Daniel F. McCaffrey

Accountability for public education often requires estimating and ranking the quality of individual teachers or schools on the basis of student test scores. Although the properties of estimators of teacher-or-school effects are well established, less is known about the properties of rank estimators. We investigate performance of rank (percentile) estimators in a basic, two-stage hierarchical model capturing the essential features of the more complicated models that are commonly used to estimate effects. We use simulation to study mean squared error (MSE) performance of percentile estimates and to find the operating characteristics of decision rules based on estimated percentiles. Each depends on the signal-to-noise ratio (the ratio of the teacher or school variance component to the variance of the direct, teacher- or school-specific estimator) and only moderately on the number of teachers or schools. Results show that even when using optimal procedures, MSE is large for the commonly encountered variance ratios, with an unrealistically large ratio required for ideal performance. Percentile-specific MSE results reveal interesting interactions between variance ratios and estimators, especially for extreme percentiles, which are of considerable practical import. These interactions are apparent in the performance of decision rules for the identification of extreme percentiles, underscoring the statistical and practical complexity of the multiple goal inferences faced in value-added modeling. Our results highlight the need to assess whether even optimal percentile estimators perform sufficiently well to be used in evaluating teachers or schools.


2014 ◽  
Vol 2 (2) ◽  
pp. 47-58
Author(s):  
Ismail Sh. Baqer

A two Level Image Quality enhancement is proposed in this paper. In the first level, Dualistic Sub-Image Histogram Equalization DSIHE method decomposes the original image into two sub-images based on median of original images. The second level deals with spikes shaped noise that may appear in the image after processing. We presents three methods of image enhancement GHE, LHE and proposed DSIHE that improve the visual quality of images. A comparative calculations is being carried out on above mentioned techniques to examine objective and subjective image quality parameters e.g. Peak Signal-to-Noise Ratio PSNR values, entropy H and mean squared error MSE to measure the quality of gray scale enhanced images. For handling gray-level images, convenient Histogram Equalization methods e.g. GHE and LHE tend to change the mean brightness of an image to middle level of the gray-level range limiting their appropriateness for contrast enhancement in consumer electronics such as TV monitors. The DSIHE methods seem to overcome this disadvantage as they tend to preserve both, the brightness and contrast enhancement. Experimental results show that the proposed technique gives better results in terms of Discrete Entropy, Signal to Noise ratio and Mean Squared Error values than the Global and Local histogram-based equalization methods


2020 ◽  
Vol 10 (24) ◽  
pp. 8904
Author(s):  
Ana Isabel Montoya-Munoz ◽  
Oscar Mauricio Caicedo Rendon

The reliability in data collection is essential in Smart Farming supported by the Internet of Things (IoT). Several IoT and Fog-based works consider the reliability concept, but they fall short in providing a network’s edge mechanisms for detecting and replacing outliers. Making decisions based on inaccurate data can diminish the quality of crops and, consequently, lose money. This paper proposes an approach for providing reliable data collection, which focuses on outlier detection and treatment in IoT-based Smart Farming. Our proposal includes an architecture based on the continuum IoT-Fog-Cloud, which incorporates a mechanism based on Machine Learning to detect outliers and another based on interpolation for inferring data intended to replace outliers. We located the data cleaning at the Fog to Smart Farming applications functioning in the farm operate with reliable data. We evaluate our approach by carrying out a case study in a network based on the proposed architecture and deployed at a Colombian Coffee Smart Farm. Results show our mechanisms achieve high Accuracy, Precision, and Recall as well as low False Alarm Rate and Root Mean Squared Error when detecting and replacing outliers with inferred data. Considering the obtained results, we conclude that our approach provides reliable data collection in Smart Farming.


2004 ◽  
Vol 61 (7) ◽  
pp. 1075-1082 ◽  
Author(s):  
Steven T Kalinowski

Genetic data can be used to estimate the stock composition of mixed-stock fisheries. Designing efficient strategies for estimating mixture proportions is important, but several aspects of study design remain poorly understood, particularly the relationship between genetic polymorphism and estimation error. In this study, computer simulation was used to investigate how the following variables affect expected squared error of mixture estimates: the number of loci examined, the number of alleles at those loci, and the size of baseline data sets. This work showed that (i) loci with more alleles produced estimates of stock proportions that had a lower expected squared error than less polymorphic loci, (ii) highly polymorphic loci did not require larger samples than less polymorphic loci, and (iii) the total number of independent alleles examined is a reasonable indicator of the quality of estimates of stock proportions.


2007 ◽  
Vol 89 (3) ◽  
pp. 135-153 ◽  
Author(s):  
JINLIANG WANG

SummaryKnowledge of the genetic relatedness among individuals is essential in diverse research areas such as behavioural ecology, conservation biology, quantitative genetics and forensics. How to estimate relatedness accurately from genetic marker information has been explored recently by many methodological studies. In this investigation I propose a new likelihood method that uses the genotypes of a triad of individuals in estimating pairwise relatedness (r). The idea is to use a third individual as a control (reference) in estimating the r between two other individuals, thus reducing the chance of genes identical in state being mistakenly inferred as identical by descent. The new method allows for inbreeding and accounts for genotype errors in data. Analyses of both simulated and human microsatellite and SNP datasets show that the quality of r estimates (measured by the root mean squared error, RMSE) is generally improved substantially by the new triadic likelihood method (TL) over the dyadic likelihood method and five moment estimators. Simulations also show that genotyping errors/mutations, when ignored, result in underestimates of r for related dyads, and that incorporating a model of typing errors in the TL method improves r estimates for highly related dyads but impairs those for loosely related or unrelated dyads. The effects of inbreeding were also investigated through simulations. It is concluded that, because most dyads in a natural population are unrelated or only loosely related, the overall performance of the new triadic likelihood method is the best, offering r estimates with a RMSE that is substantially smaller than the five commonly used moment estimators and the dyadic likelihood method.


2021 ◽  
Vol 5 (1) ◽  
pp. 126-133
Author(s):  
Afriyani Afriyani ◽  
Muhammad Yazid ◽  
Desi Aryani

Lahat is one of the Robusta coffee production centers in South Sumatra. The coffee beansproduced by this district are often used as raw material in the coffee shop in Palembang because of thedistinctive taste and aroma that coffee lovers love. Coffee shops opens new opportunities for Robustacoffee farming. This study aims to analyze the flow of the supply chain and the added value of Lahatcoffee beans used by coffee shop. This research was conducted through a survey of four coffee shops inPalembang. The results showed that there are two supply chain lines, (1) coffee farmers - collectors -retailers - market traders - consumers; (2) coffee farmers - processors - coffee shops - consumers. Thesecond pattern is better and more profitable than the first pattern because the quality of the coffeeproduced is higher. The average added value obtained from processing one kilogram of coffee cherriesinto ground coffee is Rp. 158,132.94, coffee bean into green bean is Rp. 427,798.55, and green beaninto a cup of coffee is Rp. 1,029,269.00. This value indicates that processing the coffee cherries intopowder and processing the selected coffee cherries in coffee shop are profitable.


2015 ◽  
Vol 5 (1) ◽  
pp. 27 ◽  
Author(s):  
Hikmah Hikmah

Tulisan ini bertujuan untuk mengkaji tentang potensi dan peluang, p ermasalahan sertapengembagan industri rumput laut. Rumput laut merupakan salah satu komoditas perikanan budidayayang mampu meningkatkan perekonomian masyarakat, menyerap tenaga kerja dan meningkatkandevisa negara. Potensi sebaran rumput laut di Indonesia sangat luas baik yang tumbuh secara alamimaupun yang dibudidayakan di laut. Peluang menuju pengembagan Industri rumput laut masih terbuka dilihat dari potensi lahan budidaya, ketersediaan bahan baku, maupun dari sisi permintaan produk olahan.Permasalahan dan tantangan terkait kemampuan Indonesia dalam mengekspor dan bersaing dalamperebutan pangsa pasar dunia untuk pemenuhan kebutuhan rumput laut dunia antara lain rendahnyakualitas dan kontinuitas bahan baku, permodalan, lemahnya sumberdaya manusia dan kelembagaan,serta permasalahan pemasaran produk rumput laut. Strategi kebijakan pengembangan industripengolahan rumput laut E. cotonii untuk peningkatan nilai tambah adalah peningkatan produktivitas dankualitas rumput laut, pengembangan industri pengolahan rumput laut setengah jadi (ATC,SRC dan RC)secara bertahap di sentra kawasan produksi rumput laut, dan pengembangan skala usaha pengolahanrumput laut siap konsumsi dari skala tradisoinal menjadi skala industri.Title: Strategy of Commudity Precessing Industry DepelopmentE. cottonii Seaweed to Increasing Value Added in The AreaCenter of IndustrializationThis paper aims to assess the potential and opportunities, problems and developing a seaweedindustry. Seaweed is one aquaculture commodity that is able to improve the economy, provide employmentand increase foreign exchange. Potential distribution of seaweed in Indonesia is very wide both naturallygrown and cultivated in the sea. Opportunities towards developing a seaweed industry is still open inview of the potential for the cultivation of land, availability of raw materials and processed productsfrom the demand side. Problems and challenges related to Indonesia’s ability to export and compete inthe race for market share to meet the needs of the world’s seaweed were low quality and continuity ofraw materials, capital, human resources and institutional weaknesses, as well as marketing problemsseaweed products. Strategy of commudities E. cotonii seaweed processing industry to developmentincrease the added value is increased productivity and quality of seaweed, seaweed processing industrydevelopment of semi-finished (ATC, SRC and RC) gradually in the central area of seaweed production,and the development of business scale processing of seaweed ready for consumption on the scaletradisoinal be scale industries.


2019 ◽  
Author(s):  
Shanaz A. Ghandhi ◽  
Igor Shuryak ◽  
Shad R. Morton ◽  
Sally A. Amundson ◽  
David J. Brenner

AbstractIn the event of a nuclear attack or radiation event, there would be an urgent need for assessing and reconstructing the dose to which hundreds or thousands of individuals were exposed. These measurements would need a rapid assay to facilitate triage and medical management for individuals based on dose. Our approaches to development of rapid assays for reconstructing dose, using transcriptomics, have led to identification of gene sets that have potential to be used in the field; but need further testing. This was a proof-of-principle study for new methods using radiation-responsive genes to generate quantitative, rather than categorical, radiation-dose reconstructions based on a blood sample. We used a new normalization method to reduce effects of variability of gene signals in unirradiated samples across studies; developed a quantitative dose-reconstruction method that is generally under-utilized compared to categorical methods; and combined these to determine a gene-set as a reconstructor. Our dose-reconstruction biomarker was trained on two data sets and tested on two independent ones. It was able to predict dose up to 4.5 Gy with root mean squared error (RMSE) of ± 0.35 Gy on test datasets (same platform), and up to 6.0 Gy with RMSE of 1.74 Gy on another (different platform).


2021 ◽  
Vol 8 ◽  
Author(s):  
A. Christoper Tamilmathi ◽  
P. L. Chithra

This paper introduces a novel deep learned quantization-based coding for 3D Airborne LiDAR (Light detection and ranging) point cloud (pcd) image (DLQCPCD). The raw pcd signals are sampled and transformed by applying the Nyquist signal sampling and Min-max signal transformation techniques, respectively for improving the efficiency of the training process. Then, the transformed signals are feed into the deep learned quantization module for compressing the data. To the best of our knowledge, this proposed DLQCPCD is the first deep learning-based model for 3D airborne LiDAR pcd compression. The functions of Mean Squared Error and Stochastic Gradient Descent optimization function enhance the quality of the decompressed image by 67.01 percent on average, compared to other functions. The model’s efficiency has been validated with established well-known compression techniques such as the 7-Zip, WinRAR, and tensor tucker decomposition algorithm on the three inconsistent airborne datasets. The experimental results show that the proposed model compresses every pcd image into constant 16 Number of Neurons of data and decompresses the image with approximately 160 dB of PSNR value, 174.46 s execution time with 0.6 s execution speed per instruction, and proved that it outperforms the other existing algorithms regarding space and time.


2018 ◽  
Vol 13 (02) ◽  
Author(s):  
Reika Fichristika Kutika ◽  
David P. E. Saerang ◽  
Natalia Y. T. Gerungai

Nowadays, the changes that occur in the business environment creates competitiveness between companies in terms of seizing the market. For these companies, one of the way to achieve excellence is by being constantly focused on improving their processes and activities, paying attention to quality, flexibility and cost efficiency. Along with the competition of manufacturing companies that occur these days, PT. Indofood CBP Sukses Makmur, Tbk Bitung Branch must be able to manage its activities effectively and efficiently in order to achieve competitive advantage. Ergo, what companies need to do to improve its efficiency is by managing the activities that occur without reducing the quality of their products provided to customers. This study aims to find out how the application of activity based management at PT. Indofood CBP Sukses Makmur, Tbk Bitung Branch and how non value added activity can improve the efficiency of the company. The data analyze used descriptive analysis with qualitative approach and activity based management method. Based on the research, it is indicated that some non value-added activities are still going on, and by applying the activity-based management method, there is no added-value by the company’s reduction cost. By using activity-based management method, the total overhead cost of the factory is reduced by Rp 2,384,750,669,84 or 20,30%.Keyword: Activity Based Management, Non Value Added Activity, Efficiency


Sign in / Sign up

Export Citation Format

Share Document