Comparison of Two Data Entry Methods

1965 ◽  
Vol 20 (2) ◽  
pp. 369-384 ◽  
Author(s):  
William K. Earl ◽  
James D. Goff

The purpose of this experiment was to measure the effects of a number of display and input variables on the relative speed and accuracy of input performance when using point-in and type-in data entry methods for entering alphabetical material into automatic data processing machines. The factors tested in the experimental design were: types of arrangement of display material, density of material, different types of input tasks, typing ability, sex, and relative location of the keypunch device to the operator. The major finding of this study was that the point-in data entry method was a more accurate input technique than either the type-in or mixed point-in type-in data entry methods when measured under the effects of the independent variables.

Membranes ◽  
2021 ◽  
Vol 11 (1) ◽  
pp. 70
Author(s):  
Jasir Jawad ◽  
Alaa H. Hawari ◽  
Syed Javaid Zaidi

The forward osmosis (FO) process is an emerging technology that has been considered as an alternative to desalination due to its low energy consumption and less severe reversible fouling. Artificial neural networks (ANNs) and response surface methodology (RSM) have become popular for the modeling and optimization of membrane processes. RSM requires the data on a specific experimental design whereas ANN does not. In this work, a combined ANN-RSM approach is presented to predict and optimize the membrane flux for the FO process. The ANN model, developed based on an experimental study, is used to predict the membrane flux for the experimental design in order to create the RSM model for optimization. A Box–Behnken design (BBD) is used to develop a response surface design where the ANN model evaluates the responses. The input variables were osmotic pressure difference, feed solution (FS) velocity, draw solution (DS) velocity, FS temperature, and DS temperature. The R2 obtained for the developed ANN and RSM model are 0.98036 and 0.9408, respectively. The weights of the ANN model and the response surface plots were used to optimize and study the influence of the operating conditions on the membrane flux.


2018 ◽  
Vol 119 (2) ◽  
pp. 377-379 ◽  
Author(s):  
Jack Brooks ◽  
Jennifer Nicholas ◽  
Jennifer J. Robertson

Odor discrimination is a complex task that may be improved by increasing sampling time to facilitate evidence accumulation. However, experiments testing this phenomenon in olfaction have produced conflicting results. To resolve this disparity, Frederick et al. (Frederick DE, Brown A, Tacopina S, Mehta N, Vujovic M, Brim E, Amina T, Fixsen B, Kay LM. J Neurosci 37: 4416–4426, 2017) conducted experiments that suggest that sampling time and performance are task dependent. Their findings have implications for understanding olfactory processing and experimental design, specifically the effect of subtle differences in experimental design on study results.


2015 ◽  
Author(s):  
Melissa E. Tribou ◽  
Geoffrey Swain

Ship hull grooming is proposed as an environmentally friendly method of controlling fouling on ship hulls. It is defined as the frequent and gentle cleaning of a coating when the ship is idle to prevent the Establishment of fouling. Prior research by Tribou and Swain has evaluated the effectiveness of different methods and the frequency of grooming on different types of ship hull coatings. It was found that vertical rotating cup style Brushes provided the best method to maintain the coatings in a smooth and fouling free condition. This study investigated brush design and operational parameters in relationship to normal forces imparted by the brushes to the surface. A brush stiffness factor was developed and the independent variables for brush design non-dimensionalized for the normal force. A load cell was used to measure the forces imparted by different brushes and the models were validated using these non-dimensional terms. The knowledge gained by these studies will be used to optimize brush design for the implementation of grooming.


Author(s):  
Anuj Kumar ◽  
Pranay Mohadikar ◽  
Fiona Mary Anthony ◽  
Diwakar Z. Shende ◽  
Kailas L. Wasewar ◽  
...  

Abstract Glutaric acid is an attractive chemical compound which can be used for the manufacturing of polyesters, polyamides, and polyols. It can be produced by the synthesis (chemical method) and fermentation (biological method) process. Glutaric acid is presented with the lowest quantity in the fermentation broth and industrial waste streams. The separation methods of glutaric acid are difficult, costly, and non-environment friendly from fermentation broth. Reactive separation is a simple, cheapest, and environment-friendly process for the recovery of carboxylic acid. Which can be employed for the separation of glutaric acid with lower cost and environment-friendly process. In this study, response surface methodology (RSM) was used as a mathematical technique to optimize and experimental design for investigation of the reactive separation of glutaric acid from the aqueous phase. As per RSM study, 20 experiments with different independent variables such as concentration of glutaric acid, % v/v of trioctylamine, and pH for recovery of glutaric acid were performed. The optimum condition with maximum efficiency (η) 92.03% for 20% trioctylamine and pH = 3 at 0.08 mol/L of glutaric acid initial concentration were observed. The lower concentration of trioctylamine provides sufficient extraction efficiency of glutaric acid. This method can also be used for the separation from fermentation broth because a lower concentration of trioctylamine which makes this process environment-friendly. The optimization condition-defined quadratic response surface model is significant with R 2 of 0.9873. The independent variables defined the effect on the extraction efficiency of glutaric acid. This data can be used for the separation of glutaric acid from industries waste and fermentation broth.


Author(s):  
Christian Kaspar ◽  
Adam Melski ◽  
Britta Lietke ◽  
Madlen Boslau ◽  
Svenja Hagenhoff

Radio frequency identification (RFID) is a radiosupported identification technology that typically operates by saving a serial number on a radio transponder that contains a microchip for data storage. Via radio waves, the coded information is communicated to a reading device (Jones et al., 2005). RFID does not represent a new development; it was devised by the American military in the 1940s. Since the technology’s clearance for civil use in 1977, RFID has been successfully used for the identification of productive livestock, for electronic immobilizer systems in vehicles, or for the surveillance of building entrances (Srivastava, 2005). Due to decreasing unit costs (especially for passive transponders), RFID technologies now seem increasingly applicable for the labeling of goods and semi-finished products. By this, manual or semi-automatic data entry, for instance through the use of barcodes, can be avoided. This closes the technical gap between the real world (characterized by the lack of distribution transparency of its objects) and the digital world (characterized by logically and physically unambiguous and therefore distribution-transparent objects). In addition, RFID facilitates fully automated simultaneous recognition of more than one transponder without direct line of sight between reader and transponders.


2020 ◽  
pp. 014544552094632
Author(s):  
Chad E. L. Kinney ◽  
John C. Begeny ◽  
Scott A. Stage ◽  
Sierra Patterson ◽  
Amirra Johnson

Making treatment decisions based upon graphed data is important in helping professions. A small amount of research has compared usability between equal-interval and semi-log graphs, but no prior studies have compared different types of semi-log graphs. Using a randomized, cross-over, experimental design with 72 participants, this study examined the relative usability and acceptability of three types of graphs: Regular (equal-interval), Standard Celeration Chart (SCC; semi-log), and Standard Behavior Graph (SBG; semi-log). All participants used each graph across three usability tasks (Plotting Data, Writing Values, and Interpreting Trends). For the Plotting and Writing tasks, the equal-interval graph produced the greatest rate of correct responses. However, for the Interpreting task the SBG produced the greatest rate of corrects, while the equal-interval graph produced the smallest rate. User acceptability mainly favored the equal-interval and SBG graphs. Study findings and implications are discussed with respect to graph usability and acceptability during day-to-day practice.


1970 ◽  
Vol 2 (2) ◽  
pp. 67-74
Author(s):  
AKM Rezanur Rahman

Attempts were made to examine the interactive relations of gender, residence and social stratification with different types of aggressive behaviour. The independent variables were gender, residence and socio-economic status. Different types of the behaviour include physical, verbal, anger, hostile and indirect aggression. A total of 240 respondents between 13 and 16 years of age constituted the sample of the study. The Measure of Aggressive Behaviour (MAB) was used for data collection. The study utilized a 2 × 2 × 3 factorial design consisting of two levels of gender (male/female), two levels of residential background (urban/rural), and three levels of socio-economic status (high/middle/low). The results were computed on each dimension separately using t-tests. The findings revealed interactive relations of gender, residence and social stratification with different types of aggression. Key words: Aggression; antisocial behaviour; gang related violence; autism; attention deficit disorder DOI: 10.3329/jles.v2i2.7500 J. Life Earth Sci., Vol. 2(2) 67-74, 2007


Author(s):  
Lauren Parikhal ◽  
Hillary Abraham ◽  
Alea Mehler ◽  
Thomas McWilliams ◽  
Jonathan Dobres ◽  
...  

Allergen information on food labels is not standardized, making allergen avoidance difficult for consumers. This study investigated the speed and accuracy of allergen identification on commercial packaging across different types of warning labels. The results identified packaging label characteristics significantly correlated with faster and more accurate identification of allergens. Standardizing warning and safe-to-consume labels may reduce risk of accidental allergen exposure for consumers managing food allergies.


Metals ◽  
2019 ◽  
Vol 9 (11) ◽  
pp. 1198 ◽  
Author(s):  
Saldaña ◽  
González ◽  
Jeldres ◽  
Villegas ◽  
Castillo ◽  
...  

Multivariate analytical models are quite successful in explaining one or more response variables, based on one or more independent variables. However, they do not reflect the connections of conditional dependence between the variables that explain the model. Otherwise, due to their qualitative and quantitative nature, Bayesian networks allow us to easily visualize the probabilistic relationships between variables of interest, as well as make inferences as a prediction of specific evidence (partial or impartial), diagnosis and decision-making. The current work develops stochastic modeling of the leaching phase in piles by generating a Bayesian network that describes the ore recovery with independent variables, after analyzing the uncertainty of the response to the sensitization of the input variables. These models allow us to recognize the relations of dependence and causality between the sampled variables and can estimate the output against the lack of evidence. The network setting shows that the variables that have the most significant impact on recovery are the time, the heap height and the superficial velocity of the leaching flow, while the validation is given by the low measurements of the error statistics and the normality test of residuals. Finally, probabilistic networks are unique tools to determine and internalize the risk or uncertainty present in the input variables, due to their ability to generate estimates of recovery based upon partial knowledge of the operational variables.


2006 ◽  
Vol 18 (4) ◽  
pp. 749-759 ◽  
Author(s):  
Nicola Ancona ◽  
Sebastiano Stramaglia

We consider kernel-based learning methods for regression and analyze what happens to the risk minimizer when new variables, statistically independent of input and target variables, are added to the set of input variables. This problem arises, for example, in the detection of causality relations between two time series. We find that the risk minimizer remains unchanged if we constrain the risk minimization to hypothesis spaces induced by suitable kernel functions. We show that not all kernel-induced hypothesis spaces enjoy this property. We present sufficient conditions ensuring that the risk minimizer does not change and show that they hold for inhomogeneous polynomial and gaussian radial basis function kernels. We also provide examples of kernel-induced hypothesis spaces whose risk minimizer changes if independent variables are added as input.


Sign in / Sign up

Export Citation Format

Share Document