Quantifying Operator Preference during Human Factors Test and Evaluation

1978 ◽  
Vol 22 (1) ◽  
pp. 24-28
Author(s):  
Michael L. Fineberg

The present paper will describe the construction and pilot testing of a human factors evaluation instrument. The instrument was constructed using psychometric procedures generally applied to development of attitude scales. The goal of the instrument was the quantification of operator preference in helicopter design within four major areas of human factors consideration: handling qualities, comfort/discomfort, human engineering design and safety. Each area had a common scale against which 10 specific parameters were evaluated. The ten items within each area were chosen using system operators' expertise, human factors standards, human factors experimental literature and the experience of the authors. The instrument was validated using a sample of 16 aviators during the conduct of an actual operational test. The results of these validation studies indicated a test-retest reliability of .85 (P < .001) and an inter-rater reliability of .93 (P < .001). Use of the instrument did provide statistically significant differences among aircraft candidates under various operational test conditions as measured within each of the four indices within the instrument. It is concluded that the instrument provides an effective method of quantifying the preference of operational aviators. The scrupulous development process using inputs from experienced aviators, experts in aviation field test design, and experts in test construction has provided a high measure of construct validity to the instrument.

1986 ◽  
Vol 30 (12) ◽  
pp. 1146-1148
Author(s):  
Michael L. Frazier ◽  
Bruce H. Taylor

Despite widespread policy support and increasingly sophisticated measurement tools, the evaluation of human factors issues within operational test and evaluation (OT&E) continues to lag behind the evaluation of other system elements. This situation can be traced, in part, to difficulties in integrating human factors findings with system performance measures. The present paper discusses one approach to this problem that is being implemented in the OT&E of the Consolidated Space Operations Center (CSOC).


1986 ◽  
Vol 30 (13) ◽  
pp. 1306-1310 ◽  
Author(s):  
Brett A. Storey

This report describes a methodology of simulation research which is designed to accomplish requirements of a human factors engineering simulation, plan. This approach, accompanied by detailed test plans and schedules will fulfill the data item DI-H-7052 (Human Engineering Dynamic Simulation Plan) for intended use of dynamic simulation techniques in support of human engineering analysis, design support and test and evaluation. This methodology will cover the need for dynamic simulation, evaluation techniques, procedures and guidelines, and the behavioral, subjective and physiological methods recommended for use in human engineering evaluations.


1985 ◽  
Vol 29 (5) ◽  
pp. 499-503
Author(s):  
Thomas G. O'Brien

The paper summarizes current test and evaluation methods employed by the US Army. The role of human factors in weapons systems acquisition is discussed along with the author's perspective on problems related to the test and evaluation process. Utilization of human performance operational test data to improve the human-materiel interface before type classification or transition into the next phase of development is of particular concern. The paper suggests an alternative to current methods which would combine engineering and operational testing to address both technical and operational system critical issues.


1986 ◽  
Vol 30 (13) ◽  
pp. 1311-1315
Author(s):  
Gregory S. Krohn

The Fort Hood Field Unit of the U.S. Army Research Institute for the Behavioral and Social Sciences (ARI) provided support to the TRADOC Combined Arms Test Activity (TCATA) conduct of the Follow-On Evaluation (FOE) of the M9 ACE. The FOE was conducted at Fort Hood, Texas over a 15-week period from March through June 1985, by TCATA for the U.S. Army Operational Test and Evaluation Agency (USAOTEA). This paper describes the human factors assessment support and findings provided to the evaluation by ARI. The primary objective of the assessment was to identify human factors engineering (man-machine) deficiencies that distract from M9 operational effectiveness and maintainability.


Author(s):  
Jennifer Ockerman ◽  
David Roberts ◽  
Albert A. Sciarretta ◽  
Dennis Folds ◽  
Susana McKee ◽  
...  

Author(s):  
Terence S. Andre ◽  
Samuel G. Charlton

Human factors operational test and evaluation (OT&E) at the function/characteristic level has not always provided an appropriate balance of addressing both the needs of the system user and the decisionmaker. System users are primarily concerned with the characteristics and capabilities of their system. Acquisition decision-makers, on the other hand, are more concerned with force structure and how potential military systems fit within the national military strategy. Human factors OT&E has traditionally considered the user of the system by testing human factors at a characteristic, rather than mission or operational task-level. In order to address the needs of the decision-maker, OT&E has adopted a strategy-to-task formulation that can have the undesirable side effect of decreasing the visibility of human factors test results. Because human factors measures are considered at the system function/characteristic level, significant human performance/human-machine interface issues are not always visible at the level of higher task elements and missions. Systems which require significant human-in-control or human-in-the-loop operability may lend themselves to consideration at the task-level. Testing human factors at the task-level within the strategy-to-task framework provides both the decision-maker and user with the necessary information to buy and properly operate the system.


2010 ◽  
Author(s):  
Jennifer Ockerman ◽  
David Roberts ◽  
Albert A. Sciarretta ◽  
Dennis Folds ◽  
Susana McKee ◽  
...  

Author(s):  
Henry M. Parsons

Work in human factors encompasses research and application in human engineering, procedure development, training techniques, personnel requirements, test and evaluation, task description, and task allocation. Opportunities and needs exist in computer-based data processing systems for all these endeavors, especially with regard to on-line users. Within human engineering, only manual entry has so far received much research attention. Work is also needed on displays, integrated entry-display, workspace and other equipment aspects, on-line languages, and program production. Of greatest concern to human engineering is the computer output, designed by programmers, rather than the hardware. Human factors people will have to master a new field and provide guidance to a new discipline which has not yet understood human factors requirements.


Sign in / Sign up

Export Citation Format

Share Document