Human Factors in the Naval Air Systems Command: Computer Based Training

1988 ◽  
Vol 32 (16) ◽  
pp. 1095-1099
Author(s):  
Thomas L. Seamster ◽  
Cathrine E. Snyder ◽  
Michele Terranova ◽  
William J. Walker ◽  
D. Todd Jones

Military standards applied to the private sector contracts have a substantial effect on the quality of Computer Based Training (CBT) systems procured for the Naval Air Systems Command. This study evaluated standards regulating the following areas in CBT development and procurement: interactive training systems, cognitive task analysis, and CBT hardware. The objective was to develop some high-level recommendations for evolving standards that will govern the next generation of CBT systems. One of the key recommendations is that there be an integration of the instructional systems development, the human factors engineering, and the software development standards. Recommendations were also made for task analysis and CBT hardware standards.

Author(s):  
James H. Hicinbothom

Human factors engineering research into human-computer interaction (HCI) needs means to easily and objectively study HCI. The Instrumented Interface Construction (IICON) Evaluator, and its associated IICON Data Taps, provide such means. Any X Window system user interface built using development tools for which appropriate IICON Data Taps exist (e.g., IICON Data Taps/TAE+5.2) can be automatically instrumented. Instrumented interfaces can then be used by subjects, recording a complete transcript of user actions. Once an operator's session has been recorded, it can be replayed on-screen as needed for further observation and analysis. These capabilities are extremely helpful for developing cognitive task models (i.e., user models) for analytic purposes, or for construction of Intelligent Agents to be embedded in the operator's computer-based workstation. Furthermore, with a complete objective record of all operator actions, IICON Evaluator supports a wide variety of research investigations of HCI.


Author(s):  
Maria Lund Jensen ◽  
Jayme Coates

Development of implantable medical devices is becoming increasingly interesting for manufacturers, but identifying the right Human Factors Engineering (HFE) approach to ensure safe use and effectiveness is challenging. Most active implantable devices are highly complex; they are built on extremely advanced, compact technology, often comprise systems of several device elements and accessories, and they span various types of user interfaces which must facilitate diverse interaction performed by several different user groups throughout the lifetime of the device. Furthermore, since treatment with implantable devices is often vital and by definition involves surgical procedures, potential risks related to use error can be severe. A systematic mapping of Product System Elements and Life Cycle Stages can help early identification of Use Cases, and for example user groups and high-level use risks, to be accounted for via HFE throughout development to optimize Human Factors processes and patient outcomes. This paper presents a concrete matrix tool which can facilitate an early systematic approach to planning and frontloading of Human Factors Engineering activities in complex medical device development.


Author(s):  
Kim J. Vicente

Cognitive task analysis (CTA) is increasingly being used to effectively address a wide variety of human factors problems. However, different researchers are using significantly different methods. In many cases, a particular method is used solely by its originators. Therefore, there are significant issues that must be worked through before CTA becomes a widely accepted and easily transferable human factors tool. The objectives of this symposium are to: bring CTA to the attention of a wider audience; develop a better understanding of the differences and similarities between different CTA methods; and demonstrate the practical advantages of CTA.


Author(s):  
Michael J. DeVries ◽  
Sallie E. Gordon

Because an increasing number of systems are being developed to support complex cognitive functioning, task analysis is commonly being augmented with cognitive task analysis, which identifies cognitive processes, knowledge, and mental models relevant to task performance. Cognitive task analysis tends to be lengthy and time-consuming, so designers frequently ask how they might know if it is actually necessary for a specific project. In this paper, we assume that much of the need for cognitive task analysis depends on the inherent “cognitive complexity” of the task. We present a model of cognitive complexity, and show how it was used to develop a computer-based tool for estimating relative cognitive complexity for a set of tasks. The tool, Cog-C, elicits task and subtask hierarchies, then guides the user in making relatively simple estimates on a number of scales. The tool calculates and displays the relative cognitive complexity scores for each task, along with subscores of cognitive complexity for different types of knowledge. Usability and reliability were evaluated in multiple domains, showing that the tool is relatively easy to use, reliable, and well-accepted.


1987 ◽  
Vol 31 (12) ◽  
pp. 1425-1428
Author(s):  
Robert M. Waters

The Human Factors Engineering products from the systems requirements phase of system development were transformed into techniques consistent with structured software development techniques. These techniques supported definition of the mission functions with the context diagram, the task list was compatible with the event list, and high level functional flow diagrams are consistent with the structured data flow diagrams. In addition, the sequenced task analysis procedures used provided a structured diagraming methodology in state transition diagrams. This technique provided a method for defining MMI requirements in software engineering terminology.


Author(s):  
Jennifer Herout ◽  
Jolie Dobre ◽  
William Plew ◽  
Jason J. Saleem

The coordination of site visits to execute human factors methods, such as onsite usability tests, interviews, or observations, in clinical settings requires a high level of management to attain successful data collection outcomes. Members of the Department of Veterans Affairs (VA) Veterans Health Administration (VHA) Human Factors Engineering (HFE) team occasionally visit VHA medical centers or outpatient clinics to complete our work. We have developed a site visit checklist as a practice innovation to facilitate logistical coordination when gathering data onsite. This Practice-Oriented paper includes the full checklist, as well as discussion of its use to enable other groups to benefit from lessons we have learned in conducting onsite work in health care settings.


Author(s):  
Bella Yigong Zhang ◽  
Mark Chignell

Human Factors Engineering (HFE) is an applied discipline that uses a wide range of methodologies to better the design of systems and devices for human use. Underpinning all human factors design is the maxim to fit the human to the task/machine/system rather than vice versa. While some HFE methods such as task analysis and anthropometrics remain relatively fixed over time, areas such as human-technology interaction are strongly influenced by the fast-evolving technological trend. In times of big data, human factors engineers need to have a good understanding of topics like machine learning, advanced data analytics, and data visualization so that they can design data-driven products that involve big data sets. There is a natural lag between industrial trends and HFE curricula, leading to gaps between what people are taught and what they will need to know. In this paper, we present the results of a survey involving HFE practitioners (N=101) and we demonstrate the need for including data science and machine learning components in HFE curricula.


Author(s):  
Laura Lin ◽  
Racquel Isla ◽  
Karine Doniz ◽  
Heather Harkness ◽  
Kim J. Vicente ◽  
...  

The hypothesis explored in this paper is that, by adopting human factors design principles, the use of medical equipment can be made safer and more efficient We have selected a commercially available patient-controlled analgesia (PCA) machine as a vehicle to test this hypothesis. A cognitive task analysis of PCA usage, combined with a set of human factors design principles, led to a redesigned PCA interface. An experimental evaluation was conducted, comparing this new interface with the existing interface. The results show that the new interface leads to significantly faster, less effortful, and more reliable performance. These findings have implications for improving the design of other medical equipment.


1974 ◽  
Vol 18 (3) ◽  
pp. 343-349 ◽  
Author(s):  
Edward L. Holshouser

Department of Defense (DOD) Directive 5000.1 of 13 July 1971 provides direction for Test and Evaluation (T&E) of major weapon systems. In addition, there are Navy unique T&E requirements which must be satisfied. A Human Factors Engineering (HFE) test and evaluation concept has been proposed for implementation by the Naval Air Systems Command. The proposed T&E concept will make explicit the interface among the test and evaluation activities so that the role of HFE can be realistically implemented and managed during system acquisition. The HFE T&E concept will specifically address the developmental and operational tests designed to provide HFE data for answering the Defense System Acquisition Review Council (DSARC) milestones for evolving systems. The concept also features a HFE information system which will serve as a focal point and feedback network for use by personnel needing information or data on some element in the evaluation of a particular weapon system.


Sign in / Sign up

Export Citation Format

Share Document