Human Factors Guidelines for Computer Software Design

1983 ◽  
Vol 27 (13) ◽  
pp. 1035-1038
Author(s):  
Olga Towstopiat

A project sponsored by the U.S. Army Research Institute and conducted by Synectics Corporation human factors staff, focused on the development of a human/computer interface design handbook. The objective of this effort was to promote functional standardization and modularization of tasks and procedures, in order to reduce the amount of training and skill levels required for computer users and operators. This paper will 1) outline the contents of the computer software design handbook developed by Synectics' staff; 2) discuss the practical utility of these guidelines, as evaluated in the context of actual system software designs; 3) specify limitations of human factors design guidelines; 4) emphasize the need to link human factors software design guidelines to empirically-based models of human performance; and 5) provide examples of empirical data and design guidelines that may support future efforts, by human factors specialists, to increase computer system effectiveness and reduce personnel costs.

Author(s):  
David R Desaulniers ◽  
Stephen Fleger

Since 1980 the Institute of Electrical and Electronics Engineers (IEEE) has supported development of human factors (HF) standards. Within IEEE, Subcommittee 5 (SC5) of the Nuclear Power Engineering Committee develops and maintains HF standards applicable to nuclear facilities. These standards are structured in a hierarchical fashion. The top-level standard (IEEE Std. 1023) defines the HF tasks required to support the integration of human performance into the design process. Five lower tier documents (IEEE Std. 845, 1082, 1289, 1786 and 1707) expand upon the upper tier standard. Presently, two new HF standards projects are underway; one to provide HF guidance for the validation of the system interface design and integrated systems operation and another for designing and developing computer-based displays for monitoring and control of nuclear facilities. SC5 is also involved in outreach activities, including sponsorship of a series of conferences on human factors and nuclear power plants.


1986 ◽  
Vol 30 (14) ◽  
pp. 1358-1362 ◽  
Author(s):  
Louis Tijerina

The proliferation of computer systems in recent years has prompted a growing concern about the human factors of interface design. Industrial and military organizations have responded by supporting studies in user-computer interaction and, more recently, products which might aid in the design of interfaces. One type of design aid which attempts to make findings of user-computer interface (UCI) research available to the system designer is the interface design guidelines document. This paper reviews literature about the design process and how design guidelines or standards might fit into that activity. Suggestions are offered about where future research and development might be directed in order to enhance the use of guidelines in the interface design process and so enhance the final product as well.


Author(s):  
Sylvia R. Mayer

Military information systems are surveyed in an historical context starting with the SAGE system of the 1950's and projecting to the anticipated supersystems of the 1970's. Human functions in development, operation, and use of these systems are considered from a human factors point of view. This evolutionary overview shows how hardware and software design impact on human performance and how this impact has focused and expanded research in the computer sciences and in the behavioral sciences. The evolving human functions in military information systems are described. These descriptions serve as a basis for defining and researching critical human factors opportunities and problems. Paralleling this analysis is a review of several past, current, and future trends in human factors research for future military information systems.


Author(s):  
Christopher C. Heasly ◽  
Lisa A. Dutra ◽  
Mark Kirkpatrick ◽  
Thomas L. Seamster ◽  
Robert A. Lyons

The 21st century Navy combatant ship will experience exponential increases in shipboard information to be processed, disseminated and integrated. High Definition System (HDS) technology will provide for the convergence of text, graphics, digital video, imagery, and complex computing to allow for a new range of advanced capabilities that exceed those of currently available workstations. These capabilities could result in unmanageable and overwhelming cognitive workloads for Navy tactical operators in CIC (Combat Information Center). For this reason, a prototype user interface was designed using future combat system requirements, proposed HDS capabilities, and human-computer interface design standards and principles. Usability testing of the protoype user interface was conducted as part of an effort to identify integrated information management technologies which reduce operator workload, increase human performance, and improve combat system effectiveness. This demonstration will focus on explanation and demonstration of future concepts envisioned for the AEGIS operational environment; organization and functionality of the menu structures and window contents; the usability testing methods utilized; results from usability testing; and plans for utilization of the prototype shell in other operational environments.


1983 ◽  
Vol 27 (11) ◽  
pp. 892-895
Author(s):  
David M. Gilfoil ◽  
J. Thomas Murray ◽  
John Van Praag

There exists a voluminous body of human factors literature pertaining to various aspects of human/computer interface design. This literature is frequently reviewed and cited as source documentation by human factors industry professionals. Traditional “hard copy” methods of storage/retrieval of this information are inefficient because of people, resource, and location constraints. The Ergonomics department at Exxon Office Systems has developed a preliminary version of a computerized information storage and retrieval system. Using this Ergonomic Design Guidelines and Rules (E.D.G.A.R.) system, department members develop and maintain a closer working knowledge of the human factors research literature. They are also able to quickly and accurately retrieve and apply guidelines to a variety of human/computer design situations. The design objectives of the EDGAR system, details of the system itself, and a preliminary evaluation are presented in this paper.


Author(s):  
Katsuhiko Ogawa

Many human-computer interface design guidelines have been developed to design good interfaces for various kinds of software. Database systems have been also developed for accessing the guidelines. This paper considers the role of the design guidelines, rather than the role of the database, in improving interface designs. Sixteen software designers, who have no human factors experience, participated in a typical design review task. They were provided with a representation of a bad interface design. Eight designers (the UG participants) were instructed to individually improve the design by using the guidelines. The other designers (the NG participants) were instructed to improve it unaided (without the guidelines). The results indicated that both groups made similar numbers of improvements, but the UG participants produced higher quality improvements. Quality was evaluated using a goodness measure defined in this paper. The NG participants made good improvements but also bad ones that conflicted with the guidelines because only the designers' knowledge, experience and preference were used. On the other hand, the UG participants made fewer bad proposals because they could refer to the guidelines. Guidelines can work as a filter to eliminate inappropriate or false improvements from the designers' original proposals. There is a possibility that the guidelines may hinder the designer from developing new and interesting proposals. Their value is, however, very clear for novice designers who have no human factors experience; they can easily develop high quality proposals.


1976 ◽  
Vol 20 (12) ◽  
pp. 243-244
Author(s):  
Wade R. Helm

A human factors evaluation of the P-3C aircraft was conducted to determine the workload implications resulting from design modifications at the sensor and tactical operator stations. The primary objectives of the evaluation were: (a) to determine if equipment and software design changes had significantly influenced the workload of the operators and (b) to determine if the design changes resulted in improved system performance. To aid in conducting this analysis a method known as the Function Description Inventory (FDI) was used. This method requires a series of investigations analyzing the selected operational functions of specific P-3C crew members, with an essential part involving the determination of roles, duties, and tasks performed by the crew members. Next, crew members' judgments were compiled on how important these roles, duties, and tasks were for mission success, how frequently they were performed on a typical mission, how difficult were the activities for the typical operator, and finally, how effective were the systems in accomplishing these operational functions. After combining the FDI results with the results of traditional human engineering analysis, it was concluded that there were substantial workload and system effectiveness changes at all three stations.


Author(s):  
Betty P. Chao

A well-designed user interface is recognized as a benchmark for determining the success of a software product. The proliferation of user interface design guidelines, standards, prototyping tools, and techniques are indicative of the importance placed on quality user interfaces. However, even with the availability of the latest information, tools, and human factors practitioners to software developers, sub-optimal interfaces may result. This is because within a large multidisciplinary software design team, issues such as communication, responsibilities, and cost and schedule constraints may override the usability issues. This paper describes the implementation of concurrent engineering, used to successfully develop user interfaces for a large, complex system. Success is expressed in terms of quality and consistent user interfaces, positive influence of human factors on software development, and customer satisfaction.


Sign in / Sign up

Export Citation Format

Share Document