Increasing the Breadth of the Human Factors Scientist-Practitioner

1992 ◽  
Vol 36 (6) ◽  
pp. 563-565
Author(s):  
Curt C. Braun

It has been 40 years since the Department of Defense first commissioned the development of the Human Engineering Guide to Equipment Design. In this, the 40th anniversary year, it is fitting to examine the training of human factors purveyors and provide suggestions for supplemental training where little formal training exists. Review of current human factors education programs reveals that many of the published guidelines are, to a greater degree, being fulfilled. These findings should be encouraging, yet human factors educators, students, and practitioners alike are hesitant to conclude that psychologists or human factors specialists are necessarily complete upon attaining these skills. Many newly graduated human factors practitioners, while competent in human processes, do not possess the skills and background necessary to perform in a variety of domains. The goal of this symposium is to address the issues of human factors training by providing curriculum material designed to build upon fundamental skills.

1964 ◽  
Vol 5 (1) ◽  
pp. 112
Author(s):  
Robert H. McKim ◽  
Clifford T. Morgan ◽  
Jesse S. Cook ◽  
Alphonse Chapanis ◽  
Max W. Lund

1977 ◽  
Vol 21 (6) ◽  
pp. 545-547
Author(s):  
Tyler Blake

With the advent of voting machines and computers, the voting process has evolved into a complex man-machine system. However, to date no comprehensive human factors analysis of the voting process has been conducted. A systems analysis of the voting process yielded four major functions which impact critically on voter behavior and performance: (a) Design of voting instructions (b) Display of crucial voting information (c) Human engineering of voting equipment and procedures (d) Distribution of voting machines and personnel across and within voting districts. Some critical aspects of each area are discussed, and some additional points of interest to human factors specialists interested in researching this area are mentioned.


Author(s):  
Nelda Melissa ◽  
Lisa Chavez ◽  
John Winters

Panelists from military, government, and industry areas were asked to discuss efforts to educate their respective workforces on Human Factors (HF) or Human Systems Integration (HSI). The efforts varied in terms of breadth (general to specific) and implementation (classroom or web-based). Still, some commonalities existed across training and education programs, including a focus on embedding HSI and human factors into the acquisition and design process and in presenting human factors as a risk mitigation method.


1988 ◽  
Vol 32 (18) ◽  
pp. 1237-1240
Author(s):  
J. Peter Kincaid ◽  
Richard Braby ◽  
John E. Mears ◽  
A.J.G. Babu

This paper describes current developments in automating the processes to author technical information (TI) and deliver it using microcomputers. It describes desirable characteristics which support the presentation of TI for technicians varying in skill levels. Addressed are human factors issues relating to information access, user acceptance, and display formats. Programming is being done in Smalltalk/V, an object oriented language, on a Zenith 248 computer, which is compatible with the IBM PC/AT. The project emphasizes low cost authoring and delivery of information which traditionally has been contained in paper technical manuals. Our intent is to support the Department of Defense initiative to shift from paper to paperless technical manuals.


1978 ◽  
Vol 22 (1) ◽  
pp. 24-28
Author(s):  
Michael L. Fineberg

The present paper will describe the construction and pilot testing of a human factors evaluation instrument. The instrument was constructed using psychometric procedures generally applied to development of attitude scales. The goal of the instrument was the quantification of operator preference in helicopter design within four major areas of human factors consideration: handling qualities, comfort/discomfort, human engineering design and safety. Each area had a common scale against which 10 specific parameters were evaluated. The ten items within each area were chosen using system operators' expertise, human factors standards, human factors experimental literature and the experience of the authors. The instrument was validated using a sample of 16 aviators during the conduct of an actual operational test. The results of these validation studies indicated a test-retest reliability of .85 (P < .001) and an inter-rater reliability of .93 (P < .001). Use of the instrument did provide statistically significant differences among aircraft candidates under various operational test conditions as measured within each of the four indices within the instrument. It is concluded that the instrument provides an effective method of quantifying the preference of operational aviators. The scrupulous development process using inputs from experienced aviators, experts in aviation field test design, and experts in test construction has provided a high measure of construct validity to the instrument.


Sign in / Sign up

Export Citation Format

Share Document