Mechanical and ergonomics hazards

Author(s):  
Julia Smedley ◽  
Finlay Dick ◽  
Steven Sadhra

Ergonomics hazards: overview 118Lifting and handling 120Posture 124Repetitive work 128Mechanical hazards 130Ergonomics (or human factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory, principles, data, and methods to design in order to optimize human wellbeing and overall system performance....

1986 ◽  
Vol 30 (12) ◽  
pp. 1146-1148
Author(s):  
Michael L. Frazier ◽  
Bruce H. Taylor

Despite widespread policy support and increasingly sophisticated measurement tools, the evaluation of human factors issues within operational test and evaluation (OT&E) continues to lag behind the evaluation of other system elements. This situation can be traced, in part, to difficulties in integrating human factors findings with system performance measures. The present paper discusses one approach to this problem that is being implemented in the OT&E of the Consolidated Space Operations Center (CSOC).


Author(s):  
Michael E. Watson ◽  
Christina F. Rusnock ◽  
Michael E. Miller ◽  
John M. Colombi

Humans perform critical functions in nearly every system, making them vital to consider during system development. Human Systems Integration (HSI) would ideally permit the human’s impact on system performance to be effectively accounted for during the systems engineering (SE) process, but effective processes are often not applied, especially in the early design phases. Failure to properly account for human capabilities and limitations during system design may lead to unreasonable expectations of the human. The result is a system design that makes unrealistic assumptions about the human, leading to an overestimation of the human’s performance and thus the system’s performance. This research proposes a method of integrating HSI with SE that allows human factors engineers to apply Systems Modeling Language (SysML) and human performance simulation to describe and communicate human and system performance. Using these models, systems engineers can more fully understand the system’s performance to facilitate design decisions that account for the human. A scenario is applied to illustrate the method, in which a system developer seeks to redesign an example system, Vigilant Spirit, by incorporating system automation to improve overall system performance. The example begins by performing a task analysis through physical observation and analysis of human subjects’ data from 12 participants employing Vigilant Spirit. This analysis is depicted in SysML Activity and Sequence Diagrams. A human-in-the-loop experiment is used to study performance and workload effects of humans applying Vigilant Spirit to conduct simulated remotely-piloted aircraft surveillance and tracking missions. The results of the task analysis and human performance data gathered from the experiment are used to build a human performance model in the Improved Performance Research Integration Tool (IMPRINT). IMPRINT allows the analyst to represent a mission in terms of functions and tasks performed by the system and human, and then run a discrete event simulation of the system and human accomplishing the mission to observe the effects of defined variables on performance and workload. The model was validated against performance data from the human-subjects’ experiment. In the scenario, six different scan algorithms, which varied in terms of scan accuracy and speed, were simulated. These algorithms represented different potential system trades as factors such as various technologies and hardware architectures could influence algorithm accuracy and speed. These automation trades were incorporated into the system’s block definition (BDD), requirements, and parametric SysML diagrams. These diagrams were modeled from a systems engineer’s perspective; therefore they originally placed less emphasis on the human. The BDD portrayed the structural aspect of Vigilant Spirit, to include the operator, automation, and system software. The requirements diagram levied a minimum system-level performance requirement. The parametric diagram further defined the performance and specification requirements, along with the automation’s scan settings, through the use of constraints. It was unclear from studying the SysML diagrams which automation setting would produce the best results, or if any could meet the performance requirement. Existing system models were insufficient by themselves to evaluate these trades; thus, IMPRINT was used to perform a trade study to determine the effects of each of the automation options on overall system performance. The results of the trade study revealed that all six automation conditions significantly improved performance scores from the baseline, but only two significantly improved workload. Once the trade study identified the preferred alternative, the results were integrated into existing system diagrams. Originally system-focused, SysML diagrams were updated to reflect the results of the trade analysis. The result is a set of integrated diagrams that accounts for both the system and human, which may then be used to better inform system design. Using human performance- and workload-modeling tools such as IMPRINT to perform tradeoff analyses, human factors engineers can attain data about the human subsystem early in system design. These data may then be integrated into existing SysML diagrams applied by systems engineers. In so doing, additional insights into the whole system can be gained that would not be possible if human factors and systems engineers worked independently. Thus, the human is incorporated into the system’s design and the total system performance may be predicted, achieving a successful HSI process.


1989 ◽  
Vol 33 (18) ◽  
pp. 1187-1191 ◽  
Author(s):  
Mark F. Kanter ◽  
Frank J. O'Brien

Concept of Operations Experiments (COOPEXs) are conducted at the Naval Underwater Systems Center to evaluate submarine combat system operability through structured walkthroughs of submarine missions in a full-scale replica of the combat system environment. Data were collected from one COOPEX for the purpose of piloting human factors engineering methodologies. Partial results based on a different COOPEX scenario are reported and compared. The data reduction and analysis procedures of digraph analysis, Q-analysis, multidimensional scaling and crew density were applied to assess combat system information flow and configuration effectiveness. Results revealed the potential for significantly enhancing submarine combat system performance when applied to larger, more complex data sets. Plans for subsequent research are discussed.


1976 ◽  
Vol 20 (12) ◽  
pp. 243-244
Author(s):  
Wade R. Helm

A human factors evaluation of the P-3C aircraft was conducted to determine the workload implications resulting from design modifications at the sensor and tactical operator stations. The primary objectives of the evaluation were: (a) to determine if equipment and software design changes had significantly influenced the workload of the operators and (b) to determine if the design changes resulted in improved system performance. To aid in conducting this analysis a method known as the Function Description Inventory (FDI) was used. This method requires a series of investigations analyzing the selected operational functions of specific P-3C crew members, with an essential part involving the determination of roles, duties, and tasks performed by the crew members. Next, crew members' judgments were compiled on how important these roles, duties, and tasks were for mission success, how frequently they were performed on a typical mission, how difficult were the activities for the typical operator, and finally, how effective were the systems in accomplishing these operational functions. After combining the FDI results with the results of traditional human engineering analysis, it was concluded that there were substantial workload and system effectiveness changes at all three stations.


Author(s):  
Michael F. Rayo ◽  
Michael F. Rayo ◽  
Emilie M. Roth ◽  
Alexander M. Morison ◽  
Daniel J. Zelik ◽  
...  

Although the majority of effort in Artificial Intelligence (AI) ideation, design, and development seeks to optimize the AI as the primary method of optimizing overall system performance, the evidence is clear that for risk-critical work in high-complexity, high-uncertainty settings, it is the interactions between human and machines that must be prioritized. Only be effectively coordinating the available machine and human agents can the system be resilient to an increasing set of system demands. This panel will convey the work that they are doing and obstacles they are facing in the following areas: (1) demonstrating the critical importance of human-machine teaming, (2) hardening design patterns that result in successful human- machine teams, (3) designing and evaluating new automation solutions for their ability to team, and (4) ensuring that new automation solutions are implemented and adopted for risk-critical work.


Sign in / Sign up

Export Citation Format

Share Document