Safety Critical Tasks: Identification of Human Error to Control Risks from Major Accident Hazards

2015 ◽  
Author(s):  
George Petrie ◽  
Ann Rosbrook
Author(s):  
Susanne Boll ◽  
Philippe Palanque ◽  
Alexander G. Mirnig ◽  
Jessica Cauchard ◽  
Margareta Holtensdotter Lützhöft ◽  
...  
Keyword(s):  

2021 ◽  
Author(s):  
Girish Kamal

Abstract Safety Critical Elements (SCEs) are the equipment and systems that provide the foundation of risk management associated with Major Accident Hazards (MAHs). A SCE is classified as an equipment, structure or system whose failure could cause or contribute to a major accident, or the purpose of which is to prevent or limit the effect of a major accident. Once the SCE has been ascertained, it is essential to describe its critical function in terms of a Performance Standard. Based on the Performance Standard, assurance tasks can be stated in the maintenance system to ensure that the required performance is confirmed. By analyzing the data in the maintenance system, confidence can be gained that all the SCEs required to manage Major Accidents and Environmental Hazards are functioning correctly. Alternatively, corrective actions can be taken to reinstate the integrity of the systems if shortcomings are identified. This paper shall detail out how the MAH and SCE Management process is initiated to follow the best industry practices in the identification and integrity management of major accident hazards as well as safety critical equipment. The tutorial shall describe in detail the following important stages:Identification of Major Accident HazardsIdentification of Safety Critical Equipment, involved in managing Major Accident HazardsDefine Performance Standards for these Safety Critical EquipmentExecution of the Assurance processes that maintain or ensure the continued suitability of the SCE Equipment, and that these are meeting the Performance StandardsVerification that all stages have been undertaken, any deviations being managed and thus that Major Accident Hazards are being controlled.Analyze and Improve Through the diligent application of these stages, it is possible to meet the requirements for MAH and SCE Management process giving a better understanding and control of risks in the industry.


Author(s):  
Caroline Morais ◽  
Raphael Moura ◽  
Michael Beer ◽  
Edoardo Patelli

Abstract Risk analyses require proper consideration and quantification of the interaction between humans, organization, and technology in high-hazard industries. Quantitative human reliability analysis approaches require the estimation of human error probabilities (HEPs), often obtained from human performance data on different tasks in specific contexts (also known as performance shaping factors (PSFs)). Data on human errors are often collected from simulated scenarios, near-misses report systems, and experts with operational knowledge. However, these techniques usually miss the realistic context where human errors occur. The present research proposes a realistic and innovative approach for estimating HEPs using data from major accident investigation reports. The approach is based on Bayesian Networks used to model the relationship between performance shaping factors and human errors. The proposed methodology allows minimizing the expert judgment of HEPs, by using a strategy that is able to accommodate the possibility of having no information to represent some conditional dependencies within some variables. Therefore, the approach increases the transparency about the uncertainties of the human error probability estimations. The approach also allows identifying the most influential performance shaping factors, supporting assessors to recommend improvements or extra controls in risk assessments. Formal verification and validation processes are also presented.


2020 ◽  
Vol 60 (1) ◽  
pp. 41
Author(s):  
Joelle Mitchell ◽  
Alice Turnbull

Analysis of incident investigation findings as a means of identifying common precursors or causal factors is a common topic of safety research. Historically this type of research has been conducted through a single lens, depending on the researcher’s discipline, with incidents analysed in accordance with a favoured theory, or grouped according to industry or region. This has led to the development of numerous frameworks and taxonomies that attempt to predict or analyse events at various levels of granularity. Such theories and disciplines include safety culture and climate, human factors, human error, management systems, systems theory, engineering and design, chemistry and maintenance. The intent of such research is ostensibly to assist organisations in understanding the degree to which their operations are vulnerable to known precursors or causal factors to major accident events and to take proactive measures to improve the safety of their operations. However, the discipline-specific nature of much of this research may limit its application in practice. Specific frameworks and taxonomies may be of assistance when organisations have identified a relevant area of vulnerability within their operations, but are unlikely to assist organisations in identifying those vulnerabilities in the first place. This paper seeks to fill that gap. A multidisciplinary approach was taken to identify common causal factors. Investigation reports published by independent investigation agencies across various industries were analysed to determine common causal factors regardless of discipline or industry.


2020 ◽  
Vol 8 (6) ◽  
pp. 5749-5758

Safety critical systems are systems whose failure could result loss of life, economic damage, incidents, accidents or undesirable outcome, but it is not doubt that critical system safety has improved greatly under the development of the technology as the number of hardware and software induced accidents has been definitely reduced, but number of human deviations in their decision making found in each accident range remains more. We deeply reviewed traditional human error approaches and their limitations and propose approach of Human Cognitive Bias Identification Technique (H-CoBIT) that identifies, mitigates human potential cognitive biases and generates safety requirements during the initial phase of system Design. This proposed method, analyses the design of safety critical systems from a human factors perspective. It contributes system analyst, designers, and software engineers to identify potential cognitive biases (metal deviations in operator’s decision-making process) during the system use. To ensure the validity of the proposed method, we conducted an empirical experiment to validate the method for accuracy and reliability comparing different experimental outcomes using signal detection theorem.


Author(s):  
M J Cook ◽  
T Simpson ◽  
D Garrett ◽  
M Thody

A philosophy of technology use has developed in many safety-critical industries that is based upon the view that human operators are feckless and unreliable system operators, so wherever possible should not be trusted to execute safety-critical tasks.  The implicit view of automation is that it invariably improves system performance and increases reliability. After many decades or even centuries of machine and automation development human error remains one of the dominant features in failures of modern systems. The drive towards introducing automation has claimed a larger performance envelope, lower operating costs with fewer people, less risk of hazard realisation, and a more economical path in development. One of the aims of introducing automation is to introduce higher reliability in the belief that this implicitly brings with it increases in safety. As Leveson (2011) points out high reliability can be misleading because interactions between elements that are working as expected may trigger the system failure because of transverse consequences. The propagation of the view that human operators are the weakest operational link and the pervasive myths about the reliability of automated solutions, which affords automation the easier scenarios of task execution, need to be re-visited (Cook, Thody and Garrett, 2017). This should ensure that the best capability and optimal safety case is developed for future systems based upon operator and system in synergy. This may be especially true if the claims for automation are treated more aggressively in terms of liability.


2016 ◽  
Vol 26 (12) ◽  
pp. 285-288 ◽  
Author(s):  
F Roche

Making mistakes is part of being human and human error is normal in all areas of life (Bromiley & Mitchell 2009). In some contexts this is of little consequence, but in environments where human safety and well-being are at stake it is vital that such error is minimised. The operating theatre is one such safety critical environment. Research suggests, however, that certain factors predispose to human error. Some or all of these factors may be present in the operating theatre and, therefore, have the potential to compromise patient safety.


Sign in / Sign up

Export Citation Format

Share Document