scholarly journals Comprehensive analysis of the strength and safety of potentially hazardous facilities subject to uncertainties

Dependability ◽  
2020 ◽  
Vol 20 (1) ◽  
pp. 47-56
Author(s):  
N. A. Makhutov ◽  
D. O. Reznikov

Aim. This paper aims to compare the two primary approaches to ensuring the structural strength and safety of potentially hazardous facilities, i.e. the deterministic approach that is based on ensuring standard values of a strength margin per primary limit state mechanisms, and the probabilistic approach, under which the strength condition criterion is the nonexceedance by the target values of probability of damage per various damage modes of the standard maximum allowable values. . The key problem of ensuring the structural strength is the high level of uncertainties that are conventionally subdivided into two types: (1) the uncertainties due to the natural variation of the parameters that define the load-carrying ability of a system and the load it is exposed to, and (2) the uncertainties due to the human factor (the limited nature of human knowledge of a system and possibility of human error at various stages of system operation). The methods of uncertainty mitigation depend on the approach applied to strength assurance: under the deterministic approach the random variables “load” and “carrying capacity” are replaced with deterministic values, i.e. their mathematical expectations, while the fulfillment of the strength conditions subject to uncertainties is ensured by introducing the condition that the relation of the mathematical expectation of the loadcarrying capacity and strength must exceed the standard value of strength margin that, in turn, must be greater than unity. As part of the probabilistic approach, the structural strength is assumed to be ensured if the estimated probability of damage per the given mechanism of limit state attainment does not exceed the standard value of the probability of damage.Conclusions. The two approaches (deterministic and probabilistic) can be deemed equivalent only in particular cases. The disadvantage of both is the limited capability to mitigate the uncertainties of the second type defined by the effects of the human factor, as well as the absence of a correct procedure of accounting for the severity of consequences caused by the attainment of the limit state. The above disadvantages can be overcome if risk-based methods are used in ensuring structural strength and safety. Such methods allow considering uncertainties of the second type and explicitly taking into consideration the criticality of consequences of facility destruction.

Author(s):  
O. M. Reva ◽  
V. V. Kamyshin ◽  
S. P. Borsuk ◽  
V. A. Shulhin ◽  
A. V. Nevynitsyn

The negative and persistent impact of the human factor on the statistics of aviation accidents and serious incidents makes proactive studies of the attitude of “front line” aviation operators (air traffic controllers, flight crewmembers) to dangerous actions or professional conditions as a key component of the current paradigm of ICAO safety concept. This “attitude” is determined through the indicators of the influence of the human factor on decision-making, which also include the systems of preferences of air traffic controllers on the indicators and characteristics of professional activity, illustrating both the individual perception of potential risks and dangers, and the peculiarities of generalized group thinking that have developed in a particular society. Preference systems are an ordered (ranked) series of n = 21 errors: from the most dangerous to the least dangerous and characterize only the danger preference of one error over another. The degree of this preference is determined only by the difference in the ranks of the errors and does not answer the question of how much time one error is more dangerous in relation to another. The differential method for identifying the comparative danger of errors, as well as the multistep technology for identifying and filtering out marginal opinions were applied. From the initial sample of m = 37 professional air traffic controllers, two subgroups mB=20 and mG=7 people were identified with statisti-cally significant at a high level of significance within the group consistency of opinions a = 1%. Nonpara-metric optimization of the corresponding group preference systems resulted in Kemeny’s medians, in which the related (middle) ranks were missing. Based on these medians, weighted coefficients of error hazards were determined by the mathematical prioritization method. It is substantiated that with the ac-cepted accuracy of calculations, the results obtained at the second iteration of this method are more ac-ceptable. The values of the error hazard coefficients, together with their ranks established in the preference systems, allow a more complete quantitative and qualitative analysis of the attitude of both individual air traffic controllers and their professional groups to hazardous actions or conditions.


Author(s):  
Sylwia Agata Bęczkowska ◽  
Iwona Grabarek

This article discusses the issues related to the safety for the transport of dangerous goods by road. Research on accidents in transport unambiguously points to the human factor, which is the most responsible for causing accidents. Determining the causes of driver unreliability in the human−vehicle−environment system requires thorough research. Unfortunately, in this case, experimental research with human involvement is limited in scope. This leaves modeling and simulation of the behavior of the human factor, i.e., the driver transporting dangerous goods. The human being, because of its complexity, is a challenging element to parameterize. The literature presents various attempts to model human actions. Herein, the authors used heuristic methods, specifically fuzzy set techniques, to build a human factor model. In these models, human actions were specified using a verbal or linguistic description. The specificity of the fuzzy sets allowed for “naturally” limiting the “precision” in describing human behavior. The model was built based on the author’s questionnaire and expert research, based on which individual features were selected. Then, the traits were assigned appropriate states. The output parameter of the model was λL—the intensity of human error. The obtained values of the intensity of the accident caused by the driver’s error were implemented into the author’s method of risk assessment. They constituted one of the factors determining the probability of an accident in the transport of dangerous goods, which allowed for determining the optimal route for the transport of these goods characterized by the lowest risk of an undesirable event on the route. The article presents the model’s assumptions, structure, and the features included in the model, all of which have the most significant influence on shaping the intensity of human error. The results of the simulation studies showed a diversified effect of the analyzed characteristics on the driver’s efficiency.


2021 ◽  
Vol 11 (9) ◽  
pp. 3921
Author(s):  
Paloma Carrasco ◽  
Francisco Cuesta ◽  
Rafael Caballero ◽  
Francisco J. Perez-Grau ◽  
Antidio Viguria

The use of unmanned aerial robots has increased exponentially in recent years, and the relevance of industrial applications in environments with degraded satellite signals is rising. This article presents a solution for the 3D localization of aerial robots in such environments. In order to truly use these versatile platforms for added-value cases in these scenarios, a high level of reliability is required. Hence, the proposed solution is based on a probabilistic approach that makes use of a 3D laser scanner, radio sensors, a previously built map of the environment and input odometry, to obtain pose estimations that are computed onboard the aerial platform. Experimental results show the feasibility of the approach in terms of accuracy, robustness and computational efficiency.


Author(s):  
Goran Alpsten

This paper is based on the experience from investigating over 400 structural collapses, incidents and serious structural damage cases with steel structures which have occurred over the past four centuries. The cause of the failures is most often a gross human error rather than a combination of “normal” variations in parameters affecting the load-carrying capacity, as considered in normal design procedures and structural reliability analyses. Human errors in execution are more prevalent as cause for the failures than errors in the design process, and the construction phase appears particularly prone to human errors. For normal steel structures with quasi-static (non-fatigue) loading, various structural instability phenomena have been observed to be the main collapse mode. An important observation is that welds are not as critical a cause of structural steel failures for statically loaded steel structures as implicitly understood in current regulations and rules for design and execution criteria.


Author(s):  
U Yildirim ◽  
O Ugurlu ◽  
E Basar ◽  
E Yuksekyildiz

Investigation on maritime accidents is a very important tool in identifying human factor-related problems. This study examines the causes of accidents, in particular the reasons for the grounding of container ships. These are analysed and evaluation according to the contribution rate using the Monte Carlo simulation. The OpenFTA program is used to run the simulation. The study data are obtained from 46 accident reports from 1993 to 2011. The data were prepared by the International Maritime Organization (IMO) Global Integrated Shipping Information System (GISIS). The GISIS is one of the organizations that investigate reported accidents in an international framework and in national shipping companies. The Monte Carlo simulation determined a total of 23.96% human error mental problems, 26.04% physical problems, 38.58% voyage management errors, and 11.42% team management error causes. Consequently, 50% of the human error is attributable to human performance disorders, while 50% team failure has been found.


Author(s):  
Eric Brehm ◽  
Robert Hertle ◽  
Markus Wetzel

In common structural design, random variables, such as material strength or loads, are represented by fixed numbers defined in design codes. This is also referred to as deterministic design. Addressing the random character of these variables directly, the probabilistic design procedure allows the determination of the probability of exceeding a defined limit state. This probability is referred to as failure probability. From there, the structural reliability, representing the survival probability, can be determined. Structural reliability thus is a property of a structure or structural member, depending on the relevant limit states, failure modes and basic variables. This is the basis for the determination of partial safety factors which are, for sake of a simpler design, applied within deterministic design procedures. In addition to the basic variables in terms of material and loads, further basic variables representing the structural model have to be considered. These depend strongly on the experience of the design engineer and the level of detailing of the model. However, in the clear majority of cases [1] failure does not occur due to unexpectedly high or low values of loads or material strength. The most common reasons for failure are human errors in design and execution. This paper will provide practical examples of original designs affected by human error and will assess the impact on structural reliability.


Aviation ◽  
2013 ◽  
Vol 17 (2) ◽  
pp. 76-79 ◽  
Author(s):  
Peter Trifonov-Bogdanov ◽  
Leonid Vinogradov ◽  
Vladimir Shestakov

During an operational process, activity is implemented through an ordered sequence of certain actions united by a common motive. Actions can be simple or complex. Simple actions cannot be split into elements having independent objectives. Complex actions can be presented in the form of a set of simple actions. If the logical organisation of this set is open, a complex action can be described as an algorithm consisting of simple actions. That means various kinds of operational activities develop from the same simple and typical actions, but in various sequences. Therefore, human error is always generated by a more elementary error of action. Thus, errors of action are the primary parameter that is universal for any kind of activity of an aviation specialist and can serve as a measure for estimating the negative influence of the human factor (HF) on flight safety. Aviation personnel are various groups of experts having various specialisations and working in various areas of civil aviation. It is obvious that their influence on conditions is also unequal and is defined by their degree of interaction with the performance of flights. In this article, the results of an analysis of air incidents will be presented.


2016 ◽  
Vol 50 (0) ◽  
Author(s):  
Gisele Pinto de Oliveira ◽  
Ana Luiza de Souza Bierrenbach ◽  
Kenneth Rochel de Camargo Júnior ◽  
Cláudia Medina Coeli ◽  
Rejane Sobrino Pinheiro

ABSTRACT OBJECTIVE To analyze the accuracy of deterministic and probabilistic record linkage to identify TB duplicate records, as well as the characteristics of discordant pairs. METHODS The study analyzed all TB records from 2009 to 2011 in the state of Rio de Janeiro. A deterministic record linkage algorithm was developed using a set of 70 rules, based on the combination of fragments of the key variables with or without modification (Soundex or substring). Each rule was formed by three or more fragments. The probabilistic approach required a cutoff point for the score, above which the links would be automatically classified as belonging to the same individual. The cutoff point was obtained by linkage of the Notifiable Diseases Information System – Tuberculosis database with itself, subsequent manual review and ROC curves and precision-recall. Sensitivity and specificity for accurate analysis were calculated. RESULTS Accuracy ranged from 87.2% to 95.2% for sensitivity and 99.8% to 99.9% for specificity for probabilistic and deterministic record linkage, respectively. The occurrence of missing values for the key variables and the low percentage of similarity measure for name and date of birth were mainly responsible for the failure to identify records of the same individual with the techniques used. CONCLUSIONS The two techniques showed a high level of correlation for pair classification. Although deterministic linkage identified more duplicate records than probabilistic linkage, the latter retrieved records not identified by the former. User need and experience should be considered when choosing the best technique to be used.


Author(s):  
Sharif E. Guseynov ◽  
Sergey Matyukhin ◽  
Misir J. Mardanov ◽  
Jekaterina V. Aleksejeva ◽  
Olga Sidorenko

The present paper deals with one problem of quantitative controlling the seeding of the sown area by agricultural crops in different agroclimatic conditions. The considered problem is studied from the standpoint of three strategies: from the seeding planning perspective aiming at minimal risk associated with possible unfavourable agroclimatic conditions (a probabilistic approach is used); from the perspective of obtaining the maximum crops sales profit (a deterministic approach is used); from the perspective of obtaining the maximum crops harvest. For the considered problem, mathematical models are constructed (one probabilistic model and two deterministic models, respectively), their analytical solutions are found, and then, using a specific example, the application of the constructed and solved mathematical models is illustrated as well as the obtained numerical results are analysed..


1999 ◽  
Vol 122 (1) ◽  
pp. 167-174 ◽  
Author(s):  
M. R. HUTCHINGS ◽  
S. HARRIS

Despite strong circumstantial evidence to suggest that the main route of TB transmission from badgers to cattle is via contaminated badger excreta, it is unclear whether the associated risks are high enough to account for the prevalence of the disease in south-west England. To decide whether this was a viable route of transmission, cattle contact with badger excreta was investigated using a deterministic approach to quantify the risks to cattle posed by badger excreta. Levels of investigative and grazing contacts between cattle and badger urine and faeces could each account for the disease prevalence in south-west England. An infection probability of 3·7×10−4 per bite from pasture contaminated with badger urine infected with Mycobacterium bovis could account for the prevalence of TB in cattle in south-west England. Infection probabilities of 6·9×10−7 per investigation and 1·1×10−7 per bite from badger latrines could each account for the prevalence of TB in cattle in the south-west. When considering only the high risk areas of south-west England these bounds fell by a factor of eight. However, badger excreta may still constitute a high level of risk to cattle. The levels of cattle contact with badger excreta are far higher than previously thought, suggesting that it is the probability of infection per given contact with infected badger excreta which has the greater influence on the probability of transmission and not the level of contact. The infection probability per cattle contact with infected badger excreta is in all likelihood extremely low.


Sign in / Sign up

Export Citation Format

Share Document