interval approach
Recently Published Documents


TOTAL DOCUMENTS

221
(FIVE YEARS 58)

H-INDEX

22
(FIVE YEARS 3)

2022 ◽  
Vol 18 (6) ◽  
pp. 84-92
Author(s):  
E. Yu. Kolesnikov ◽  
F. Vasileios

The purpose of the article was to analyze the problems currently standing in the way of more effective application of the risk-based approach methodology in the field of technogenic safety management. Methods — theoretical, inductive method, analysis of own experience, adopted normative legal acts, other publications. The main results of the work include the following: • despite the broadest use of the concept of “risk” in the field of technogenic safety management, there is no generally accepted interpretation of it to date; • often the evaluative concept of “risk” is mistakenly used instead of objectively existing risk factors; • quantifiably technogenic risk should be characterized by indicators of numerical nature, having vector objects, since two components should be indicated for the complete assignment of the indicator: the probability and the amount of damage; • experience shows that the methods of assessing the probabilistic component of risk indicators recommended by regulatory documents on the analysis and quantification of technogenic risk are accompanied by a very large uncertainty, therefore, instead of the traditional point statement, a more adequate method of assessment is the use of an interval approach that takes into account and allows quantifying this uncertainty; • the analysis showed that the so-called frequency approach, which is most often used to assess the probabilistic component of technogenic risk indicators, is used improperly, has no basis, since, as a rule, the phenomenon of statistical stability is not observed in the object area of the technosphere, there are no general aggregates; • in society and even among specialists, by now there is no understanding of the need to express all three components of damage from an accident (explosion/fire) in monetary terms, without which it is impossible to estimate and express the amount of total damage: —  in conclusion, four key problems that hinder the more effective use of the risk-based approach methodology in the field of technogenic safety management are listed; —  imperfection of the existing methodological base for the analysis and quantitative assessment of technogenic risk; — the problem of staffing in the field of technogenic risk management; — lack of national criteria for acceptable risk; —  complete disregard of the problem of uncertainty of the results of the COR, the lack of methodological support for the procedure of analysis and quantification of this uncertainty. Conclusion — the efforts of the entire community of specialists-researchers, legislators, practitioners engaged in various aspects of the problem of technosphere safety management are necessary to solve the tasks specified in the article.


2021 ◽  
Vol 27 (12) ◽  
pp. 2719-2745
Author(s):  
Mikhail V. POMAZANOV

Subject. This article deals with the issues of validation of the consistency of rating-based model forecasts. Objectives. The article aims to provide developers and validators of rating-based models with a practical fundamental test for benchmarking study of the estimated default probability values obtained as a result of the application of models used in the rating system. Methods. For the study, I used the classical interval approach to testing of statistical hypotheses focused on the subject area of calibration of rating systems. Results. In addition to the generally accepted tests for the correspondence of the predicted probabilities of default of credit risk objects to the historically realized values, the article proposes a new statistical test that corrects the shortcomings of the generally accepted ones, focused on "diagnosing" the consistency of the implemented discrimination of objects by the rating model. Examples of recognizing the reasons for a negative test result and negative consequences for lending are given while maintaining the current settings of the rating model. In addition to the bias in the assessment of the total frequency of defaults in the loan portfolio, the proposed method makes it possible to objectively reveal the inadequacy of discrimination against borrowers with a calibrated rating model, diagnose the “disease” of the rating model. Conclusions and Relevance. The new practical benchmark test allows to reject the hypothesis about the consistency of assessing the probability of default by the rating model at a given level of confidence and available historical data. The test has the advantage of practical interpretability based on its results, it is possible to draw a conclusion about the direction of the model correction. The offered test can be used in the process of internal validation by the bank of its own rating models, which is required by the Bank of Russia for approaches based on internal ratings.


2021 ◽  
pp. 175857322110654
Author(s):  
E. Fleischhacker ◽  
G. Siebenbürger ◽  
J. Gleich ◽  
T. Helfen ◽  
W. Böcker ◽  
...  

Background Open reduction and internal fixation (ORIF) of humeral head split fractures is challenging because of high instability and limited visibility. The aim of this retrospective study was to investigate the extend of the approach through the rotator interval (RI) on the reduction quality and functional outcome. Methods 37 patients (mean age: 59  ±  16 years,16 female) treated by ORIF through a standard deltopectoral (DP) approach were evaluated. The follow-up period was at least two years. In 17 cases, the approach was extended through the RI. Evaluation was based on radiographs, Constant scores (CS) and DASH scores. Results In group DP, “anatomic” reduction was achieved in 9 cases (45%), “acceptable” in 5 cases (25%), and “malreduced” in 6 cases (30%). In group RI, “anatomic” reduction was seen in 12 cases (71%), “acceptable” in 5 cases (29%), and “malreduced” in none (p  =  0.04). In the DP group, the CS was 60.2  ±  16.2 and the %CS was 63.9  ±  22.3, while in the RI group, the CS was 74.5  ±  17.4 and the %CS was 79.1  ±  24.1 (p  =  0.07, p  =  0.08). DASH score was 22.8  ±  19.5 in DP compared to RI: 25.2  ±  20.6 (p  =  0.53). Conclusions The RI approach improves visualization as it enhances quality of fracture reduction, however functional outcomes may not differ significantly. Type of study and level of proof Retrospective, level III


2021 ◽  
Vol 2096 (1) ◽  
pp. 012127
Author(s):  
A A Lavrukhin ◽  
A S Tukanova

Abstract The article presents a new approach to estimate the frequency characteristics of the impedance tensor for processing magnetotelluric data. The approach is based on the applying of interval analysis methods when solving a system of linear equations. As a reference method, to compare with, a combined robust algorithm is used (with discarding data by the coherence criterion, median estimating, and weighting least squares method). This algorithm is compared with the results of the proposed interval computational algorithm that is based on the method of J. Rohn, implemented in the intvalpy Python library. Computational experiments on the data processing were performed using natural magnetotelluric field data. The interval approach can be successfully applied to the processing of magnetotelluric data.


2021 ◽  
Vol 11 (10) ◽  
pp. 2584-2597
Author(s):  
M. L. Sworna Kokila ◽  
V. Gomathi

The efficient tracking of vehicle drivers can be used to prevent collisions through visual human behaviour analysis. Many different methods have not been satisfactory enough such as iris-sklera research, driver’s approximation of gaze, and Hough transforming technological performance. Since these methods make it more difficult to spot drivers’ sleepiness and carelessness. This paper therefore suggested that it be careful to estimate the profile after finding the left eye, right eye, mouth and nose Absence of each of these traits marks a non-frontal approach. The Rectangular Face Classificatión control system monitors frontal faces by moving a rectangular filter on the image for testing the dullness of the face area. Once the facial regions are tracked, the Hybrid Balanced Networks separates the eye area from it depending on the greater axis and the smaller axis. Heavy Eyed Approach is often used to spot drowsiness and twitch of the brow. The intensity of the horizontal plot is measured and successive frames in the eye twitch are not counted as a closed eye for three seconds. The result of the proposed work therefore effectively improves accuracy efficiency.


2021 ◽  
pp. 77-83
Author(s):  
Т.V. Potanina ◽  
O.V. Yefimov ◽  
M.M. Pylypenko

The applications of the interval and standard probabilistic approaches for verifying the reliability of the results of an experiment studying the mechanical properties of nuclear materials are compared. The presence of “outliers” in a sample of hardness values for hafnium ingots is studied with for fixed oxygen mass content. The situation of measurement error limitation without reliable information about its distribution is considered. The correctness of the application of numerical methods of interval analysis for processing experimental data under conditions of uncertainty and noisy experimental data is shown. Determination of the dependence of the Brinell hardness of refined hafnium samples on the mass oxygen content was performed by a combination of methods: removal of anomalous measurements by interval analysis methods and approximation of data from truncated samples by the method Levenberg-Marquardt minimization.


Author(s):  
Joanna Akrouche ◽  
Mohamed Sallak ◽  
Eric Châtelet ◽  
Fahed Abdallah ◽  
Hiba Haj Chhade

Abstract An essential step in the safe design of systems is choosing the system configuration that will maximize the overall availability of the system and minimize its overall cost. The main objective of this paper is to propose an optimization method of multi-state system availability in the presence of both aleatory and epistemic uncertainties, to choose the best configuration for the system in terms of availability, cost, and imprecision. The problem is formulated as follows: let us consider several configurations of a system, with each configuration consisting of components with different working states, and imprecise failure and repair rates provided in the form of intervals. The aim is to find the best configuration regarding the system's imprecise availability, cost, and imprecision. First, the imprecise steady availability of each configuration is computed by using an original method based on Markovian approaches combined with interval contraction techniques. Then an objective function incorporating cost, the lower and upper bounds of availability, and imprecision is defined and computed to provide the best configuration. To illustrate the proposed method, a use case is discussed.


Sign in / Sign up

Export Citation Format

Share Document