Modeling System Audit as a Sequential test with Discovery as a Failure Time Endpoint

2011 ◽  
Author(s):  
Craig S Wright ◽  
Tanveer A. Zia
Keyword(s):  
Author(s):  
A.Yu. Kulakov

Goal. Assess the reliability of a complex technical system with periodic reconfiguration and compare the results obtained a similar system, but without reconfiguration. Materials and methods. In this article uses the method of statistical modeling (Monte Carlo) to assess the reliability of complex system. We using the normal and exponential distribution of failure time for modeling failures of system elements. Reconfiguration algorithm is the algorithm proposed for the attitude and orbit control system of spacecraft. Results. A computer program has been developed for assessing reliability on the basis of a statistical modeling method, which makes it possible to evaluate systems of varying complexity with exponential and normal distribution, as well as with and without periodic reconfiguration. A quantitative estimate of the reliability as a function of the probability of system failure is obtained. Conclusion. It has been demonstrated that a system with reconfiguration has the best reliability characteristics, both in the case of exponential and normal distribution of failures.


Author(s):  
Ehtesham Husain ◽  
Masood ul Haq

<p><span>The reliability (unreliability) and life testing are important topics in the field of engineering, electronic, <span>medicine, economic and many more, where we are interested in, life of components, human organs, <span>subsystem and system. Statistically, a probability distribution failure time (life time) of a certain form is <span>usually assumed to give reliability of a component for a system for each time t. Some well known <span>parametric life time models (T ≥ 0) are Exponential, Weibull, Inverse Weibull, Gamma, Lognormal, <span>normal ( T&gt;0 ; left truncated ) etc. </span></span></span></span></span></span></p><p><span><span><span><span><span><span><span>In this paper we consider a system that, has two components with independent but non-identical life time <span>probabilities explained by two distinct random variables say T<span>1 <span>and T<span>2 <span>, where T<span>1 <span>has a constant hazard <span>rate and T<span>2 <span>has an increasing hazard respectively </span></span></span></span></span></span></span></span><br /><br class="Apple-interchange-newline" /></span></span></span></span></span></span></span></span></span></p>


2021 ◽  
Vol 11 (6) ◽  
pp. 2521
Author(s):  
Feng Jiang ◽  
Jianyong Liu ◽  
Wei Yuan ◽  
Jianbo Yan ◽  
Lin Wang ◽  
...  

Improving the fire resistance of the key cables connected to firefighting and safety equipment is of great importance. Based on the engineering practice of an oil storage company, this study proposes a modification scheme that entails spraying fire-retardant coatings on the outer surface of a cable tray to delay the failure times of the cables in the tray. To verify the effect, 12 specimens were processed using five kinds of fire-retardant coatings and two kinds of fire-resistant cotton to coat the cable tray. The specimens were installed in the vertical fire resistance test furnace. For the ISO 834 standard fire condition, a fire resistance test was carried out on the specimens. The data for the surface temperature and the insulation resistance of the cables in trays were collected, and the fireproof effect was analyzed. The results showed that compared with the control group, the failure time of the cable could be delayed by 1.57–14.86 times, and the thicker the fire-retardant coatings were, the better the fireproof effect was. In general, the fire protection effect of the fire-retardant coating was better than that of the fire-resistant cotton.


2020 ◽  
Vol 22 (Supplement_2) ◽  
pp. ii85-ii86
Author(s):  
Ping Zhu ◽  
Xianglin Du ◽  
Angel Blanco ◽  
Leomar Y Ballester ◽  
Nitin Tandon ◽  
...  

Abstract OBJECTIVES To investigate the impact of biopsy preceding resection compared to upfront resection in glioblastoma overall survival (OS) and post-operative outcomes using the National Cancer Database (NCDB). METHODS A total of 17,334 GBM patients diagnosed between 2010 and 2014 were derived from the NCDB. Patients were categorized into two groups: “upfront resection” versus “biopsy followed by resection”. Primary outcome was OS. Post-operative outcomes including 30-day readmission/mortality, 90-day mortality, and prolonged length of inpatient hospital stay (LOS) were secondary endpoints. Kaplan-Meier methods and accelerated failure time (AFT) models with gamma distribution were applied for survival analysis. Multivariable binary logistic regression models were performed to compare differences in the post-operative outcomes between these groups. RESULTS Patients undergoing “upfront resection” experienced superior survival compared to those undergoing “biopsy followed by resection” (median OS: 12.4 versus 11.1 months, log-rank test: P=0.001). In multivariable AFT models, significant survival benefits were observed among patients undergoing “upfront resection” (time ratio [TR]: 0.83, 95% CI: 0.75–0.93, P=0.001). Patients undergoing upfront GTR had the longest survival compared to upfront STR, GTR following STR, or GTR and STR following an initial biopsy (14.4 vs. 10.3, 13.5, 13.3, and 9.1, months), respectively (TR: 1.00 [Ref.], 0.75, 0.82, 0.88, and 0.67). Recent years of diagnosis, higher income and treatment at academic facilities were significantly associated with the likelihood of undergoing upfront resection after adjusting the covariates. Multivariable logistic regression revealed that 30-day mortality and 90-day mortality were decreased by 73% and 44% for patients undergoing “upfront resection” over “biopsy followed by resection”, respectively (both p &lt; 0.001). CONCLUSIONS Pre-operative biopsies for surgically accessible tumors with characteristic imaging features of Glioblastoma lead to worse survival despite subsequent resection compared to patients undergoing upfront resection.


2021 ◽  
Vol 11 (9) ◽  
pp. 4280
Author(s):  
Iurii Katser ◽  
Viacheslav Kozitsin ◽  
Victor Lobachev ◽  
Ivan Maksimov

Offline changepoint detection (CPD) algorithms are used for signal segmentation in an optimal way. Generally, these algorithms are based on the assumption that signal’s changed statistical properties are known, and the appropriate models (metrics, cost functions) for changepoint detection are used. Otherwise, the process of proper model selection can become laborious and time-consuming with uncertain results. Although an ensemble approach is well known for increasing the robustness of the individual algorithms and dealing with mentioned challenges, it is weakly formalized and much less highlighted for CPD problems than for outlier detection or classification problems. This paper proposes an unsupervised CPD ensemble (CPDE) procedure with the pseudocode of the particular proposed ensemble algorithms and the link to their Python realization. The approach’s novelty is in aggregating several cost functions before the changepoint search procedure running during the offline analysis. The numerical experiment showed that the proposed CPDE outperforms non-ensemble CPD procedures. Additionally, we focused on analyzing common CPD algorithms, scaling, and aggregation functions, comparing them during the numerical experiment. The results were obtained on the two anomaly benchmarks that contain industrial faults and failures—Tennessee Eastman Process (TEP) and Skoltech Anomaly Benchmark (SKAB). One of the possible applications of our research is the estimation of the failure time for fault identification and isolation problems of the technical diagnostics.


Sign in / Sign up

Export Citation Format

Share Document