scholarly journals Bayesian reliability models of Weibull systems: State of the art

Author(s):  
Abdelaziz Zaidi ◽  
Belkacem Ould Bouamama ◽  
Moncef Tagina

Abstract In the reliability modeling field, we sometimes encounter systems with uncertain structures, and the use of fault trees and reliability diagrams is not possible. To overcome this problem, Bayesian approaches offer a considerable efficiency in this context. This paper introduces recent contributions in the field of reliability modeling with the Bayesian network approach. Bayesian reliability models are applied to systems with Weibull distribution of failure. To achieve the formulation of the reliability model, Bayesian estimation of Weibull parameters and the model’s goodness-of-fit are evoked. The advantages of this modelling approach are presented in the case of systems with an unknown reliability structure, those with a common cause of failures and redundant ones. Finally, we raise the issue of the use of BNs in the fault diagnosis area.

Author(s):  
SHINJI INOUE ◽  
NAOKI IWAMOTO ◽  
SHIGERU YAMADA

This paper discusses an new approach for discrete-time software reliability growth modeling based on an discrete-time infinite server queueing model, which describes a debugging process in a testing phase. Our approach enables us to develop discrete-time software reliability growth models (SRGMs) which could not be developed under conventional discrete-time modeling approaches. This paper also discuss goodness-of-fit comparisons of our discrete-time SRGMs with conventional continuous-time SRGMs in terms of the criterion of the mean squared errors, and show numerical examples for software reliability analysis of our models by using actual data.


Materials ◽  
2022 ◽  
Vol 15 (1) ◽  
pp. 384
Author(s):  
António Gaspar-Cunha ◽  
José A. Covas ◽  
Janusz Sikora

Given the global economic and societal importance of the polymer industry, the continuous search for improvements in the various processing techniques is of practical primordial importance. This review evaluates the application of optimization methodologies to the main polymer processing operations. The most important characteristics related to the usage of optimization techniques, such as the nature of the objective function, the type of optimization algorithm, the modelling approach used to evaluate the solutions, and the parameters to optimize, are discussed. The aim is to identify the most important features of an optimization system for polymer processing problems and define the best procedure for each particular practical situation. For this purpose, the state of the art of the optimization methodologies usually employed is first presented, followed by an extensive review of the literature dealing with the major processing techniques, the discussion being completed by considering both the characteristics identified and the available optimization methodologies. This first part of the review focuses on extrusion, namely single and twin-screw extruders, extrusion dies, and calibrators. It is concluded that there is a set of methodologies that can be confidently applied in polymer processing with a very good performance and without the need of demanding computation requirements.


2019 ◽  
Vol 22 (4) ◽  
pp. 903-931 ◽  
Author(s):  
Rakhyun E Kim

Abstract International institutions such as treaties and organizations shape, and are shaped by, the large web-like architecture of global governance. Yet we know little about what this architecture looks like, why certain structures are observed, and how they are linked to the functioning of international institutions as well as the overall effectiveness of global governance. Over the past decade, network science has emerged as a promising and indispensable approach to unraveling structural nuances and complexities of the system of international institutions. This article presents a state-of-the-art review of this emerging field of research and seeks to stimulate its further development. In this article, I draw connections between various network analyses of global governance that are found in different bodies of literature. In so doing, I integrate three separate but overlapping strands of work on institutional fragmentation, polycentricity, and complexity and bring much-needed conceptual clarity to the debate. Building on previous studies, I propose a framework for operationalizing fragmentation, polycentricity, and complexity in network terms in order to enable systematic and comparative analysis of global governance systems. This article argues that there is much potential in the network approach and makes a case for advancing the “network science of global governance.”


2018 ◽  
Vol 6 ◽  
pp. 357-371 ◽  
Author(s):  
Edwin Simpson ◽  
Iryna Gurevych

We introduce a scalable Bayesian preference learning method for identifying convincing arguments in the absence of gold-standard ratings or rankings. In contrast to previous work, we avoid the need for separate methods to perform quality control on training data, predict rankings and perform pairwise classification. Bayesian approaches are an effective solution when faced with sparse or noisy training data, but have not previously been used to identify convincing arguments. One issue is scalability, which we address by developing a stochastic variational inference method for Gaussian process (GP) preference learning. We show how our method can be applied to predict argument convincingness from crowdsourced data, outperforming the previous state-of-the-art, particularly when trained with small amounts of unreliable data. We demonstrate how the Bayesian approach enables more effective active learning, thereby reducing the amount of data required to identify convincing arguments for new users and domains. While word embeddings are principally used with neural networks, our results show that word embeddings in combination with linguistic features also benefit GPs when predicting argument convincingness.


Diversity ◽  
2020 ◽  
Vol 12 (2) ◽  
pp. 70 ◽  
Author(s):  
Juan C. Garcia-R ◽  
Emily Moriarty Lemmon ◽  
Alan R. Lemmon ◽  
Nigel French

The integration of state-of-the-art molecular techniques and analyses, together with a broad taxonomic sampling, can provide new insights into bird interrelationships and divergence. Despite their evolutionary significance, the relationships among several rail lineages remain unresolved as does the general timescale of rail evolution. Here, we disentangle the deep phylogenetic structure of rails using anchored phylogenomics. We analysed a set of 393 loci from 63 species, representing approximately 40% of the extant familial diversity. Our phylogenomic analyses reconstruct the phylogeny of rails and robustly infer several previously contentious relationships. Concatenated maximum likelihood and coalescent species-tree approaches recover identical topologies with strong node support. The results are concordant with previous phylogenetic studies using small DNA datasets, but they also supply an additional resolution. Our dating analysis provides contrasting divergence times using fossils and Bayesian and non-Bayesian approaches. Our study refines the evolutionary history of rails, offering a foundation for future evolutionary studies of birds.


Author(s):  
W Mechri ◽  
C Simon ◽  
K Ben Othman

This paper analyses the problem of epistemic uncertainty in assessing the performance of safety instrumented systems (SIS) using fault trees. The imperfect knowledge concerns the common cause failure (CCF) involved in the SIS in low demand mode. The point-valued CCF factors are replaced by fuzzy numbers, allowing experts to express their uncertainty about the CCF values. This paper shows how these uncertainties propagate through the fault tree and how this induces an uncertainty to the values of the SIS failure probability on demand and to the safety integrity level of the SIS. For the sake of verification and comparison, and to show the exactness of the approach, a Monte Carlo sampling approach is proposed, where by a uniform or triangular second-order probability distribution of CCF factors is considered.


2010 ◽  
Vol 118-120 ◽  
pp. 319-326
Author(s):  
Jin Yu Zhou ◽  
Kui Zhou Sun ◽  
Xiu Lian Li

As a new tool of statistical analysis, Copula is introduced to build reliability model for structural system consisting of identical components, by which the complex feature of failure dependence can be depicted. Aiming at symmetric structure systems, typical failure-dependence mechanism of components is discussed firstly. Considering the failure-dependence mechanism, modeling steps based on Gauss Copula and Archimedean Copulas are put forward, in which the twin stress, components strength are chosen as the basic variables and the safety margins are chosen as the analytic variables. Compared with Gauss Copula, Archimedean Copulas have powerful capability of describing the failure-dependence mechanism owing to the adjustable parameters can be determined according to the rank correlation coefficient and the information about the critical failure point. Archimedean Copula-based reliability models are applicable to non-normal situations. A numerical example is given to show that the new method is reasonable and feasible. Copula-based reliability models can give a new path for reliability analysis of complex systems with failure-dependence.


Sign in / Sign up

Export Citation Format

Share Document