particle filter
Recently Published Documents


TOTAL DOCUMENTS

5044
(FIVE YEARS 950)

H-INDEX

68
(FIVE YEARS 11)

Author(s):  
Nicola Esposito ◽  
Agostino Mele ◽  
Bruno Castanier ◽  
Massimiliano Giorgio

In this paper, a new gamma-based degradation process with random effect is proposed that allows to account for the presence of measurement error that depends in stochastic sense on the measured degradation level. This new model extends a perturbed gamma model recently suggested in the literature, by allowing for the presence of a unit to unit variability. As the original one, the extended model is not mathematically tractable. The main features of the proposed model are illustrated. Maximum likelihood estimation of its parameters from perturbed degradation measurements is addressed. The likelihood function is formulated. Hence, a new maximization procedure that combines a particle filter and an expectation-maximization algorithm is suggested that allows to overcome the numerical issues posed by its direct maximization. Moreover, a simple algorithm based on the same particle filter method is also described that allows to compute the cumulative distribution function of the remaining useful life and the conditional probability density function of the hidden degradation level, given the past noisy measurements. Finally, two numerical applications are developed where the model parameters are estimated from two sets of perturbed degradation measurements of carbon-film resistors and fuel cell membranes. In the first example the presence of random effect is statistically significant while in the second example it is not significant. In the applications, the presence of random effect is checked via appropriate statistical procedures. In both the examples, the influence of accounting for the presence of random effect on the estimates of the cumulative distribution function of the remaining useful life of the considered units is also discussed. Obtained results demonstrate the affordability of the proposed approach and the usefulness of the proposed model.


2022 ◽  
Author(s):  
Kai Sasaki ◽  
Mayu Muramatsu ◽  
Kenta Hirayama ◽  
Katsuhiro Endo ◽  
Mitsuhiro Murayama

Abstract Observation of dynamic processes by transmission electron microscopy (TEM) is an attractive technique to experimentally analyze materials’ nanoscale phenomena and understand the microstructure-properties relationships in nanoscale. Even if spatial and temporal resolutions of real-time TEM increase significantly, it is still difficult to say that the researchers quantitatively evaluate the dynamic behavior of defects. Images in TEM video are a two-dimensional projection of three-dimensional space phenomena, thus missing information must be existed that makes image’s uniquely accurate interpretation challenging. Therefore, even though they are still a clustering high-dimensional data and can be compressed to two-dimensional, conventional statistical methods for analyzing images may not be powerful enough to track nanoscale behavior by removing various artifacts associated with experiment; and automated and unbiased processing tools for such big-data are becoming mission-critical to discover knowledge about unforeseen behavior. We have developed a method to quantitative image analysis framework to resolve these problems, in which machine learning and particle filter estimation are uniquely combined. The quantitative and automated measurement of the dislocation velocity in an Fe-31Mn-3Al-3Si autunitic steel subjected to the tensile deformation was performed to validate the framework, and an intermittent motion of the dislocations was quantitatively analyzed. The framework is successfully classifying, identifying and tracking nanoscale objects; these are not able to be accurately implemented by the conventional mean-path based analysis.


Author(s):  
П.В. Полухин

В работе предложены математические инструменты на основе достаточных статистик и декомпозиции выборок в сочетании с алгоритмами распределенных вычислений, позволяющие существенно повысить эффективность процедуры фильтрации. Filtering algorithms are used to assess the state of dynamic systems when solving various practical problems, such as voice synthesis and determining the geo-position and monitoring the movement of objects. In the case of complex hierarchical dynamic systems with a large number of time slices, the process of calculating probabilistic characteristics becomes very time-consuming due to the need to generate a large number of samples. The essence of optimization is to reduce the number of samples generated by the filter, increase their consistency and speed up computational operations. The paper offers mathematical tools based on sufficient statistics and sample decomposition in combination with distributed computing algorithms that can significantly improve the efficiency of the filtering procedure.


Risks ◽  
2021 ◽  
Vol 9 (12) ◽  
pp. 228
Author(s):  
Karol Gellert ◽  
Erik Schlögl

This paper presents the construction of a particle filter, which incorporates elements inspired by genetic algorithms, in order to achieve accelerated adaptation of the estimated posterior distribution to changes in model parameters. Specifically, the filter is designed for the situation where the subsequent data in online sequential filtering does not match the model posterior filtered based on data up to a current point in time. The examples considered encompass parameter regime shifts and stochastic volatility. The filter adapts to regime shifts extremely rapidly and delivers a clear heuristic for distinguishing between regime shifts and stochastic volatility, even though the model dynamics assumed by the filter exhibit neither of those features.


Sign in / Sign up

Export Citation Format

Share Document