probabilistic finite state automata
Recently Published Documents


TOTAL DOCUMENTS

25
(FIVE YEARS 7)

H-INDEX

7
(FIVE YEARS 1)

Author(s):  
Chandrachur Bhattacharya ◽  
Susheel Dharmadhikari ◽  
Amrita Basak ◽  
Asok Ray

Abstract Fatigue failure occurs ubiquitously in mechanical structures when they are subjected to cyclic loading well below the material's yield stress. The tell-tale sign of a fatigue failure is the emergence of cracks at the internal or surface defects. In general, a machinery component has a finite fatigue life based on the number of cycles it can sustain before a fracture occurs. However, the estimated life is generally conservative and often a large factor of safety is applied to make the component fail-safe. From the perspective of better utilization of a machinery component, it is, however, desirable to have maximum usage of the component without a catastrophic failure. It is, therefore, conducive to have a measure that can capture precursors to failure to facilitate active diagnosis of the machinery health. In this letter, a precursor detection method is developed upon modifications of probabilistic finite state automata (PFSA). The efficacy of the proposed method is demonstrated on cold-rolled AL7075-T6 notched specimens in a computer-instrumented and computer-controlled fatigue testing apparatus. The proposed method is capable of detecting the emergence of cracks (at ∼95% accuracy) and also can capture precursors with good fidelity.


2021 ◽  
Vol 63 ◽  
pp. 102200
Author(s):  
Zhi Li ◽  
Harm Derksen ◽  
Jonathan Gryak ◽  
Cheng Jiang ◽  
Zijun Gao ◽  
...  

2020 ◽  
Vol 516 ◽  
pp. 388-400
Author(s):  
Joan Andreu Sánchez ◽  
Verónica Romero

2018 ◽  
Vol 44 (1) ◽  
pp. 17-37 ◽  
Author(s):  
Joan Andreu Sánchez ◽  
Martha Alicia Rocha ◽  
Verónica Romero ◽  
Mauricio Villegas

Probabilistic finite-state automata are a formalism that is widely used in many problems of automatic speech recognition and natural language processing. Probabilistic finite-state automata are closely related to other finite-state models as weighted finite-state automata, word lattices, and hidden Markov models. Therefore, they share many similar properties and problems. Entropy measures of finite-state models have been investigated in the past in order to study the information capacity of these models. The derivational entropy quantifies the uncertainty that the model has about the probability distribution it represents. The derivational entropy in a finite-state automaton is computed from the probability that is accumulated in all of its individual state sequences. The computation of the entropy from a weighted finite-state automaton requires a normalized model. This article studies an efficient computation of the derivational entropy of left-to-right probabilistic finite-state automata, and it introduces an efficient algorithm for normalizing weighted finite-state automata. The efficient computation of the derivational entropy is also extended to continuous hidden Markov models.


Sign in / Sign up

Export Citation Format

Share Document