pdf estimation
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 4)

H-INDEX

6
(FIVE YEARS 0)

2020 ◽  
Vol 12 (1) ◽  
pp. 11
Author(s):  
Francisco Roberto Jaramillo Montoya ◽  
Martín Valderrama ◽  
Vanessa Quintero ◽  
Aramis Pérez ◽  
Marcos Orchard

One of the main challenges in prognostics corresponds to the estimation of a system’s probability density function (PDF) for the time-of-failure (ToF) prior to reach a fault condition. An appropriate characterization of the ToF-PDF will let the user know about the remaining useful life of the system or component, allowing the users to prevent catastrophic failures through optimal maintenance schedules. However, the ToF-PDF estimation is not an easy task because it involves both the computation of long-term predictions of a fault indicator of the system and the definition of the hazard zone. In most cases, the trajectory of the fault indicator is assumed as a trajectory with monotonic behavior, and the hazard zone may be considered as a deterministic or probabilistic threshold. This monotonic behavior of the fault indicator enables assuming that the system will only fail once when this indicator reaches the hazard zone, and the ToF-PDF will be estimated according to mathematical definitions proposed in the state-of-the-art. Nevertheless, not all the fault indicators may be considered with a monotonic behavior due to its nature as a stochastic process or regeneration phenomenon, which may entail to errors in the ToF-PDF estimation. To overcome this issue, this paper presents an approach for the estimation of the ToF-PDF using the first-passage-time (FPT) method. This method is focused on the computation of the FPT-PDF when the stochastic process under analysis reaches a specified threshold for the first time only. Accordingly, this work aims to analyze the impact in the estimation of the ToF-PMF (probability mass function) when particle-filter-based prognostics algorithms are used to perform long-term predictions of the fault indicator and compute the probability of failure considering specific hazard zones (which may be characterized by a deterministic value or by a failure likelihood function). A hypothetical self regenerative degradation process is used as a case study to evaluate the performance of the proposed methods.


2020 ◽  
Vol 9 (10) ◽  
pp. 8675-8683
Author(s):  
R. Kumar ◽  
C. M. Velu ◽  
C. Karthikeyan ◽  
S. Sivakumar ◽  
S. Nimmagadda ◽  
...  

Author(s):  
S J Schmidt ◽  
A I Malz ◽  
J Y H Soo ◽  
I A Almosallam ◽  
M Brescia ◽  
...  

Abstract Many scientific investigations of photometric galaxy surveys require redshift estimates, whose uncertainty properties are best encapsulated by photometric redshift (photo-z) posterior probability density functions (PDFs). A plethora of photo-z PDF estimation methodologies abound, producing discrepant results with no consensus on a preferred approach. We present the results of a comprehensive experiment comparing twelve photo-z algorithms applied to mock data produced forLarge Synoptic Survey Telescope The Rubin Observatory Legacy Survey of Space and Time (lsst) Dark Energy Science Collaboration (desc). By supplying perfect prior information, in the form of the complete template library and a representative training set as inputs to each code, we demonstrate the impact of the assumptions underlying each technique on the output photo-z PDFs. In the absence of a notion of true, unbiased photo-z PDFs, we evaluate and interpret multiple metrics of the ensemble properties of the derived photo-z PDFs as well as traditional reductions to photo-z point estimates. We report systematic biases and overall over/under-breadth of the photo-z PDFs of many popular codes, which may indicate avenues for improvement in the algorithms or implementations. Furthermore, we raise attention to the limitations of established metrics for assessing photo-z PDF accuracy; though we identify the conditional density estimate (CDE) loss as a promising metric of photo-z PDF performance in the case where true redshifts are available but true photo-z PDFs are not, we emphasize the need for science-specific performance metrics.


Author(s):  
E. G. Parmehr ◽  
C. S. Fraser ◽  
C. Zhang ◽  
J. Leach

Accurate co-registration of multi-sensor data is a primary step in data integration for photogrammetric and remote sensing applications. A proven intensity-based registration approach is Mutual Information (MI). However the effectiveness of MI for automated registration of multi-sensor remote sensing data can be impacted to the point of failure by its non-monotonic convergence surface. Since MI-based methods rely on joint probability density functions (PDF) for the datasets, errors in PDF estimation can directly affect the MI value. Certain PDF parameter values, such as the bin-size of the joint histogram and the smoothing kernel, need to be assigned in advance, since they play a key role in forming the convergence surface. The lack of a general approach to the assignment of these parameter values for various data types reduces both the automation level and the robustness of registration. This paper proposes a new approach for selection of optimal parameter values for PDF estimation in MI-based registration of optical imagery to LiDAR point clouds. The proposed method determines the best parameters for PDF estimation via an analysis of the relationship between similarity measure values of the data and the adopted geometric transformation in order to achieve the optimal registration reliability. The performance of the proposed parameter selection method is experimentally evaluated and the obtained results are compared with those achieved through a feature-based registration method.


2014 ◽  
Vol 138 ◽  
pp. 248-259 ◽  
Author(s):  
Ming Gao ◽  
Xia Hong ◽  
Sheng Chen ◽  
Chris J. Harris ◽  
Emad Khalaf
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document