optimal censoring
Recently Published Documents


TOTAL DOCUMENTS

15
(FIVE YEARS 1)

H-INDEX

4
(FIVE YEARS 0)

2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Soumya Roy ◽  
Biswabrata Pradhan ◽  
Annesha Purakayastha

PurposeThis article considers Inverse Gaussian distribution as the basic lifetime model for the test units. The unknown model parameters are estimated using the method of moments, the method of maximum likelihood and Bayesian methods. As part of maximum likelihood analysis, this article employs an expectation-maximization algorithm to simplify numerical computation. Subsequently, Bayesian estimates are obtained using the Metropolis–Hastings algorithm. This article then presents the design of optimal censoring schemes using a design criterion that deals with the precision of a particular system lifetime quantile. The optimal censoring schemes are obtained after taking into account budget constraints.Design/methodology/approachThis article first presents classical and Bayesian statistical inference for Progressive Type-I Interval censored data. Subsequently, this article considers the design of optimal Progressive Type-I Interval censoring schemes after incorporating budget constraints.FindingsA real dataset is analyzed to demonstrate the methods developed in this article. The adequacy of the lifetime model is ensured using a simulation-based goodness-of-fit test. Furthermore, the performance of various estimators is studied using a detailed simulation experiment. It is observed that the maximum likelihood estimator relatively outperforms the method of moment estimator. Furthermore, the posterior median fares better among Bayesian estimators even in the absence of any subjective information. Furthermore, it is observed that the budget constraints have real implications on the optimal design of censoring schemes.Originality/valueThe proposed methodology may be used for analyzing any Progressive Type-I Interval Censored data for any lifetime model. The methodology adopted to obtain the optimal censoring schemes may be particularly useful for reliability engineers in real-life applications.



Sankhya A ◽  
2020 ◽  
Author(s):  
Samir K. Ashour ◽  
Ahmed A. El-Sheikh ◽  
Ahmed Elshahhat
Keyword(s):  


2019 ◽  
Author(s):  
John C. Williams ◽  
Philip N. Tubiolo ◽  
Jacob R. Luceno ◽  
Jared X. Van Snellenberg

AbstractMultiband-accelerated fMRI provides dramatically improved temporal and spatial resolution of resting state functional connectivity (RSFC) studies of the human brain, but poses unique challenges for denoising of subject motion induced data artifacts, a major confound in RSFC research. We comprehensively evaluated existing and novel approaches to volume censoring-based motion denoising in the Human Connectome Project dataset. We show that assumptions underlying common metrics for evaluating motion denoising pipelines, especially those based on quality control-functional connectivity (QC-FC) correlations and differences between high- and low-motion participants, are problematic, making these criteria inappropriate for quantifying pipeline performance. We further develop two new quantitative metrics that are free from these issues and demonstrate their use as benchmarks for comparing volume censoring methods. Finally, we develop rigorous, quantitative methods for determining optimal censoring thresholds and provide straightforward recommendations and code for all investigators to apply this optimized approach to their own RSFC datasets.



Author(s):  
Ehab Mohamed Almetwally ◽  
Hisham Mohamed Almongy ◽  
Amaal El sayed Mubarak

In this paper we consider the estimation of the Weibull Generalized Exponential Distribution (WGED) Parameters with Progressive Censoring Schemes. In order to obtain the optimal censoring scheme for WGED, more than one method of estimation was used to reach a better scheme with the best method of estimation. The maximum likelihood method and the method of Bayesian estimation for (square error and Linex) loss function have been used. Monte carlo simulation is used for comparison between the two methods of estimation under censoring schemes. To show how the schemes work in practice; we analyze a strength data for single carbon fibers as a case of real data.



2017 ◽  
Vol 60 (4) ◽  
pp. 1349-1367 ◽  
Author(s):  
Uoseph Hamdi Salemi ◽  
Sadegh Rezaei ◽  
Saralees Nadarajah


2016 ◽  
Vol 48 (A) ◽  
pp. 119-144
Author(s):  
Miles B. Gietzmann ◽  
Adam J. Ostaszewski

AbstractFollowing the approach of standard filtering theory, we analyse investor valuation of firms, when these are modelled as geometric-Brownian state processes that are privately and partially observed, at random (Poisson) times, by agents. Tasked with disclosing forecast values, agents are able purposefully to withhold their observations; explicit filtering formulae are derived for downgrading the valuations in the absence of disclosures. The analysis is conducted for both a solitary firm andmco-dependent firms.



Sign in / Sign up

Export Citation Format

Share Document