hazard rate
Recently Published Documents


TOTAL DOCUMENTS

1011
(FIVE YEARS 224)

H-INDEX

47
(FIVE YEARS 3)

2022 ◽  
Author(s):  
Fei Huang ◽  
Ross Maller ◽  
Brandon Milholland ◽  
Xu Ning

Close analysis of an extensive data set combined with independent evidence prompts our proposal to view human lifetimes as individually finite but collectively unbounded. We formulate a model incorporating this idea whose predictions agree very well with the observed data. In the model, human lifetimes are theoretically unbounded, but the probability of an individual living to an extreme age is negligible, so lifetimes are effectively limited. Our model incorporates a mortality hazard rate plateau and a late-life mortality deceleration effect in conjunction with a newly observed advanced age mortality acceleration. This reconciles many previously observed effects. The model is temporally stable: consistent with observation, parameters do not change over time. As an application, assuming no major medical advances, we predict the emergence of many individuals living past 120, but due to accelerating mortality find it unlikely that any will subsequently survive to an age of 125.


Author(s):  
Chun Su ◽  
Kui Huang ◽  
Zejun Wen

To improve the probability that an engineering system successfully completes its next mission, it is crucial to implement timely maintenance activities, especially when maintenance time or maintenance resources are limited. Taking series-parallel system as the object of study, this paper develops a multi-objective imperfect selective maintenance optimization model. Among it, during the scheduled breaks, potential maintenance actions are implemented for the components, ranging from minimal repair to replacement. Considering that the level of maintenance actions is closely related to the maintenance cost, age reduction coefficient and hazard rate adjustment coefficient are taken into account. Moreover, improved hybrid hazard rate approach is adopted to describe the reliability improvement of the components, and the mission duration is regarded as a random variable. On this basis, a nonlinear stochastic optimization model is established with dual objectives to minimize the total maintenance cost and maximize the system reliability concurrently. The fast elitist non-dominated sorting genetic algorithm (NSGA-II) is adopted to solve the model. Numerical experiments are conducted to verify the effectiveness of the proposed approach. The results indicate that the proposed model can obtain better scheduling schemes for the maintenance resources, and more flexible maintenance plans are gained.


Author(s):  
Showkat Ahmad Dar ◽  
Anwar Hassan ◽  
Peer Bilal Ahmad

In this paper, a new model for count data is introduced by compounding the Poisson distribution with size-biased three-parameter Lindley distribution. Statistical properties, such as reliability, hazard rate, reverse hazard rate, Mills ratio, moments, shewness, kurtosis, moment genrating function, probability generating function and order statistics, have been discussed. Moreover, the collective risk model is discussed by considering the proposed distrubution as the primary distribution and the expoential and Erlang distributions as the secondary ones. Parameter estimation is done using maximum likelihood estimation (MLE). Finally a real dataset is discussed to demonstrate the suitability and applicability of the proposed distribution in modeling count dataset.


Author(s):  
Umme Habibah Rahman ◽  
Tanusree Deb Roy

In this paper, a new kind of distribution has suggested with the concept of exponentiate. The reliability analysis including survival function, hazard rate function, reverse hazard rate function and mills ratio has been studied here. Its quantile function and order statistics are also included. Parameters of the distribution are estimated by the method of Maximum Likelihood estimation method along with Fisher information matrix and confidence intervals have also been given. The application has been discussed with the 30 years temperature data of Silchar city, Assam, India. The goodness of fit of the proposed distribution has been compared with Frechet distribution and as a result, for all 12 months, the proposed distribution fits better than the Frechet distribution.


Author(s):  
Nelson Doe Dzivor ◽  
Henry Otoo ◽  
Eric Neebo Wiah

The quest to improve on flexibility of probability distributions motivated this research. Four-parameter Janardan generalized distribution known as Kumaraswamy-Janardan distribution is proposed through method of parameterization and studied. The probability density function, cumulative density function, survival rate function as well as hazard rate function of the distribution are established. Statistical properties such as moments, moment generating function as well as maximum likelihood of the model are discussed. The parameters are estimated using the simulated annealing optimization algorithm.   Flexibility of the model in comparison with the baseline model as well as other competing sub-models is verified using Akaike Information Criteria (AIC). The model is tested with real data and is proven to be more flexible in fitting real data than any of its sub-models considered. 


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1662
Author(s):  
Ahmed Sayed M. Metwally ◽  
Amal S. Hassan ◽  
Ehab M. Almetwally ◽  
B M Golam Kibria ◽  
Hisham M. Almongy

The inverted Topp–Leone distribution is a new, appealing model for reliability analysis. In this paper, a new distribution, named new exponential inverted Topp–Leone (NEITL) is presented, which adds an extra shape parameter to the inverted Topp–Leone distribution. The graphical representations of its density, survival, and hazard rate functions are provided. The following properties are explored: quantile function, mixture representation, entropies, moments, and stress–strength reliability. We plotted the skewness and kurtosis measures of the proposed model based on the quantiles. Three different estimation procedures are suggested to estimate the distribution parameters, reliability, and hazard rate functions, along with their confidence intervals. Additionally, stress–strength reliability estimators for the NEITL model were obtained. To illustrate the findings of the paper, two real datasets on engineering and medical fields have been analyzed.


Mathematics ◽  
2021 ◽  
Vol 9 (23) ◽  
pp. 3113
Author(s):  
Muhammed Rasheed Irshad ◽  
Christophe Chesneau ◽  
Soman Latha Nitin ◽  
Damodaran Santhamani Shibu ◽  
Radhakumari Maya

Many studies have underlined the importance of the log-normal distribution in the modeling of phenomena occurring in biology. With this in mind, in this article we offer a new and motivated transformed version of the log-normal distribution, primarily for use with biological data. The hazard rate function, quantile function, and several other significant aspects of the new distribution are investigated. In particular, we show that the hazard rate function has increasing, decreasing, bathtub, and upside-down bathtub shapes. The maximum likelihood and Bayesian techniques are both used to estimate unknown parameters. Based on the proposed distribution, we also present a parametric regression model and a Bayesian regression approach. As an assessment of the longstanding performance, simulation studies based on maximum likelihood and Bayesian techniques of estimation procedures are also conducted. Two real datasets are used to demonstrate the applicability of the new distribution. The efficiency of the third parameter in the new model is tested by utilizing the likelihood ratio test. Furthermore, the parametric bootstrap approach is used to determine the effectiveness of the suggested model for the datasets.


2021 ◽  
Vol 2 (4) ◽  
Author(s):  
R Mukherjee ◽  
N Muehlemann ◽  
A Bhingare ◽  
G W Stone ◽  
C Mehta

Abstract Background Cardiovascular trials increasingly require large sample sizes and long follow-up periods. Several approaches have been developed to optimize sample size such as adaptive group sequential trials, samples size re-estimation based on the promising zone, and the win ratio. Traditionally, the log-rank or the Cox proportional hazards model is used to test for treatment effects, based on a constant hazard rate and proportional hazards alternatives, which however, may not always hold. Large sample sizes and/or long follow up periods are especially challenging for trials evaluating the efficacy of acute care interventions. Purpose We propose an adaptive design wherein using interim data, Bayesian computation of predictive power guides the increase in sample size and/or the minimum follow-up duration. These computations do not depend on the constant hazard rate and proportional hazards assumptions, thus yielding more robust interim decision making for the future course of the trial. Methods PROTECT IV is designed to evaluate mechanical circulatory support with the Impella CP device vs. standard of care during high-risk PCI. The primary endpoint is a composite of all-cause death, stroke, MI or hospitalization for cardiovascular causes with initial minimum follow-up of 12 months and initial enrolment of 1252 patients with expected recruitment in 24 months. The study will employ an adaptive increase in sample size and/or minimum follow-up at the Interim analysis when ∼80% of patients have been enrolled. The adaptations utilize extensive simulations to choose a new sample size up to 2500 and new minimal follow-up time up to 36 months that provides a Bayesian predictive power of 85%. Bayesian calculations are based on patient-level information rather than summary statistics therefore enabling more reliable interim decisions. Constant or proportional hazard assumptions are not required for this approach because two separate Piece-wise Constant Hazard Models with Gamma-priors are fitted to the interim data. Bayesian predictive power is then calculated using Monte-Carlo methodology. Via extensive simulations, we have examined the utility of the proposed design for situations with time varying hazards and non-proportional hazards ratio such as situations of delayed treatment effect (Figure) and crossing of survival curves. The heat map of Bayesian predictive power obtained when the interim Kaplan-Meier curves reflected delayed response shows that for this scenario an optimal combination of increased sample size and increased follow-up time would be needed to attain 85% predictive power. Conclusion A proposed adaptive design with sample size and minimum follow-up period adaptation based on Bayesian predictive power at interim looks allows for de-risking the trial of uncertainties regarding effect size in terms of control arm outcome rate, hazard ratio, and recruitment rate. Funding Acknowledgement Type of funding sources: Private company. Main funding source(s): Abiomed, Inc Figure 1


Sign in / Sign up

Export Citation Format

Share Document