Estimating Housing Mortality with Standard Loss Curves

1986 ◽  
Vol 18 (11) ◽  
pp. 1521-1530 ◽  
Author(s):  
M E Gleeson

Tests of fit using one set of data on mobile homes and another on conventional housing indicate that standard loss curves, such as the Pearl-Reed and Weibull curves, can be used to approximate housing survivorship functions. This finding opens up the possibility of analytical work using standard curves and the application of time-to-failure statistical models that are based on such curves. Tests of fit of standard curves to the two housing survivorship functions using truncated data are also encouraging, suggesting means of estimating housing mortality and computing life tables with incomplete cohort survival data.

2009 ◽  
Vol 92 (11) ◽  
pp. 5730-5738 ◽  
Author(s):  
M. Holtsmark ◽  
B. Heringstad ◽  
J. Ødegård

2014 ◽  
Vol 2 (1) ◽  
pp. 62-69 ◽  
Author(s):  
Jimin Lee ◽  
Robert Yearout ◽  
Donna Parsons

There are circumstances where an item is intentionally tested to destruction.  The purpose of this technique is to determine the failure rate (λ) of a tested item.  For these items, the quality attribute is defined as how long the item will last until failure.  Once the failure rate is determined from the number of survivors and total time of all items tested the mean time to failure (MTTF) which is a typical statistic for survival data analysis issues.  MTTF is calculated by dividing one by failure rate (λ).  From this one obtains the reliability function R(t) = e-λt where t is time.  This allows the cumulative density function F(t) = 1- e-λt  to be determined.  This density function, f(t) = λe-λt is a negative exponential with a standard deviation (σ) = 1/λ.  Thus setting a warranty policy for the tested item is difficult for the practitioner.  An important property of the exponential distribution is that it is memory less.  This means its conditional probability follows P(T > s + t |T > s)=P(T > t) for all s, t ≥0.  The exponential distribution can be used to describe the interval lengths between any two consecutive arrival times in a homogeneous Poisson process.  The purpose of this research paper is to present a simple technique to determine a realistic confidence level. Using the same technique the warranty level for the tested item can be predicted.


2019 ◽  
Vol 5 (1) ◽  
Author(s):  
Joseph Homer Saleh

AbstractPopular culture associates the lives of Roman emperors with luxury, cruelty, and debauchery, sometimes rightfully so. One missing attribute in this list is, surprisingly, that this mighty office was most dangerous for its holder. Of the 69 rulers of the unified Roman Empire, from Augustus (d. 14 CE) to Theodosius (d. 395 CE), 62% suffered violent death. This has been known for a while, if not quantitatively at least qualitatively. What is not known, however, and has never been examined is the time-to-violent-death of Roman emperors. This work adopts the statistical tools of survival data analysis to an unlikely population, Roman emperors, and it examines a particular event in their rule, not unlike the focus of reliability engineering, but instead of their time-to-failure, their time-to-violent-death. We investigate the temporal signature of this seemingly haphazardous stochastic process that is the violent death of a Roman emperor, and we examine whether there is some structure underlying the randomness in this process or not. Nonparametric and parametric results show that: (i) emperors faced a significantly high risk of violent death in the first year of their rule, which is reminiscent of infant mortality in reliability engineering; (ii) their risk of violent death further increased after 12 years, which is reminiscent of wear-out period in reliability engineering; (iii) their failure rate displayed a bathtub-like curve, similar to that of a host of mechanical engineering items and electronic components. Results also showed that the stochastic process underlying the violent deaths of emperors is remarkably well captured by a (mixture) Weibull distribution. We discuss the interpretation and possible reasons for this uncanny result, and we propose a number of fruitful venues for future work to help better understand the deeper etiology of the spectacle of regicide of Roman emperors.


1986 ◽  
Vol 64 (3) ◽  
pp. 602-605 ◽  
Author(s):  
Richard M. Zammuto ◽  
Paul W. Sherman

Eight years of age-specific survival data and 6 years of fecundity data from a free-living population of Belding's ground squirrels (Spermophilus beldingi) at Tioga Pass, California, were used to test the hypothesis that time-specific life tables, based on data from individual years, were different from the cohort-specific life table, based on the combined data from all years. The results indicated that neither the age structure of the male nor the female population significantly differed among years (all P > 0.05). Furthermore, the means and the variances in the sizes of weaned litters did not differ among years either in the population at large or within individual age-classes (all P > 0.05). A 27-day snowstorm that occurred in the spring of 1977 increased mortality and reduced reproduction, but it did not change the ground squirrels' age-specific survival or fecundity patterns. Taken together, our analyses revealed that each time-specific life table provided age-specific survival and fecundity estimates that were statistically indistinguishable (P > 0.05) from the composite, cohort-specific life table for each sex, regardless of severe environmental conditions. This is the first demonstration of the equivalence of time- and cohort-specific life tables for a free-living population of mammals.


Author(s):  
Øystein Arild ◽  
Hans Petter Lohne ◽  
Hans Joakim Skadsem ◽  
Eric Patrick Ford ◽  
Jon Tømmerås Selvik

Abstract With the increasing number of aging fields and wells worldwide, a large number of wells will have to permanently plugged and abandoned in the coming decades. Today’s technical solutions on P&A design are primarily driven by legislations or recognized standards such as NORSOK D-10 or the Oil & Gas UK Well Decommissioning Guidelines. The NORSOK D-010 say that the well should be sealing to “eternity” without providing any link between the recommended solution and time-to-failure. During the last few years, there has been a drive towards a risk-based approach to P&A design. With such an approach, the goodness of a P&A design can be formulated in terms of the associated leakage risk, which consists of two components; i) the time-to-failure of the barrier system and ii) the leakage rate given the barrier system has failed. When failure data are available, there is a wide range of statistical tools available for establishing the time-to-failure probability distribution. However, barrier failure data on permanently plugged and abandoned wells are scarce in the North Sea region. In order to estimate the time-to-failure for wells in the North Sea region, all relevant information should be taken into account; survival data, expert input and physiochemical degradation models. In this paper, we will show how this can be accomplished by means of a Bayesian reliability approach. The paper will first describe the general framework for how to perform Bayesian time-to-failure estimation. Thereafter, information pertaining to barrier system lifetime for wells on the Norwegian Continental Shelf (NCS) and relevant assumptions will be discussed. Finally, the methodology will be applied on a synthetic case.


2019 ◽  
Author(s):  
Kaiqiao Li ◽  
Xuefeng Wang ◽  
Pei Fen Kuan

AbstractHigh dimensional genomics data in biomedical sciences is an invaluable resource for constructing statistical prediction models. With the increasing knowledge of gene networks and pathways, this information can be utilized in the statistical models to improve prediction accuracy and enhance model interpretability. However, in some scenarios the network structure may only be partially known or inaccurately specified. Thus, the performance of statistical models incorporating such network structure may be compromised. In this paper, we proposed a weighted sparse network learning method by optimally combining a data driven network with sparsity property to a known or partially known prior network to address this issue. We showed that our proposed model attained the oracle property which aims to improve the accuracy of parameter estimation and achieved a parsimonious model in high dimensional setting for different outcomes including continuous, binary and survival data in extensive simulations studies. Case studies on ovarian cancer proteomics and melanoma gene expression further demonstrated that our proposed model achieved good operating characteristics in predicting response to chemotherapy and survival risk. An R package glmaag implemented our method is available on the Comprehensive R Archive Network (CRAN).


2007 ◽  
Vol 33 (3) ◽  
pp. 233-239 ◽  
Author(s):  
Karl B Christensen ◽  
Per Kragh Andersen ◽  
Lars Smith-Hansen ◽  
Martin L Nielsen ◽  
Tage S Kristensen

1996 ◽  
Vol 2 (2) ◽  
pp. 429-448 ◽  
Author(s):  
A.S. Macdonald

ABSTRACTThis paper surveys some statistical models of survival data. Competing risks models are described; the unidentifiability of net decrements suggests a sceptical approach to the use of underlying single decrement tables. Approaches based on observations of complete lifetimes (with censoring) are surveyed including the Kaplan-Meier and Nelson-Aalen estimates. Regression models for lifetimes depending on covariates are discussed, in particular the Cox model and partial likelihood estimation.


Sign in / Sign up

Export Citation Format

Share Document