scholarly journals A Review of Survival Analysis Methods Used in NICE Technology Appraisals of Cancer Treatments: Consistency, Limitations, and Areas for Improvement

2019 ◽  
Vol 39 (8) ◽  
pp. 899-909 ◽  
Author(s):  
Helen Bell Gorrod ◽  
Ben Kearns ◽  
John Stevens ◽  
Praveen Thokala ◽  
Alexander Labeit ◽  
...  

Objectives. In June 2011, the National Institute for Health and Care Excellence (NICE) Decision Support Unit published a Technical Support Document (TSD) providing recommendations on survival analysis for NICE technology appraisals (TAs). Survival analysis outputs are influential inputs into economic models estimating the cost-effectiveness of new cancer treatments. Hence, it is important that systematic and justifiable model selection approaches are used. This study investigates the extent to which the TSD recommendations have been followed since its publication. Methods. We reviewed NICE cancer TAs completed between July 2011 and July 2017. Information on survival analyses undertaken and associated critiques for overall survival (OS) and progression-free survival were extracted from the company submissions, Evidence Review Group (ERG) reports, and final appraisal determination documents. Results. Information was extracted from 58 TAs. Only 4 (7%) followed all TSD recommendations for OS outcomes. The vast majority (91%) compared a range of common parametric models and assessed their fit to the data (86%). Only a minority of TAs included an assessment of the shape of the hazard function (38%) or proportional hazards assumption (40%). Validation of the extrapolated portion of the survival function using external data was attempted in a minority of TAs (40%). Extrapolated survival functions were frequently criticized by ERGs (71%). Conclusions. Survival analysis within NICE TAs remains suboptimal, despite publication of the TSD. Model selection is not undertaken in a systematic way, resulting in inconsistencies between TAs. More attention needs to be given to assessing hazard functions and validation of extrapolated survival functions. Novel methods not described in the TSD have been used, particularly in the context of immuno-oncology, suggesting that an updated TSD may be of value.

Neurology ◽  
2020 ◽  
Vol 95 (5) ◽  
pp. e508-e518
Author(s):  
Daigo Yoshida ◽  
Tomoyuki Ohara ◽  
Jun Hata ◽  
Mao Shibata ◽  
Yoichiro Hirakawa ◽  
...  

ObjectiveTo estimate the lifetime cumulative incidence of dementia and its subtypes from a community-dwelling elderly population in Japan.MethodsA total of 1,193 community-dwelling Japanese individuals without dementia, aged 60 years or older, were followed up prospectively for 17 years. The cumulative incidence of dementia was estimated based on a death- and dementia-free survival function and the hazard functions of dementia at each year, which were computed by using a Weibull proportional hazards model. The lifetime risk of dementia was defined as the cumulative incidence of dementia at the point in time when the survival probability of the population was estimated to be less than 0.5%.ResultsDuring the follow-up, 350 participants experienced some type of dementia; among them, 191 participants developed Alzheimer disease (AD) and 117 developed vascular dementia (VaD). The lifetime risk of dementia was 55% (95% confidence interval, 49%–60%). Women had an approximately 1.5 times greater lifetime risk of dementia than men (65% [57%–72%] vs 41% [33%–49%]). The lifetime risks of developing AD and VaD were 42% (35%–50%) and 16% (12%–21%) in women vs 20% (7%–34%) and 18% (13%–23%) in men, respectively.ConclusionLifetime risk of all dementia for Japanese elderly was substantial at approximately 50% or higher. This study suggests that the lifetime burden attributable to dementia in contemporary Japanese communities is immense.


2016 ◽  
Vol 17 (2) ◽  
pp. 130-151 ◽  
Author(s):  
Scott Dellana ◽  
David West

Purpose The purpose of this paper is to apply survival analysis, using Cox proportional hazards regression (CPHR), to the problem of predicting if and when supply chain (SC) customers or suppliers might file a petition for bankruptcy so that proactive steps may be taken to avoid a SC disruption. Design/methodology/approach CPHR is first compared to multiple discriminant analysis (MDA) and logistic regression (LR) to assess its suitability and accuracy to SC applications using three years of financial quarterly data for 69 non-bankrupt and 74 bankrupt organizations. A k-means clustering approach is then applied to the survival curves of all 143 organizations to explore heuristics for predicting the timing of bankruptcy petitions. Findings CPHR makes bankruptcy predictions at least as accurately as MDA and LR. The survival function also provides valuable information on when bankruptcy might occur. This information allows SC members to be prioritized into three groups: financially healthy companies of no immediate risk, companies with imminent risk of bankruptcy and companies with intermediate levels of risk that need monitoring. Originality/value The current paper proposes a new analytical approach to scanning and assessing the financial risk of SC members (suppliers or customers). Traditional models are able to predict if but not when a financial failure will occur. Lacking this information, it is impossible for SC managers to prioritize risk mitigation activities. A simple decision rule is developed to guide SC managers in setting these priorities.


2020 ◽  
Vol 4 (349) ◽  
pp. 81-92
Author(s):  
Dominik Kubacki ◽  
Robert Kubacki

One of the key elements related to calculating Customer Lifetime Value is to estimate the duration of a client’s relationship with a bank in the future. This can be done using survival analysis. The aim of the article is to examine which of the known distributions used in survival analysis (Weibull, Exponential, Gamma, Log‑normal) best describes the churn phenomenon of a bank’s clients. If the aim is to estimate the distribution according to which certain units (bank customers) survive and the factors that cause this are not so important, then parametric models can be used. Estimation of survival function parameters is faster than estimating a full Cox model with a properly selected set of explanatory variables. The authors used censored data from a retail bank for the study. The article also draws attention to the most common problems related to preparing data for survival analysis.


Author(s):  
Peter Watson

This chapter explores survival analysis. It includes data censoring, functions of duration time (the survival function, and hazard function), Cox’s proportional hazards model, log-linearity, time varying predictors, and odds ratios.


Cancers ◽  
2021 ◽  
Vol 13 (22) ◽  
pp. 5778
Author(s):  
Anne Clavreul ◽  
Jean-Michel Lemée ◽  
Gwénaëlle Soulard ◽  
Audrey Rousseau ◽  
Philippe Menei

Purpose: The survival times of glioblastoma (GB) patients after the standard therapy including safe maximal resection followed by radiotherapy plus concomitant and adjuvant temozolomide are heterogeneous. In order to define a simple, reliable method for predicting whether patients with isocitrate dehydrogenase (IDH)-wildtype GB treated with the standard therapy will be short- or long-term survivors, we analyzed the correlation of preoperative blood counts and their combined forms with progression-free survival (PFS) and overall survival (OS) in these patients. Methods: Eighty-five patients with primary IDH-wildtype GB treated with the standard therapy between 2012 and 2019 were analyzed retrospectively. Cox proportional hazards models and Kaplan–Meier analysis were used to investigate the survival function of preoperative hematological parameters. Results: Preoperative high neutrophil-to-lymphocyte ratio (NLR, >2.42), high platelet count (>2.36 × 109/L), and low red blood cell (RBC) count (≤4.59 × 1012/L) were independent prognostic factors for poorer OS (p = 0.030, p = 0.030, and p = 0.004, respectively). Moreover, a high NLR was an independent prognostic factor for shorter PFS (p = 0.010). We also found that, like NLR, preoperative high derived NLR (dNLR, >1.89) was of poor prognostic value for both PFS (p = 0.002) and OS (p = 0.033). A significant correlation was observed between NLR and dNLR (r = 0.88, p < 0.001), which had a similar prognostic power for OS (NLR: AUC = 0.58; 95% CI: [0.48; 0.68]; dNLR: AUC = 0.62; 95% CI: [0.51; 0.72]). Two scores, one based on preoperative platelet and RBC counts plus NLR and the other on preoperative platelet and RBC counts plus dNLR, were found to be independent prognostic factors for PFS (p = 0.006 and p = 0.002, respectively) and OS (p < 0.001 for both scores). Conclusion: Cheap, routinely ordered, preoperative assessments of blood markers, such as NLR, dNLR, RBC, and platelet counts, can predict the survival outcomes of patients with IDH-wildtype GB treated with the standard therapy.


Author(s):  
Jonathan Golub

This article provides a discussion of survival analysis that presents another way to incorporate temporal information into analysis in ways that give advantages similar to those from using time series. It describes the main choices researchers face when conducting survival analysis and offers a set of methodological steps that should become standard practice. After introducing the basic terminology, it shows that there is little to lose and much to gain by employing Cox models instead of parametric models. Cox models are superior to parametric models in three main respects: they provide more reliable treatment of the baseline hazard and superior handling of the proportional hazards assumption, and they are the best for handling tied data. Moreover, the illusory benefits of parametric models are presented. The greater use of Cox models enables researchers to elicit more useful information from their data, and allows for more reliable substantive inferences about important political processes.


Author(s):  
Constantin Ruhe

In many applications of the Cox model, the proportional-hazards assumption is implausible. In these cases, the solution to nonproportional hazards usually consists of modeling the effect of the variable of interest and its interaction effect with some function of time. Although Stata provides a command to implement this interaction in stcox, it does not allow the typical visualizations using stcurve if stcox was estimated with the tvc() option. In this article, I provide a short workaround that estimates the survival function after stcox with time-dependent coefficients. I introduce and describe the scurve_tvc command, which automates this procedure and allows users to easily visualize survival functions for models with time-varying effects.


2021 ◽  

Background: Simulation studies present an important statistical tool to investigate the performance, properties, and adequacy of statistical models in pre-specified situations. The proportional hazards model of survival analysis is one of the most important statistical models in medical studies. This study aimed to investigate the underlying one-month survival of road traffic accident (RTA) victims in a Level 1 Trauma Center in Iran using parametric and semi-parametric survival analysis models from the viewpoint of post-crash care-provider in 2017. Materials and Methods: This retrospective cohort study (restudy) was conducted at Level-I Trauma Center of Shiraz, Iran, from January to December 2017. Considering the fact that certain covariates acting on survival may take a non-homogenous risk pattern leading to the violation of proportional hazards assumption in Cox-PH, the parametric survival modeling was employed to inspect the multiplicative effect of all covariates on the hazard. Distributions of choice were Exponential, Weibull and Lognormal. Parameters were estimated using the Akaike Results: Survival analysis was conducted on 8,621 individuals for whom the length of stay (observation period) was between 1 and 89 days. In total, 141 death occurred during this time. The log-rank test revealed inequality of survival functions across various categories of age, injury mechanism, injured body region, injury severity score, and nosocomial infections. Although the risk level in the Cox model is almost the same as that in the results of the parametric models, the Weibull model in the multivariate analysis yields better results, according to the Akaike criterion. Conclusion: In multivariate analysis, parametric models were more efficient than other models. Some results were similar in both parametric and semi-parametric models. In general, parametric models and among them the Weibull model was more efficient than other models.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 103
Author(s):  
Morne Joubert ◽  
Tanja Verster ◽  
Helgard Raubenheimer ◽  
Willem D. Schutte

Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank.


2020 ◽  
Vol 72 (2) ◽  
pp. 111-121
Author(s):  
Abdurakhim Akhmedovich Abdushukurov ◽  
Rustamjon Sobitkhonovich Muradov

At the present time there are several approaches to estimation of survival functions of vectors of lifetimes. However, some of these estimators either are inconsistent or not fully defined in range of joint survival functions and therefore not applicable in practice. In this article, we consider three types of estimates of exponential-hazard, product-limit, and relative-risk power structures for the bivariate survival function, when replacing the number of summands in empirical estimates with a sequence of Poisson random variables. It is shown that these estimates are asymptotically equivalent. AMS 2000 subject classification: 62N01


Sign in / Sign up

Export Citation Format

Share Document