scholarly journals Causal inference for long-term survival in randomised trials with treatment switching: Should re-censoring be applied when estimating counterfactual survival times?

2018 ◽  
Vol 28 (8) ◽  
pp. 2475-2493 ◽  
Author(s):  
NR Latimer ◽  
IR White ◽  
KR Abrams ◽  
U Siebert

Treatment switching often has a crucial impact on estimates of effectiveness and cost-effectiveness of new oncology treatments. Rank preserving structural failure time models (RPSFTM) and two-stage estimation (TSE) methods estimate ‘counterfactual’ (i.e. had there been no switching) survival times and incorporate re-censoring to guard against informative censoring in the counterfactual dataset. However, re-censoring causes a loss of longer term survival information which is problematic when estimates of long-term survival effects are required, as is often the case for health technology assessment decision making. We present a simulation study designed to investigate applications of the RPSFTM and TSE with and without re-censoring, to determine whether re-censoring should always be recommended within adjustment analyses. We investigate a context where switching is from the control group onto the experimental treatment in scenarios with varying switch proportions, treatment effect sizes, treatment effect changes over time, survival function shapes, disease severity and switcher prognosis. Methods were assessed according to their estimation of control group restricted mean survival that would be observed in the absence of switching, up to the end of trial follow-up. We found that analyses which re-censored usually produced negative bias (i.e. underestimating control group restricted mean survival and overestimating the treatment effect), whereas analyses that did not re-censor consistently produced positive bias which was often smaller in magnitude than the bias associated with re-censored analyses, particularly when the treatment effect was high and the switching proportion was low. The RPSFTM with re-censoring generally resulted in increased bias compared to the other methods. We believe that analyses should be conducted with and without re-censoring, as this may provide decision-makers with useful information on where the true treatment effect is likely to lie. Incorporating re-censoring should not always represent the default approach when the objective is to estimate long-term survival times and treatment effects.

2005 ◽  
Vol 25 (1_suppl) ◽  
pp. S444-S444 ◽  
Author(s):  
Kristin M Noppens ◽  
J Regino Perez-Polo ◽  
David K Rassin ◽  
Karin N Westlund ◽  
Roderic Fabian ◽  
...  

2021 ◽  
Vol 28 ◽  
pp. 107327482199743
Author(s):  
Ke Chen ◽  
Xiao Wang ◽  
Liu Yang ◽  
Zheling Chen

Background: Treatment options for advanced gastric esophageal cancer are quite limited. Chemotherapy is unavoidable at certain stages, and research on targeted therapies has mostly failed. The advent of immunotherapy has brought hope for the treatment of advanced gastric esophageal cancer. The aim of the study was to analyze the safety of anti-PD-1/PD-L1 immunotherapy and the long-term survival of patients who were diagnosed as gastric esophageal cancer and received anti-PD-1/PD-L1 immunotherapy. Method: Studies on anti-PD-1/PD-L1 immunotherapy of advanced gastric esophageal cancer published before February 1, 2020 were searched online. The survival (e.g. 6-month overall survival, 12-month overall survival (OS), progression-free survival (PFS), objective response rates (ORR)) and adverse effects of immunotherapy were compared to that of control therapy (physician’s choice of therapy). Results: After screening 185 studies, 4 comparative cohort studies which reported the long-term survival of patients receiving immunotherapy were included. Compared to control group, the 12-month survival (OR = 1.67, 95% CI: 1.31 to 2.12, P < 0.0001) and 18-month survival (OR = 1.98, 95% CI: 1.39 to 2.81, P = 0.0001) were significantly longer in immunotherapy group. The 3-month survival rate (OR = 1.05, 95% CI: 0.36 to 3.06, P = 0.92) and 18-month survival rate (OR = 1.44, 95% CI: 0.98 to 2.12, P = 0.07) were not significantly different between immunotherapy group and control group. The ORR were not significantly different between immunotherapy group and control group (OR = 1.54, 95% CI: 0.65 to 3.66, P = 0.01). Meta-analysis pointed out that in the PD-L1 CPS ≥10 sub group population, the immunotherapy could obviously benefit the patients in tumor response rates (OR = 3.80, 95% CI: 1.89 to 7.61, P = 0.0002). Conclusion: For the treatment of advanced gastric esophageal cancer, the therapeutic efficacy of anti-PD-1/PD-L1 immunotherapy was superior to that of chemotherapy or palliative care.


2021 ◽  
Vol 14 (8) ◽  
pp. 705
Author(s):  
Hideki Houzen ◽  
Takahiro Kano ◽  
Kazuhiro Horiuchi ◽  
Masahiro Wakita ◽  
Azusa Nagai ◽  
...  

Reports on the long-term survival effect of edaravone, which was approved for the treatment of amyotrophic lateral sclerosis (ALS) in 2015 in Japan, are rare. Herein, we report our retrospective analysis of 45 consecutive patients with ALS who initially visited our hospital between 2013 and 2018. Of these, 22 patients were treated with edaravone for an average duration of 26.6 (range, 2–64) months, whereas the remaining patients were not treated with edaravone and comprised the control group. There were no differences in baseline demographics between the two groups. The primary endpoint was tracheostomy positive-pressure ventilation (TPPV) or death, and the follow-up period ended in December 2020. The survival rate was significantly better in the edaravone group than in the control group based on the Kaplan–Meier analysis, which revealed that the median survival durations were 49 (9–88) and 25 (8–41) months in the edaravone and control groups, respectively (p = 0.001, log-rank test). There were no serious edaravone-associated adverse effects during the study period. Overall, the findings of this single-center retrospective study suggest that edaravone might prolong survival in patients with ALS.


Heart ◽  
2021 ◽  
Vol 107 (5) ◽  
pp. 389-395
Author(s):  
Jianhua Wu ◽  
Alistair S Hall ◽  
Chris P Gale

AimsACE inhibition reduces mortality and morbidity in patients with heart failure after acute myocardial infarction (AMI). However, there are limited randomised data about the long-term survival benefits of ACE inhibition in this population.MethodsIn 1993, the Acute Infarction Ramipril Efficacy (AIRE) study randomly allocated patients with AMI and clinical heart failure to ramipril or placebo. The duration of masked trial therapy in the UK cohort (603 patients, mean age=64.7 years, 455 male patients) was 12.4 and 13.4 months for ramipril (n=302) and placebo (n=301), respectively. We estimated life expectancy and extensions of life (difference in median survival times) according to duration of follow-up (range 0–29.6 years).ResultsBy 9 April 2019, death from all causes occurred in 266 (88.4%) patients in placebo arm and 275 (91.1%) patients in ramipril arm. The extension of life between ramipril and placebo groups was 14.5 months (95% CI 13.2 to 15.8). Ramipril increased life expectancy more for patients with than without diabetes (life expectancy difference 32.1 vs 5.0 months), previous AMI (20.1 vs 4.9 months), previous heart failure (19.5 vs 4.9 months), hypertension (16.6 vs 8.3 months), angina (16.2 vs 5.0 months) and age >65 years (11.3 vs 5.7 months). Given potential treatment switching, the true absolute treatment effect could be underestimated by 28%.ConclusionFor patients with clinically defined heart failure following AMI, ramipril results in a sustained survival benefit, and is associated with an extension of life of up to 14.5 months for, on average, 13 months treatment duration.


2020 ◽  
Author(s):  
Yun Xu ◽  
Cong Li ◽  
Charlie Zhi-Lin Zheng ◽  
Yu-Qin Zhang ◽  
Tian-An Guo ◽  
...  

Abstract Background Lynch syndrome (LS) is the most common hereditary colorectal cancer (CRC) syndrome. Comparison of prognosis between LS and sporadic CRC (SCRC) were rare,with conflicting results. This study aimed to compare the long-term outcomes between patients with LS and SCRC. Methods Between June 2008 and September 2018, a total of 47 patients were diagnosed with LS by genetic testing at Fudan University Shanghai Cancer Center. A 1:2 propensity score matching was performed to obtain homogeneous cohorts from SCRC group. Thereafter, 94 SCRC patients were enrolled as control group. The long-term survival rates between the two groups were compared, and the prognostic factors were also analyzed. Results The 5-year OS rate of LS group was 97.6%, which was significantly higher than of 82.6% for SCRC group (p = 0.029). The 5-year PFS rate showed no significant differences between the two groups (78.0% for LS group vs. 70.6% for SCRC patients; p = 0.262). The 5-year TFS rates in LS group was 62.1% for LS patients, which were significantly lower than of 70.6% for SCRC group (p = 0.039). By multivariate analysis, we found that tumor progression of primary CRC and TNM staging were independent risk factors for OS. Conclusion LS patients have better long-term survival prognosis than SCRC patients. Strict regular follow-up monitoring, detection at earlier tumor stages, and effective treatment are key to ensuring better long-term prognosis.


2016 ◽  
Vol 32 (3) ◽  
pp. 160-166 ◽  
Author(s):  
Nicholas R. Latimer ◽  
Chris Henshall ◽  
Uwe Siebert ◽  
Helen Bell

Objectives: Treatment switching refers to the situation in a randomized controlled trial where patients switch from their randomly assigned treatment onto an alternative. Often, switching is from the control group onto the experimental treatment. In this instance, a standard intention-to-treat analysis does not identify the true comparative effectiveness of the treatments under investigation. We aim to describe statistical methods for adjusting for treatment switching in a comprehensible way for nonstatisticians, and to summarize views on these methods expressed by stakeholders at the 2014 Adelaide International Workshop on Treatment Switching in Clinical Trials.Methods: We describe three statistical methods used to adjust for treatment switching: marginal structural models, two-stage adjustment, and rank preserving structural failure time models. We draw upon discussion heard at the Adelaide International Workshop to explore the views of stakeholders on the acceptability of these methods.Results: Stakeholders noted that adjustment methods are based on assumptions, the validity of which may often be questionable. There was disagreement on the acceptability of adjustment methods, but consensus that when these are used, they should be justified rigorously. The utility of adjustment methods depends upon the decision being made and the processes used by the decision-maker.Conclusions: Treatment switching makes estimating the true comparative effect of a new treatment challenging. However, many decision-makers have reservations with adjustment methods. These, and how they affect the utility of adjustment methods, require further exploration. Further technical work is required to develop adjustment methods to meet real world needs, to enhance their acceptability to decision-makers.


2019 ◽  
Vol 111 (11) ◽  
pp. 1186-1191 ◽  
Author(s):  
Julien Péron ◽  
Alexandre Lambert ◽  
Stephane Munier ◽  
Brice Ozenne ◽  
Joris Giai ◽  
...  

Abstract Background The treatment effect in survival analysis is commonly quantified as the hazard ratio, and tested statistically using the standard log-rank test. Modern anticancer immunotherapies are successful in a proportion of patients who remain alive even after a long-term follow-up. This new phenomenon induces a nonproportionality of the underlying hazards of death. Methods The properties of the net survival benefit were illustrated using the dataset from a trial evaluating ipilimumab in metastatic melanoma. The net survival benefit was then investigated through simulated datasets under typical scenarios of proportional hazards, delayed treatment effect, and cure rate. The net survival benefit test was computed according to the value of the minimal survival difference considered clinically relevant. As comparators, the standard and the weighted log-rank tests were also performed. Results In the illustrative dataset, the net survival benefit favored ipilimumab [Δ(0) = 15.8%, 95% confidence interval = 4.6% to 27.3%, P = .006]. This favorable effect was maintained when the analysis was focused on long-term survival differences (eg, >12 months, Δ(12) = 12.5% (95% confidence interval = 4.4% to 20.6%, P = .002). Under the scenarios of a delayed treatment effect and cure rate, the power of the net survival benefit test compared favorably to the standard log-rank test power and was comparable to the power of the weighted log-rank test for large values of the threshold of clinical relevance. Conclusion The net long-term survival benefit is a measure of treatment effect that is meaningful whether or not hazards are proportional. The associated statistical test is more powerful than the standard log-rank test when a delayed treatment effect is anticipated.


HortScience ◽  
1996 ◽  
Vol 31 (6) ◽  
pp. 988-991 ◽  
Author(s):  
K.A. Jacobs ◽  
G.R. Johnson

Seedlings of eight Prunus taxa were evaluated for variation in susceptibility to a single, 4- or 5-day flooding period and root rot caused by Phytophthora cryptogea Pethybr. & Lafferty. Survival, plant defoliation, disease severity index, root necrosis, and net photosynthesis indicated that the combination of flooding and pathogen was significantly more severe to all taxa than either individual treatment. Most response variables reflected early plant dysfunction but were not correlated with long-term survival. Long-term survival was 70% in the combination treatment compared to 99% in the control group. Flooding injured seedlings more than the pathogen in most taxa. Taxa differed only slightly in tolerance to the treatments, as measured by survival rate. Prunus takesimensis Nakai had the highest survival rate of 100% and along with P. mahaleb L. and P. yedoensis Matsum. showed some tolerance to flooding and the pathogen. Prunus sargentii Rehd. had the lowest survival rate of 81% and appeared to be least tolerant to the pathogen.


2020 ◽  
Vol 37 (9) ◽  
pp. 707-715
Author(s):  
Ala Abudayyeh ◽  
Juhee Song ◽  
Maen Abdelrahim ◽  
Ibrahim Dahbour ◽  
Valda D. Page ◽  
...  

Introduction: In patients with advanced cancer, prolongation of life with treatment often incurs substantial emotional and financial expense. Among hospitalized patients with cancer since acute kidney injury (AKI) is known to be associated with much higher odds for hospital mortality, we investigated whether renal replacement therapy (RRT) use in the intensive care unit (ICU) was a significant independent predictor of worse outcomes. Methods: We retrospectively reviewed patients admitted in 2005 to 2014 who were diagnosed with stage IV solid tumors, had AKI, and a nephrology consult. The main outcomes were survival times from the landmark time points, inpatient mortality, and longer term survival after hospital discharge. Logistic regression and Cox proportional regression were used to compare inpatient mortality and longer term survival between RRT and non-RRT groups. Propensity score-matched landmark survival analyses were performed with 2 landmark time points chosen at day 2 and at day 7 from ICU admission. Results: Of the 465 patients with stage IV cancer admitted to the ICU with AKI, 176 needed RRT. In the multivariate logistic regression model after adjusting for baseline serum albumin and baseline maximum Sequential Organ Failure Assessment (SOFA), the patients who received RRT were not significantly different from non-RRT patients in inpatient mortality (odds ratio: 1.004 [95% confidence interval: 0.598-1.684], P = .9892). In total, 189 patients were evaluated for the impact of RRT on long-term survival and concluded that RRT was not significantly associated with long-term survival after discharge for patients who discharged alive. Landmark analyses at day 2 and day 7 confirmed the same findings. Conclusions: Our study found that receiving RRT in the ICU was not significantly associated with inpatient mortality, survival times from the landmark time points, and long-term survival after discharge for patients with stage IV cancer with AKI.


2021 ◽  
pp. ijgc-2020-002023
Author(s):  
Joanna Baum ◽  
Elena Ioana Braicu ◽  
Oliver Hunsicker ◽  
Ignace Vergote ◽  
Nicole Concin ◽  
...  

IntroductionLong-term survivors of ovarian cancer are a unique group of patients in whom prognostic factors for long-term survival have been poorly described. Such factors may provide information for a more personalized therapeutic approach. The objective of this study is to determine further characteristics of long-term survivors with high-grade serous ovarian cancer.MethodsLong-term survivors were defined as patients living longer than 8 years after first diagnosis and were recruited within seven high volume centers across Europe from November 1988 to November 2008. The control group included patients with high-grade serous ovarian cancer with less than 5 years' survival identified from the systematic ‘Tumorbank ovarian cancer’ database. A subanalysis of Charité patients only was performed separately for in-depth analysis of tumor dissemination. Propensity score matching with nearest-neighbor caliper width was used to match long-term survivors and the control group regarding age, FIGO stage, and residual tumor.ResultsA total of 276 patients with high-grade serous ovarian cancer were included, divided into 131 long-term survivors and 145 control group patients. After propensity score matching and multivariable adjustment, platinum sensitivity (p=0.002) was an independent favorable prognostic factor whereas recurrence (p<0.001) and ascites (p=0.021) were independent detrimental predictors for long-term survival. Significantly more long-term survivors tested positive for mutation in the BRCA1 gene than the BRCA2 gene (p=0.016). Intraoperatively, these patients had less tumor involvement of the upper abdomen at initial surgery (p=0.024). Complexity of surgery and surgical techniques were similar in both cohorts.ConclusionPlatinum sensitivity constitutes a favorable factor for long-term survival whereas tumor involvement of the upper abdomen, ascites, and recurrence have a negative impact. Based on clinical estimation, long-term survival is associated with combinations of clinical, surgical, and molecular factors.


Sign in / Sign up

Export Citation Format

Share Document