scholarly journals Risk factors associated with major adverse cardiac and cerebrovascular events following percutaneous coronary intervention: a 10-year follow-up comparing random survival forest and Cox proportional-hazards model

2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.

Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P<.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Fujino ◽  
H Ogawa ◽  
S Ikeda ◽  
K Doi ◽  
Y Hamatani ◽  
...  

Abstract Background Atrial fibrillation (AF) commonly progresses from paroxysmal type to sustained type in the natural course of the disease, and we previously demonstrated that the progression of AF was associated with increased risk of clinical adverse events. There are some patients, though less frequently, who regress from sustained to paroxysmal AF, but the clinical impact of the regression of AF remains unknown. Purpose We sought to investigate whether regression from sustained to paroxysmal AF is associated with better clinical outcomes. Methods Using the dataset of the Fushimi AF Registry, patients who were diagnosed as sustained (persistent or permanent) AF at baseline were studied. Conversion of sustained AF to paroxysmal AF during follow-up was defined as regression of AF. Major adverse cardiac events (MACE) were defined as the composite of cardiac death, stroke, and hospitalization for heart failure (HF). Event rates were compared between the patients with and without regression of AF. In patients with sustained AF at baseline, predictors of MACE were identified using Cox proportional hazards model. Results Among 2,253 patients who were diagnosed as sustained AF at baseline, regression of AF was observed in 9.0% (202/2,253, 2.0 per 100 patient-years) during a median follow-up of 4.0 years. Of these, 24.3% (49/202, 4.6 per 100 patient-years) of the patients finally recurred to sustained AF during follow-up. The proportion of asymptomatic patients was lower in patients with regression of AF than those without (with vs without regression; 49.0% vs 69.5%, p<0.01). The percentage of beta-blocker use at baseline was similar between the two groups (37.2% vs 33.8%, p=0.34). The prevalence of patients who underwent catheter ablation or electrical cardioversion during follow-up was higher in patients with regression of AF (catheter ablation: 15.8% vs 5.5%; p<0.01, cardioversion: 4.0% vs 1.4%; p<0.01, respectively). The rate of MACE was significantly lower in patients with regression of AF as compared with patients who maintained sustained AF (3.7 vs 6.2 per 100 patient-years, log-rank p<0.01). Figure shows the Kaplan-Meier curves for MACE, cardiac death, hospitalization for heart failure, and stroke. In patients with sustained AF at baseline, multivariable Cox proportional hazards model demonstrated that regression of AF was an independent predictor of lower MACE (adjusted hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.28 to 0.88, p=0.02), stroke (HR: 0.51, 95% CI: 0.30 to 0.88, p=0.02), and hospitalization for HF (HR: 0.50, 95% CI: 0.29 to 0.85, p=0.01). Conclusion Regression from sustained to paroxysmal AF was associated with a lower incidence of adverse cardiac events. Funding Acknowledgement Type of funding source: None


2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.


2017 ◽  
Vol 35 (15_suppl) ◽  
pp. 8014-8014
Author(s):  
Arjun Lakshman ◽  
Muhamad Alhaj Moustafa ◽  
S.Vincent Rajkumar ◽  
Angela Dispenzieri ◽  
Morie A. Gertz ◽  
...  

8014 Background: t(11;14) is a standard risk cytogenetic marker in MM. Methods: We reviewed 366 patients with MM who had t(11;14) by FISH and 732 age and period-matched controls without t(11;14), seen at our institution from 2004 to 2014 and outcomes were analyzed using time to first progression or death (PFS1) and overall survival (OS). Results: For the t(11;14) group at diagnosis, the median age was 63.7 yr (range, 22.1-95.4) with 64.5% of patients being male. Eighty nine (24.3%) patients were above 70 yr of age at diagnosis. 33.8%, 40.3% and 25.9% patients belonged to ISS 1, II and III stages respectively. 13% patients had elevated LDH. Monosomy 17 or del 17p were identified in 10.6% patients. The median follow up period was 56.9 months (m) (95% CI: 54.6-62.2) and 209 (57.1%) patients were alive at last follow-up. Among patients receiving proteasome inhibitor (PI)-based, immunomodulator (IMiD)-based, PI+IMiD based or other agent based induction therapy, 71.2%, 70.3%, 90.4% and 37.5% patients respectively attained ≥PR as best response to induction (p < 0.01). During their course, 223 (60.9%) patients underwent stem cell transplant. Median PFS1 and OS were 23.1 (CI: 20.8-27.9) and 78.6 (CI: 66.7-105.9) m respectively. Among the controls, high risk cytogenetics (HRC) was present in 142 (19.4%), and the median OS was 83.8 m (CI: 70.8-97.0) being comparable to t(11;14) group (p = 0.8). For all 1098 patients, using a Cox-proportional hazards model with age > 70 years, induction therapy (novel agent-based vs others), cytogenetics [HRC vs t(11;14) without HRC vs no HRC or t(11;14)], and ISS stage III vs II/I as predictors, age > 70 years [HR-2.2 (CI: 1.8-2.8) and p < 0.01], ISS III vs ISS II/I [HR-1.4 (CI: 1.1-1.8) and p < 0.01] and HRC [HR of 2.1 (CI: 1.6-2.8) vs no HRC or t(11;14) (p < 0.01) and 1.9 (CI = 1.4-2.6) for t(11;14) without HRC (p < 0.01)were associated with reduced OS. The risk for reduced OS did not differ between t(11;14) without HRC, and those without t(11;14) or HRC [HR-1.1 (CI: 0.9-1.4), p = 0.4]. Conclusions: Our study characterizes the outcomes of a large cohort of MM patients with t(11;14) at diagnosis. Advanced age, HRC and advanced stage at diagnosis were associated with worse OS in our cohort. t(11;14) MM without HRC does not differ in outcome compared to non-t(11;14) MM without HRC.


2016 ◽  
Author(s):  
Michael S. Lauer

AbstractTo inform the retirement of NIH-owned chimpanzees, we analyzed the outcomes of 764 NIH-owned chimpanzees that were located at various points in time in at least one of 4 specific locations. All chimpanzees considered were alive and at least 10 years of age on January 1, 2005; transfers to a federal sanctuary began a few months later. During a median follow-up of just over 7 years, there were 314 deaths. In a Cox proportional hazards model that accounted for age, sex, and location (which was treated as a time-dependent covariate), age and sex were strong predictors of mortality, but location was only marginally predictive. Among 273 chimpanzees who were transferred to the federal sanctuary, we found no material increased risk in mortality in the first 30 days after arrival. During a median follow-up at the sanctuary of 3.5 years, age was strongly predictive of mortality, but other variables – sex, season of arrival, and ambient temperature on the day of arrival – were not predictive. We confirmed our regression findings using random survival forests. In summary, in a large cohort of captive chimpanzees, we find no evidence of materially important associations of location of residence or recent transfer with premature mortality.


2005 ◽  
Vol 32 (3) ◽  
pp. 302-328 ◽  
Author(s):  
Mike Stoolmiller ◽  
Elaine A. Blechman

How well does substance use predict adolescent recidivism? When the Cox proportional hazards model was applied to officially recorded first rearrest of 505 juvenile offenders, a best-fitting complex multivariate model indicated that: (a) parent reports that youths “often” use substances more than doubles first rearrest risk, (b) averaged youth and parent substance use reports predict recidivism better than a single source, (c) parent or youth denial of youth substance use predicts recidivism, (d) age at first arrest does not predict recidivism, (e) non-White/non-Asians have a 79% higher recidivism risk than peers, (f) parent-reported delinquency predicts recidivism with declining accuracy, and (g) substance use robustly predicts recidivism despite prior reported delinquency, gender, ethnicity, age, follow-up time, or data source. Findings are related to host-provocation theory.


Blood ◽  
2019 ◽  
Vol 134 (Supplement_1) ◽  
pp. 2274-2274
Author(s):  
Annelies J. Van Vuren ◽  
Laurel Mendelsohn ◽  
Richard van Wijk ◽  
Caterina P. Minniti ◽  
John Baird ◽  
...  

Background Chronic hemolysis is a hallmark of sickle cell disease (SCD). Intravascular hemolysis in particular is associated with severe vasculopathic complications including pulmonary hypertension (PH) and early mortality. Free heme causes oxidative damage and recently was identified as erythrocyte-derived Danger Associated Molecular Pattern (e-DAMP), associated with endothelial activation and vaso-occlusion in SCD (Belcher et al., Blood. 2014; Ghosh et al., J. Clin. Invest. 2013). Intravascular hemolysis is associated with elevated levels of serum lactate dehydrogenase (LDH). Heme catabolism leads to endogenous carbon monoxide (CO) production by heme oxygenase-1 (HO1), and CO is eliminated in exhaled breath. CO is transported primarily as the conjugate carboxyhemoglobin (HbCO), and end-alveolar CO (EACO) is an accepted proxy marker for its concentration in blood. We evaluated several lab values and ratios that might reflect the relative contribution of intravascular heme release and overall heme processing. Methods We investigated the relationship between EACO, HbCO (NCT01547793, cohort A) and other biomarkers of hemolysis in adults with SCD at steady state as part of the clinical cohort at the National Institutes of Health Clinical Center, Bethesda, Maryland, USA (NCT00011648, cohort B). Of the patients included in the cohort B, all routine samples with results on HbCO were included in the analyses. In a subgroup of the cohort B with data available on HbCO, echocardiography and/or mortality, we evaluated the correlation between LDH/HbCO ratio and echocardiographic markers of PH and all-cause mortality (cohort C). Combining all recognized available markers for hemolysis (total bilirubin, AST, absolute reticulocyte count, hemoglobin, median LDH, median HbCO and LDH/HbCO ratio) in a multivariate Cox proportional hazards model for survival led to selection of a predictive model encompassing three biomarkers: LD/HbCO ratio, AST and hemoglobin. Of these three markers, the LD/HbCO ratio was the most predictive factor. We also conducted univariate correlations with clinical outcome indicators. Main findings Erythropoietic and hemolytic laboratory parameters of the cohorts are provided in Table 1. HbCO concentrations and EACO were strongly correlated (Pearson's correlation r=0.66, p<0.01). In both cohort A and cohort B, HbCO and EACO were not correlated to LDH. However, EACO and HbCO did correlate with absolute reticulocyte counts (respectively r=0.46, p<0.01 and r=0.58, p<0.01). The patients of cohort C were divided into low (peak TRV <2.5m/s, N=34), intermediate (peak TRV 2.5-3m/s, N=38) and high risk (peak TRV >=3.0m/s, N=13) categories, based upon prior cut-points determined by risk of development of PH and early mortality (Mehari A. et al. JAMA. 2012) (Figure 1, panel A). LDH/HbCO ratios were positively correlated with TRV (r=0.38, p<0.01), and were significantly higher in patients with TRV >=3.0m/s (Mann-Whitey U test; p=0.02). In contrast, LDH values alone were not discriminative. All patients (25/25) with a LDH/HbCO ratio <1,200 had a TRV <3.0m/s; 94% (15/16) of the patients with catheterization-proven PH had a LDH/HbCO ratio >1,200. In the intermediate risk subgroup, PH was only diagnosed in individuals with LDH/HbCO ratios exceeding 1,200. Median follow-up was 12.1 years (IQR 10.3; 16.3), 25% (23/91) of the patients died during follow-up. Five-year, 10-year and 15-year overall survival in the group with LDH/HbCO ratio >1,200 were respectively 92.1%, 76.0% and 69.1%, whereas 5-year, 10-year and 15-year overall survival in the group with LDH/HbCO ratio <1,200 were respectively 100%, 92.9% and 88.0% (Figure 1, panel B). LDH/HbCO ratios were associated with all-cause mortality in a Cox proportional hazards model (p<0.01) and remained significantly associated with all-cause mortality when adjusted for age, C-reactive protein and ferritin (p=0.02). LDH alone was not associated with all-cause mortality in the unadjusted analysis. Main conclusions A ratio of two readily available clinical laboratory markers, LDH and HbCO, is promising as a potential biomarker in SCD. Increased LDH/HbCO ratios are strongly associated with elevated TRV and all-cause mortality, and thereby might improve the individual risk prediction in SCD patients. These markers deserve additional validation in future prospective trials. Disclosures van Wijk: Agios Pharmaceuticals: Consultancy, Research Funding; RR Mechatronics: Research Funding. Minniti:Doris Duke Foundation: Research Funding. Kato:Novartis, Global Blood Therapeutics: Consultancy, Research Funding; Bayer: Research Funding. van Beers:Novartis: Consultancy, Research Funding; Pfizer: Research Funding; RR Mechatronics: Research Funding; Agios Pharmaceuticals, Inc.: Membership on an entity's Board of Directors or advisory committees, Research Funding.


2016 ◽  
Vol 34 (4) ◽  
pp. 337-344 ◽  
Author(s):  
Caroline E. Weibull ◽  
Sandra Eloranta ◽  
Karin E. Smedby ◽  
Magnus Björkholm ◽  
Sigurdur Y. Kristinsson ◽  
...  

Purpose Many patients and clinicians are worried that pregnancy after the diagnosis of Hodgkin lymphoma (HL) may increase the risk of relapse despite a lack of empirical evidence to support such concerns. We investigated if an association exists between pregnancy and relapse in women with a diagnosis of HL. Materials and Methods Using Swedish healthcare registers combined with medical records, we included 449 women who received a diagnosis of HL between 1992 and 2009 and who were age 18 to 40 years at diagnosis. Follow-up started 6 months after diagnosis, when the patients' condition was assumed to be in remission. Pregnancy-associated relapse was defined as a relapse during pregnancy or within 5 years after delivery. Hazard ratios (HRs) with 95% CIs were estimated by using the Cox proportional hazards model. Results Among the 449 women, 144 (32%) became pregnant during follow-up. Overall, 47 relapses were recorded, of which one was a pregnancy-associated relapse. The adjusted HR for the comparison of the pregnancy-associated relapse rate to the non–pregnancy-associated relapse rate was 0.29 (95% CI, 0.04 to 2.18). The expected number of relapses in women with a recent pregnancy, given that they would experience the same relapse rate as that of women without a recent pregnancy, was 3.76; the observed-to-expected ratio was 0.27 (95% exact CI, 0.01 to 1.51). Conclusion We found no evidence that a pregnancy after diagnosis increases the relapse rate among women whose HL is in remission. Survivors of HL need to consider a range of factors when deciding about future reproduction. However, given the results of this study, the risk of pregnancy-associated relapse does not need to be considered.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 3448-3448
Author(s):  
Nicolas Batty ◽  
Xavier C. Badoux ◽  
Michael Keating ◽  
E. Lin ◽  
Susan Lerner ◽  
...  

Abstract Abstract 3448 Poster Board III-336 Introduction Patients (pts) with CLL have more than twice the risk of developing second malignancies [1]. Frontline therapy with FCR was the strongest independent determinant of survival when compared to FC in patients with CLL in retrospective analysis [2]. This study aims to identify pretreatment factors that may be associated with the development of 2nd malignancies in patients with CLL treated with FCR as initial therapy. Methods Our analysis includes pts with CLL treated between July 1999 and November 2003, on a Phase II trial of FCR as initial therapy. Patients who had developed a 2nd malignancy prior to initiation of therapy were excluded. Patients were divided in 2 groups according to whether they developed a 2nd malignancy during the follow up period. Time to 2nd malignancy was defined as the time from treatment to the first occurrence of 2nd malignancy. Chromosomal abnormalities were detected by metaphase karyotype of bone marrow leukemia cells. Patient characteristics, response to FCR, and overall survival were compared between the two groups using: Wilcoxon rank for continuous variables or Chi-square tests for categorical variables; Kaplan-Meier method was used to generate survival curves and log-rank test was used to assess differences in survival between subgroups. Responses were assessed by 1996 NCI-WG criteria after completion of treatment. Univariate and multivariate Cox proportional hazards model were fitted to assess the association between pts' characteristics and the second malignancy-free survival. Results Among 300 pts with CLL treated with frontline FCR, 46 had a 2nd cancer diagnosed prior to FCR were therefore excluded from this analysis, resulting in a total of 254 pts (85%). With a median follow-up of 76 months, 58 pts (23%) have developed a 2nd malignancy. These included hematological malignancies n=20, non-melanoma skin cancer n=18, solid tumors n=15 and 5 patients have more than 1 type of malignancy. Pts who developed a 2nd malignancy had significantly higher pretreatment percent of lymphocyte in the bone marrow (p=0.04), were less likely to have enlarged spleen size (p=0.024), and were more likely to have deletion of or abnormal chromosome 17 (p=0.008). The overall survival or the responses to treatment were not different between the 2 groups of pts. In the Cox proportional hazards model, abnormalities of chromosome 17 and 13 were statistically significantly associated with shorter time to 2nd malignancy: HR, 9.79 (95% CI, 2.84 - 33.82), p=0.0003 and HR, 4.019 (95% CI, 1.41 - 11.42), p=0.009, respectively. Conclusion Chromosome 17 and 13 abnormalities identified by standard metaphase karyotype analysis were more common in patients with CLL who develop 2nd malignancy after FCR therapy. The response rates and overall survival were not different between patients with CLL with or without 2nd malignancy after frontline therapy with FCR. Univariate Cox proportional hazards model in estimating the associations between covariates and 2nd malignancy free survival. Disclosures No relevant conflicts of interest to declare.


2020 ◽  
Vol 14 (Supplement_1) ◽  
pp. S316-S317
Author(s):  
S Mahmmod ◽  
J P D Schultheiss ◽  
A C I T L Tan ◽  
M W M D Lutgens ◽  
L P L Gilissen ◽  
...  

Abstract Background In current clinical practice, patients with inflammatory bowel disease (IBD) treated with originator infliximab (IFX) have been or are being switched to biosimilar infliximab (CT-P13) because of lower costs and seemingly similar effectiveness of biosimilars. Over a one-year follow-up, 7%-26% of the patients discontinue CT-P13 treatment. Common reasons for CT-P13 discontinuation are (subjective) loss of response or adverse events. As a result of these newly experienced symptoms, patients are occasionally switched back to treatment with originator IFX. However, not much is currently known regarding reverse switching to originator IFX. We aimed to assess the prevalence of and the specific reasons for reverse switching to originator infliximab within 52-weeks after an initial conversion from originator infliximab to CT-P13 in patients with IBD. Additionally, we evaluated whether reinitiating originator infliximab led to the desired favourable effect and sustained infliximab use. Methods In this retrospective, multicentre cohort study, data of IBD patients from eight hospitals in the Netherlands were collected. Adult patients with IBD were eligible for inclusion if they were switched from infliximab originator to CT-P13 and had a follow-up time of at least 52 weeks after initial conversion. Reasons for re-switching were categorised into adverse events or loss of response on the biosimilar. Drug survival was analysed with a time-varying Cox proportional hazards model. Results A total of 684 patients with IBD were switched (516 Crohn’s disease, 168 ulcerative colitis). Reverse switching was seen in 74 (10.8%) patients after a median of 140 days (IQR 86–231). Reverse switchers were more often females (70.3% vs. 49.7%, p &lt; 0.001) and had shorter originator treatment (4.0 [IQR 2.6–6.5] vs. 5.2 [IQR 3.0–7.5] years, p = 0.037) than those who stayed on CT-P13. A total of 105 symptoms for switching were reported. IBD-like symptoms (25.7%) and dermatological symptoms (21.9%) were the most common. Four patients had objectified loss of response. All regained response after switching back. IBD-like symptoms and dermatological symptoms were reversible in 74.1% and 87%, respectively. Overall reversibility of symptoms was 73.3%. Cox proportional hazards model with CT-P13/originator infliximab as time-varying covariate, yielded no difference in survival risk (hazard ratio 1.23, 95% CI 0.65–2.33). Conclusion Switching back to originator infliximab seems effective in patients with IBD, who experience adverse effects or loss of response after switching from originator to CT-P13. Switching patients back to originator infliximab might, therefore, be justified in case patients experience new side effects or loss of response after switching to CT-P13.


Sign in / Sign up

Export Citation Format

Share Document