scholarly journals Relationship Between Chelation and Clinical Outcomes in Lower-Risk Patients with Myelodysplastic Syndrome (MDS): Registry Analysis at 5 Years

Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 1350-1350 ◽  
Author(s):  
Roger M. Lyons ◽  
Billie J. Marek ◽  
Carole Paley ◽  
Jason Esposito ◽  
Katie McNamara ◽  
...  

Abstract Introduction: We prospectively collected data from lower-risk patients (pts) with MDS in an ongoing US registry in order to assess the association between chelation and clinical outcomes. In addition, we evaluated the association between chelation and overall survival (OS). Here we report outcomes at 5 years. Methods: The registry enrolled 600 pts from 107 US centers. Pts were ≥18 years old with lower-risk MDS (WHO, FAB, and/or IPSS criteria) and transfusional iron overload (serum ferritin ≥1000 µg/L and/or ≥20 packed red blood cell units and/or ≥6 units every 12 weeks). Pts were analyzed by iron chelation status; ie, had never been chelated or had ever used iron chelation, and a subgroup of the latter group—pts with ≥6 mo of chelation. Pts were evaluated every 6 mo for 5 years or until death with respect to characteristics, survival, disease status, comorbidities, cause of death, and MDS therapy. Results: 600 pts (median age, 76 years [range, 21-99], 346 [57.8%] male, 519 [86.6%] Caucasian) were evaluated. IPSS status was similar across chelation groups. Chelated pts (n=271) had a greater median number of lifetime units transfused at the time of enrollment vs nonchelated pts (n=328): 38.5 vs 20.0. At baseline, cardiac and vascular comorbidities (CVC) were significantly higher in nonchelated vs chelated pts (52.4% vs 34.3% [P<0.0001], 59.8% vs 48.0% [P=0.0039], respectively). Endocrine comorbidities (EC) were numerically higher in nonchelated vs ≥6 mo chelated pts (44.2% vs 35.6%). As of May 1, 2014, 61 pts continue in the registry; 538 discontinued (400 died, 66 lost to follow-up, 46 completed study, and 26 discontinued for other reasons). Of the 271 chelated pts, 187 (69.0%) were chelated with deferasirox, 40 (14.8%) with deferasirox and deferoxamine, 32 (11.8%) with deferoxamine, and 1 (0.4%) with an unknown chelator; in 11 (4.1%), the chelator name was not provided. Cumulative duration of chelation was 18.9 mo in pts who had ever used iron chelation and 27.0 mo in pts with ≥6 mo of iron chelation. OS from diagnosis of MDS and time to acute myeloid leukemia (AML) were significantly greater in the chelated vs nonchelated pts (P<0.0001 for both). In pts with CVC, median OS was also significantly greater in chelated vs nonchelated pts (67.66 vs 43.40 mo; P<0.0001). In pts with EC, median OS was also greater in chelated pts (74.98 vs 44.63 mo; P<0.0001) (Table). Patients with ≥6 mo of chelation had numerically fewer deaths in the registry, numerically greater OS, time to death, and time to AML transformation vs pts who had any chelation (Table). Conclusions: Limitations of these analyses include variation in time from diagnosis, duration of chelation, impact of pt clinical status on decision to chelate, and optional conduct of clinical assessments. Nonetheless, the results after 5 years of follow-up of lower-risk pts with MDS suggest iron chelation therapy is associated with improved OS and longer time to AML transformation. Causation has not been established. Abstract 1350. TABLE. Characteristics of Patients Nonchelated, Chelated, and Chelated ≥6 Months Nonchelated n=328 Chelated n=271 Chelated ≥6 Months n=202 Time to death, median (min/max) mo 47.8 (43.4, 53.1) 88.0 (78.4, 103.0) *P<0.0001 100.0 (83.4, 118.2) *P<0.0001 Deaths (n), % 239 (72.9) 161 (59.4) *P=0.0005 115 (56.9) *P=0.0002 Median OS (mo): No CVCMedian OS (mo): With CVC 34.0 (n=42) 43.4 (n=286) 69.3 (n=72) 67.7 (n=199) *P<0.0001 79.3 (n=60) 72.6 (n=142) *P<0.0001 Median OS (mo): No ECMedian OS (mo): With EC 38.5 (n=162) 44.6 (n=166) 67.1 (n=149) 75.0 (n=122) *P<0.0001 69.6 (n=114) 81.8 (n=88) *P<0.0001 Time to AML transformation from diagnosis, median (min, max) mo 46.4 (6.9, 82.5) 72.1 (16.4, 176.6) *P<0.0001 78.8 (16.4, 176.6) *P<0.0001 AML transformation, n (%) 34 (10.4) 17 (6.3) 14 (6.9) Cause of death, n (%) MDS/AML 103 (31.4) 73 (26.9) 53 (26.2) Cardiac 36 (11.0) 21 (7.7) 15 (7.4) Infection 27 (8.2) 14 (5.2) 14 (6.9) Other 16 (4.9) 16 (5.9) 10 (5.0) Unknown 29 (8.8) 18 (6.6) 12 (5.9) Malignancy 14 (4.3) 2 (0.7) 0 (0.0) Respiratory 7 (2.1) 7 (2.6) 4 (2.0) Multiorgan failure 3 (0.9) 3 (1.1) 3 (1.5) CVA 1 (0.3) 5 (1.8) 3 (1.5) GvHD/transplant 3 (0.9) 2 (0.7) 1 (0.5) CVC, cardiovascular comorbidity; EC, endocrine comorbidity; CVA, cerebrovascular accident; GvHD, graft-vs-host disease *Versus nonchelated. Disclosures Paley: Novartis Pharma: Employment. Esposito:Novartis Pharma: Employment. McNamara:Novartis Pharmaceuticals Corporation: Employment. Garcia-Manero:Novartis Pharma: Research Funding.

Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 2800-2800 ◽  
Author(s):  
Roger M. Lyons ◽  
Billie J. Marek ◽  
Surabhi Sharma ◽  
Carole Paley ◽  
Jason Esposito ◽  
...  

Abstract Abstract 2800 Introduction: Many patients with MDS require regular transfusions. Several reviews have documented poorer clinical outcomes and overall survival (OS) in transfusion-dependent MDS patients. A US registry of 600 lower-risk MDS patients prospectively collected data on clinical outcomes in chelated and non-chelated transfused patients. This 24-month interim analysis reports on cardiac events, leukemic transformation and OS. Methods: This is a 5-year, non-interventional registry in MDS patients (aged ≥18 years) with lower-risk MDS (based on WHO, FAB and/or IPSS criteria) from 107 US centers. Patients had to have transfusional iron overload (serum ferritin ≥1000 μg/L and/or ≥20 packed red blood cell units and/or ongoing transfusion requirement of ≥6 units every 12 weeks). Follow-up was every 6 months for up to 60 months or death. Use of chelation therapy was not required. Chelated patients were those who had ever used iron chelation; a sub-analysis was done on patients with ≥6 months chelation. Assessments included demographics, disease status, MDS therapy, comorbidities, and causes of death. Differences between non-chelated and chelated patients are reported. Results: 600 patients enrolled; as of May 26, 2011, 249 continued in the registry. 351 patients discontinued due to: lost to follow-up (n=51, 8.5%); death (n=278, 46.3%); other (n=22, 3.7%). 263/600 patients received chelation therapy, of whom 191 received ≥6 months. Leukemic transformation and cardiac events were more common in non-chelated patients (Table 2). Time to leukemic transformation was significantly shorter in non-chelated versus chelated patients. A greater percentage of deaths occurred in non-chelated patients; time to death was significantly shorter in non-chelated versus chelated patients. The most frequent reasons for death were MDS/AML, cardiac, and infection. At baseline, non-chelated patients had a higher incidence of cardiac disorders than chelated patients (51.3% vs 35%). While on the registry, non-chelated patients had a higher incidence of comorbidities than did chelated patients, predominantly vascular, cardiac and endocrine. Lifetime use of MDS therapies (pre- and on-registry) was lower among non-chelated versus chelated patients (88.4% vs 94.3%). Conclusions: At the 24-month analysis, use of chelation was associated with lower AML transformation, fewer cardiac events, and better OS. The two patients groups had similar age, gender, and risk status breakdown (IPSS); however the non-chelated group had a higher prevalence of cardiac comorbidities. Ongoing follow-up for the 5-year duration of this registry will provide further data on differences in outcomes between chelated and non-chelated patients. Disclosures: Lyons: Amgen: Consultancy, Honoraria, Research Funding; Celgene: Consultancy, Honoraria, Research Funding; Incyte: Research Funding; Telik: Research Funding; Alexion: Consultancy, Honoraria; Novartis: Research Funding. Sharma:Novartis: Employment. Paley:Novartis: Employment. Esposito:Novartis: Employment.


2021 ◽  
Vol 10 (22) ◽  
pp. 5355
Author(s):  
Gabby Elbaz-Greener ◽  
Shemy Carasso ◽  
Elad Maor ◽  
Lior Gallimidi ◽  
Merav Yarkoni ◽  
...  

(1) Introduction: Most studies rely on in-hospital data to predict cardiovascular risk and do not include prehospital information that is substantially important for early decision making. The aim of the study was to define clinical parameters in the prehospital setting, which may affect clinical outcomes. (2) Methods: In this population-based study, we performed a retrospective analysis of emergency calls that were made by patients to the largest private emergency medical services (EMS) in Israel, SHL Telemedicine Ltd., who were treated on-site by the EMS team. Demographics, clinical characteristics, and clinical outcomes were analyzed. Mortality was evaluated at three time points: 1, 3, and 12 months’ follow-up. The first EMS prehospital measurements of the systolic blood pressure (SBP) were recorded and analyzed. Logistic regression analyses were performed. (3) Results: A total of 64,320 emergency calls were included with a follow-up of 12 months post index EMS call. Fifty-five percent of patients were men and the mean age was 70.2 ± 13.1 years. During follow-up of 12 months, 7.6% of patients died. Age above 80 years (OR 3.34; 95% CI 3.03–3.69, p < 0.005), first EMS SBP ≤ 130 mm Hg (OR 2.61; 95% CI 2.36–2.88, p < 0.005), dyspnea at presentation (OR 2.55; 95% CI 2.29–2.83, p < 0001), and chest pain with ischemic ECG changes (OR 1.95; 95% CI 1.71–2.23, p < 0.001) were the highest predictors of 1 month mortality and remained so for mortality at 3 and 12 months. In contrast, history of hypertension and first EMS prehospital SBP ≥ 160 mm Hg were significantly associated with decreased mortality at 1, 3 and 12 months. (4) Conclusions: We identified risk predictors for all-cause mortality in a large cohort of patients during prehospital EMS calls. Age over 80 years, first EMS-documented prehospital SBP < 130 mm Hg, and dyspnea at presentation were the most profound risk predictors for short- and long-term mortality. The current study demonstrates that in prehospital EMS call settings, several parameters can be used to improve prioritization and management of high-risk patients.


2020 ◽  
Vol 11 ◽  
Author(s):  
Fabio Cofano ◽  
Carlotta Giambra ◽  
Paolo Costa ◽  
Pietro Zeppa ◽  
Andrea Bianconi ◽  
...  

Objective: Intradural Extramedullary (IDEM) tumors are usually treated with surgical excision. The aim of this study was to investigate the impact on clinical outcomes of pre-surgical clinical conditions, intraoperative neurophysiological monitoring (IONM), surgical access to the spinal canal, histology, degree of resection and intra/postoperative complications.Methods: This is a retrospective observational study analyzing data of patients suffering from IDEM tumors who underwent surgical treatment over a 12 year period in a double-center experience. Data were extracted from a prospectively maintained database and included: sex, age at diagnosis, clinical status according to the modified McCormick Scale (Grades I-V) at admission, discharge, and follow-up, tumor histology, type of surgical access to the spinal canal (bilateral laminectomy vs. monolateral laminectomy vs. laminoplasty), degree of surgical removal, use and type of IONM, occurrence and type of intraoperative complications, use of Ultrasonic Aspirator (CUSA), radiological follow-up.Results: A total number of 249 patients was included with a mean follow-up of 48.3 months. Gross total resection was achieved in 210 patients (84.3%) mostly in Schwannomas (45.2%) and Meningiomas (40.4%). IONM was performed in 162 procedures (65%) and D-wave was recorded in 64.2% of all cervical and thoracic locations (99 patients). The linear regression diagram for McCormick grades before and after surgery (follow-up) showed a correlation between preoperative and postoperative clinical status. A statistically significant correlation was found between absence of worsening of clinical condition at follow-up and use of IONM at follow-up (p = 0.01) but not at discharge. No associations were found between the choice of surgical approach and the extent of resection (p = 0.79), the presence of recurrence or residual tumor (p = 0.14) or CSF leakage (p = 0.25). The extent of resection was not associated with the use of IONM (p = 0.91) or CUSA (p = 0.19).Conclusion: A reliable prediction of clinical improvement could be made based on pre-operative clinical status. The use of IONM resulted in better clinical outcomes at follow-up (not at discharge), but no associations were found with the extent of resection. The use of minimally invasive approaches such as monolateral laminectomy showed to be effective and not associated with worse outcomes or increased complications.


2020 ◽  
Vol 41 (Supplement_1) ◽  
Author(s):  
W Sun ◽  
B P Y Yan

Abstract Background We have previously demonstrated unselected screening for atrial fibrillation (AF) in patients ≥65 years old in an out-patient setting yielded 1-2% new AF each time screen-negative patients underwent repeated screening at 12 to 18 month interval. Selection criteria to identify high-risk patients for repeated AF screening may be more efficient than repeat screening on all patients. Aims This study aimed to validate CHA2DS2VASC score as a predictive model to select target population for repeat AF screening. Methods 17,745 consecutive patients underwent 24,363 index AF screening (26.9% patients underwent repeated screening) using a handheld single-lead ECG (AliveCor) from Dec 2014 to Dec 2017 (NCT02409654). Adverse clinical outcomes to be predicted included (i) new AF detection by repeated screening; (ii) new AF clinically diagnosed during follow-up and (ii) ischemic stroke/transient ischemic attack (TIA) during follow-up. Performance evaluation and validation of CHA2DS2VASC score as a prediction model was based on 15,732 subjects, 35,643 person-years of follow-up and 765 outcomes. Internal validation was conducted by method of k-fold cross-validation (k = n = 15,732, i.e., Leave-One-Out cross-validation). Performance measures included c-index for discriminatory ability and decision curve analysis for clinical utility. Risk groups were defined as ≤1, 2-3, or ≥4 for CHA2DS2VASC scores. Calibration was assessed by comparing proportions of actual observed events. Results CHA2DS2VASC scores achieved acceptable discrimination with c-index of 0.762 (95%CI: 0.746-0.777) for derivation and 0.703 for cross-validation. Decision curve analysis showed the use of CHA2DS2VASC to select patients for rescreening was superior to rescreening all or no patients in terms of net benefit across all reasonable threshold probability (Figure 1, left). Predicted and observed probabilities of adverse clinical outcomes progressively increased with increasing CHA2DS2VASC score (Figure 1, right): 0.7% outcome events in low-risk group (CHA2DS2VASC ≤1, predicted prob. ≤0.86%), 3.5% intermediate-risk group (CHA2DS2VASC 2-3, predicted prob. 2.62%-4.43%) and 11.3% in high-risk group (CHA2DS2VASC ≥4, predicted prob. ≥8.50%). The odds ratio for outcome events were 4.88 (95%CI: 3.43-6.96) for intermediate-versus-low risk group, and 17.37 (95%CI: 12.36-24.42) for high-versus-low risk group.  Conclusion Repeat AF screening on high-risk population may be more efficient than rescreening all screen-negative individuals. CHA2DS2VASC scores may be used as a selection tool to identify high-risk patients to undergo repeat AF screening. Abstract P9 Figure 1


2017 ◽  
Vol 35 (15_suppl) ◽  
pp. 7053-7053 ◽  
Author(s):  
Guillermo Montalban-Bravo ◽  
Ana Alfonso Pierola ◽  
Koichi Takahashi ◽  
Marina Konopleva ◽  
Elias Jabbour ◽  
...  

7053 Background: Clinical outcomes of patients with myelodysplastic syndromes (MDS) and myelodysplastic/myeloproliferative neoplasms (MDS/MPN) are heterogeneous. Specific mutations and mutation patterns are known to define prognostic groups in normal karyotype acute myeloid leukemia. Whether this is the case in MDS and MDS/MPN remains unknown. Methods: We evaluated 325 previously untreated patients with MDS or MDS/MPN with normal karyotype evaluated from 2012 to 2016. Next generation sequencing (NGS) on whole bone marrow DNA analyzing a panel of 28 or 53 genes was performed at the time of diagnosis. Results: A total of 225 (69%) patients had MDS and 100 (31%) had MDS/MPN including 77 (24%) patients with chronic myelomonocytic leukemia (CMML). Median age was 69 years (31-92). Among patients with MDS, 189 (84%) had lower-risk and 36 (16%) had higher-risk based on IPSS. NGS data was obtained by 53-gene panel in 93 (29%) patients and by 28-gene panel in 232 (71%). A total of 202 (62%) patients had detectable mutations. Median number of mutations was 1 (range 0-6). Detected mutations are detailed in Table 1. A total of 111 (34%) patients, 70 with MDS and 41 with MDS/MPN, received therapy with hypomethylating agents. Median follow up was 12 months (0-167). By univariate analysis, NRAS (HR 3.28, CI 1.25-8.62, p=0.016) and TP53 (HR 4.9, CI 1.44-16.67, p=0.011) predicted for shorter overall survival (OS) among MDS patients. After multivariate analysis including IPSS-R, only TP53retained its impact in OS (HR 5.25, CI 1.44-19.13, p=0.012). Among MDS/MPN patients, no mutation was found to significantly impact OS. Conclusions: With the exception of TP53mutations, no other identified mutation seemed to independently define prognosis of patients with MDS or MDS/MPN with normal karyotype. In view of the high proportion of lower-risk patients, longer follow up is required to better define prognostic impact of mutations in this population. [Table: see text]


2017 ◽  
Vol 56 ◽  
pp. 88-95 ◽  
Author(s):  
Roger M. Lyons ◽  
Billie J. Marek ◽  
Carole Paley ◽  
Jason Esposito ◽  
Katie McNamara ◽  
...  

2021 ◽  
pp. 036354652110420
Author(s):  
Zachariah Gene Wing Ow ◽  
Chin Kai Cheong ◽  
Hao Han Hai ◽  
Cheng Han Ng ◽  
Dean Wang ◽  
...  

Background: Meniscal allograft transplant (MAT) is an important treatment option for young patients with deficient menisci; however, there is a lack of consensus on the optimal method of allograft fixation. Hypothesis: The various methods of MAT fixation have measurable and significant differences in outcomes. Study Design: Meta-analysis; Level of evidence, 4. Methods: A single-arm meta-analysis of studies reporting graft failure, reoperations, and other clinical outcomes after MAT was performed. Studies were stratified by suture-only, bone plug, and bone bridge fixation methods. Proportionate rates of failure and reoperation for each fixation technique were pooled with a mixed-effects model, after which reconstruction of relative risks with confidence intervals was performed using the Katz logarithmic method. Results: A total of 2604 patients underwent MAT. Weighted mean follow-up was 4.3 years (95% CI, 3.2-5.6 years). During this follow-up period, graft failure rates were 6.2% (95% CI, 3.2%-11.6%) for bone plug fixation, 6.9% (95% CI, 4.5%-10.3%) for suture-only fixation, and 9.3% (95% CI, 6.2%-13.9%) for bone bridge fixation. Transplanted menisci secured using bone plugs displayed a lower risk of failure compared with menisci secured via bone bridges (RR = 0.97; 95% CI, 0.94-0.99; P = .02). Risks of failure were not significantly different when comparing suture fixation to bone bridge (RR = 1.02; 95% CI, 0.99-1.06; P = .12) and bone plugs (RR = 0.99; 95% CI, 0.96-1.02; P = .64). Allografts secured using bone plugs were at a lower risk of requiring reoperations compared with those secured using sutures (RR = 0.91; 95% CI, 0.87-0.95; P < .001), whereas allografts secured using bone bridges had a higher risk of reoperation when compared with those secured using either sutures (RR = 1.28; 95% CI, 1.19-1.38; P < .001) or bone plugs (RR = 1.41; 95% CI, 1.32-1.51; P < .001). Improvements in Lysholm and International Knee Documentation Committee scores were comparable among the different groups. Conclusion: This meta-analysis demonstrates that bone plug fixation of transplanted meniscal allografts carries a lower risk of failure than the bone bridge method and has a lower risk of requiring subsequent operations than both suture-only and bone bridge methods of fixation. This suggests that the technique used in the fixation of a transplanted meniscal allograft is an important factor in the clinical outcomes of patients receiving MATs.


2017 ◽  
Vol 6 (7) ◽  
pp. 522-527 ◽  
Author(s):  
Danuta Gąsior-Perczak ◽  
Iwona Pałyga ◽  
Monika Szymonek ◽  
Artur Kowalik ◽  
Agnieszka Walczyk ◽  
...  

Purpose Delayed risk stratification (DRS) system by Momesso and coworkers was accepted by the American Thyroid Association as a diagnostic tool for the risk stratification of unfavorable clinical outcomes and to monitor the clinical outcomes of differentiated thyroid cancer (DTC) patients treated without radioactive iodine (RAI). The aim of this study was to evaluate the DRS system in patients with pT1aN0/Nx stage. Methods The study included 304 low-risk patients after thyroidectomy (n = 202) or lobectomy (n = 102) without RAI and were treated at a single center. The median age was 50.5 years, 91.1% were women and the median follow-up was 4 years. DRS of the treatment response was performed based on medical records and according to the criteria of Momesso and coworkers. Disease course (recurrence, death) and status (remission, persistent disease) on December 31, 2016 were evaluated. The relationship between unfavorable outcomes and the DRS system was evaluated. Results Response to initial therapy was excellent in 272 patients (89.5%), indeterminate in 31 (10.2%) and biochemical incomplete (increased TgAb levels) in one (0.3%). Two patients in the excellent response group experienced recurrence at 6 and 7 years of follow-up (after lobectomy). None of the patients with indeterminate and biochemical incomplete response developed structural disease, and none of the patients died during the follow-up. Conclusions The DRS system was not useful for predicting the risk of unfavorable clinical outcomes and cannot be used to personalize the monitoring method of the disease in patients at pT1aN0/Nx stage who are not treated with RAI.


Neurosurgery ◽  
2000 ◽  
Vol 47 (6) ◽  
pp. 1332-1342 ◽  
Author(s):  
Satoshi Tateshima ◽  
Yuichi Murayama ◽  
Y. Pierre Gobin ◽  
Gary R. Duckwiler ◽  
Guido Guglielmi ◽  
...  

ABSTRACT OBJECTIVE Seventy-three consecutive patients with 75 basilar tip aneurysms were treated with Guglielmi detachable coil (GDC) technology. Their anatomic and clinical outcomes are discussed. METHODS Seventy-five basilar tip aneurysms were treated with the GDC system at the University of California, Los Angeles Medical Center from 1990 to 1999. The average age of the population was 48.3 years (range, 28–82 yr). Forty-two patients (57.5%) presented with acute subarachnoid hemorrhage, 8 patients (10.9%) had unruptured aneurysms with mass effect, and 23 patients (31.5%) had incidental aneurysms. Thirty-one aneurysms (41.3%) were small with a small neck, 18 (24%) were small with a wide neck, 16 (21.3%) were large, and 10 (13.3%) were giant aneurysms. RESULTS Immediate anatomic outcomes demonstrated complete or near-complete occlusion in 64 aneurysms (85.3%) and incomplete occlusion in 7 aneurysms (9.3%). Four aneurysms (5.3%) could not be embolized because of anatomic difficulties. Of the 69 patients treated with GDCs, 63 patients (91.3%) remained neurologically intact or unchanged from their initial clinical status. Procedure-related morbidity and mortality were 4.1% and 1.4%, respectively. Long-term follow-up angiograms were obtained in 41 patients with 42 aneurysms. Thirty aneurysms (71.4%) demonstrated complete or near-complete occlusion. One incompletely embolized giant aneurysm ruptured during the follow-up period. CONCLUSION In contrast to surgical clipping of basilar tip aneurysms, the main technical challenge of the Guglielmi detachable coiling procedure depends on the shape of the aneurysm, not its location. The results of this study indicate that endovascular GDC technology is an appropriate therapeutic alternative in ruptured or unruptured basilar tip aneurysms regardless of patient age, clinical presentation, clinical status, or timing of treatment.


2020 ◽  
Vol 9 (8) ◽  
pp. 2464
Author(s):  
Jiesuck Park ◽  
Jung-Kyu Han ◽  
Mineok Chang ◽  
You-Jeong Ki ◽  
Jeehoon Kang ◽  
...  

We investigated whether intensive glucose control after percutaneous coronary intervention (PCI) improves clinical outcomes in diabetic patients. From the Grand-DES registry, we analyzed 2576 diabetic patients (median age 66 years, male 65.6%) who underwent PCI and had at least 2 records of HbA1c during the follow-up. Patients were categorized according to the mean HbA1c (≥7% or <7%). Primary outcome was major adverse cardiovascular event (MACE), a composite of cardiac death, non-fatal myocardial infarction, and any revascularization. During a median follow-up of 33.6 months, MACE occurred in 335 (13.0%) patients. Intensive glucose control with follow-up mean HbA1c < 7.0% (42.2%; n = 1087) was not associated with lower risk of MACE, compared to control with mean HbA1c ≥ 7.0% (adjusted hazard ratio [aHR] [95% confidence interval] 1.06 [0.82–1.37], p = 0.672). In subgroup analysis, patients with sustained HbA1c of <7.0% throughout the follow-up were not associated with a lower risk of MACE compared to those with sustained HbA1c of ≥7.0% (aHR 1.15 [0.71–1.89], p = 0.566). More intensive glucose control with mean HbA1c ≤ 6.5% was not associated with lower risk of MACE, compared to loose control with a mean HbA1c ≥ 8.0% (aHR 1.15 [0.71–1.86], p = 0.583). Intensive glucose control after PCI was not associated with better clinical outcomes in diabetic patients undergoing PCI than lenient control.


Sign in / Sign up

Export Citation Format

Share Document