Coalescence of the German-Austrian and IMRAW Cytogenetic MDS Databases: Modification of Patient Risk Groups.

Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 2468-2468 ◽  
Author(s):  
Christian Steidl ◽  
Julie Schanz ◽  
Michelle M. Le Beau ◽  
John M. Bennett ◽  
Ulrich Germing ◽  
...  

Abstract Introduction The International Prognostic Scoring System (IPSS) for evaluating prognosis in myelodysplastic syndromes (MDS) has been the standard for risk assessment in this disease for the past ten years. Based on a patient cohort comprising 816 primary MDS patients from the IMRAW, a refined bone marrow cytogenetic classification system was introduced. Recently, the GACMSG published cytogenetic data including 1155 primary MDS patients treated with supportive care only. Coalescence of these two large databases offered the opportunity to analyze the cytogenetic data jointly and to propose a modified cytogenetic risk stratification system. Patients and Methods 1971 patients with karyotype and survival data originating from the IMRAW and the GACMSG cohorts were included in this study. The collectives comprised patients with primary MDS treated with supportive care, only allowing short courses of low dose oral chemotherapy or hemopoietic growth factors. By reviewing the ISCN karyotypes, the patients were grouped into cytogenetic categories defined by median survival (MS) (Haase et al, Blood in press). The categories comprised karyotypes with the respective abnormality alone or in combination with one additional anomaly. Karyotypes with 3, or more than 3 abnormalities were considered separate categories. Results We found 15 cytogenetic categories each comprising 10 or more patients. These categories could be combined into 4 prognostic groups according to the MS: Group 1 (MS>3 years): normal karyotype, del(5q), del(12p), del(20q), +21, −Y, −X; Group 2 (1.5–3 years): +1/+1q/t(1q), add(3q)/inv(3q)/del(3q)/t(3q), +8, del(11q); Group 3 (1–1.5 years): 3 anomalies, −7, del(7q); Group 4 (MS<1 year): >3 anomalies. Further stratification of these categories led to a system with 4 distinct risk strata (number of patients): good (1374), int-1 (160), int-2 (99), and poor (166). Only 172 patients (9% of all patients) could not be classified according to this system. Survival analysis of these 4 groups showed distinct MS (Log-rank test: p<0.0001): good, 50 months; int-1, 24 months; int-2, 15 months; poor, 6 months. When combining the non-classified patients into one group MS was 31 months. When comparing this new classification system with the original system defined by the IPSS, 66 formerly intermediate risk patients shifted into the good risk group and 114 poor risk patients into the intermediate risk group. Discussion Combined examination of the two databases introduces 7 new cytogenetic categories with distinct survival times as compared to the IPSS; Group 1: del(12p), +21, −X; Group 2: +1/+1q/t(1q), add(3q)/inv(3q)/del(3q)/t(3q), del(11q); Group 3: 3 anomalies. Based on previously published data, the proposed system combines non-complex karyotypes in one category and distinguishes karyotypes with 3 or more than 3 abnormalities. With respect to future refined integrative scoring in MDS we present an approach that distinguishes groups of intermediate risk and a heterogeneous group of as yet unclassified rare cases harboring uncertain prognoses. In the latter cases, risk assessment should be based on other prognostic parameters rather than assigning an intermediate risk to this group. This new cytogenetic risk stratification system needs to be validated and tested using multivariate approaches.

2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
E Puymirat ◽  
V Tea ◽  
F Schiele ◽  
C Baixas ◽  
X Lamit ◽  
...  

Abstract Background High dose statins prescription are strongly recommended in patients after acute myocardial infarction (AMI) in current guidelines. Aim We aimed to assess the clinical impact on major cardiovascular events (MACE) of high dose statins prescription at discharge according to the atherothrombotic risk stratification in a routine-practice population of AMI patients, and to determine the relative efficacy of currently recommended high dose statins according to risk level. Methods We used data from the 2005, and 2010 FAST-MI nationwide registries, including 7,839 patients with AMI (54% STEMI) admitted to cardiac intensive care units in France. Atherothrombotic risk stratification was performed using the TIMI Risk Score for Secondary Prevention (TRS-2P). Patients were defined in 3 categories: Group 1 (Low-risk; TRS-2P=0/1); Group 2 (Intermediate-risk; TRS-2P=2); and, Group 3 (High-risk; TRS-2P≥3). Baseline characteristics and the rate of MACE (defined as death, stroke or re-MI) at 5-years were analyzed according to TRS-2P categories, and the impact of high dose statins (i.e. atorvastatin 80mg/day or rosuvastatin 20mg/day) at discharge was compared using Cox multivariate analysis among the different risk groups. Results A total of 7,348 patients discharge alive and in whom TRS-2P was available. Prevalence of Groups 1, 2, and Group 3 was 41.5%, 25% and 33.5% respectively. Over the 5-year period, the overall risk of patients admitted for AMI decreased in Group 3 from 41% to 27% (P<0.001). Optimal medical therapy at discharge (defined by the use of dual antiplatelets therapy, statins for all; and, beta-blocker, ACE-I or ARB when appropriate) was 53% in Group 3, 67% in Group 2, and 80% in Group 1 (P<0.001). High dose statins prescription at discharge was 18.5% (Group 3), 31.3% (Group 2), and 41.3%% (Group 1). High dose statins prescription was associated with lower MACE at five-year in the overall population compared to patients with intermediate/low dose statins or without statin prescription (14.3% vs. 29.6%; Δ absolute risk= 15.3%; HR adjusted on baseline characteristics and management: 0.86, 0.76–0.97, P=0.018). The decrease in MACE at five-year was observed in all TRS-2P categories (Group 1: 8.1% vs. 10.7%, Δ= 2.6%; Group 2: 14.8% vs. 21.6%, Δ= 6.8%; Group 3: 30.8% vs. 51.6%, Δ= 20.8%). Finally, the benefits of high dose statins in low- and intermediate-risk was lower (HR=0.97; 95% CI, 0.74–1.26; P=0.81 and HR=1.06; 95% CI, 0.81–1.38; P=0.81) compared to high-risk patients (HR=0.78; 95% CI, 0.65–0.94; P=0.008). Five-year events-free survival Conclusions High dose statins prescription at discharge after AMI was associated with lower MACE at five-year regardless of the atherothrombotic risk stratification, although the highest absolute reduction was found in the high risk TRS2P class. Acknowledgement/Funding The FAST-MI 2010 registry is a registry of the French Society of Cardiology, supported by unrestricted grants from: Merck, the Eli-Lilly-Daiichi-Sanky


2020 ◽  
Vol 56 (3) ◽  
pp. 2000513
Author(s):  
Stefano Ghio ◽  
Valentina Mercurio ◽  
Federico Fortuni ◽  
Paul R. Forfia ◽  
Henning Gall ◽  
...  

Question addressedEchocardiography is not currently considered as providing sufficient prognostic information to serve as an integral part of treatment goals in pulmonary arterial hypertension (PAH). We tested the hypothesis that incorporation of multiple parameters reflecting right heart function would improve the prognostic value of this imaging modality.Methods and main resultsWe pooled individual patient data from a total of 517 patients (mean age 52±15 years, 64.8% females) included in seven observational studies conducted at five European and United States academic centres. Patients were subdivided into three groups representing progressive degrees of right ventricular dysfunction based on a combination of echocardiographic measurements, as follows. Group 1 (low risk): normal tricuspid annular plane systolic excursion (TAPSE) and nonsignificant tricuspid regurgitation (TR) (n=129); group 2 (intermediate risk): normal TAPSE and significant TR or impaired TAPSE and nondilated inferior vena cava (IVC) (n=256); group 3 (high risk): impaired TAPSE and dilated IVC (n=132). The 5-year cumulative survival rate was 82% in group 1, 63% in group 2 and 43% in group 3. Low-risk patients had better survival rates than intermediate-risk patients (log-rank Chi-squared 12.25; p<0.001) and intermediate-risk patients had better survival rates than high-risk patients (log-rank Chi-squared 26.25; p<0.001). Inclusion of other parameters such as right atrial area and pericardial effusion did not provide added prognostic value.Answer to the questionThe proposed echocardiographic approach integrating the evaluation of TAPSE, TR grade and IVC is effective in stratifying the risk for all-cause mortality in PAH patients, outperforming the prognostic parameters suggested by current guidelines.


2018 ◽  
Vol 26 (4) ◽  
pp. 411-419 ◽  
Author(s):  
Victoria Tea ◽  
Marc Bonaca ◽  
Chekrallah Chamandi ◽  
Marie-Christine Iliou ◽  
Thibaut Lhermusier ◽  
...  

Background Full secondary prevention medication regimen is often under-prescribed after acute myocardial infarction. Design The purpose of this study was to analyse the relationship between prescription of appropriate secondary prevention treatment at discharge and long-term clinical outcomes according to risk level defined by the Thrombolysis In Myocardial Infarction (TIMI) Risk Score for Secondary Prevention (TRS-2P) after acute myocardial infarction. Methods We used data from the 2010 French Registry of Acute ST-Elevation or non-ST-elevation Myocardial Infarction (FAST-MI) registry, including 4169 consecutive acute myocardial infarction patients admitted to cardiac intensive care units in France. Level of risk was stratified in three groups using the TRS-2P score: group 1 (low-risk; TRS-2P=0/1); group 2 (intermediate-risk; TRS-2P=2); and group 3 (high-risk; TRS-2P≥3). Appropriate secondary prevention treatment was defined according to the latest guidelines (dual antiplatelet therapy and moderate/high dose statins for all; new-P2Y12 inhibitors, angiotensin-converting-enzyme inhibitor/angiotensin-receptor-blockers and beta-blockers as indicated). Results Prevalence of groups 1, 2 and 3 was 46%, 25% and 29% respectively. Appropriate secondary prevention treatment at discharge was used in 39.5%, 37% and 28% of each group, respectively. After multivariate adjustment, evidence-based treatments at discharge were associated with lower rates of major adverse cardiovascular events (death, re-myocardial infarction or stroke) at five years especially in high-risk patients: hazard ratio = 0.82 (95% confidence interval: 0.59–1.12, p = 0.21) in group 1, 0.74 (0.54–1.01; p = 0.06) in group 2, and 0.64 (0.52–0.79, p < 0.001) in group 3. Conclusions Use of appropriate secondary prevention treatment at discharge was inversely correlated with patient risk. The increased hazard related to lack of prescription of recommended medications was much larger in high-risk patients. Specific efforts should be directed at better prescription of recommended treatment, particularly in high-risk patients.


2021 ◽  
Author(s):  
Evert F.s. van Velsen ◽  
Robin P. Peeters ◽  
Merel T. Stegenga ◽  
F.j. van Kemenade ◽  
Tessa M. van Ginhoven ◽  
...  

Objective Recent research suggests that the addition of age improves the 2015 American Thyroid Association (ATA) Risk Stratification System for differentiated thyroid cancer (DTC). The aim of our study was to investigate the influence of age on disease outcome in ATA High Risk patients with a focus on differences between patients with papillary (PTC) and follicular thyroid cancer (FTC). Methods We retrospectively studied adult patients with High Risk DTC from a Dutch university hospital. Logistic regression and Cox proportional hazards models were used to estimate the effects of age (at diagnosis) and several age cutoffs (per five years increment between 20 and 80 years) on (i) response to therapy, (ii) developing no evidence of disease (NED), (iii) recurrence, and (iv) disease specific mortality (DSM). Results We included 236 ATA High Risk patients (32% FTC) with a median follow-up of 6 years. Age, either continuously or dichotomously, had a significant influence on having an excellent response after initial therapy, developing NED, recurrence, and DSM for PTC and FTC. For FTC, an age cutoff of 65 or 70 years showed the best statistical model performance, while this was 50 or 60 years for PTC. Conclusions In a population of patients with High Risk DTC, older age has a significant negative influence on disease outcomes. Slightly different optimal age cutoffs were identified for the different outcomes, and these cutoffs differed between PTC and FTC. Therefore, the ATA Risk Stratification System may further improve should age be incorporated as an additional risk factor.


Blood ◽  
2007 ◽  
Vol 110 (11) ◽  
pp. 830-830
Author(s):  
J. Alejandro Madrigal ◽  
Neema P. Mayor ◽  
Hazael Maldonado-Torres ◽  
Bronwen E. Shaw ◽  
Steven G.E. Marsh

Abstract Haematopoietic Stem Cell Transplantation (HSCT) using volunteer Unrelated Donors (UD) has become an important and viable option in the treatment of Acute Leukaemia (AL). While matching donors and recipients usually refers to five of the classical HLA genes (HLA-A, -B, -C, -DRB1 and -DQB1), the impact of a sixth gene, HLA-DPB1, on the outcome of UD-HSCT is increasingly emerging. We have previously shown an increased risk of relapse with HLA-DPB1 matching and independently, with NOD2/CARD15 genotype. In light of these data, we have analysed a larger UD-HSCT cohort in order to establish the impact on transplant outcome when both HLA-DPB1 matching status and NOD2/CARD15 genotype are considered. HLA typing and NOD2/CARD15 genotyping was performed on 304 AL patients and their Anthony Nolan Trust volunteer unrelated donors. Transplants occurred between 1996 and 2005 at UK transplant centres. Diagnoses were ALL (47%) and AML (53%). 67% of the cohort were a 10/10 HLA match with 16% also being matched for HLA-DPB1. Myeloablative conditioning regimens were used in 74% of transplants. T-cell depletion was included in 82% of conditioning protocols. Bone marrow was used in 72% of transplants with the remaining 28% using peripheral blood stem cells. Two forms of post-transplant immunosuppression predominated, Cyclosporine A and Methotrexate (47%) and Cyclosporine A alone (38%). Previous studies on a subgroup of this cohort showed that HLA-DPB1 matching and NOD2/CARD15 SNPs independently caused an increase in disease relapse. Consequently, the cohort was grouped into three categories to reflect this risk, group 1 (DPB1 matched; NOD2/CARD15 SNP, n=24), group 2 (HLA-DPB1 matched; NOD2/CARD15 Wild-Type (WT) or HLA-DPB1 mismatched; NOD2/CARD15 SNP, n=112) and group 3 (HLA-DPB1 mismatched; NOD2/CARD15 WT, n=168). There was a significant difference in disease relapse between the three groups (1 year: group 1; 68%, group 2; 48%, group 3; 30%, p=0.0038). This finding persisted in multivariate analysis where being in either group 2 or 3 was protective towards relapse as compared to group 1 (RR 0.321; 95% CI 0.167–0.616; p=0.001 and RR 0.478; 95% CI 0.244–0.934; p=0.031 respectively). In the group with the highest relapse risk (group 1), this resulted in a decrease in Overall Survival (OS) (33% vs 54% in group 3, RR 0.617; 95% CI 0.359–1.060; p=0.080). The best OS was seen in the group with the lowest risk of relapse (group 3). Here, in addition to low relapse, there was increased acute and chronic Graft-versus-Host Disease (GvHD) (p=0.0019 and p=0.0058 respectively). In this cohort, cGvHD (in its limited form) was associated with a significantly lower incidence of relapse (p=0.0066) and better OS (p<0.0001). In concordance with our previous theories, it appears that being HLA-DPB1 matched and having NOD2/CARD15 SNPs predicts for the worst outcome with a significant increase in relapse and reduced OS. Conversely, the ideal pairing would be HLA-DPB1 mismatched and NOD2/CARD15 WT. These data suggest that prospectively typing AL patients for HLA-DPB1 and NOD2/CARD15 SNPs will allow the prediction of disease relapse, aGvHD and cGvHD and in addition will allow the effects of being independently HLA-DPB1 matched or having a NOD2/CARD15 SNP to be offset by intelligently selecting a suitable, less precarious donor.


Author(s):  
Ivo Pavlik ◽  
Vit Ulmann ◽  
Helena Modra ◽  
Milan Gersl ◽  
Barbora Rantova ◽  
...  

A total of 281 guano samples were collected from caves (N = 181) in 8 European countries (Bulgaria, Czech Republic, France, Hungary, Italy, Romania, Slovakia and Slovenia) and attics in the Czech R. (N = 100). The correlation of detection of mycobacteria between Ziehl-Neelsen (ZN) microscopy and culture examination and qPCR was strong. ZN microscopy was positive in guano from caves (58.6%) more than double than positivity in guano from attics (21.0%; P &amp;lt;0.01). From 89 mycobacterial isolates (73 isolates from cave guano and 16 isolates from attics&rsquo; guano) 68 (76.4%) isolates of 19 sp., ssp. and complex were identified as members of 3 Groups (M. fortuitum, M. chelonae, and M. mucogenicum), and 4 Complexes (M. avium, M. terrae, M. vaccae, and M. smegmatis). A total of 20 isolates (22,5%) belonged to risk group 1 (environmental saprophytes), 48 isolates (53.9%) belonged to risk group 2 (potential pathogens), and none of isolates belonged to risk group 3 (obligatory pathogens). When comparing bat guano collected from caves and attics, differences (P&amp;lt;0.01; Mann-Whitney test) were observed for the electrical conductivity, total carbon, total organic and total inorganic carbon. None difference (P&amp;gt; 0.05; Mann-Whitney test) was found for pH and oxidation-reduction potential parameters.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 4731-4731 ◽  
Author(s):  
Jeffrey Paul Liles ◽  
Christopher Wanderling ◽  
Jordan Lee Liles ◽  
Debra Hoppensteadt ◽  
Jawed Fareed ◽  
...  

Abstract Background: Oral anticoagulants such as warfarin (W) have been conventionally used for the management of atrial fibrillation (AF). Despite the effectiveness of W, its use in AF patients requiring anticoagulation is suboptimal with an even greater underuse seen in elderly patients who are at higher risk of stroke. New oral anticoagulants such as rivaroxaban (R) and apixaban (A) have been approved to manage thrombotic and cardiovascular disorders including AF. The newer anticoagulants do not require continuous monitoring like W does and are much more convenient for patients with AF. Objective: To profile the baseline level of circulating thrombogenic biomarkers von Willebrand Factor (vWF), prothrombin fragment 1.2 (F1+2), microparticle bound tissue factor (MP-TF) and plasminogen activator inhibitor (PAI-1) in patients with AF. Additionally, the effect of both newer (R and A) and traditional (W) anticoagulants on the levels of thrombogenic biomarkers in patients with AF will be assessed. Materials: Citrated blood was drawn from thirty AF patients prior to ablation surgery and spun at 3000 rpm to obtain platelet poor plasma. Normal plasma samples from healthy controls were purchased from a commercial source (George King Biomedical, Overland Park, KS). The plasma samples were analyzed using a biochip array (Randox, London, UK) for metabolic syndrome biomarkers including PAI-1 and ELISA kits for vWF, MP-TF (Hyphen BioMed, Nueville-Sur-Oise, France) and prothrombin F1+2 (Siemens, Newark, DE). Results: Circulating levels of vWF, MP-TF and PAI-1 were statistically increased in patients with AF compared to normal (P<0.0001, P<0.0001, and P=0.0014, respectively). Circulating levels of prothrombin F1+2 showed no difference between the AF and normal group (P=0.2696). AF patients (n=30) were divided into two groups based on their usage (Group 1, n=21) and non-usage (Group 2, n=9) of any anticoagulant. Furthermore, those on anticoagulants were divided based on their use of newer (R and A, Group 3, n=16) or traditional (W, Group 4, n=4) anticoagulants. A statistical increase in vWF (P<0.0001), MP-TF (P<0.0001) and PAI-1 (P=0.011) remained in Group 1 compared to normal while a statistical increase in prothrombin F1+2 (P=0.0343) and PAI-1 (P=0.0040) were noted in Group 2 compared to normal. vWF (P=0.0036) and MP-TF (P=0.0059) were elevated in Group 1 compared to Group 2 while prothrombin F1+2 (P=0.0697) and PAI-1 (P=0.4548) showed no difference between the two groups. Furthermore, there was no statistical difference in the level of any thrombogenic biomarker in AF patients between Group 3 (R and A) and Group 4 (W). (Table 1) Discussion: Elevated levels of vWF, MP-TF and PAI-1 seen in AF patients compared to normal provide insight into an additional risk of thrombogenesis associated with AF which is not targeted by current anticoagulant medications. Most patients are assessed using a stroke risk stratification scale (CHA2DS2VASc, CHADS2, CHADS-VASC, or CHADS) to determine if anti-coagulants should be used to prevent stroke associated with AF. Of the 30 patients examined in this study, 8/9 (89%) patients who were not on anticoagulants had a stroke risk stratification score of 0 while 20/21(95%) patients who were on anticoagulants had a score of >1. This data supports studies which suggest that adding levels of prothrombotic biomarkers to current risk stratification scales could be more effective in assessing the risk of stroke of patients with AF. This data also suggests that although very effective in lowering prothrombin F1+2 levels in AF, the newer anticoagulants, R and A, and the traditional anticoagulant, W, still leave additional prothrombotic biomarkers unaffected. These unaffected biomarkers could be the potential target of future drug therapies which could lower the risk of stroke in patients with AF even more than the use of newer/traditional anticoagulants alone.Table 1.Biomarkers of Thrombogenesis in AF and Normal GroupsGroupvWF (concentration %)Prothrombin F1+2 (pmol/L)MP-TF (pg/mL)PAI-1 (ng/mL)Normal4140 ± 919 n=46106.1 ± 52.7 n=500.38 ± 0.25 n=483.21 ± 4.13 n=25AF Group 1 Anticoagulant n=216616 ± 1173107.9 ± 61.20.93 ± 0.715.69 ± 4.15Group 2 Non-Anticoagulant n=94788 ± 1338162.5 ± 93.30.51 ± 0.136.49 ± 3.14Group 3 New Anticoagulants (R and A) n=166721 ± 1127103.5 ± 37.60.76 ± 0.345.68 ± 4.53Group 4 Traditional Anticoagulants (W) n=46387 ± 158075.2 ± 52.60.94 ± 0.335.63 ± 3.52 Disclosures No relevant conflicts of interest to declare.


2021 ◽  
Vol 11 ◽  
Author(s):  
Xue Shi ◽  
Xiaoqian Liu ◽  
Xiaomei Li ◽  
Yahan Li ◽  
Dongyue Lu ◽  
...  

The baseline International Prognostic Index (IPI) is not sufficient for the initial risk stratification of patients with diffuse large B-cell lymphoma (DLBCL) treated with R‐CHOP (rituximab plus cyclophosphamide, doxorubicin, vincristine, and prednisone). The aims of this study were to evaluate the prognostic relevance of early risk stratification in DLBCL and develop a new stratification system that combines an interim evaluation and IPI. This multicenter retrospective study enrolled 314 newly diagnosed DLBCL patients with baseline and interim evaluations. All patients were treated with R-CHOP or R-CHOP-like regimens as the first-line therapy. Survival differences were evaluated for different risk stratification systems including the IPI, interim evaluation, and the combined system. When stratified by IPI, the high-intermediate and high-risk groups presented overlapping survival curves with no significant differences, and the high-risk group still had &gt;50% of 3-year overall survival (OS). The interim evaluation can also stratify patients into three groups, as 3-year OS and progression-free survival (PFS) rates in patients with stable disease (SD) and progressive disease (PD) were not significantly different. The SD and PD patients had significantly lower 3-year OS and PFS rates than complete remission and partial response patients, but the percentage of these patients was only ~10%. The IPI and interim evaluation combined risk stratification system separated the patients into low-, intermediate-, high-, and very high-risk groups. The 3-year OS rates were 96.4%, 86.7%, 46.4%, and 40%, while the 3-year PFS rates were 87.1%, 71.5%, 42.5%, and 7.2%. The OS comparison between the high-risk group and very high-risk group was marginally significant, and OS and PFS comparisons between any other two groups were significantly different. This combined risk stratification system could be a useful tool for the prognostic prediction of DLBCL patients.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Yaobin Lin ◽  
Lei Wang ◽  
Lingdong Shao ◽  
Xueqing Zhang ◽  
Huaqin Lin ◽  
...  

AbstractThe clinical efficacy of adjuvant radiotherapy in sigmoid colon cancer remains questioned. To evaluate the clinical efficacy of adjuvant external beam radiotherapy (EBRT) for patients with pathologic stage T4b sigmoid colon cancer. Patients with stage pT4b sigmoid colon cancer receiving adjuvant EBRT or not followed by surgery between 2004 and 2016 were extracted from the Surveillance, Epidemiology, and End Results database. Analysis of overall survival (OS) was performed using Kaplan–Meier curves and prognostic factors were identified using Cox proportional hazards regression models with 95% confidence intervals within the entire cohort. A risk-stratification system was then developed based on the β regression coefficient. Among 2073 patients, 284 (13.7%) underwent adjuvant EBRT. The median OS in the group receiving adjuvant EBRT was significantly longer than that in the non-radiotherapy group (p < 0.001). Age, serum carcinoembryonic antigen (CEA) level, perineural invasion, lymph node dissection (LND) number, and adjuvant EBRT were independent factors associated with OS. A risk‐stratification system was generated, which showed that low‐risk patients had a higher 5-year survival rate than high-risk patients (75.6% vs. 42.3%, p < 0.001). Adjuvant EBRT significantly prolonged the 5-year survival rate of high-risk patients (62.6% vs. 38.3%, p = 0.009) but showed no survival benefit among low‐risk patients (87.7% vs. 73.2%, p = 0.100). Our risk‐stratification model comprising age, serum CEA, perineural invasion, and LND number predicted the outcomes of patients with stage pT4b sigmoid colon cancer based on which subgroup of high-risk patients should receive adjuvant EBRT.


Sign in / Sign up

Export Citation Format

Share Document