scholarly journals FC 065PREDICTING OUTCOMES IN ANCA ASSOCIATED VASCULITIS: THE COMPLETE SCOTTISH EXPERIENCE

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Dominic McGovern ◽  
Jennifer Lees ◽  
Dana Kidder ◽  
James Smith ◽  
Jamie Traynor ◽  
...  

Abstract Background and Aims Outcomes in ANCA vasculitis remain difficult to predict and therapeutic decision-making can be challenging. We aimed to establish if a renal risk score (RRS) could predict outcomes in this population. Method The Scottish Renal Biopsy Registry is a complete national dataset of all renal biopsies performed in Scotland. Those who had a first renal biopsy between 01/01/2014 and 31/12/2017 with evidence of ANCA vasculitis were included. Demographic data, treatment regimens, episodes of relapse and patient and kidney survival were recorded, retrospectively. The RRS was calculated using the system proposed by Brix et al (1). Each patient was categorised according to % of normal glomeruli (N0 >25%, N1 10 to 25%, N2 <10%), % of tubular atrophy/interstitial fibrosis (T0 ≤25%, T1 >25%) and eGFR (CKD-EPI) at time of biopsy (eGFR: G0 >15 mL/min/1.73 m2, G1 ≤15 mL/min/1.73 m2). Individual scores were summated and patients defined as low, medium or high risk. Cox proportional hazard models were created for survival to ESKD, relapse and death, stratified by risk category. Analyses were conducted using R statistical software. Results Two-hundred and forty-six patients with biopsy proven ANCA vasculitis were identified. Fifty percent (n=123), 46% (n=112) and 5% (n=11) were stratified as low, medium and high risk respectively. Fifty-two percent (n=129) were male and mean age at biopsy was 66.7±12.2 years. This was similar across the risk categories. Mean eGFR was lower in the high-risk category (High risk 8.6±6.1 ‘v’ Low risk 45.7±26.0 ml/min/1.73m2, p<0.001) and proteinuria was higher (High risk 405 (IQR 170-767) ‘v’ Low risk 81 (IQR 41-155) mg/mmol, p<0.001). Thirty-seven percent (n=91) were PR3 antigen positive, 2% (n=5) had dual positivity. In the high risk category, 8 (73%) were PR3 or dual positive. Eighteen (n=7%) patients experienced pulmonary haemorrhage; representation similar across all risk categories. Those categorised as medium or high risk were more likely to receive plasma exchange and/or haemodialysis at presentation (p<0.001) compared with the low risk category. Overall, 16% (n=40) of patients relapsed with a trend to higher risk of relapse in the low risk group (27% of these patients, p=0.05). Thirty seven (15%) patients developed ESKD. Cox proportional hazard model for development of ESKD (Figure 1) shows that those in high risk ‘v’ low risk category were more likely to reach ESKD (HR 124.8, 95% CI 26.4-590.3, p<0.001). Patient survival was similar between risk categories. Conclusion A simple RRS, using routinely reported data, in patients with renal biopsy proven ANCA vasculitis can help to predict development of ESKD. It may also be predictive of future relapse in those with a lower RRS, most likely explained by reduced irreversible damage in this group. The RRS could inform monitoring and treatment decisions. Whilst the numbers are small, a unique strength of this data is that it is based on a complete national dataset making it less susceptible to bias from regional variations in diagnostic and therapeutic practice.

Blood ◽  
2012 ◽  
Vol 120 (21) ◽  
pp. 431-431
Author(s):  
Paola Guglielmelli ◽  
Flavia Biamonte ◽  
Arturo Pereira ◽  
Johannah Score ◽  
Carmela Mannarelli ◽  
...  

Abstract Abstract 431 Background. Primary myelofibrosis (PMF) has the worst prognosis among myeloproliferative neoplasms with median overall survival (OS) of 4.6y in the International Prognostic Scoring System (IPSS) series (Cervantes F, Blood 2009;113:2895) and 6.5y in patients (pts) seen more recently (Cervantes F, JCO, 2012 in press). OS is predicted by the four risk categories of IPSS, dynamic-IPSS and IPSS-plus system, and these scores are used for therapeutic choices particularly allogeneic stem cell transplantation. Nevertheless, pts heterogeneity still remains within these categories, necessitating improved risk stratification. A number of molecular abnormalities have been reported in PMF pts, but their prognostic relevance is incompletely understood, particularly with regard to transformation to leukemia (AL). The aim of this work was to analyze the prognostic impact of known mutations detected close to diagnosis in an international series of 429 pts. Patients and methods. PMF diagnosis had to satisfy the 2008 WHO criteria. Mutations in JAK2V617F, MPLW515, EZH2, ASXL1, TET2, IDH1/2, DNMT3A, CBL, SRSF2 were genotyped in whole blood or granulocytes using allele specific RTQ-PCR, HRM and direct sequencing; all mutations were confirmed at least twice. Missense, nonsense and frameshift mutations were considered; in case of novel mutations, SNPs were excluded by database searching and when feasible by germline DNA genotyping. The prognostic value of the molecular variables with regard to overall survival (OS) was analyzed by Cox regression and adjusted for the IPSS category. The association of molecular features with the risk of progression to AL was investigated in the framework of competing risks by the Fine & Gray regression method. Replicability of the prognostic models for both OS and progression to AL was assessed by replication in 1000 bootstrap samples randomly taken from the original series. Results. Patient median age was 60y. Median follow-up was 3.7y (95% CI, 0.02–27.9), death occurred in 157 pts (32%). Frequency of pts with constitutional symptoms was 28%, splenomegaly 74%, anemia 27%, leukocytosis 8%, >1% blasts 16%, thrombocytopenia 12%. Abnormal karyotype was found in 24% (of 229 evaluated). IPSS risk category: low-risk 35%, Int-1 30%, Int-2 21%, High-risk 14%. Frequency of mutations was: JAK2V617F 59.8% with 49% of pts having <25% allele burden; MPLW515 14%; EZH2 5%; ASXL1 21.3%; TET2 9.5%; IDH1-2 2.4%; DNMT3A 5.6%; CBL 4.3%; SRSF2 8.4%. Survival Model. Median survival was 9.7y (CI, 7.9–12.2), 22y in low-risk, 10y Int-1, 6.2y Int-2, 2.5y high-risk (P<0.0001). We found a strong association between IPSS risk categories and ASXL1 and SRSF2 mutated cases that clustered in the high-risk category (41.5% and 25.4%, respectively, P<0.001). ASXL1 mutations were associated with leukocytosis, blasts and constitutional symptoms; mutations in SRSF2 with older age, leukocytosis and symptoms. No other relevant associations between IPSS and molecular parameters was found. In the final prognostic model, only mutations in ASXL1 (Hazard ratio, HR:2.02; P<0.001) were found to add to IPSS (HR: 2.40; P<0.001) with regards to OS independently of the association of ASXL1 mutations and high-risk category. Within the Int-2/high-risk category, ASXL1 mutations were associated with significantly shorter survival (median survival 2.6 years months for mutated versus 5.8 years for unmutated; P=0.0004). According to bootstrap analysis, ASXL1 mutations were selected as significant predictor for OS in 74.6% of the samples. Leukemia Model. AL occurred in 75 pts (15.2%) after of a median of 3.8y (95%, 0.04–26.5) from diagnosis. Mutations in ASXL1 and IDH1/2 were the only molecular variables associated with higher risk of AL with a SHR of 2.33 (P<0.001) and 3.63 (P=0.008), respectively. Bootstrap validation approach resulted in ASXL1 and IDH1/2 mutations being significant predictors for AL in 60% and 54% of the samples, respectively. Conclusions. In this comprehensive series of mutations profiled in PMF pts, mutations in ASXL1 emerged as a powerful prognostic variable for survival refining prognosis in the Int-2/high risk IPSS categories. ASXL1 and IDH1/2 mutations predicted for death due to AL. Therefore, genotyping for ASXL1 and IDH1-2 mutations at diagnosis may help to tailor therapy for pts with IPSS Int-2/high risk PMF. Disclosures: No relevant conflicts of interest to declare.


10.2196/16069 ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. e16069
Author(s):  
Kenneth B Chapman ◽  
Martijn M Pas ◽  
Diana Abrar ◽  
Wesley Day ◽  
Kris C Vissers ◽  
...  

Background Several pain management guidelines recommend regular urine drug testing (UDT) in patients who are being treated with chronic opioid analgesic therapy (COAT) to monitor compliance and improve safety. Guidelines also recommend more frequent testing in patients who are at high risk of adverse events related to COAT; however, there is no consensus on how to identify high-risk patients or on the testing frequency that should be used. Using previously described clinical risk factors for UDT results that are inconsistent with the prescribed COAT, we developed a web-based tool to adjust drug testing frequency in patients treated with COAT. Objective The objective of this study was to evaluate a risk stratification tool, the UDT Randomizer, to adjust UDT frequency in patients treated with COAT. Methods Patients were stratified using an algorithm based on readily available clinical risk factors into categories of presumed low, moderate, high, and high+ risk of presenting with UDT results inconsistent with the prescribed COAT. The algorithm was integrated in a website to facilitate adoption across practice sites. To test the performance of this algorithm, we performed a retrospective analysis of patients treated with COAT between June 2016 and June 2017. The primary outcome was compliance with the prescribed COAT as defined by UDT results consistent with the prescribed COAT. Results 979 drug tests (867 UDT, 88.6%; 112 oral fluid testing, 11.4%) were performed in 320 patients. An inconsistent drug test result was registered in 76/979 tests (7.8%). The incidences of inconsistent test results across the risk tool categories were 7/160 (4.4%) in the low risk category, 32/349 (9.2%) in the moderate risk category, 28/338 (8.3%) in the high risk category, and 9/132 (6.8%) in the high+ risk category. Generalized estimating equation analysis demonstrated that the moderate risk (odds ratio (OR) 2.1, 95% CI 0.9-5.0; P=.10), high risk (OR 2.0, 95% CI 0.8-5.0; P=.14), and high risk+ (OR 2.0, 95% CI 0.7-5.6; P=.20) categories were associated with a nonsignificantly increased risk of inconsistency vs the low risk category. Conclusions The developed tool stratified patients during individual visits into risk categories of presenting with drug testing results inconsistent with the prescribed COAT; the higher risk categories showed nonsignificantly higher risk compared to the low risk category. Further development of the tool with additional risk factors in a larger cohort may further clarify and enhance its performance.


2019 ◽  
Author(s):  
Kenneth B Chapman ◽  
Martijn M Pas ◽  
Diana Abrar ◽  
Wesley Day ◽  
Kris C Vissers ◽  
...  

BACKGROUND Several pain management guidelines recommend regular urine drug testing (UDT) in patients who are being treated with chronic opioid analgesic therapy (COAT) to monitor compliance and improve safety. Guidelines also recommend more frequent testing in patients who are at high risk of adverse events related to COAT; however, there is no consensus on how to identify high-risk patients or on the testing frequency that should be used. Using previously described clinical risk factors for UDT results that are inconsistent with the prescribed COAT, we developed a web-based tool to adjust drug testing frequency in patients treated with COAT. OBJECTIVE The objective of this study was to evaluate a risk stratification tool, the UDT Randomizer, to adjust UDT frequency in patients treated with COAT. METHODS Patients were stratified using an algorithm based on readily available clinical risk factors into categories of presumed low, moderate, high, and high+ risk of presenting with UDT results inconsistent with the prescribed COAT. The algorithm was integrated in a website to facilitate adoption across practice sites. To test the performance of this algorithm, we performed a retrospective analysis of patients treated with COAT between June 2016 and June 2017. The primary outcome was compliance with the prescribed COAT as defined by UDT results consistent with the prescribed COAT. RESULTS 979 drug tests (867 UDT, 88.6%; 112 oral fluid testing, 11.4%) were performed in 320 patients. An inconsistent drug test result was registered in 76/979 tests (7.8%). The incidences of inconsistent test results across the risk tool categories were 7/160 (4.4%) in the low risk category, 32/349 (9.2%) in the moderate risk category, 28/338 (8.3%) in the high risk category, and 9/132 (6.8%) in the high+ risk category. Generalized estimating equation analysis demonstrated that the moderate risk (odds ratio (OR) 2.1, 95% CI 0.9-5.0; <i>P</i>=.10), high risk (OR 2.0, 95% CI 0.8-5.0; <i>P</i>=.14), and high risk+ (OR 2.0, 95% CI 0.7-5.6; <i>P</i>=.20) categories were associated with a nonsignificantly increased risk of inconsistency vs the low risk category. CONCLUSIONS The developed tool stratified patients during individual visits into risk categories of presenting with drug testing results inconsistent with the prescribed COAT; the higher risk categories showed nonsignificantly higher risk compared to the low risk category. Further development of the tool with additional risk factors in a larger cohort may further clarify and enhance its performance.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 1672-1672
Author(s):  
Meritxell Nomdedeu ◽  
Xavier Calvo ◽  
Dolors Costa ◽  
Montserrat Arnan ◽  
Helena Pomares ◽  
...  

Abstract Introduction: The MDS are a group of clonal hematopoietic disorders characterized by blood cytopenias and increased risk of transformation into acute myeloid leukemia (AML). The MDS predominate in old people (median age at diagnosis > 70 years) so that a fraction of the observed mortality would be driven by age-related factors shared with the general population rather than the MDS. Distinguishing between the MDS-related and unrelated mortality rates will help better assessment of the population health impact of the MDS and more accurate prognostication. This study was aimed at quantifying the MDS-attributable mortality and its relationship with the IPSSR risk categories. Methods: The database of the GESMD was queried for patients diagnosed with primary MDS after 1980 according to the WHO 2001 classification. Patients with CMML, younger than 16 years or who lacked the basic demographic or follow-up data were excluded. Relative survival and MDS-attributable mortality were calculated by the cohort method and statistically compared by Poisson multivariate regression as described by Dickman (Stat Med 2004; 23: 51). Three main parameters were calculated: the observed (all-cause) mortality, the MDS-attributable mortality (both as percentage of the initial cohort), and the fraction of the observed mortality attributed to the MDS. Results: In total, 7408 patients met the inclusion criteria and constitute the basis for this study. Among these patients, 5307 had enough data to be classified according to the IPSSR. Median age was 74 (IQR: 16-99) years and 58 % were males. The most frequent WHO categories were RAEB, type I or II (29% of cases), RCMD (28%), and RA with ring sideroblasts (16%). Most patients (72%) were classified within the very low and low risk categories of the IPSSR. At the study closing date (December 2014), 1022 patients had progressed to AML, 3198 had died (974 after AML) and 3210 were censored alive. The median actuarial survival for the whole series was 4.8 (95% CI: 4.6-5.1) years and 30% of patients are projected to survive longer than 10 years. The overall MDS-attributable mortality at 5 years from diagnosis was 39%, which accounted for three-quarters of the observed mortality (51%, figure). The corresponding figures at 10 years for the MDS-attributable and observed mortality were 55% and 71%, respectively. According to the IPSSR, the 5-year MDS-attributable mortality rates was 19% for the very low risk category, 39% (low risk), 70% (intermediate risk), 78% (high risk), and 92% (very high risk). On average, the incidence rate ratio for the MDS-attributable mortality increased 1.9 times (95% CI: 1.7-2.3, p<0.001) as the IPSSR worsened from one to the next risk category. The fraction of the observed mortality attributed to the MDS was 0.55 for the very low risk category, 0.79 (low risk), 0.93 (intermediate risk), 0.96 (high risk), and 0.99 (very high risk). After distinguishing between AML-related and unrelated mortality, the 5-year MDS-attributable mortality not related to AML was 10% for the very low risk category, 20% (low risk), 33% (intermediate risk), 42% (high risk), and 44% (very high risk). By comparing these figures with the above ones, we could estimate that about 50% of the MDS-attributable mortality was AML-unrelated and that such fraction kept nearly constant across the five IPSSR categories. Conclusions: About three-quarters of the mortality observed in patients with MDS is caused by the disease, the remaining one-quarter being due to MDS-independent factors shared with the general population. The MDS-attributable mortality increases with the IPSSR risk category, from half the observed mortality in the very low risk to nearly all the mortality observed in the high and very high risk groups. Half the MDS-attributable mortality is driven by factors unrelated to leukemic transformation, a proportion that keeps constant across the five IPSSR risk categories. Disclosures Valcarcel: AMGEN: Honoraria, Membership on an entity's Board of Directors or advisory committees, Speakers Bureau; NOVARTIS: Honoraria, Membership on an entity's Board of Directors or advisory committees; GSK: Membership on an entity's Board of Directors or advisory committees, Speakers Bureau; CELGENE: Honoraria, Membership on an entity's Board of Directors or advisory committees, Speakers Bureau. Ramos:AMGEN: Consultancy, Honoraria; NOVARTIS: Consultancy, Honoraria; JANSSEN: Honoraria, Membership on an entity's Board of Directors or advisory committees; CELGENE: Consultancy, Honoraria, Membership on an entity's Board of Directors or advisory committees, Research Funding. Esteve:Celgene: Consultancy, Honoraria; Janssen: Consultancy, Honoraria.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 4459-4459 ◽  
Author(s):  
Dr. Muhammad Irfan Khan ◽  
Catriona O'Leary ◽  
Mary Ann Hayes ◽  
Patricia O'Flynn ◽  
Pauline Suzanne Chappell ◽  
...  

Abstract Background Evidence based consensus guidelines for venous thromboembolism (VTE) prevention are broadly accepted to be effective and safe for more than three decades (Clagett GP et al, 1992). However VTE continues to be associated with a major global burden of disease with 3.9 million cases of HAT during one year among 1.1 billion citizens of high income countries (Jha AK et al, 2013). Therefore prevention is the key to reduce death and disability resulting from VTE (Kahn S et al, Gould MK et al & Falck-Yitter Y et al, 2012). Ireland like many other countries has yet to implement a mandatory risk assessment tool and thromboprophylaxis (TP) policy nationally. Aims The aim of this study was to calculate the proportion of inpatients who had a VTE risk assessment performed and received appropriate TP in a large tertiary referral hospital. This information will be vital for baseline data for implementation of a new national policy for prevention of HAT. Methods This audit was performed at Cork University Hospital on 4 pre specified days between November 2014 to February 2015. All adult inpatients (Medical and Surgical) excluding maternity and psychiatric were included. Patients on therapeutic anticoagulation were also excluded. The patients' medical chart and drug prescription chart were reviewed to determine whether or not a VTE risk assessment was documented for each patient and if they had received appropriate TP. If no risk assessment had been performed, trained researchers applied the National Institute for health and Care Excellence (NICE) guidelines 92 (Jan 2010) for VTE risk assessment and prevention. Following the risk assessment patients were divided into three categories, high risk of VTE with low risk of bleeding; high risk of VTE with significant risk of bleeding and low risk of VTE. From this the proportion of patients in each group that received appropriate TP were calculated. Results A total of 1019 patients were enrolled the majority were medical patients 63.5% (n=648). The mean age of patients was 69 years. Females accounted for 52% of patients. Average length of hospitalisation for each patient at the time of the audit was 6 days (range 1-664 days). Overall, a formal TP risk assessment was documented in only 24% (n=244) of all charts reviewed however TP was prescribed in 43.2% (n=441) of patients. See table.Table 1.High Risk of VTE low risk of bleedingHigh risk of VTE significant risk of bleedingLow risk of VTENo. of pts80.3% (n=819)16.6% (n=170)2.9% (n=30)VTE risk assessment documented21.9% (n=180)28.2% (n=55)30% (n=9)Received TP46.3% (n=380)28.8% (n=49)40% (n=12) Within the high risk category patients, 64.3% (n=526) medical. TP was only administered to 46.3% (n=380) of patients in the high risk category. This was almost evenly distributed between surgical 50.1% (n=147) and medical 43.4% (n=233) patients. Conclusion This audit was done as the initial step to develop a national policy to prevent HAT. As suspected, this audit highlights that a large proportion of hospitalised patients, both surgical and medical, continue to be at high risk for VTE despite the availability of preventative measures. There is clear illustration of under prescription of safe, effective and recommended means of VTE prevention. The current overall figure of less than 50% prescription of VTE thromboprophylaxis in high risk patients is a major patient safety concern. There are numerous recognised international guidelines for prevention of VTE, and an efficient method to implement these guidelines needs to be developed. Beyond developing national guidelines for TP, we need a co-ordinated approach to implement and monitor compliance with guidelines. Once the preliminary results of this audit were available to us in March 2015, urgent measures were taken to reduce the identified risk such as the establishment of a Hospital Thrombosis Group which developed a user friendly VTE risk assessment tool and TP policy. The VTE risk assessment tool was incorporated into the patients drug prescription chart and included a pre printed prescription for TP. It is now mandatory for the all patients to have a VTE risk assessment tool and TP prescribed if appropriate within 24hrs of admission. This was successfully piloted for four weeks in the acute medical assessment unit and is now incorporated into each patients drug chart throughout the hospital. This audit will be replicated in 6 months from introduction of this initiative, with an aim of >90% compliance. Disclosures No relevant conflicts of interest to declare.


PeerJ ◽  
2021 ◽  
Vol 9 ◽  
pp. e11092
Author(s):  
Mutsuaki Edama ◽  
Hiromi Inaba ◽  
Fumi Hoshino ◽  
Saya Natsui ◽  
Sae Maruyama ◽  
...  

Background This study aimed to clarify the relationship between the triad risk assessment score and the sports injury rate in 116 female college athletes (average age, 19.8 ± 1.3 years) in seven sports at the national level of competition; 67 were teenagers, and 49 were in their 20s. Methods Those with menstrual deficiency for >3 months or <6 menses in 12 months were classified as amenorrheic athletes. Low energy availability was defined as adolescent athletes having a body weight <85% of ideal body weight, and for adult athletes in their 20s, a body mass index ≤17.5 kg/m2. Bone mineral density (BMD) was measured on the heel of the right leg using an ultrasonic bone densitometer. Low BMD was defined as a BMD Z-score <−1.0. The total score for each athlete was calculated. The cumulative risk assessment was defined as follows: low risk (a total score of 0–1), moderate risk (2–5), and high risk (6). The injury survey recorded injuries referring to the injury survey items used by the International Olympic Committee. Results In swimming, significantly more athletes were in the low-risk category than in the moderate and high-risk categories (p = 0.004). In long-distance athletics, significantly more athletes were in the moderate-risk category than in the low and high-risk categories (p = 0.004). In the moderate and high-risk categories, significantly more athletes were in the injury group, whereas significantly more athletes in the low-risk category were in the non-injury group (p = 0.01). Significantly more athletes at moderate and high-risk categories had bone stress fractures and bursitis than athletes at low risk (p = 0.023). Discussion These results suggest that athletes with relative energy deficiency may have an increased injury risk.


2021 ◽  
Vol 108 (Supplement_6) ◽  
Author(s):  
P Holden ◽  
G Wilson ◽  
M Daniel ◽  
R Srivastava

Abstract Aim Tonsillectomy represents 17% of the elective workload in ENT and post-tonsillectomy haemorrhage is the most significant complication of this procedure. Accordingly, the GIRFT (Getting It Right First Time) report for ENT surgery focusses on the prevention of post-tonsillectomy bleeding. However, there is little guidance on the management of post-tonsillectomy haemorrhage. A local guideline for the management of post-tonsillectomy haemorrhage was introduced in 2020 based on expert consensus. This audit examines the management of patients readmitted with post-tonsillectomy haemorrhage in 2019 and compares this to the management suggested in the new guideline. Method Patients readmitted with post-tonsillectomy haemorrhage within 30 days of a tonsillectomy performed in 2019 were identified. These were retrospectively stratified into risk categories according to both patient and clinical factors. Management was audited against the new guideline including both the initial patient assessment and the treatment suggested for their respective risk category. Results Fifteen patients were identified and stratified into low, medium and high-risk categories. All patients in the “low risk” category were successfully treated conservatively. One patient from the “medium risk” category had a further bleed as an inpatient during the proposed period of observation in the new guideline and was thereafter treated as “high risk”. Within the “high risk” category two patients required return to theatre for arrest of post-tonsillectomy haemorrhage. Conclusions These results show that the risk stratification proposed in these guidelines may be useful in the management of post-tonsillectomy haemorrhage. Amendments to the guideline and a re-audit are in progress.


Blood ◽  
2014 ◽  
Vol 124 (21) ◽  
pp. 4545-4545
Author(s):  
Massimo Breccia ◽  
Matteo Molica ◽  
Irene Zacheo ◽  
Giuliana Alimena

Abstract Nilotinib is currently approved for the treatment of chronic myeloid leukemia (CML) in chronic (CP) and accelerated phase (AP) after failure of imatinib and in newly diagnosed patients. Atherosclerotic events were retrospectively reported in patients with baseline cardiovascular risk factors during nilotinib treatment. We estimated the risk of developing atherosclerotic events in patients treated with second or first line nilotinib, with a median follow-up of 48 months, by retrospectively applying the SCORE chart proposed by the European Society of Cardiology (ESC) and evaluating risk factors at baseline (diabetes, obesity, smoking and hypertension). Overall, we enrolled in the study 82 CP patients treated frontline (42 patients, at the dose of 300 mg BID) or after failure of other tyrosine kinase inhibitors (40 patients, treated with 400 mg BID). The SCORE chart is based on the stratification of sex (male vs female), age (from 40 to 65 years), smoker vs non-smoker, systolic pressure (from 120 to 180 mm Hg) and cholesterol (measured in mmol/l, from 150 to 300 mg/dl). For statistical purposes we considered patients subdivided in low, moderate, high and very high risk. There were 48 males and 34 females, median age 51 years (range 22-84). According to WHO classification, 42 patients were classified as normal weight (BMI < 25), 26 patients were overweight (BMI 26- <30) and 14 were obese (BMI > 30). Retrospective classification according to the SCORE chart revealed that 27 patients (33%) were in the low risk category, 30 patients (36%) in the moderate risk category and 24 patients (29%) in the high risk category. As regards risk factors, we revealed that 17 patients (20.7%) had a concomitant type II controlled diabetes (without organ damage), 23 patients (28%) were smokers, 29 patients (35%) were receiving concomitant drugs for hypertension, 15 patients (18%) had concomitant dyslipidaemia. Overall, the cumulative incidence of atherosclerotic events at 48 months was 8.5% (95% CI: 4.55-14.07): none of the low-risk patients according to the SCORE chart experienced atherosclerotic events compared to 10% in the moderate risk and 29% in the high risk category (p=0.002). Atherosclerotic-free survival was 100%, 89% and 69% in the low, moderate and high-risk population, respectively (p=0.001). SCORE chart evaluation at disease baseline could be a valid tool to identify patients at high risk of atherosclerotic events during nilotinib treatment. Disclosures Breccia: novartis: Consultancy; BMS: Consultancy; Celgene: Consultancy.


Blood ◽  
2013 ◽  
Vol 122 (21) ◽  
pp. 104-104
Author(s):  
Paola Guglielmelli ◽  
Terra L. Lasho ◽  
Flavia Biamonte ◽  
Johannah Score ◽  
Carmela Mannarelli ◽  
...  

Abstract Background The median survival (OS) of patients with primary myelofibrosis (PMF; 6.5y) is significantly shortened compared to reference population (Cervantes et al, JCO 2012; 30:2981). OS is predicted by the four risk categories of IPSS, Dynamic-IPSS and DIPSS-plus score, nevertheless pts heterogeneity persists within these categories, necessitating improved risk stratification. We recently reported that pts harboring mutations in any one of the prognostically relevant ASXL1, EZH2, IDH1/2 and SRSF2 genes constitute a IPSS-and DIPPS-plus independent Molecular High Risk category (MHR+) characterized by significant reduction of OS and leukemia free survival (LFS) (Vannucchi et al, Leukemia 2013). The aim of this work was to analyze the impact of the number of prognostically relevant mutated genes on OS and LFS in PMF. Patients and methods Two independent cohorts are included, a “test cohort” from Europe, analyzed at diagnosis, and a “validation cohort” from Mayo Clinic, analyzed at any time after diagnosis. Mutation analysis was performed in DNA from whole blood or granulocytes using RTQ-PCR, HRM and direct sequencing; all mutations were confirmed at least twice. The prognostic value of the molecular variables with regard to OS and LFS was analyzed by Cox regression. Test cohort It included 490 pts (median age 61y; males = 301) risk stratified by IPSS into high (n=74, 15%), intermediate-2 (n=93, 19% ), intermediate-1 (n=147, 30%) and low (n=176, 36%). The median follow-up was 3.63y (95% CI, 0.06-28.33); 161 pts died (33.0%), of whom 76 (15.6%) had progressed to acute leukemia after a median of 3.4y (0.04-28.3) from diagnosis. One hundred forty-six pts (29.8%) presented at least one of the four aforementioned mutated genes and were classified as MHR+. The OS of MHR+ pts was significantly reduced compared to patients with no mutations (n=344): 80.7 vs 148.9 mo (HR 2.2, CI95% 1.6-3.03). One hundred twelve (22.8%) pts had 1 mutation and 34 (6.9%) had 2 or more mutations. In univariate analysis presence of 2 or more mutated genes was significantly more detrimental for OS (29.5mo; HR 4.12, IC95% 2.6-6.4) than having 1 mutated gene (84.2mo; HR 1.8, IC95% 1.2-2.5) compared with having no mutations (148.9mo). Multivariate analysis results adjusted for IPSS indicated that having two or more mutations was an independent prognostic factor for OS (HR 2.9, 95% CI 1.8-4.5). Notably, the prognostic relevance of harboring two or more mutations involved both lower (low and intermediate-1; HR 1.87, 95% CI 1.3-2.6) and higher (intermediate-2 and high; HR 1.6, 95% CI 1.2-2.1) categories of IPSS. Also LFS resulted significantly shorter (129mo; HR 3.1, 95%CI 1.9-4.8) in MHR+ pts compared with pts with no adverse mutations (323mo). Having 2 or more mutations correlated with greater reduction of LFS (79.6mo; HR 7.02, CI95% 3.9-12.6) than having one mutation only (133.9mo; HR 2.2, IC95% 1.3-3.7) as compared with no mutations (323.2mo). Multivariate analysis showed that IPSS high risk category (HR 4.5; 95% CI 2.3-8.8) and two or more mutated genes (HR 5.3; 95% CI 2.9-9.8) predicted independently for a significant reduction in LFS. The negative impact on LFS of having 2 or more mutated genes compared to one or no mutations was maintained in the lower (HR 6.4, 95% CI 2.6-15.2) and higher (HR 5.5, 95% CI 2.3-12.8) risk categories. Validation cohort Validation cohort included 262 patients (median age 64 years; males = 167) risk stratified by DIPSS-plus into high (n=89, 34%), intermediate-2 (n=92, 35%), intermediate-1 (n=46, 18%) and low (n=34, 13%). One hundred forty-six pts (56%) displayed none of the four aforementioned mutations, 93 (36%) harbored one mutation whereas 23 (9%) harbored two or more mutations. In univariate analysis, having two or more mutations (HR 3.7; 95% CI 2.2-6.1) was significantly more detrimental for OS than having no mutations, which is more favorable than having one mutation (HR 1.9; 95% CI 1.4-2.7). When adjusted for DIPSS-plus, the presence of two or more mutations retained its significance (HR 2.1; 95% CI 1.3-3.6) and outperformed ASXL1 mutation alone in its prognostic relevance Conclusions Overall, these results show that the number of prognostically relevant mutated genes correlate with OS and LFS in pts with PMF, suggesting that screening for these mutations might help to improve risk stratification. Disclosures: No relevant conflicts of interest to declare.


2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Heba Alshaker ◽  
Robert Mills ◽  
Ewan Hunter ◽  
Matthew Salter ◽  
Aroul Ramadass ◽  
...  

Abstract Background Current diagnostic blood tests for prostate cancer (PCa) are unreliable for the early stage disease, resulting in numerous unnecessary prostate biopsies in men with benign disease and false reassurance of negative biopsies in men with PCa. Predicting the risk of PCa is pivotal for making an informed decision on treatment options as the 5-year survival rate in the low-risk group is more than 95% and most men would benefit from surveillance rather than active treatment. Three-dimensional genome architecture and chromosome structures undergo early changes during tumourigenesis both in tumour and in circulating cells and can serve as a disease biomarker. Methods In this prospective study we screened whole blood of newly diagnosed, treatment naïve PCa patients (n = 140) and cancer-free controls (n = 96) for the presence of 14,241 chromosomal loops in the loci of 425 genes. Results We have detected specific chromosome conformation changes in the loci of ETS1, MAP3K14, SLC22A3 and CASP2 genes in peripheral blood from PCa patients yielding PCa detection with 80% sensitivity and 80% specificity. Further analysis between PCa risk groups yielded prognostic validation sets consisting of HSD3B2, VEGFC, APAF1, BMP6, ERG, MSR1, MUC1, ACAT1 and DAPK1 genes that achieved 80% sensitivity and 93% specificity stratifying high-risk category 3 vs low risk category 1 and 84% sensitivity and 89% specificity stratifying high risk category 3 vs intermediate risk category 2 disease. Conclusions Our results demonstrate specific chromosome conformations in the blood of PCa patients that allow PCa diagnosis and risk stratification with high sensitivity and specificity.


Sign in / Sign up

Export Citation Format

Share Document