Pharmacokinetics of Cyclophosphamide Metabolites Influence Outcome in Patients with β-Thalassemia Major Undergoing Allogeneic HSCT

Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 1941-1941
Author(s):  
Poonkuzhali Balasubramanian ◽  
Salamun Desire ◽  
John C Panetta ◽  
Kavitha M Lakshmi ◽  
Ezhilpavai Mohanan ◽  
...  

Abstract Abstract 1941 Busulfan in combination with cyclophosphamide (Cy) is the commonly used conditioning regimen for hematopoietic stem cell transplantation (HSCT) for various hematological diseases. Cy, a prodrug, undergoes hepatic biotransformation to 4-hydroxy cyclophosphamide (4-HCy) and subsequently to its active metabolite, phosphoramide mustard (PM) and carboxyethyl PM (CEPM), a nontoxic oxidation product of HCY/aldophosphamide. Though toxic complications like hemorrhagic cystitis (HC) and hepatic sinusoidal obstruction syndrome (SOS) have been associated with metabolites of Cy such as acrolein and CEPM, respectively, there is no data correlating pharmacokinetics (PK) of Cy, HCy or CEPM with toxicity and outcome of HSCT for ß thalassemia major. Aim of the present study was to evaluate Cy, 4-HCy and CEPM PK and the influence of these PK parameters on clinical outcome in patients with ß thalassemia major undergoing HSCT. Between January 2001 to June 2009, out of the 168 HSCT for thalassemia major conditioned with Bu/Cy regimen (including 8 second transplants), 90 patients for whom PK samples were collected were included in this study. Cy was administered at 50 mg/kg/day for 4 days (days −5 to −2) following 4 days of busulfan (days −9 to −6). Peripheral blood samples were collected during Cy infusion at various time interval and plasma samples were stored for Cy, HCy and CEPM PK analysis. Levels of Cy and 4-HCy were measured by high performance liquid chromatography, and CEPM, by a modified LCMSMS method. The population PK estimates were determined using non-linear mixed effects modeling analysis performed with Monolix (version 3.1, www.monolix.org). Specifically, a compartmental model which included two compartments each for Cy, HCy and CEPM was used to describe the data. Clinical outcome endpoints including graft rejection, event free survival (EFS), overall survival (OS), transplant related mortality (TRM), SOS, and HC were evaluated using standard criteria. The influence of Cy and metabolite PK on clinical outcome endpoints were compared using logistic regression analysis. Age range of the patients was 2 to 24 years. Four patients belonged to Lucarelli risk class I, 49 class II and 37 class III. Based on risk stratification that we have defined using liver size and age (high risk: age >7 years and liver size >5cm; and the rest as low risk; Mathews et al, 2007), 39 were in low risk, 40 intermediate and 11 were high risk. Overall incidence of OS, EFS, rejection, TRM, SOS, HC in this cohort was 77, 70, 14, 10, 18 and 31% respectively. It should be noted that this does not completely represent the outcome of HSCT in thalassemia during this period, as only patients with available Cy PK analysis were included for analysis. The high risk patients had significantly reduced OS (RR 2.59; p=0.04), EFS (RR 2.23; p=0.058), increased risk of TRM (RR: 3.56; p=0.03) and HC (RR: 3.19; p=0.036) compared to others, while this was not significantly different with respect to Lucarelli class except for increased incidence of HC in Lucarelli class III patients (RR: 2.6; p=0.04). Upon univariate analysis, there was significantly increased Cy AUC (1887 ng*h/ml; range: 679–8546; vs. 1544, range: 662–4434; p= 0.028) in patients who developed HC compared to those who did not. There was significantly decreased HCy AUC (median 5.172 ug*L/h, range: 0.795–6.457; vs. 6.224; 2.536–12.003 p=0.007) in patients who died compared to those who are alive; similar but more significant association was seen with EFS as well. There was decreased CEPM Cl/F in those who rejected the grafts (0.013 vs. 0.028 L/h/Kg; p=0.016), while it was significantly increased in patients who developed SOS (median 0.029; vs. 0.013, range: L/h/Kg; p=0.05). Upon forward stepwise multivariate analysis including all the Cy and metabolite PK parameters, only HCy AUC significantly influenced EFS and OS in these patients. We show here for the first time that Cy and metabolite PK influences HSCT outcome in a uniform cohort of patients with thalassemia major. However, due to the complex metabolic pattern of Cy, and the association of metabolite PK instead of the levels of the parent compound with outcome, the possibility of targeted dose adjustment of Cy to improve HSCT outcome may be more challenging than targeted dose adjustment of other drugs used in HSCT. Disclosures: No relevant conflicts of interest to declare.

Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 518-518
Author(s):  
Poonkuzhali Balasubramanian ◽  
Salamun Desire ◽  
Pranavi Sugumaran ◽  
Kavitha ML ◽  
Aby Abraham ◽  
...  

Abstract Abstract 518 Targeted dosing of busulfan (Bu) has been shown to improve outcome of allogeneic HSCT (aHSCT) in patients with leukemia. There is limited data on correlation of Bu PK with outcome in patients with thalassemia major (TM)) undergoing aHSCT. We have previously shown that first dose trough level of Bu (Cmin1) predicts graft rejection (Chandy et al, BMT 2005), and Bu Css is significantly lower in patients with hepatic veno-occlusive disease (HVOD) (Srivastava et al, Blood 2004). The aim of the present study was to evaluate the correlations of Bu PK with outcome of HSCT in a larger cohort of patients and to evaluate the pharmacogenetic basis for the differences. We retrospectively analyzed oral Bu PK after 1st and 13th doses of Bu in 255 patients out of the 291 thalassemic patients who underwent aHSCT from matched related donors between 1991 till February 2010 at our centre. All patients received busulfan (at a dose of 14 or 16 mg/kg/day or 600mg/m2/day) in combination with Cy (at a dose of 160mg/kg for those >15 years or 200mg/kg for all others) as part of the conditioning regimen. Bu levels were measured by HPLC as previously described (Poonkuzhali et al, 1999). We also analyzed GSTA1*1B, GSTM1 and GSTT1 deletion polymorphisms in these patients. Based on Lucarelli's risk stratification, 18/291 patients belonged to class I, 121/291 were class II and 151/291 were class III. The class III patients were further risk stratified into class 3 high risk and low risk based on age and liver size (high risk: age >7 years and liver size >5cm; and the rest as low risk; Mathews et al, 2007). None of the Bu PK parameters were significantly different between Lucarelli classes as well as between class III high and low risk patients. For the entire group, EFS was 77%, OS 81%, NRM (non rejection mortality)15% and graft rejection 8.6%. Class III patients had a significantly lower EFS (p=0.0007) and OS (p=0.0051) compared to class I and II. Bu Cmin1 (p=0.007), but not Bu Css1 was significantly lower in those who rejected their graft compared to those who did not. On quartile analysis, patients with Cmin1 <156ng/ml had 30% (18/60) rejection compared to 8% (14/169) rejection in patients with Bu Cmin1 >156ng/ml (RR= 9.8; p=0.0001). Those with Bu Css1 in the lowest quartile also had significant risk of rejection (14/57 with Css1 <490 vs. 18/169 with Css1 >490 ng/ml; RR= 3.8, p=0.027) but the correlation was not as strong as that with Cmin1. Upon multivariate analysis of all the variables that were significantly influencing aHSCT outcome in univariate analysis, only Lucarelli class III high risk (p=0.034), SGOT level (p=0.036) and Bu Cmin1 (p=0.0001) were significantly influencing graft rejection. In addition, GSTA1*1B homozygous variant genotype was significantly associated with higher Bu Cmin1 (p=0.008) and Css1 (p=0.009). Since Bu Cmin1 was significantly influenced by GSTA1*1B genotype, we compared the combined risk of Cmin1 <156ng/ml and GSTA1*1B wild type genotype. None of the 6 patients with Bu Cmin1 <156ng/ml and GSTA1*1B homozygous mutant genotype rejected their graft. The incidence of graft rejection in the patients with GSTA1*1B wild type (17/113; 15%) and heterozygous (15/106; 9.4%) was higher than among those with homozygous mutant (1/34; 2.9%), (p=0.059). This is the largest available data on PK of oral busulfan patients with a single genetic disorder. We conclude that Bu Cmin1 is a stronger predictor of graft rejection than Css1 and that it is impacted by GSTA1*1B polymorphism in patients with beta thalassemia major undergoing aHSCT. Disclosures: Krishnamoorthy: INSERM U763: Employment, Research Funding.


Blood ◽  
2009 ◽  
Vol 114 (22) ◽  
pp. 659-659 ◽  
Author(s):  
Vikram Mathews ◽  
Kavitha M Lakshmi ◽  
Auro Viswabandya ◽  
Biju George ◽  
Mammen Chandy ◽  
...  

Abstract Abstract 659 An allogeneic stem cell transplant (SCT) remains the only curative option for patients with β thalassemia major (TM). Conventional risk stratification requires a liver biopsy to be done prior to transplant, has inadequate chelation therapy as a risk factor, which is poorly defined with currently available therapies and results in a large heterogeneous high risk group. We have previously shown that survival can be significantly different in subsets within this high risk group (BBMT 2007;13:889) From October 1991 to December 2008, 271 patients with TM underwent a SCT at our center. The median age was 7 years (range: 2-24) and there were 175 males (64.6%). 133 (49%) were conventional Class III patients. Myelo-ablative conditioning regimen in the majority was busulphan (oral) and cyclophosphamide with or without anti lymphocyte globulin. GVHD prophylaxis was cylcosporine and short course methotrexate. Two hundred and sixty six (98%) received a bone marrow graft. At a median follow up of 41 months (range: 0-209) the 5-year Kaplan-Meir estimate of overall survival (OS) and event free survival (EFS) was 70.79±2.9% and 63.75±3% respectively. On a univariate analysis, factors associated with an adverse impact on EFS were patients' age, donor age, liver size, serum AST level, serum ferritin level, number of packed cell transfusions received, spleen size and splenectomy. On a multivariate analysis only liver size (both 2-5 cm and >5 cm) retained its significant adverse impact. The remaining parameters that were significant on a univariate analysis as a continuous variable were further evaluated after dividing them into quartiles. On a cox-regression analysis of the quartiles only age retained significance in all quartiles while the rest were significant only in the highest quartile. For the latter, the cut offs of the highest quartile was used to dichotomize the cohort into two groups for each parameter and a score given proportional to relative risk (Table 1). The total score could therefore range form 0 to 13 for each patient. On a multivariate analysis none of these generated values, including age in the different quartiles, had an independent significant impact on EFS. They were still retained in the scoring system because of the established biological relevance of these parameters on transplant outcomes. Splenectomy was excluded in view of the small number of cases. Kaplan-Meir estimates of EFS were generated for each of these scores and clustered into groups. Three groups could be recognized; Group A with a score <3.5 (n=125 [46%], Group B 3.5-7.5 (n=87 [32%]) and Group C >7.5 (n=59 [22%]). Figure 1 illustrates the Kaplan-Meir estimates of EFS and cumulative risk of rejection between these groups which were significantly different. There were 133 (49%) patients in this cohort who belonged to the conventional Lucarelli Class III subset. Of these, using the current risk stratification 18(13%) would be in Group A, 58(44%) in Group B and 57(43%) in Group C. The proposed risk stratification does not require a liver biopsy, has a good distribution of cases in the defined groups and better identifies a high risk subset of patients, than the conventional risk stratification system. This high risk subset may need innovative strategies for improving outcomes following an allogeneic SCT. The proposed risk stratification system will need to be validated prospectively.Table 1Variables Score011.524Liver size (cm)<22-5>5Age (years)<55-77-11>11Packed cell transfusions (units)<90>90Serum AST (IU/Lt)<75>75Serum Ferritin (ng/ml)<3500>3500Spleen size (cm)<3.0>3.0 Disclosures: No relevant conflicts of interest to declare.


2017 ◽  
Vol 117 (07) ◽  
pp. 1432-1439 ◽  
Author(s):  
Joseph S. Biedermann ◽  
Willem M. H. Rademacher ◽  
Hendrika C. A. M. Hazendonk ◽  
Denise E. van Diermen ◽  
Frank W. G. Leebeek ◽  
...  

SummaryPatients on vitamin K antagonists (VKA) often undergo invasive dental procedures. International guidelines consider all dental procedures as low-risk procedures, while bleeding risk may differ between standard low-risk (e. g. extraction 1–3 elements) and extensive high-risk (e.g. extraction of >3 elements) procedures. Therefore current guidelines may need refinement. In this cohort study, we identified predictors of oral cavity bleeding (OCB) and evaluated clinical outcome after low-risk and highrisk dental procedures in patients on VKA. Perioperative management strategy, procedure risk, and 30-day outcomes were assessed for each procedure. We identified 1845 patients undergoing 2004 low-risk and 325 high-risk procedures between 2013 and 2015. OCB occurred after 67/2004 (3.3 %) low-risk and 21/325 (6.5 %) high-risk procedures (p=0.006). In low-risk procedures, VKA continuation with tranexamic acid mouthwash was associated with a lower OCB risk compared to continuation without mouthwash [OR=0.41, 95 %CI 0.23–0.73] or interruption with bridging [OR=0.49, 95 %CI 0.24–1.00], and a similar risk as interruption without bridging [OR=1.44, 95 %CI 0.62–3.64]. In high-risk procedures, VKA continuation was associated with an increased OCB risk compared to interruption [OR=3.08, 95 %CI 1.05–9.04]. Multivariate analyses revealed bridging, antiplatelet therapy, and a supratherapeutic or unobjectified INR before the procedure as strongest predictors of OCB. Non-oral cavity bleeding (NOCB) and thromboembolic event (TE) rates were 2.1 % and 0.2 %. Bridging therapy was associated with a two-fold increased risk of NOCB [OR=1.93, 95 %CI 1.03–3.60], but not with lower TE rates. In conclusion, predictors of OCB were mostly related to perioperative management and differed between low-risk and high-risk procedures. Perioperative management should be differentiated accordingly.


Blood ◽  
2006 ◽  
Vol 108 (11) ◽  
pp. 2996-2996
Author(s):  
Vikram Mathews ◽  
Biju George ◽  
Kavitha M. Lakshmi ◽  
Auro Viswabandya ◽  
Mammen Chandy ◽  
...  

Abstract The current risk stratification of patients with β thalassemia major undergoing an allogeneic stem cell transplantation (SCT) is based on liver size (>2cm), presence of liver fibrosis and inadequate iron chelation (Lucarelli et al, NEJM 1990). Our clinical observation suggested that patients in Class III (presence of all three adverse features) were a heterogeneous group and included a large number of patients who would otherwise have a good prognosis. We therefore undertook a retrospective analysis to study the pre-transplant variables that have an impact on outcome. Between 1991 and 2005, 189 patients underwent 196 HLA matched related allogeneic SCT for a diagnosis of β thalassemia major at our center. Except for two cases, all patients were less than 18 years of age at the time of transplant. The majority (97.5%) of patients received myeloablative (BuCy) conditioning regimen. The median (±SD) age of this cohort was 7±4.1 years with 68% males. There were 11(5.6%), 81(41.1%) and 105(53.3%) in Lucarelli Class I, II and III respectively. The Kaplan-Meier 5 year event free survival (event defined as rejection, relapse or death) for Class II and III patients was 78.53±4.53 and 51.97±5.14, respectively. (Table 1) summarizes the impact of pre transplant variables on the EFS. Patient age and liver size as continuous variables were significantly associated with an adverse outcome. Using a receiver operating characteristic (ROC) curve plot analysis, cutoff values of 7 years and 5 cms respectively for age and liver size gave the highest likelihood ratios for an adverse effect on EFS (1.6 and 2.7 respectively). These cut off values significantly discriminated patients’ EFS on a univariate analysis. Table 1: Unadjusted adverse effect of pre-transplant variables on EFS Pre-transplant variable RR (95% CI) P-value Age (≥ 7 years) 2.9 (1.6– 5.2) 0.000 Sex (F) 1.5 (0.9 – 2.6) 0.082 F>M transplant 0.9 (0.5 – 1.5) 0.715 Liver size (≥5 cm) 3.5 (2.1 – 5.9) 0.000 Chelation (inadequate) 2.9 (0.7 – 12.2) 0.130 Liver fibrosis (yes) 1.7 (0.8 – 3.3) 0.106 SGPT 1.0 (1 – 1.006) 0.080 Ferritin 1 (0.8 – 1.2) 0.056 On a forward stepwise multivariate analysis only age ≥7 years and liver size ≥ 5 cms retained their significance (RR 2.2 and 3.6, P-values 0.014 and 0.000 respectively). Using these two variables patients were categorized as high risk if they were ≥ 7 years and had a liver size ≥ 5 cms. There were 41 cases in this sub group (all were Class III). The 5 year EFS and OS in this high risk group (n=41) was 23.93±6.88 and 39.01±7.96 respectively, while in the remaining Class III patients (n=64) the 5 year EFS and OS was 73.23±5.56 and 81.22±4.89. Statistical analysis of these survival curves by a log rank test revealed that they were both statistically significant (P=0.000 for both EFS and OS). The majority of the events in the high risk group happened in the first 100 days [TRM=17(41.4%), rejection=3(7.3%) and death from GVHD=3(7.3%)]. Using age ≥ 7years and liver size ≥ 5 cms we were able to identify a significant subset of patients in class III (39%) who have a poor outcome with allogeneic SCT and could benefit from novel approaches while the others with clinical outcomes comparable to those in class II should probably be classified with them and managed accordingly.


BMJ Open ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. e043837
Author(s):  
Usha Dutta ◽  
Anurag Sachan ◽  
Madhumita Premkumar ◽  
Tulika Gupta ◽  
Swapnajeet Sahoo ◽  
...  

ObjectivesHealthcare personnel (HCP) are at an increased risk of acquiring COVID-19 infection especially in resource-restricted healthcare settings, and return to homes unfit for self-isolation, making them apprehensive about COVID-19 duty and transmission risk to their families. We aimed at implementing a novel multidimensional HCP-centric evidence-based, dynamic policy with the objectives to reduce risk of HCP infection, ensure welfare and safety of the HCP and to improve willingness to accept and return to duty.SettingOur tertiary care university hospital, with 12 600 HCP, was divided into high-risk, medium-risk and low-risk zones. In the high-risk and medium-risk zones, we organised training, logistic support, postduty HCP welfare and collected feedback, and sent them home after they tested negative for COVID-19. We supervised use of appropriate personal protective equipment (PPE) and kept communication paperless.ParticipantsWe recruited willing low-risk HCP, aged <50 years, with no comorbidities to work in COVID-19 zones. Social distancing, hand hygiene and universal masking were advocated in the low-risk zone.ResultsBetween 31 March and 20 July 2020, we clinically screened 5553 outpatients, of whom 3012 (54.2%) were COVID-19 suspects managed in the medium-risk zone. Among them, 346 (11.4%) tested COVID-19 positive (57.2% male) and were managed in the high-risk zone with 19 (5.4%) deaths. One (0.08%) of the 1224 HCP in high-risk zone, 6 (0.62%) of 960 HCP in medium-risk zone and 23 (0.18%) of the 12 600 HCP in the low-risk zone tested positive at the end of shift. All the 30 COVID-19-positive HCP have since recovered. This HCP-centric policy resulted in low transmission rates (<1%), ensured satisfaction with training (92%), PPE (90.8%), medical and psychosocial support (79%) and improved acceptance of COVID-19 duty with 54.7% volunteering for re-deployment.ConclusionA multidimensional HCP-centric policy was effective in ensuring safety, satisfaction and welfare of HCP in a resource-poor setting and resulted in a willing workforce to fight the pandemic.


2019 ◽  
Author(s):  
J. Tremblay ◽  
M. Haloui ◽  
F. Harvey ◽  
R. Tahir ◽  
F.-C. Marois-Blanchet ◽  
...  

AbstractType 2 diabetes increases the risk of cardiovascular and renal complications, but early risk prediction can lead to timely intervention and better outcomes. Through summary statistics of meta-analyses of published genome-wide association studies performed in over 1.2 million of individuals, we combined 9 PRS gathering genomic variants associated to cardiovascular and renal diseases and their key risk factors into one logistic regression model, to predict micro- and macrovascular endpoints of diabetes. Its clinical utility in predicting complications of diabetes was tested in 4098 participants with diabetes of the ADVANCE trial followed during a period of 10 years and replicated it in three independent non-trial cohorts. The prediction model adjusted for ethnicity, sex, age at onset and diabetes duration, identified the top 30% of ADVANCE participants at 3.1-fold increased risk of major micro- and macrovascular events (p=6.3×10−21 and p=9.6×10−31, respectively) and at 4.4-fold (p=6.8×10−33) increased risk of cardiovascular death compared to the remainder of T2D subjects. While in ADVANCE overall, combined intensive therapy of blood pressure and glycaemia decreased cardiovascular mortality by 24%, the prediction model identified a high-risk group in whom this therapy decreased mortality by 47%, and a low risk group in whom the therapy had no discernable effect. Patients with high PRS had the greatest absolute risk reduction with a number needed to treat of 12 to prevent one cardiovascular death over 5 years. This novel polygenic prediction model identified people with diabetes at low and high risk of complications and improved targeting those at greater benefit from intensive therapy while avoiding unnecessary intensification in low-risk subjects.


2007 ◽  
Vol 14 (9) ◽  
pp. 1102-1107 ◽  
Author(s):  
Richard M. Novak ◽  
Betty A. Donoval ◽  
Parrie J. Graham ◽  
Lucy A. Boksa ◽  
Gregory Spear ◽  
...  

ABSTRACT Innate immune factors in mucosal secretions may influence human immunodeficiency virus type 1 (HIV-1) transmission. This study examined the levels of three such factors, genital tract lactoferrin [Lf], secretory leukocyte protease inhibitor [SLPI], and RANTES, in women at risk for acquiring HIV infection, as well as cofactors that may be associated with their presence. Women at high risk for HIV infection meeting established criteria (n = 62) and low-risk controls (n = 33) underwent cervicovaginal lavage (CVL), and the CVL fluid samples were assayed for Lf and SLPI. Subsets of 26 and 10 samples, respectively, were assayed for RANTES. Coexisting sexually transmitted infections and vaginoses were also assessed, and detailed behavioral information was collected. Lf levels were higher in high-risk (mean, 204 ng/ml) versus low-risk (mean, 160 ng/ml, P = 0.007) women, but SLPI levels did not differ, and RANTES levels were higher in only the highest-risk subset. Lf was positively associated only with the presence of leukocytes in the CVL fluid (P < 0.0001). SLPI levels were lower in women with bacterial vaginosis [BV] than in those without BV (P = 0.04). Treatment of BV reduced RANTES levels (P = 0.05). The influence, if any, of these three cofactors on HIV transmission in women cannot be determined from this study. The higher Lf concentrations observed in high-risk women were strongly associated with the presence of leukocytes, suggesting a leukocyte source and consistent with greater genital tract inflammation in the high-risk group. Reduced SLPI levels during BV infection are consistent with an increased risk of HIV infection, which has been associated with BV. However, the increased RANTES levels in a higher-risk subset of high-risk women were reduced after BV treatment.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
C Van Der Aalst ◽  
S.J.A.M Denissen ◽  
M Vonder ◽  
J.-W.C Gratema ◽  
H.J Adriaansen ◽  
...  

Abstract Aims Screening for a high cardiovascular disease (CVD) risk followed by preventive treatment can potentially reduce coronary heart disease (CHD)-related morbidity and mortality. ROBINSCA (Risk Or Benefit IN Screening for CArdiovascular disease) is a population-based randomized controlled screening trial that investigates the effectiveness of CVD screening in asymptomatic participants using the Systematic COronary Risk Evaluation (SCORE) model or Coronary Artery Calcium (CAC) scoring. This study describes the distributions in risk and treatment in the ROBINSCA trial. Methods and results Individuals at expected elevated CVD risk were randomized (1:1:1) into the control arm (n=14,519; usual care); screening arm A (n=14,478; SCORE, 10-year fatal and non-fatal risk); or screening arm B (n=14,450; CAC scoring). Preventive treatment was largely advised according to current Dutch guidelines. Risk and treatment differences between the screening arms were analysed. 12,185 participants (84.2%) in arm A and 12,950 (89.6%) in arm B were screened. 48.7% were women, and median age was 62 (InterQuartile Range 10) years. SCORE screening identified 45.1% at low risk (SCORE&lt;10%), 26.5% at intermediate risk (SCORE 10–20%), and 28.4% at high risk (SCORE≥20%). According to CAC screening, 76.0% were at low risk (Agatston&lt;100), 15.1% at high risk (Agatston 100–399), and 8.9% at very high risk (Agatston≥400). CAC scoring significantly reduced the number of individuals indicated for preventive treatment compared to SCORE (relative reduction women: 37.2%; men: 28.8%). Conclusion We showed that compared to risk stratification based on SCORE, CAC scoring classified significantly fewer men and women at increased risk, and less preventive treatment was indicated. ROBINSCA flowchart Funding Acknowledgement Type of funding source: Public grant(s) – EU funding. Main funding source(s): Advanced Research Grant


2020 ◽  
Vol 154 (Supplement_1) ◽  
pp. S8-S9
Author(s):  
Nicholas E Larkey ◽  
Leslie J Donato ◽  
Allan S Jaffe ◽  
Jeffrey W Meeusen

Abstract Plasma concentrations of low-density-lipoprotein cholesterol (LDL-C) are directly associated with risk for coronary artery disease (CAD). Multisociety guidelines define LDL-C&gt;160mg/dL as a risk factor for CAD and LDL-C&gt;190mg/dL as an indication for lipid lowering medication, regardless of other clinical factors. Subfractionation of LDL according to size (LDL-s) enables differentiation between two LDL phenotypes: large-buoyant LDL and small-dense LDL. The small-dense LDL phenotype reportedly conveys increased risk for CAD. Major societies do not recommend LDL subfractions be used for clinical decision making and most payers do not cover LDL subfraction testing. Despite these restrictions, LDL subfraction is routinely requested by clinicians. Nuclear magnetic resonance (NMR) spectroscopy measures LDL-C and LDL-s. Following inquiries regarding interpretation of conflicting LDL-C and LDL-s results, we investigated associations between LDL-C and LDL-s measured by NMR in order to determine how often they provide contradicting or additive information. Verification of NMR LDL-C accuracy was confirmed by ß-quantification in a subset of patient samples (n=250). The average bias was -4.5mg/dL and the correlation coefficient was 0.92. High-risk was defined as LDL-C&gt;160mg/dL or LDL-s&lt;20.5 nm (small-dense LDL); and low-risk was defined as LDL-C&lt;70mg/dL or LDL-s&gt;20.5nm (large-buoyant LDL). In 26,710 clinical NMR analyses, the median LDL-C was 94.0mg/dL (range:5-436mg/dL) with median LDL-s of 20.8 nm (range:19.4–23.0nm). LDL-s moderately correlated with LDL-C (Ï#129;=0.51;p&lt;0.01). Small-dense-LDL was identified in only 18% (407/2,191) of patients with elevated LDL-C (&gt;160mg/dL) and was more common (73.2% of 6,093) in patients with low LDL-C (&lt;70mg/dL;p&lt;0.001). Associations with CAD were investigated among patients without cholesterol-lowering medication treatment referred for angiography (n=356). CAD (defined as stenosis &gt;50% in one or more coronary artery) was diagnosed in 14% (1/7) of subjects with low LDL-C (&lt;70mg/dL) compared to 59% (47/80) of subjects with elevated LDL-C (p=0.01). When stratifying by LDL-s, CAD was diagnosed in 50% (57/115) of subjects with small-dense LDL compared to 43% (104/241) of subjects with large-buoyant LDL (p=0.2). Small-dense LDL was identified in only 33% (26/80) of cases with elevated LDL-C. Limiting to subjects with elevated LDL-C, CAD was diagnosed in 50% (13/26) of subjects with concordant (high-risk) small-dense LDL compared to 61% (33/54) of subjects with discordant (low-risk) large-buoyant LDL (LDL-s&gt;20.5nm) (p=0.3). Our data confirm that LDL-s subfraction measured by NMR is reported discordantly in most cases when LDL-C is unequivocally high or low. Furthermore, CAD diagnosis was significantly associated with LDL-C, but not with LDL-s. Our data also show that in discrepant samples, elevated LDL-C correlates better with disease state compared to LDL-s. Therefore, LDL-s should not be used to justify treatment decisions in patients with elevated LDL-C. Laboratories should consider carefully whether or not to report LDL-s when it is known that misleading and discordant values will be reported in a majority of cases.


Sign in / Sign up

Export Citation Format

Share Document