scholarly journals Impact of longer term phosphorus control on cardiovascular mortality in hemodialysis patients using an area under the curve approach: results from the DOPPS

2020 ◽  
Vol 35 (10) ◽  
pp. 1794-1801
Author(s):  
Marcelo Barreto Lopes ◽  
Angelo Karaboyas ◽  
Brian Bieber ◽  
Ronald L Pisoni ◽  
Sebastian Walpen ◽  
...  

Abstract Background Serial assessment of phosphorus is currently recommended by the Kidney Disease: Improving Global Outcomes (KDIGO) guidelines, but its additional value versus a single measurement is uncertain. Methods We studied data from 17 414 HD patients in the Dialysis Outcomes and Practice Patterns Study, a prospective cohort study, and calculated the area under the curve (AUC) by multiplying the time spent with serum phosphorus >4.5 mg/dL over a 6-month run-in period by the extent to which this threshold was exceeded. We estimated the association between the monthly average AUC and cardiovascular (CV) mortality using Cox regression. We formally assessed whether AUC was a better predictor of CV mortality than other measures of phosphorus control according to the Akaike information criterion. Results Compared with the reference group of AUC = 0, the adjusted hazard ratio (HR) of CV mortality was 1.12 [95% confidence interval (CI) 0.90–1.40] for AUC > 0–0.5, 1.26 (95% CI 0.99–1.62) for AUC > 0.5–1, 1.44 (95% CI 1.11–1.86) for AUC > 1–2 and 2.03 (95% CI 1.53–2.69) for AUC > 2. The AUC was predictive of CV mortality within strata of the most recent phosphorus level and had a better model fit than other serial measures of phosphorus control (mean phosphorus, months out of target). Conclusions We conclude that worse phosphorus control over a 6-month period was strongly associated with CV mortality. The more phosphorus values do not exceed 4.5 mg/dL the better is survival. Phosphorus AUC is a better predictor of CV death than the single most recent phosphorus level, supporting with real-world data KDIGO’s recommendation of serial assessment of phosphorus to guide clinical decisions.

2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Rebecca Winzeler ◽  
Patrice Max Ambühl

Abstract Background and Aims Anemia is highly prevalent in dialysis patients and is associated with increased morbidity and mortality. The purpose of the present analysis is to evaluate current anemia management in dialysis patients in Switzerland collected from the Swiss Dialysis Registry (srrqap), which covers all dialysis patients in Switzerland. Method All medical establishments in Switzerland (both public and private; N=92) providing chronic treatment by either hemo- and/or peritoneal dialysis, had to provide relevant data for the year 2018. All individuals being on chronic dialytic therapy in the year 2018 were enrolled (N=4646). To calculate survival probabilities, all deaths from incident dialysis patients between 2014 and 2018 were analyzed. Results: 65 percent of all dialysis patients receive iron and EPO. Regardless of anemia management, 82% of patients reach target hemoglobin levels 10 g/dL. In 18% of patients inadequate management to reach Hb targets may be suspected. The distribution of iron and EPO substitution is similar in all age groups. However, 26% of the age group 20-44 years receive EPO, but no iron, compared to only 15% in the other age groups. Survival analysis by Cox regression adjusted for age, Charlson score and treatment modality revealed that patients with Hb levels equal or greater than 11 g/dl have the best survival (reference group). In comparison, patients in the Hb categories below 9, 9-9.9 and 10-10.9 g/dl have an odds ratio of 3.9, 2.0 and 1.3, respectively, to die. Conclusion Anemia management to reach Hb target levels following KDIGO guidelines seems to be adequately implemented among dialysis patients in Switzerland. In 18% of patients treatment might be optimized to achieve Hb targets. As expected, patients with Hb levels equal or greater than 11 g/dl have better survival rates compared to patients with lower Hb values.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
V.L Malavasi ◽  
E Fantecchi ◽  
V Tordoni ◽  
L Melara ◽  
A Barbieri ◽  
...  

Abstract Background Natural history of atrial fibrillation (AF) shows a progression of arrhythmia from non-permanent to permanent AF. Permanent AF was found associated with a worse prognosis than non-permanent one. Aim To assess the factors associated with progression to permanent AF in an unselected population of AF patients with non-permanent AF. Methods In this prospective study we enrolled in- as well as out-patients with non-permanent AF and age ≥18 years, with at least one episode of ECG-documented AF within 1 year. The patients were followed-up at 1 month and every 6 months thereafter. Results Out of 523 patients, 314 (60%) were in non-permanent AF (80 [25.5%] paroxysmal AF, 165 [52.5%] persistent AF, 69 [2%] first diagnosed AF), mostly male (188, 59.9%), median age 71 years (IQ range 62–77), median CHA2DS2VASc 3 (1–4), median HATCH score 1 (1–2). After a median follow-up of 701 (IQ range 437–902) days, 66 patients (21%) showed permanent AF. CHA2DS2VASc and HATCH scores were incrementally associated to progression to permanent AF (CHA2DS2VASc χ2 p=0.001; HATCH χ2 p=0.017; p for trend CHA2DS2VASc <0.001, HATCH p=0.001). At multivariable Cox proportional hazard regression the following variables were significantly associated with AF progression: age (hazard ratio [HR] 1.041; 95% CI: 1.004–1.079; p=0.028), at least moderate left atrial (LA) enlargement (>42 ml/m2) (HR 2.092; 95% CI: 1.132–3.866; p=0.018), antiarrhythmics drugs after the enrollment (HR 0.087; 95% CI: 0.011–0.662; p=0.018), EHRA score >2 (HR 0.351; 95% CI: 0.158–0.779; p=0.010) and Valvular HD (HR 2.161; 95% CI: 1.057–4.420; p=0.035). Adding LA dilation to HATCH score (HATCH-LA) and assigning 2 points based on multivariable Cox regression, HATCH-LA was statistically better in ROC curves in prediction of AF progression vs HATCH score (area under the curve 0.695 vs 0.636; DeLong p=0.0225). Survival-free curves on freedom from permanent AF using as discriminator HATCH-LA score ≤2 vs >2 led to a statistically significant difference (χ2=16.080 p<0.001), but the same was not found for HATCH score (χ2 =3.099; p=0.078). Conclusions In patients without permanent AF, progression of AF was independentely related to age, LA dilation, AF symptoms severity, antiarrhythmic drugs and Valvular HD. HATCH score predicted AF progression and adding to it LA dilation (at least moderate) improved patients stratification for the risk of evolution to permanent AF. Funding Acknowledgement Type of funding source: None


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Hongshuai Li ◽  
Jie Yang ◽  
Guohui Yang ◽  
Jia Ren ◽  
Yu Meng ◽  
...  

AbstractSarcoma is a rare malignancy with unfavorable prognoses. Accumulating evidence indicates that aberrant alternative splicing (AS) events are generally involved in cancer pathogenesis. The aim of this study was to identify the prognostic value of AS-related survival genes as potential biomarkers, and highlight the functional roles of AS events in sarcoma. RNA-sequencing and AS-event datasets were downloaded from The Cancer Genome Atlas (TCGA) sarcoma cohort and TCGA SpliceSeq, respectively. Survival-related AS events were further assessed using a univariate analysis. A multivariate Cox regression analysis was also performed to establish a survival-gene signature to predict patient survival, and the area-under-the-curve method was used to evaluate prognostic reliability. KOBAS 3.0 and Cytoscape were used to functionally annotate AS-related genes and to assess their network interactions. We detected 9674 AS events in 40,184 genes from 236 sarcoma samples, and the 15 most significant genes were then used to construct a survival regression model. We further validated the involvement of ten potential survival-related genes (TUBB3, TRIM69, ZNFX1, VAV1, KCNN2, VGLL3, AK7, ARMC4, LRRC1, and CRIP1) in the occurrence and development of sarcoma. Multivariate survival model analyses were also performed, and validated that a model using these ten genes provided good classifications for predicting patient outcomes. The present study has increased our understanding of AS events in sarcoma, and the gene-based model using AS-related events may serve as a potential predictor to determine the survival of sarcoma patients.


Biomedicines ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 891
Author(s):  
Cheng-Maw Ho ◽  
Chi-Ling Chen ◽  
Chia-Hao Chang ◽  
Meng-Rui Lee ◽  
Jann-Yuan Wang ◽  
...  

Background: Anti-tuberculous (TB) medications are common causes of drug-induced liver injury (DILI). Limited data are available on systemic inflammatory mediators as biomarkers for predicting DILI before treatment. We aimed to select predictive markers among potential candidates and to formulate a predictive model of DILI for TB patients. Methods: Adult active TB patients from a prospective cohort were enrolled, and all participants received standard anti-tuberculous treatment. Development of DILI, defined as ≥5× ULN for alanine transaminase or ≥2.6× ULN of total bilirubin with causality assessment (RUCAM, Roussel Uclaf causality assessment method), was regularly monitored. Pre-treatment plasma was assayed for 15 candidates, and a set of risk prediction scores was established using Cox regression and receiver-operating characteristic analyses. Results: A total of 19 (7.9%) in 240 patients developed DILI (including six carriers of hepatitis B virus) following anti-TB treatment. Interleukin (IL)-22 binding protein (BP), interferon gamma-induced protein 1 (IP-10), soluble CD163 (sCD163), IL-6, and CD206 were significant univariable factors associated with DILI development, and the former three were backward selected as multivariable factors, with adjusted hazards of 0.20 (0.07–0.58), 3.71 (1.35–10.21), and 3.28 (1.07–10.06), respectively. A score set composed of IL-22BP, IP-10, and sCD163 had an improved area under the curve of 0.744 (p < 0.001). Conclusions: Pre-treatment IL-22BP was a protective biomarker against DILI development under anti-TB treatment, and a score set by additional risk factors of IP-10 and sCD163 employed an adequate DILI prediction.


2019 ◽  
Vol 13 (8) ◽  
Author(s):  
Guan Hee Tan ◽  
Antonio Finelli ◽  
Ardalan Ahmad ◽  
Marian Wettstein ◽  
Alexandre Zlotta ◽  
...  

Introduction: Active surveillance (AS) is standard of care in low-risk prostate cancer (PC). This study describes a novel total cancer location (TCLo) density metric and aims to determine its performance in predicting clinical progression (CP) and grade progression (GP).     Methods: This was a retrospective study of patients on AS after confirmatory biopsy (CBx). We excluded patients with Gleason ≥7 at CBx and <2 years follow-up. TCLo was the number of locations with positive cores at diagnosis (DBx) and CBx. TCLo density was TCLo / prostate volume (PV). CP was progression to any active treatment while GP occurred if Gleason ≥7 was identified on repeat biopsy or surgical pathology. Independent predictors of time to CP or GP were estimated with Cox regression. Kaplan-Meier analysis compared progression-free survival curves between TCLo density groups. Test characteristics of TCLo were explored with receiver operating characteristic (ROC) curves.     Results: We included 181 patients who had CBx between 2012-2015, and met inclusion criteria. The mean age of patients was 62.58 years (SD=7.13) and median follow-up was 60.9 months (IQR=23.4). A high TCLo density score (>0.05) was independently associated with time to CP (HR 4.70, 95% CI: 2.62-8.42, p<0.001), and GP (HR 3.85, 95% CI: 1.91-7.73, p<0.001). ROC curves showed TCLo density has greater area under the curve than number of positive cores at CBx in predicting progression.     Conclusion: TCLo density is able to stratify patients on AS for risk of CP and GP. With further validation, it could be added to the decision-making algorithm in AS for low-risk localized PC.


2019 ◽  
Vol 40 (Supplement_1) ◽  
Author(s):  
G Denas ◽  
G Costa ◽  
E Ferroni ◽  
N Gennaro ◽  
U Fedeli ◽  
...  

Abstract Introduction Anticoagulation therapy is central for the management of stroke in patients with non-valvular atrial fibrillation (NVAF). Persistence with oral anticoagulation is essential to prevent thromboembolic complications. Purpose To assess persistence levels of DOACs and look for possible predictors of treatment discontinuity in NVAF patients. Methods We performed a population-based retrospective cohort study in the Veneto Region (north-eastern Italy, about 5 million inhabitants) using the regional health system databases. Naïve patients initiating direct oral anticoagulants (DOACs) for stroke prevention in NVAF from July 2013 to September 2017 were included in the study. Patients were identified using Anatomical Therapeutic Chemical (ATC) codes, excluding other indications for anticoagulation therapy using ICD-9CM codes. Treatment persistence was defined as the time from initiation to discontinuation of the therapy. Baseline characteristics and comorbidities associated to the persistence of therapy with DOACs were explored by means of Kaplan-Meier curves and assessed through Cox regression. Results Overall, 17920 patients initiated anticoagulation with DOACs in the study period. Most patients were older than 74 years old, while gender was almost equally represented. Comorbidities included hypertension (72%), diabetes mellitus (17%), congestive heart failure (9%), previous stroke/TIA (20%), and prior myocardial infarction (2%). After one year, the persistence to anticoagulation treatment was 82.7%, while the persistence to DOAC treatment was 72.9% with about 10% of the discontinuations being due to switch to VKAs. On multivariate analysis, factors negatively affecting persistence were female gender, younger age (<65 years), renal disease and history of bleeding. Conversely, persistence was better in patients with hypertension, previous cerebral ischemic events, and previous acute myocardial infarction. Persistence to DOAC therapy Conclusion This real-world data show that within 12 months, one out of four anticoagulation-naïve patients stop DOACs, while one out of five patients stop anticoagulation. Efforts should be made to correct modifiable predictors and intensify patient education.


2020 ◽  
Vol 35 (Supplement_3) ◽  
Author(s):  
Tadao Akizawa ◽  
Hironori Kanda ◽  
Masayuki Takanuma ◽  
Jun Kinoshita ◽  
Masafumi Fukagawa

Abstract Background and Aims Phosphate binders (PB) are usually prescribed to dialysis patients with hyperphosphatemia. Several studies have reported that higher PB pill burden may reduce adherence and lead to insufficient phosphorus control. Tenapanor is an investigational, minimally absorbed, orally administered, non-binder, small-molecule that inhibits the sodium/hydrogen exchanger isoform 3 (NHE3) in development for the control of serum phosphorus. A previous Ph3 study sponsored by Ardelyx, Inc. (NCT02675998) showed a significant phosphorus decrease compared to the placebo in patients with hyperphosphatemia undergoing hemodialysis (HD) in the US. Tenapanor was expected to reduce PB pill burden since it is administered as one small pill, taken twice a day. This was the first study in Japanese HD patients, which aimed to confirm whether tenapanor reduces the pill burden of PB. Method This was a multicenter, open-label, single-arm Ph2 study. The study consists of a screening period, a 3-week observation period, and a 26-week treatment period. Patients whose serum phosphorus level was ≥ 3.5 and ≤ 7.0 mg/dL, taking at least two PB pills three times a day were enrolled. The patients started to receive 30 mg of tenapanor twice daily. The tenapanor dose could be reduced in a step-wise manner (60, 40, 20 and 10 mg/day) at the investigator’s discretion, based on GI tolerability. PB treatment was continued according to individual regimens, however, the dose could be adjusted appropriately to maintain serum phosphorus level within ±0.5 mg/dL from the baseline. The primary endpoint was an achievement of at least a 30% decrease in the mean of the total number of PB and tenapanor pills compared to the number of PB pills at baseline. The proportion of patients who achieved at least a 30 % decrease were tested using binomial test with a threshold level of 20% and a one-sided significance level of 0.025. The analysis was conducted using the data as of Dec25, 2019. Results The primary endpoint was met. Of 67 enrolled patients at the timing of analysis, 48 patients (71.6%, [95% CI: 59.3% - 82.0%], p&lt;0.001) achieved a 30% decrease in the total number of PB and tenapanor pills, and of those, 35 patients (52.2%, [95% CI: 39.7% - 64.6%]) achieved a 50% decrease and 18 patients (26.9%) no longer required the use of any PB at week 26. Mean phosphorus levels were maintained during the study from 5.2 mg/dL at the baseline to 4.7 mg/dL at week 26. The most frequent adverse event was diarrhea (76.1%), which was mostly mild to moderate. Only four patients discontinued the study due to diarrhea. Serious adverse events were reported in five patients, only two of which were related to tenapanor (diarrhea and acute myocardial infarction). Conclusion Tenapanor was able to provide phosphorus control with significantly fewer pills compared to PB. AE profile was similar to previous US studies. This result suggests that tenapanor, a non-binder, phosphate absorption inhibitor that provides a novel approach to the management of hyperphosphatemia, could potentially improve drug adherence by reducing PB pill burden while maintaining effective phosphorus control.


2021 ◽  
Vol 15 ◽  
pp. 117955492110241
Author(s):  
Hongkai Zhuang ◽  
Zixuan Zhou ◽  
Zuyi Ma ◽  
Shanzhou Huang ◽  
Yuanfeng Gong ◽  
...  

Background: The prognosis of patients with pancreatic ductal adenocarcinoma (PDAC) of pancreatic head remains poor, even after potentially curative R0 resection. The aim of this study was to develop an accurate model to predict patients’ prognosis for PDAC of pancreatic head following pancreaticoduodenectomy. Methods: We retrospectively reviewed 112 patients with PDAC of pancreatic head after pancreaticoduodenectomy in Guangdong Provincial People’s Hospital between 2014 and 2018. Results: Five prognostic factors were identified using univariate Cox regression analysis, including age, histologic grade, American Joint Committee on Cancer (AJCC) Stage 8th, total bilirubin (TBIL), CA19-9. Using all subset analysis and multivariate Cox regression analysis, we developed a nomogram consisted of age, AJCC Stage 8th, perineural invasion, TBIL, and CA19-9, which had higher C-indexes for OS (0.73) and RFS (0.69) compared with AJCC Stage 8th alone (OS: 0.66; RFS: 0.67). The area under the curve (AUC) values of the receiver operating characteristic (ROC) curve for the nomogram for OS and RFS were significantly higher than other single parameter, which are AJCC Stage 8th, age, perineural invasion, TBIL, and CA19-9. Importantly, our nomogram displayed higher C-index for OS than previous reported models, indicating a better predictive value of our model. Conclusions: A simple and practical nomogram for patient prognosis in PDAC of pancreatic head following pancreaticoduodenectomy was established, which shows satisfactory predictive efficacy and deserves further evaluation in the future.


2018 ◽  
Vol 40 (2) ◽  
pp. 354-364 ◽  
Author(s):  
Myriam G Jaarsma-Coes ◽  
Rashid Ghaznawi ◽  
Jeroen Hendrikse ◽  
Cornelis Slump ◽  
Theo D Witkamp ◽  
...  

Neurodegenerative and neurovascular diseases lead to heterogeneous brain abnormalities. A combined analysis of these abnormalities by phenotypes of the brain might give a more accurate representation of the underlying aetiology. We aimed to identify different MRI phenotypes of the brain and assessed the risk of future stroke and mortality within these subgroups. In 1003 patients (59 ± 10 years) from the Second Manifestations of ARTerial disease-Magnetic Resonance (SMART-MR) study, different quantitative 1.5T brain MRI markers were used in a hierarchical clustering analysis to identify 11 distinct subgroups with a different distribution in brain MRI markers and cardiovascular risk factors, and a different risk of stroke (Cox regression: from no increased risk compared to the reference group with relatively few brain abnormalities to HR = 10.34; 95% CI 3.80↔28.12 for the multi-burden subgroup) and mortality (from no increased risk compared to the reference group to HR = 4.00; 95% CI 2.50↔6.40 for the multi-burden subgroup). In conclusion, within a group of patients with manifest arterial disease, we showed that different MRI phenotypes of the brain can be identified and that these were associated with different risks of future stroke and mortality. These MRI phenotypes can possibly classify individual patients and assess their risk of future stroke and mortality.


Sign in / Sign up

Export Citation Format

Share Document