scholarly journals Prospective Study of Serum Uric Acid Levels and First Stroke Events in Chinese Adults With Hypertension

2021 ◽  
Vol 12 ◽  
Author(s):  
Feng Hu ◽  
Longlong Hu ◽  
Rihua Yu ◽  
Fengyu Han ◽  
Wei Zhou ◽  
...  

Objectives: We investigated the association between serum uric acid (SUA) levels and the risk of the first stroke in Chinese adults with hypertension.Methods: A total of 11, 841 hypertensive patients were selected from the Chinese Hypertension Registry for analysis. The relationship between SUA levels and first stroke was determined using multivariable Cox proportional hazards regression, smoothing curve fitting, and Kaplan–Meier survival curve analysis.Results: During a median follow-up of 614 days, 99 cases of the first stroke were occurred. Cox proportional hazards models indicated that SUA levels were not significantly associated with the first stroke event [adjusted-hazard ratio (HR) per SD increase: 0.98, 95% CI 0.76–1.26, P = 0.889]. In comparison to the group without hyperuricemia (HUA), there were no significantly higher risks of first stroke events (adjusted-HR: 1.22, 95% CI 0.79–1.90, P = 0.373) in the population with HUA. However, in the population less than 60 years old, subjects with HUA had a significantly higher risk of the first stroke than the population without HUA (adjusted-HR: 4.89, 95% CI 1.36–17.63, P = 0.015). In subjects older than 60 years, we did not find a significant relationship between HUA and first stroke (adjusted-HR: 0.97, 95% CI 0.60–1.56, P = 0.886). Survival analysis further confirmed this discrepancy (log-rank P = 0.013 or 0.899 for non-aging or aging group).Conclusion: No significant evidence in the present study indicated that increased SUA levels were associated with the risk of first stroke in the Chinese adults with hypertension. Age played an interactive role in the relationship between HUA and the first stroke event.

2021 ◽  
Vol 8 ◽  
Author(s):  
Qiu-hong Tan ◽  
Lin Liu ◽  
Yu-qing Huang ◽  
Yu-ling Yu ◽  
Jia-yi Huang ◽  
...  

Background: Limited studies focused on the association between serum uric acid (SUA) change with ischemic stroke, and their results remain controversial. The present study aimed to investigate the relationship between change in SUA with ischemic stroke among hypertensive patients.Method: This was a retrospective cohort study. We recruited adult hypertensive patients who had two consecutive measurements of SUA levels from 2013 to 2014 and reported no history of stroke. Change in SUA was assessed as SUA concentration measured in 2014 minus SUA concentration in 2013. Multivariable Cox proportional hazards models were used to estimate adjusted hazard ratios (HRs) and 95% confidence intervals (CIs). The Kaplan–Meier analysis and log-rank test were performed to quantify the difference in cumulative event rate. Additionally, subgroup analysis and interaction tests were conducted to investigate heterogeneity.Results: A total of 4,628 hypertensive patients were included, and 93 cases of ischemic stroke occurred during the mean follow-up time of 3.14 years. Participants were categorized into three groups according to their SUA change tertiles [low (SUA decrease substantially): <-32.6 μmol/L; middle (SUA stable): ≥-32.6 μmol/L, <40.2 μmol/L; high (SUA increase substantially): ≥40.2 μmol/L]. In the fully adjusted model, setting the SUA stable group as reference, participants in the SUA increase substantially group had a significantly elevated risk of ischemic stroke [HR (95% CI), 1.76 (1.01, 3.06), P = 0.0451], but for the SUA decrease substantially group, the hazard effect was insignificant [HR (95% CI), 1.31 (0.75, 2.28), P = 0.3353]. Age played an interactive role in the relationship between SUA change and ischemic stroke. Younger participants (age < 65 years) tended to have a higher risk of ischemic stroke when SUA increase substantially.Conclusion: SUA increase substantially was significantly correlated with an elevated risk of ischemic stroke among patients with hypertension.


2019 ◽  
Vol 105 (3) ◽  
pp. e597-e609 ◽  
Author(s):  
Lihua Hu ◽  
Guiping Hu ◽  
Benjamin Ping Xu ◽  
Lingjuan Zhu ◽  
Wei Zhou ◽  
...  

Abstract Background In addition to the controversy regarding the association of hyperuricemia with mortality, uncertainty also remains regarding the association between low serum uric acid (SUA) and mortality. We aimed to assess the relationship between SUA and all-cause and cause-specific mortality. Methods This cohort study included 9118 US adults from the National Health and Nutrition Examination Survey (1999-2002). Multivariable Cox proportional hazards models were used to evaluate the relationship between SUA and mortality. Our analysis included the use of a generalized additive model and smooth curve fitting (penalized spline method), and 2-piecewise Cox proportional hazards models, to address the nonlinearity between SUA and mortality. Results During a median follow-up of 5.83 years, 448 all-cause deaths occurred, with 100 cardiovascular disease (CVD) deaths, 118 cancer deaths, and 37 respiratory disease deaths. Compared with the reference group, there was an increased risk of all-cause, CVD, cancer, and respiratory disease mortality for participants in the first and third tertiles of SUA. We further found a nonlinear and U-shaped association between SUA and mortality. The inflection point for the curve was found at a SUA level of 5.7 mg/dL. The hazard ratios (95% confidence intervals) for all-cause mortality were 0.80 (0.65-0.97) and 1.24 (1.10-1.40) to the left and right of the inflection point, respectively. This U-shaped association was observed in both sexes; the inflection point for SUA was 6 mg/dL in males and 4 mg/dL in females. Conclusion Both low and high SUA levels were associated with increased all-cause and cause-specific mortality, supporting a U-shaped association between SUA and mortality.


2021 ◽  
Author(s):  
Somaya Albhaisi ◽  
Rehan Qayyum

Abstract BACKGROUND & AIMS: Interpreting levels of liver enzymes is often challenging because they may be influenced by metabolic processes beyond the liver. Given their pathophysiologic roles in inflammation and oxidative stress, higher levels of these enzymes may be associated with increased risk of mortality. However, studies have found inconsistent results. Thus, we examined the association of liver enzymes levels with cancer mortality in the general U.S. adult population. METHODS: We used the US National Health and Nutrition Examination Survey from 1999 to 2016. Kaplan-Meier survival curve comparisons were examined across quartiles of liver enzymes. Cox proportional hazards models were built to examine the relationship between cancer mortality and liver enzymes quartiles without and with adjustment for potential confounding factors. RESULTS: During the 338,882 person-years follow-up, 1059 participants had cancer-related deaths. There was a nonlinear U-shaped relationship between serum alanine and aspartate aminotransferase (ALT and AST) levels and cancer mortality. There was no relationship between cancer mortality and gamma glutamyltransferase (GGT), however, each 10 IU/L increase in GGT after median was associated with 1% higher mortality risk (HR=1.01; 95% CI=1.00, 1.02; P=0.001). Only subjects with high levels of alkaline phosphatase (ALP) had higher cancer mortality (HR=1.63; 95CI=1.30, 2.05; P<0.001 and HR=1.52; 95%CI=1.20, 1.94; P=0.001 respectively).CONCLUSIONS: Only the lowest and highest serum ALT and AST levels are associated with increased cancer mortality. For ALP, the relationship is present at higher levels. The association with GGT was not robust to different analyses. The mechanisms underlying the observed relationships need further exploration.


2021 ◽  
Author(s):  
Pingping Ren ◽  
Qilong Zhang ◽  
Yixuan Pan ◽  
Yi Liu ◽  
Chenglin Li ◽  
...  

Abstract Background: Studies on the correlation between serum uric acid (SUA) and all-cause mortality in peritoneal dialysis (PD) patients were mainly based on the results of baseline SUA. We aimed to analyze the change of SUA level post PD, and the correlation between follow-up SUA and prognosis in PD patients. Methods: All patients who received PD catheterization and maintaining PD in our center from March 2, 2001 to March 8, 2017 were screened. Kaplan-Meier and Cox proportional-hazards regression models were used to analyze the effect of SUA levels on the risks of death. We graded SUA levels at baseline, 6 months, 12 months, 18 months and 24 months post PD by mean of SUA plus or minus a standard deviation as cut-off values, and compared all-cause and cardiovascular mortality among patients with different SUA grades. Results: A total of 1402 patients were included, 763 males (54.42%) and 639 females (45.58%). Their average age at PD start was 49.50±14.20 years. The SUA levels were 7.97±1.79mg/dl at baseline, 7.12±1.48mg/dl at 6 months, 7.05±1.33mg/dl at 12 months, 7.01±1.30mg/dl at 18 months, and 6.93±1.26mg/dl at 24 months. During median follow-up time of 31 (18, 49) months, 173 (12.34%) all-cause deaths occurred, including 68 (4.85%) cardiovascular deaths. There were no significant differences on all-cause mortality among groups with graded SUA levels at baseline, 12 months, 18 months and 24 months during follow-up or on cardiovascular mortality among groups with graded SUA levels at baseline, 6 months, 12 months, 18 months and 24 months during follow-up. At 6 months post PD,Kaplan Meier analysis showed there was significant difference on all-cause mortality among graded SUA levels (c2=11.315, P=0.010), and the all-cause mortality was lowest in grade of 5.65mg/dl≤SUA<7.13mg/dl. Conclusion: SUA level decreased during follow up post PD. At 6 months post PD, a grade of 5.65mg/dl≤SUA<7.13mg/dl was appropriate for better patients’ survival.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
S Kochav ◽  
R.C Chen ◽  
J.M.D Dizon ◽  
J.A.R Reiffel

Abstract Background Theoretical concern exists regarding AV block (AVB) with class I antiarrhythmics (AADs) when bundle branch block (BBB) is present. Whether this is substantiated in real-world populations is unknown. Purpose To determine the relationship between type of AAD and incidence of AVB in patients with preexisting BBB. Methods We retrospectively studied all patients with BBB who received class I and III AADs between 1997–2019 to compare incidence of AVB. We defined index time as first exposure to either drug class and excluded patients with prior AVB or exposed to both classes. Time-at-risk window ended at first outcome occurrence or when patients were no longer observed in the database. We estimated hazard ratios for incident AVB using Cox proportional hazards models with propensity score stratification, adjusting for over 32,000 covariates from the electronic health record. Kaplan-Meier methods were used to determine treatment effects over time. Results Of 40,120 individuals with BBB, 148 were exposed to a class I AAD and 2401 to a class III AAD. Over nearly 4,200 person-years of follow up, there were 22 and 620 outcome events in the class I and class III cohorts, respectively (Figure). In adjusted analyses, AVB risk was markedly lower in patients exposed to class I AADs compared with class III (HR 0.48 [95% CI 0.30–0.75]). Conclusion Among patients with BBB, exposure to class III AADs was strongly associated with greater risk of incident AVB. This likely reflects differences in natural history of patients receiving class I vs class III AADs rather than adverse class III effects, however, the lack of worse outcomes acutely with class I AADs suggests that they may be safer in BBB than suspected. Funding Acknowledgement Type of funding source: None


2017 ◽  
Vol 117 (06) ◽  
pp. 1072-1082 ◽  
Author(s):  
Xiaoyan Li ◽  
Steve Deitelzweig ◽  
Allison Keshishian ◽  
Melissa Hamilton ◽  
Ruslan Horblyuk ◽  
...  

SummaryThe ARISTOTLE trial showed a risk reduction of stroke/systemic embolism (SE) and major bleeding in non-valvular atrial fibrillation (NVAF) patients treated with apixaban compared to warfarin. This retrospective study used four large US claims databases (MarketScan, PharMetrics, Optum, and Humana) of NVAF patients newly initiating apixaban or warfarin from January 1, 2013 to September 30, 2015. After 1:1 warfarin-apixaban propensity score matching (PSM) within each database, the resulting patient records were pooled. Kaplan-Meier curves and Cox proportional hazards models were used to estimate the cumulative incidence and hazard ratios (HRs) of stroke/SE and major bleeding (identified using the first listed diagnosis of inpatient claims) within one year of therapy initiation. The study included a total of 76,940 (38,470 warfarin and 38,470 apixaban) patients. Among the 38,470 matched pairs, 14,563 were from MarketScan, 7,683 were from PharMetrics, 7,894 were from Optum, and 8,330 were from Humana. Baseline characteristics were balanced between the two cohorts with a mean (standard deviation [SD]) age of 71 (12) years and a mean (SD) CHA2DS2-VASc score of 3.2 (1.7). Apixaban initiators had a significantly lower risk of stroke/SE (HR: 0.67, 95 % CI: 0.59–0.76) and major bleeding (HR: 0.60, 95 % CI: 0.54–0.65) than warfarin initiators. Different types of stroke/SE and major bleeding – including ischaemic stroke, haemorrhagic stroke, SE, intracranial haemorrhage, gastrointestinal bleeding, and other major bleeding – were all significantly lower for apixaban compared to warfarin treatment. Subgroup analyses (apixaban dosage, age strata, CHA2DS2-VASc or HAS-BLED score strata, or dataset source) all show consistently lower risks of stroke/SE and major bleeding associated with apixaban as compared to warfarin treatment. This is the largest “real-world” study on apixaban effectiveness and safety to date, showing that apixaban initiation was associated with significant risk reductions in stroke/SE and major bleeding compared to warfarin initiation after PSM. These benefits were consistent across various high-risk subgroups and both the standard-and low-dose apixaban dose regimens.Note: The review process for this manuscript was fully handled by Christian Weber, Editor in Chief.Supplementary Material to this article is available online at www.thrombosis-online.com.


2021 ◽  
Vol 39 (6_suppl) ◽  
pp. 59-59
Author(s):  
Umang Swami ◽  
Taylor Ryan McFarland ◽  
Benjamin Haaland ◽  
Adam Kessel ◽  
Roberto Nussenzveig ◽  
...  

59 Background: In mCSPC, baseline CTC counts have been shown to correlate with PSA responses and progression free survival (PFS) in small studies in the context of androgen deprivation therapy (ADT) without modern intensification with docetaxel or novel hormonal therapy. Similar correlation of CTC count with PSA responses and PFS was recently reported from an ongoing phase 3 trial in mCSPC setting (SWOG1216) without reporting the association in the context of ADT intensification. Furthermore, none of these studies correlated CTCs with overall survival (OS). Herein we evaluated whether CTCs were associated with outcomes including OS in a real world mCPSC population treated with intensified as well as non-intensified ADT. Methods: Eligibility criteria: new mCSPC receiving ADT with or without intensification and enumeration of baseline CTCs by FDA cleared Cell Search CTC assay. The relationship between CTC counts (categorized as: 0, 1-4, and ≥5/7.5 ml) and both PFS and OS was assessed in the context of Cox proportional hazards models, both unadjusted and adjusted for age, Gleason, PSA at ADT initiation, de novo vs. non-de novo status, and ADT intensification vs. non-intensification therapy. Results: Overall 99 pts were identified. Baseline characteristics are summarized in Table. In unadjusted analyses, CTC counts of ≥5 as compared to 0 were strongly associated with inferior PFS (hazard ratio [HR] 3.38, 95% CI 1.85-6.18; p < 0.001) and OS (HR 4.44 95% CI 1.63-12.10; p = 0.004). In multivariate analyses, CTC counts of ≥5 as compared to 0 continued to be associated with inferior PFS (HR 5.49, 95% CI 2.64-11.43; p < 0.001) and OS (HR 4.00, 95% CI 1.31-12.23; p = 0.015). Within the ADT intensification subgroup also, high CTC counts were associated with poor PFS and OS. For PFS, the univariate HR for CTC ≥5 vs. 0 was 4.87 (95% CI 1.66-14.30; p = 0.004) and multivariate HR for CTC ≥5 vs. 0 was 7.43 (95% CI 1.92-28.82; p = 0.004). For OS, the univariate HR for CTC ≥5 vs. 0 was 15.88 (95% CI 1.93-130.58; p = 0.010) and multivariate HR for CTC ≥5 vs. 0 was 24.86 (95% CI 2.03-304.45; p = 0.012). Conclusions: To best of our knowledge this is the first study to show that high baseline CTC counts are strongly associated with inferior PFS as well as OS in pts with newly diagnosed mCSPC, even in those who received intensified ADT therapy. Identifying these pts at highest risk of progression and death can help with counselling and prognostication in clinics as well as design and enrollment in future clinical trials. [Table: see text]


2020 ◽  
Vol 35 (6) ◽  
pp. 1032-1042
Author(s):  
Duk-Hee Kang ◽  
Yuji Lee ◽  
Carola Ellen Kleine ◽  
Yong Kyu Lee ◽  
Christina Park ◽  
...  

Abstract Background Eosinophils are traditionally known as moderators of allergic reactions; however, they have now emerged as one of the principal immune-regulating cells as well as predictors of vascular disease and mortality in the general population. Although eosinophilia has been demonstrated in hemodialysis (HD) patients, associations of eosinophil count (EOC) and its changes with mortality in HD patients are still unknown. Methods In 107 506 incident HD patients treated by a large dialysis organization during 2007–11, we examined the relationships of baseline and time-varying EOC and its changes (ΔEOC) over the first 3 months with all-cause mortality using Cox proportional hazards models with three levels of hierarchical adjustment. Results Baseline median EOC was 231 (interquartile range 155–339) cells/μL and eosinophilia (&gt;350 cells/μL) was observed in 23.4% of patients. There was a gradual increase in EOC over time after HD initiation with a median ΔEOC of 5.1 (IQR −53–199) cells/μL, which did not parallel the changes in white blood cell count. In fully adjusted models, mortality risk was highest in subjects with lower baseline and time-varying EOC (&lt;100 cells/μL) and was also slightly higher in patients with higher levels (≥550 cells/μL), resulting in a reverse J-shaped relationship. The relationship of ΔEOC with all-cause mortality risk was also a reverse J-shape where both an increase and decrease exhibited a higher mortality risk. Conclusions Both lower and higher EOCs and changes in EOC over the first 3 months after HD initiation were associated with higher all-cause mortality in incident HD patients.


RMD Open ◽  
2019 ◽  
Vol 5 (2) ◽  
pp. e001015 ◽  
Author(s):  
Fernando Pérez Ruiz ◽  
Pascal Richette ◽  
Austin G Stack ◽  
Ravichandra Karra Gurunath ◽  
Ma Jesus García de Yébenes ◽  
...  

ObjectiveTo determine the impact of achieving serum uric acid (sUA) of <0.36 mmol/L on overall and cardiovascular (CV) mortality in patients with gout.MethodsProspective cohort of patients with gout recruited from 1992 to 2017. Exposure was defined as the average sUA recorded during the first year of follow-up, dichotomised as ≤ or >0.36 mmol/L. Bivariate and multivariate Cox proportional hazards models were used to determine mortality risks, expressed HRs and 95% CIs.ResultsOf 1193 patients, 92% were men with a mean age of 60 years, 6.8 years’ disease duration, an average of three to four flares in the previous year, a mean sUA of 9.1 mg/dL at baseline and a mean follow-up 48 months; and 158 died. Crude mortality rates were significantly higher for an sUA of ≥0.36 mmol/L, 80.9 per 1000 patient-years (95% CI 59.4 to 110.3), than for an sUA of <0.36 mmol/L, 25.7 per 1000 patient-years (95% CI 21.3 to 30.9). After adjustment for age, sex, CV risk factors, previous CV events, observation period and baseline sUA concentration, an sUA of ≥0.36 mmol/L was associated with elevated overall mortality (HR=2.33, 95% CI 1.60 to 3.41) and CV mortality (HR=2.05, 95% CI 1.21 to 3.45).ConclusionsFailure to reach a target sUA level of 0.36 mmol/L in patients with hyperuricaemia of gout is an independent predictor of overall and CV-related mortality. Targeting sUA levels of <0.36 mmol/L should be a principal goal in these high-risk patients in order to reduce CV events and to extend patient survival.


2006 ◽  
Vol 24 (18_suppl) ◽  
pp. 560-560 ◽  
Author(s):  
D. A. Patt ◽  
Z. Duan ◽  
G. Hortobagyi ◽  
S. H. Giordano

560 Background: Adjuvant chemotherapy for breast cancer is associated with the development of secondary AML, but this risk in an older population has not been previously quantified. Methods: We queried data from the Surveillance, Epidemiology, and End Results-Medicare (SEER-Medicare) database for women who were diagnosed with nonmetastatic breast cancer from 1992–1999. We compared the risk of AML in patients with and without adjuvant chemotherapy (C), and by differing C regimens. The primary endpoint was a claim with an inpatient or outpatient diagnosis of AML (ICD-09 codes 205–208). Risk of AML was estimated using the method of Kaplan-Meier. Cox proportional hazards models were used to determine factors independently associated with AML. Results: 36,904 patients were included in this observational study, 4,572 who had received adjuvant C and 32,332 who had not. The median patient age was 75.3 (66.0–103.3). The median follow up was 63 months (13–132). Patients who received C were significantly younger, had more advanced stage disease, and had lower comorbidity scores (p<0.001). The unadjusted risk of developing AML at 10 years after any adjuvant C for breast cancer was 1.6% versus 1.1% for women who had not received C. The adjusted HR for AML with adjuvant C was 1.72 (1.16–2.54) compared to women who did not receive C. HR for radiation was 1.21 (0.86–1.70). HR was higher with increasing age but p>0.05. An analysis was performed among women who received C. When compared to other C regimens, anthracycline-based therapy (A) conveyed a significantly higher hazard for AML HR 2.17 (1.08–4.38), while patients who received A plus taxanes (T) did not have a significant increase in risk HR1.29 (0.44–3.82) nor did patients who received T with some other C HR 1.50 (0.34–6.67). Another significant independent predictor of AML included GCSF use HR 2.21 (1.14–4.25). In addition, increasing A dose was associated with higher risk of AML (p<0.05). Conclusions: There is a small but real increase in AML after adjuvant chemotherapy for breast cancer in older women. The risk appears to be highest from A-based regimens, most of which also contained cyclophosphamide, and may be dose-dependent. T do not appear to increase risk. The role of GCSF should be further explored. No significant financial relationships to disclose.


Sign in / Sign up

Export Citation Format

Share Document