Abstract P108: Side Effects Of Initial Combination Versus Monotherapy For Patients With Hypertension

Hypertension ◽  
2021 ◽  
Vol 78 (Suppl_1) ◽  
Author(s):  
Jaejin An ◽  
Matthew Mefford ◽  
Liang Ni ◽  
Rong Wei ◽  
Hui Zhou ◽  
...  

Clinical guidelines recommend initiating combination antihypertensive therapy for many patients with hypertension. However, data on the risk of side effects are limited. We evaluated side effects associated with initiating combination therapy versus monotherapy among patients with hypertension from Kaiser Permanente Southern California between 2008-2014. Patient characteristics, antihypertensive medication use, and possible side effects were collected using electronic health records. We examined the association of initial combination therapy and incidence of side effects including acute kidney injury, hypotension, injurious fall, hyperkalemia, hypokalemia, hyponatremia, or hyperuricemia using multivariable Cox Proportional hazards models. Of 164,805 patients, 44% initiated combination therapy (34% angiotensin converting enzyme inhibitor (ACEI)-thiazide diuretics (TD); 10% other combinations) and 56% initiated monotherapy (22% ACEIs; 16% TD; 11% beta blockers (BB); 7% calcium channel blockers). Incidence rates of side effects were between 3.8 for hyperkalemia to 55.5 for hypokalemia per 1000 person-yrs during median follow-up of 0.27-0.45 yrs. Initiation of ACEI-TD combination therapy was associated with a lower risk of hyperkalemia than ACEI monotherapy and a lower risk of hypokalemia than TD monotherapy ( Table ). Initiation of ACEI-TD combination therapy was associated with a higher risk of hyponatremia, hyperuricemia, and hypotension, but not associated with injurious falls when compared with other monotherapy groups. Monitoring for side effects following initiation of antihypertensive medication with combination therapy may be useful.

Hypertension ◽  
2021 ◽  
Vol 77 (1) ◽  
pp. 103-113
Author(s):  
Jaejin An ◽  
Tiffany Luong ◽  
Lei Qian ◽  
Rong Wei ◽  
Ran Liu ◽  
...  

Many patients with hypertension require 2 or more drug classes to achieve their blood pressure (BP) goal. We compared antihypertensive medication treatment patterns and BP control between patients who initiated combination therapy versus monotherapy. We identified adults with hypertension enrolled in a US integrated healthcare system who initiated antihypertensive medication between 2008 and 2014. Patient demographics, clinical characteristics, antihypertensive medication, and BP were extracted from electronic health records. Antihypertensive medication patterns and multivariable adjusted prevalence ratios (PRs) of achieving the 2017 American College of Cardiology/American Heart Association guideline-recommended BP <130/80 mm Hg were evaluated for 2 years following treatment initiation. Of 135 971 patients, 43% initiated antihypertensive combination therapy (35% ACE [angiotensin converting enzyme] inhibitor (ACEI)-thiazide diuretics; 8% with other combinations) and 57% initiated monotherapy (22% ACEIs; 16% thiazide diuretics; 11% β blockers; 8% calcium channel blockers). After multivariable adjustment including premedication BP levels, patients who initiated ACEI-thiazide diuretic combination therapy were more likely to achieve BP <130/80 mm Hg compared with their counterparts who initiated monotherapy with ACEI (PR, 1.10 [95% CI, 1.08–1.12]), thiazide diuretic (PR, 1.21 [95% CI, 1.18–1.24]), β blocker (PR, 1.17 [95% CI, 1.14–1.20]), or calcium channel blocker (PR, 1.25 [95% CI, 1.22–1.29]). Compared with initiating monotherapy, patients initiating ACEI-thiazide diuretic combination therapy were more likely to achieve BP goals.


2021 ◽  
Vol 20 (1) ◽  
Author(s):  
Djibril M. Ba ◽  
Xiang Gao ◽  
Joshua Muscat ◽  
Laila Al-Shaar ◽  
Vernon Chinchilli ◽  
...  

Abstract Background Whether mushroom consumption, which is rich in several bioactive compounds, including the crucial antioxidants ergothioneine and glutathione, is inversely associated with low all-cause and cause-specific mortality remains uncertain. This study aimed to prospectively investigate the association between mushroom consumption and all-cause and cause-specific mortality risk. Methods Longitudinal analyses of participants from the Third National Health and Nutrition Examination Survey (NHANES III) extant data (1988–1994). Mushroom intake was assessed by a single 24-h dietary recall using the US Department of Agriculture food codes for recipe foods. All-cause and cause-specific mortality were assessed in all participants linked to the National Death Index mortality data (1988–2015). We used Cox proportional hazards regression models to calculate multivariable-adjusted hazard ratios (HRs) and 95% confidence intervals (95% CIs) for all-cause and cause-specific mortality. Results Among 15,546 participants included in the current analysis, the mean (SE) age was  44.3 (0.5) years. During a mean (SD) follow-up duration of 19.5 (7.4) years , a total of 5826 deaths were documented. Participants who reported consuming mushrooms had lower risk of all-cause mortality compared with those without mushroom intake (adjusted hazard ratio (HR) = 0.84; 95% CI: 0.73–0.98) after adjusting for demographic, major lifestyle factors, overall diet quality, and other dietary factors including total energy. When cause-specific mortality was examined, we did not observe any statistically significant associations with mushroom consumption. Consuming 1-serving of mushrooms per day instead of 1-serving of processed or red meats was associated with lower risk of all-cause mortality (adjusted HR = 0.65; 95% CI: 0.50–0.84). We also observed a dose-response relationship between higher mushroom consumption and lower risk of all-cause mortality (P-trend = 0.03). Conclusion Mushroom consumption was associated with a lower risk of total mortality in this nationally representative sample of US adults.


2021 ◽  
pp. 1-26
Author(s):  
Qi Gao ◽  
Jia-Yi Dong ◽  
Renzhe Cui ◽  
Isao Muraki ◽  
Kazumasa Yamagishi ◽  
...  

Abstract We sought to examine the prospective associations of specific fruit consumption, in particular flavonoid-rich fruit (FRF) consumption, with the risk of stroke and subtypes of stroke in a Japanese population. A study followed a total of 39,843 men and 47,334 women aged 44-76 years, and free of cardiovascular disease, diabetes, and cancer at baseline since 1995 and 1998 to the end of 2009 and 2012, respectively. Data on total and specific FRF consumption for each participant were obtained using a self-administrated food frequency questionnaire. The hazard ratios (HRs) of stroke in relation to total and specific FRF consumption were estimated through Cox proportional hazards regression models. During a median follow-up of 13.1 years, 4092 incident stroke cases (2557 cerebral infarctions and 1516 hemorrhagic strokes) were documented. After adjustment for age, body mass index, study area, lifestyles, dietary factors, and other risk factors, it was found that total FRF consumption was associated with a significantly lower risk of stroke in women (HR= 0.70; 95% CI, 0.58-0.84), while the association in men was not significant (HR= 0.93; 95% CI, 0.79-1.09). As for specific FRFs, consumptions of citrus fruits, strawberries, and grapes were found associated with a lower stroke risk in women. Higher consumptions of FRFs, in particular citrus fruits, strawberries, and grapes, were associated with a lower risk of developing stroke in Japanese women.


Circulation ◽  
2021 ◽  
Vol 143 (Suppl_1) ◽  
Author(s):  
Shutong Du ◽  
Hyunju Kim ◽  
Josef Coresh ◽  
Casey M Rebholz

Introduction: Ultra-processed food defined as food and drink products formulated through sequences of industrial processes, and generally contain non-culinary used additives. Previous studies have linked higher ultra-processed food intake with several cardiometabolic and cardiovascular diseases. However, longitudinal evidence from US populations remains scarce. Hypothesis: We hypothesized that higher intake of ultra-processed food is associated with higher risk of coronary heart disease (CHD). Methods: We selected 12,607 adults aged 44-66 years in 4 US communities from the ARIC study at baseline. Dietary intake data were collected through a validated 66-item food frequency questionnaire. Ultra-processed foods were defined using the NOVA classification and the level of intake was calculated for each participant. We conducted Cox proportional hazards models to study the association between quartiles of ultra-processed food intake and incident CHD. Nonlinearity was assessed by using restricted cubic spline regression. Results: There were 1,899 incident CHD cases documented after an median follow up of 27 years (291,285.2 person-years). Incidence rates were higher in the highest quartile of ultra-processed food intake (71.6 per 10,000 person-years; 95% CI, 65.8-78.0) compared to the lowest quartile (59.7 per 10,000 person-years; 95% CI, 54.3-65.7). Participants in the highest vs. lowest quartile were associated with a 18% higher risk of CHD (Hazard ratio 1.18 [95% CI, 1.04 - 1.34]; P-trend = 0.010) after adjusting for sociodemographic factors and health behaviors. An approximately linear relationship was observed between ultra-processed food intake and risk of CHD after 4 servings/day ( Figure ). Conclusion: In conclusion, higher ultra-processed food intake was associated with a higher risk of coronary heart disease among middle-aged US adults. Further prospective studies are needed to confirm these findings and to investigate the mechanisms by which ultra-processed food may affect health.


Circulation ◽  
2015 ◽  
Vol 131 (suppl_1) ◽  
Author(s):  
Yariv Gerber ◽  
Susan A Weston ◽  
Maurice E Sarano ◽  
Sheila M Manemann ◽  
Alanna M Chamberlain ◽  
...  

Background: Little is known about the association between coronary artery disease (CAD) and the risk of heart failure (HF) after myocardial infarction (MI), and whether it differs by reduced (HFrEF) or preserved (HFpEF) ejection fraction (EF) has yet to be determined. Subjects and Methods: Olmsted County, Minnesota residents (n=1,924; mean age, 64 years; 66% male) with first MI diagnosed in 1990-2010 and no prior HF were followed through 2013. Framingham Heart Study criteria were used to define HF, which was further classified according to EF (applying a 50% cutoff). The extent of angiographic CAD was defined at index MI according to the number of major epicardial coronary arteries with ≥50% lumen diameter obstruction. Fine & Gray and Cox proportional hazards regression models were used to assess the association of CAD categories with incidence of HF, and multiple imputation methodology was applied to account for the 19% with missing EF data. Results: During a mean (SD) follow-up of 6.7 (5.9) years, 594 patients developed HF. Adjusted for age and sex, with death considered a competing risk, the cumulative incidence rates of HF among patients with 1- (n=581), 2- (n=622), and 3-vessel disease (n=721) were 11.2%, 14.6% and 20.5% at 30 days; and 18.1%, 22.3% and 29.4% at 5 years after MI, respectively. The increased risk of HF with greater number of occluded vessels was only modestly attenuated after further adjustment for patient and MI characteristics, and did not differ materially by EF (Table). Conclusions: The extent of angiographic CAD expressed by the number of diseased vessels is independently associated with HF incidence after MI. The association is evident promptly after MI and applies to both HFrEF and HFpEF.


Circulation ◽  
2014 ◽  
Vol 130 (suppl_2) ◽  
Author(s):  
Raffaele De Caterina ◽  
Ulrika Andersson ◽  
John H Alexander ◽  
M.Cecilia Bahit ◽  
Patrick J Commerford ◽  
...  

Background: History of bleeding is important in decisions for anticoagulation. We analyzed outcomes in relation to history of bleeding and randomized treatments in patients with atrial fibrillation (AF) in the ARISTOTLE trial. Methods: The on-treatment safety population included 18,140 patients receiving ≥1 dose of study drug, apixaban 5 mg bd (2.5 mg bd if 2 of the following: age >80 yrs; body weight <60 kg; or creatinine >133 μmol/L) or warfarin aiming for INR 2.0-3.0 (median TTR 66%), for a median of 1.8 yrs. Adjudicated outcomes in relation to randomization and history of bleeding were analyzed using a Cox proportional hazards model. Efficacy endpoints were analyzed in the intention-to-treat population. Results: A history of bleeding was reported in 3033 patients (16.7%), who more often were male (68% vs 64%, p <0.0005); with a history of prior stroke/TIA/systemic embolism (23% vs 19%, p <0.0001); diabetes (27% vs 24%, p=0.0010); higher CHADS2 score (CHADS2 >3: 35% vs 29%), age (mean [SD] 71 [9] vs 69 [10], p <0001) and body weight (86 [21] vs 84 [21], p <0.0001); lower creatinine clearance (77 [33] vs 80 [33], p=0.0007) and mean systolic blood pressure (131 [17] vs 132 [16], p=0.0027). Calcium channel blockers, statins, non-steroidal anti-inflammatory drugs and proton pump inhibitors were used more often in patients with vs without a history of bleeding. Major bleeding was the only outcome event occurring more frequently in patients with vs without a history of bleeding, HR 1.7 (95% CI 1.4-2.3) with apixaban and 1.5 (1.2-1.0) with warfarin. Primary efficacy and safety outcomes in relation to randomization, see Table. Conclusions: In patients with AF, a history of bleeding was associated with several risk factors for stroke and bleeding and, accordingly, a higher bleeding risk during anticoagulation. Benefits with apixaban vs warfarin as to stroke, mortality and major bleeding, are however consistent irrespective of bleeding history.


OBJECTIVE The challenges of posterior cervical fusions (PCFs) at the cervicothoracic junction (CTJ) are widely known, including the development of adjacent-segment disease by stopping fusions at C7. One solution has been to cross the CTJ (T1/T2) rather than stopping at C7. This approach may have undue consequences, including increased reoperations for symptomatic nonunion (operative nonunion). The authors sought to investigate if there is a difference in operative nonunion in PCFs that stop at C7 versus T1/T2. METHODS A retrospective analysis identified patients from the authors’ spine registry (Kaiser Permanente) who underwent PCFs with caudal fusion levels at C7 and T1/T2. Demographics, diagnoses, operative times, lengths of stay, and reoperations were extracted from the registry. Operative nonunion was adjudicated via chart review. Patients were followed until validated operative nonunion, membership termination, death, or end of study (March 31, 2020). Descriptive statistics and 2-year crude incidence rates and 95% confidence intervals for operative nonunion for PCFs stopping at C7 or T1/T2 were reported. Time-dependent crude and adjusted multivariable Cox proportional hazards models were used to evaluate operative nonunion rates. RESULTS The authors identified 875 patients with PCFs (beginning at C3, C4, C5, or C6) stopping at either C7 (n = 470) or T1/T2 (n = 405) with a mean follow-up time of 4.6 ± 3.3 years and a mean time to operative nonunion of 0.9 ± 0.6 years. There were 17 operative nonunions, and, after adjustment for age at surgery and smoking status, the cumulative incidence rates were similar between constructs stopping at C7 and those that extended to T1/T2 (C7: 1.91% [95% CI 0.88%–3.60%]; T1/T2: 1.98% [95% CI 0.86%–3.85%]). In the crude model and model adjusted for age at surgery and smoking status, no difference in risk for constructs extended to T1/T2 compared to those stopping at C7 was found (adjusted HR 1.09 [95% CI 0.42–2.84], p = 0.86). CONCLUSIONS In one of the largest cohort of patients with PCFs stopping at C7 or T1/T2 with an average follow-up of > 4 years, the authors found no statistically significant difference in reoperation rates for symptomatic nonunion (operative nonunion). This finding shows that there is no added risk of operative nonunion by extending PCFs to T1/T2 or stopping at C7.


Circulation ◽  
2017 ◽  
Vol 135 (suppl_1) ◽  
Author(s):  
James M Shikany ◽  
Monika M Safford ◽  
Joanna Bryan ◽  
PK Newby ◽  
Joshua S Richman ◽  
...  

Background: We have shown that the Southern dietary pattern, characterized by added fats, fried foods, organ and processed meats, and sugar-sweetened beverages, is associated with a greater risk of incident CHD in REGARDS, a national, population-based, longitudinal cohort. We sought to determine if the Southern pattern, other dietary patterns, and the Mediterranean diet score were associated with CHD events and mortality in REGARDS participants who previously reported CHD. Methods: REGARDS enrolled white and black adults aged ≥45 years between 2003-2007. Data were analyzed from 3,562 participants with CHD at baseline. Participants completed an FFQ at baseline, from which 5 dietary patterns were derived through factor analysis (Table). The Mediterranean diet score was calculated for each participant. Expert-adjudicated CHD events included myocardial infarction and CHD death. Cox proportional hazards regression was used to model the association of the dietary patterns and score with CHD events and death, adjusting for sociodemographics, lifestyle factors, energy intake, anthropometrics, and medical conditions. Results: Over 7 years of follow-up, there were 581 recurrent CHD events and 1,098 deaths. In fully-adjusted analyses, the highest quartile of adherence to the alcohol/salads pattern and highest group of the Mediterranean diet score were associated with lower risk of recurrent CHD compared to the lowest quartile/group (HR: 0.76; 95% CI: 0.59 – 0.98, HR: 0.78; 95% CI: 0.62 – 0.98, respectively). The highest quartile of adherence to the Southern pattern was associated with higher mortality (HR: 1.57; 95% CI: 1.28 – 1.91), while the highest group of the Mediterranean diet score was associated with lower mortality (HR: 0.80; 95% CI: 0.68 – 0.95). Conclusions: While the Southern dietary pattern was not related to risk of recurrent CHD, it was associated with higher mortality in REGARDS participants with existing CHD. Greater adherence to a Mediterranean diet was associated with lower risk of recurrent CHD and mortality.


2019 ◽  
Vol 48 (2) ◽  
pp. 240-249 ◽  
Author(s):  
Alpesh Amin ◽  
Allison Keshishian ◽  
Oluwaseyi Dina ◽  
Amol Dhamane ◽  
Anagha Nadkarni ◽  
...  

AbstractAtrial fibrillation (AF) prevalence increases with age; > 80% of US adults with AF are aged ≥ 65 years. Compare the risk of stroke/systemic embolism (SE), major bleeding (MB), net clinical outcome (NCO), and major adverse cardiac events (MACE) among elderly non-valvular AF (NVAF) Medicare patients prescribed direct oral anticoagulants (DOACs) vs warfarin. NVAF patients aged ≥ 65 years who initiated DOACs (apixaban, dabigatran, and rivaroxaban) or warfarin were selected from 01JAN2013-31DEC2015 in CMS Medicare data. Propensity score matching was used to balance DOAC and warfarin cohorts. Cox proportional hazards models estimated the risk of stroke/SE, MB, NCO, and MACE. 37,525 apixaban–warfarin, 18,131 dabigatran–warfarin, and 55,359 rivaroxaban–warfarin pairs were included. Compared to warfarin, apixaban (HR: 0.69; 95% CI 0.59–0.81) and rivaroxaban (HR: 0.82; 95% CI 0.73–0.91) had lower risk of stroke/SE, and dabigatran (HR: 0.88; 95% CI 0.72–1.07) had similar risk of stroke/SE. Apixaban (MB: HR: 0.61; 95% CI 0.57–0.67; NCO: HR: 0.64; 95% CI 0.60–0.69) and dabigatran (MB: HR: 0.79; 95% CI 0.71–0.89; NCO: HR: 0.84; 95% CI 0.76–0.93) had lower risk of MB and NCO, and rivaroxaban had higher risk of MB (HR: 1.08; 95% CI 1.02–1.14) and similar risk of NCO (HR: 1.04; 95% CI 0.99–1.09). Compared to warfarin, apixaban had a lower risk for stroke/SE, MB, and NCO; dabigatran had a lower risk of MB and NCO; and rivaroxaban had a lower risk of stroke/SE but higher risk of MB. All DOACs had lower risk of MACE compared to warfarin.


Author(s):  
Hao-Ming Li ◽  
Shi-Zuo Liu ◽  
Ying-Kai Huang ◽  
Yuan-Chih Su ◽  
Chia-Hung Kao

Appendicitis is a common surgical condition for children. However, environmental effects, such as piped water supply, on pediatric appendicitis risk remain unclear. This longitudinal, nationwide, cohort study aimed to compare the risk of appendicitis among children with different levels of piped water supply. Using data from Taiwan Water Resource Agency and National Health Insurance Research Database, we identified 119,128 children born in 1996–2010 from areas of the lowest piped water supply (prevalence 51.21% to 63.06%) as the study cohort; additional 119,128 children of the same period in areas of the highest piped water supply (prevalence 98.97% to 99.63%) were selected as the controls. Both cohorts were propensity-score matched by baseline variables. We calculated the hazard ratios (HRs) and 95% confidence intervals (CIs) of appendicitis in the study cohort compared to the controls by Cox proportional hazards regression. The study cohort had a raised overall incidence rates of appendicitis compared to the control cohort (12.8 vs. 8.7 per 10,000 person-years). After covariate adjustment, the risk of appendicitis was significantly increased in the study cohort (adjusted HR = 1.46, 95% CI: 1.35, 1.58, p < 0.001). Subgroup and sensitivity analyses showed consistent results that children with low piped water supply had a higher risk of appendicitis than those with high piped water supply. This study demonstrated that children with low piped water supply were at an increased risk of appendicitis. Enhancement of piped water availability in areas lacking adequate, secure, and sanitized water supply may protect children against appendicitis.


Sign in / Sign up

Export Citation Format

Share Document