scholarly journals Central Blood Pressure and Cardiovascular Outcomes in Chronic Kidney Disease

2018 ◽  
Vol 13 (4) ◽  
pp. 585-595 ◽  
Author(s):  
Mahboob Rahman ◽  
Jesse Yenchih Hsu ◽  
Niraj Desai ◽  
Chi-yuan Hsu ◽  
Amanda H. Anderson ◽  
...  

Background and objectivesCentral BP measurements provide noninvasive measurement of aortic BP; our objectives were to examine the association of central and brachial BP measurements with risk of cardiovascular outcomes and mortality in patients with CKD and to determine the role of central BP measurement in conjunction with brachial BP in estimating cardiovascular risk.Design, setting, participants, & measurementsIn a prospective, longitudinal study (the Chronic Renal Insufficiency Cohort), central BP was measured in participants with CKD using the SphygmoCorPVx System. Cox proportional hazards models were used for analyses.ResultsMean age of the participants (n=2875) was 60 years old. After a median follow-up of 5.5 years, participants in the highest quartile of brachial systolic BP (≥138 mm Hg) were at higher risk for the composite cardiovascular outcome (hazard ratio, 1.59; 95% confidence interval, 1.17 to 2.17; c statistic, 0.76) but not all-cause mortality (hazard ratio, 1.28; 95% confidence interval, 0.90 to 1.80) compared with those in the lowest quartile. Participants in the highest quartile of central systolic BP were also at higher risk for the composite cardiovascular outcome (hazard ratio, 1.69; 95% confidence interval, 1.24 to 2.31; c statistic, 0.76) compared with participants in the lowest quartile.ConclusionsWe show that elevated brachial and central BP measurements are both associated with higher risk of cardiovascular disease outcomes in patients with CKD. Measurement of central BP does not improve the ability to predict cardiovascular disease outcomes or mortality in patients with CKD compared with brachial BP measurement.

2021 ◽  
pp. 140349482199025
Author(s):  
Rand Jarroch ◽  
Tomi-Pekka Tuomainen ◽  
Behnam Tajik ◽  
Jussi Kauhanen

Aims: Little is known about the effect of economic recessions on cardiovascular disease. Therefore, we investigated the association of the economic recession in Finland in the 1990s with the incidence of cardiovascular disease among middle-aged and older women. Methods: A total of 918 women aged 53–73 years were examined for health and socioeconomic position in 1998–2001, as part of the population-based prospective Kuopio Ischaemic Heart Disease Risk Factor Study. The participants were asked whether Finland’s economic recession in the early 1990s had affected their lives socially or economically. The cohort was followed for 18 years, and incident physician-diagnosed cases of cardiovascular disease were obtained through record linkage with the national hospital discharge registry that covers every hospitalisation in Finland. Cox proportional hazards regression models were used to estimate the risk of cardiovascular disease among those with and without exposure to socioeconomic hardships during the recession, after adjusting for possible confounders. Results: At the baseline, 587 women reported having experienced socioeconomic hardships due to the recession. During the 20 years’ follow-up, 501 women developed cardiovascular disease. After adjustment for age, the risk of cardiovascular disease was 27% higher among women exposed to socioeconomic hardships compared to those who were not (hazard ratio 1.27, 95% confidence interval 1.06–1.53, P=0.012). Further adjustments for overall socioeconomic position at baseline, prior cardiovascular health, and lifestyle factors did not attenuate the association (hazard ratio 1.23, 95% confidence interval 1.02–1.5, P=0.029). Conclusions: The early 1990s economic recession was associated with a subsequently increased risk of cardiovascular disease among Finnish women.


2021 ◽  
pp. 000486742110096
Author(s):  
Oleguer Plana-Ripoll ◽  
Patsy Di Prinzio ◽  
John J McGrath ◽  
Preben B Mortensen ◽  
Vera A Morgan

Introduction: An association between schizophrenia and urbanicity has long been observed, with studies in many countries, including several from Denmark, reporting that individuals born/raised in densely populated urban settings have an increased risk of developing schizophrenia compared to those born/raised in rural settings. However, these findings have not been replicated in all studies. In particular, a Western Australian study showed a gradient in the opposite direction which disappeared after adjustment for covariates. Given the different findings for Denmark and Western Australia, our aim was to investigate the relationship between schizophrenia and urbanicity in these two regions to determine which factors may be influencing the relationship. Methods: We used population-based cohorts of children born alive between 1980 and 2001 in Western Australia ( N = 428,784) and Denmark ( N = 1,357,874). Children were categorised according to the level of urbanicity of their mother’s residence at time of birth and followed-up through to 30 June 2015. Linkage to State-based registers provided information on schizophrenia diagnosis and a range of covariates. Rates of being diagnosed with schizophrenia for each category of urbanicity were estimated using Cox proportional hazards models adjusted for covariates. Results: During follow-up, 1618 (0.4%) children in Western Australia and 11,875 (0.9%) children in Denmark were diagnosed with schizophrenia. In Western Australia, those born in the most remote areas did not experience lower rates of schizophrenia than those born in the most urban areas (hazard ratio = 1.02 [95% confidence interval: 0.81, 1.29]), unlike their Danish counterparts (hazard ratio = 0.62 [95% confidence interval: 0.58, 0.66]). However, when the Western Australian cohort was restricted to children of non-Aboriginal Indigenous status, results were consistent with Danish findings (hazard ratio = 0.46 [95% confidence interval: 0.29, 0.72]). Discussion: Our study highlights the potential for disadvantaged subgroups to mask the contribution of urban-related risk factors to risk of schizophrenia and the importance of stratified analysis in such cases.


Neurosurgery ◽  
2015 ◽  
Vol 77 (6) ◽  
pp. 880-887 ◽  
Author(s):  
Eric J. Heyer ◽  
Joanna L. Mergeche ◽  
Shuang Wang ◽  
John G. Gaudet ◽  
E. Sander Connolly

BACKGROUND: Early cognitive dysfunction (eCD) is a subtle form of neurological injury observed in ∼25% of carotid endarterectomy (CEA) patients. Statin use is associated with a lower incidence of eCD in asymptomatic patients having CEA. OBJECTIVE: To determine whether eCD status is associated with worse long-term survival in patients taking and not taking statins. METHODS: This is a post hoc analysis of a prospective observational study of 585 CEA patients. Patients were evaluated with a battery of neuropsychometric tests before and after surgery. Survival was compared for patients with and without eCD stratifying by statin use. At enrollment, 366 patients were on statins and 219 were not. Survival was assessed by using Kaplan-Meier methods and multivariable Cox proportional hazards models. RESULTS: Age ≥75 years (P = .003), diabetes mellitus (P < .001), cardiac disease (P = .02), and statin use (P = .014) are significantly associated with survival univariately (P < .05) by use of the log-rank test. By Cox proportional hazards model, eCD status and survival adjusting for univariate factors within statin and nonstatin use groups suggested a significant effect by association of eCD on survival within patients not taking statin (hazard ratio, 1.61; 95% confidence interval, 1.09–2.40; P = .018), and no significant effect of eCD on survival within patients taking statin (hazard ratio, 0.98; 95% confidence interval, 0.59–1.66; P = .95). CONCLUSION: eCD is associated with shorter survival in patients not taking statins. This finding validates eCD as an important neurological outcome and suggests that eCD is a surrogate measure for overall health, comorbidity, and vulnerability to neurological insult.


Circulation ◽  
2017 ◽  
Vol 135 (suppl_1) ◽  
Author(s):  
Gabriel S Tajeu ◽  
Monika M Safford ◽  
George Howard ◽  
Rikki M Tanner ◽  
Paul Muntner

Introduction: Black Americans have higher rates of cardiovascular disease (CVD) mortality compared with whites. Differences in sociodemographic, psychosocial, CVD, and other risk factors may explain increased mortality risk. Methods: We analyzed data from 29,015 REasons for Geographic and Racial Differences in Stroke study participants to determine factors that may explain the higher hazard ratio for CVD and non-CVD mortality in blacks compared with whites. Cause of death was adjudicated by trained investigators. Within age-sex sub-groups, we used Cox proportional hazards regression with progressive adjustment to estimate black:white hazard ratios. Results: Overall, 41.0% of participants were black, and 54.9% were women. Over a mean follow-up of 7.1 years (maximum 12.3 years), 5,299 participants died (1,797 CVD and 3,502 non-CVD deaths). Among participants < 65 years of age, the age and region adjusted black:white hazard ratio for CVD mortality was 2.28 (95% CI: 1.68-3.10) and 2.32 (95% CI: 1.80-3.00) for women and men, respectively, and for participants ≥ 65 was 1.54 (95% CI: 1.30-1.82) and 1.35 (95% CI: 1.16-1.57) for women and men, respectively ( Table ). The higher black:white hazard ratios for CVD mortality were no longer statistically significant after multivariable adjustment, with the largest attenuation occurring with sociodemographic and CVD risk factor adjustment. Among participants < 65 years of age, the age and region adjusted black:white hazard ratios for non-CVD mortality were 1.51 (95% CI: 1.24-1.85) and 1.76 (95% CI: 1.46-2.13) for women and men, respectively, and for participants ≥ 65 was 1.12 (95% CI: 1.00-1.26) and 1.34 (95% CI: 1.20-1.49) for women and men, respectively. The higher black:white hazard ratios for non-CVD mortality were attenuated after adjustment for sociodemographics. Conclusions: Black:white differences are larger for CVD than non-CVD causes of death. The increased CVD mortality for blacks compared with whites is primarily explained by sociodemographic and CVD risk factors.


2019 ◽  
Vol 26 (14) ◽  
pp. 1510-1518 ◽  
Author(s):  
Claudia T Lissåker ◽  
Fredrika Norlund ◽  
John Wallert ◽  
Claes Held ◽  
Erik MG Olsson

Background Patients with symptoms of depression and/or anxiety – emotional distress – after a myocardial infarction (MI) have been shown to have worse prognosis and increased healthcare costs. However, whether specific subgroups of patients with emotional distress are more vulnerable is less well established. The purpose of this study was to identify the association between different patterns of emotional distress over time with late cardiovascular and non-cardiovascular mortality among first-MI patients aged <75 years in Sweden. Methods We utilized data on 57,602 consecutive patients with a first-time MI from the national SWEDEHEART registers. Emotional distress was assessed using the anxiety/depression dimension of the European Quality of Life Five Dimensions questionnaire two and 12 months after the MI, combined into persistent (emotional distress at both time-points), remittent (emotional distress at the first follow-up only), new (emotional distress at the second-follow up only) or no distress. Data on cardiovascular and non-cardiovascular mortality were obtained until the study end-time. We used multiple imputation to create complete datasets and adjusted Cox proportional hazards models to estimate hazard ratios. Results Patients with persistent emotional distress were more likely to die from cardiovascular (hazard ratio: 1.46, 95% confidence interval: 1.16, 1.84) and non-cardiovascular causes (hazard ratio: 1.54, 95% confidence interval: 1.30, 1.82) than those with no distress. Those with remittent emotional distress were not statistically significantly more likely to die from any cause than those without emotional distress. Discussion Among patients who survive 12 months, persistent, but not remittent, emotional distress was associated with increased cardiovascular and non-cardiovascular mortality. This indicates a need to identify subgroups of individuals with emotional distress who may benefit from further assessment and specific treatment.


2020 ◽  
Vol 189 (10) ◽  
pp. 1096-1113 ◽  
Author(s):  
Shawn A Zamani ◽  
Kathleen M McClain ◽  
Barry I Graubard ◽  
Linda M Liao ◽  
Christian C Abnet ◽  
...  

Abstract Recent epidemiologic studies have examined the association of fish consumption with upper gastrointestinal cancer risk, but the associations with n-3 and n-6 polyunsaturated fatty acid (PUFA) subtypes remain unclear. Using the National Institutes of Health–AARP Diet and Health Study (United States, 1995–2011), we prospectively investigated the associations of PUFA subtypes, ratios, and fish with the incidence of head and neck cancer (HNC; n = 2,453), esophageal adenocarcinoma (EA; n = 855), esophageal squamous cell carcinoma (n = 267), and gastric cancer (cardia: n = 603; noncardia: n = 631) among 468,952 participants (median follow-up, 15.5 years). A food frequency questionnaire assessed diet. Multivariable-adjusted hazard ratios were estimated using Cox proportional hazards regression. A Benjamini-Hochberg (BH) procedure was used for false-discovery control. Long-chain n-3 PUFAs were associated with a 20% decreased HNC and EA risk (for HNC, quintile5 vs. 1 hazard ratio = 0.81, 95% confidence interval: 0.71, 0.92, and BH-adjusted Ptrend = 0.001; and for EA, quintile5 vs. 1 hazard ratio = 0.79, 95% confidence interval: 0.64, 0.98, and BH-adjusted Ptrend = 0.1). Similar associations were observed for nonfried fish but only for high intake. Further, the ratio of long-chain n-3:n-6 was associated with a decreased HNC and EA risk. No consistent associations were observed for gastric cancer. Our results indicate that dietary long-chain n-3 PUFA and nonfried fish intake are associated with lower HNC and EA risk.


2019 ◽  
Vol 14 (7) ◽  
pp. 994-1001 ◽  
Author(s):  
Eli Farhy ◽  
Clarissa Jonas Diamantidis ◽  
Rebecca M. Doerfler ◽  
Wanda J. Fink ◽  
Min Zhan ◽  
...  

Background and objectivesPoor disease recognition may jeopardize the safety of CKD care. We examined safety events and outcomes in patients with CKD piloting a medical-alert accessory intended to improve disease recognition and an observational subcohort from the same population.Design, setting, participants, & measurementsWe recruited 350 patients with stage 2–5 predialysis CKD. The first (pilot) 108 participants were given a medical-alert accessory (bracelet or necklace) indicating the diagnosis of CKD and displaying a website with safe CKD practices. The subsequent (observation) subcohort (n=242) received usual care. All participants underwent annual visits with ascertainment of patient-reported events (class 1) and actionable safety findings (class 2). Secondary outcomes included 50% GFR reduction, ESKD, and death. Cox proportional hazards assessed the association of the medical-alert accessory with outcomes.ResultsMedian follow-up of pilot and observation subcohorts were 52 (interquartile range, 44–63) and 37 (interquartile range, 27–47) months, respectively. The frequency of class 1 and class 2 safety events reported at annual visits was not different in the pilot versus observation group, with 108.7 and 100.6 events per 100 patient-visits (P=0.13), and 38.3 events and 41.2 events per 100 patient visits (P=0.23), respectively. The medical-alert accessory was associated with lower crude and adjusted rate of ESKD versus the observation group (hazard ratio, 0.42; 95% confidence interval, 0.20 to 0.89; and hazard ratio, 0.38; 95% confidence interval, 0.16 to 0.94, respectively). The association of the medical-alert accessory with the composite endpoint of ESKD or 50% reduction GFR was variable over time but appeared to have an early benefit (up to 23 months) with its use. There was no significant difference in incidence of hospitalization, death, or a composite of all outcomes between medical-alert accessory users and the observational group.ConclusionsThe medical-alert accessory was not associated with incidence of safety events but was associated with a lower rate of ESKD relative to usual care.


Neurosurgery ◽  
2017 ◽  
Vol 81 (6) ◽  
pp. 935-948 ◽  
Author(s):  
Joan Margaret O’Donnell ◽  
Michael Kerin Morgan ◽  
Gillian Z Heller

Abstract BACKGROUND The evidence for the risk of seizures following surgery for brain arteriovenous malformations (bAVM) is limited. OBJECTIVE To determine the risk of seizures after discharge from surgery for supratentorial bAVM. METHODS A prospectively collected cohort database of 559 supratentorial bAVM patients (excluding patients where surgery was not performed with the primary intention of treating the bAVM) was analyzed. Cox proportional hazards regression models (Cox regression) were generated assessing risk factors, a Receiver Operator Characteristic curve was generated to identify a cut-point for size and Kaplan–Meier life table curves created to identify the cumulative freedom from postoperative seizure. RESULTS Preoperative histories of more than 2 seizures and increasing maximum diameter (size, cm) of bAVM were found to be significantly (P &lt; .01) associated with the development of postoperative seizures and remained significant in the Cox regression (size as continuous variable: P = .01; hazard ratio: 1.2; 95% confidence interval: 1.0-1.3; more than 2 seizures: P = .02; hazard ratio: 2.1; 95% confidence interval: 1.1-3.8). The cumulative risk of first seizure after discharge from hospital following resection surgery for all patients with bAVM was 5.8% and 18% at 12 mo and 7 yr, respectively. The 7-yr risk of developing postoperative seizures ranged from 11% for patients with bAVM ≤4 cm and with 0 to 2 preoperative seizures, to 59% for patients with bAVM &gt;4 cm and with &gt;2 preoperative. CONCLUSION The risk of seizures after discharge from hospital following surgery for bAVM increases with the maximum diameter of the bAVM and a patient history of more than 2 preoperative seizures.


2015 ◽  
Vol 22 (8) ◽  
pp. 1086-1093 ◽  
Author(s):  
Saeed Akhtar ◽  
Raed Alroughani ◽  
Samar F Ahmed ◽  
Jasem Y Al-Hashel

Background: The frequency of paediatric-onset multiple sclerosis (POMS) and the precise risk of secondary progression of disease are largely unknown in the Middle East. This cross-sectional cohort study assessed the risk and examined prognostic factors for time to onset of secondary progressive multiple sclerosis (SPMS) in a cohort of POMS patients. Methods: The Kuwait National MS Registry database was used to identify a cohort of POMS cases (diagnosed at age <18 years) from 1994 to 2013. Data were abstracted from patients’ records. A Cox proportional hazards model was used to evaluate the prognostic significance of the variables considered. Results: Of 808 multiple sclerosis (MS) patients, 127 (15.7%) were POMS cases. The median age (years) at disease onset was 16.0 (range 6.5–17.9). Of 127 POMS cases, 20 (15.8%) developed SPMS. A multivariable Cox proportional hazards model showed that at MS onset, brainstem involvement (adjusted hazard ratio 5.71; 95% confidence interval 1.53–21.30; P=0.010), and POMS patient age at MS onset (adjusted hazard ratio 1.38; 95% confidence interval 1.01–1.88; P=0.042) were significantly associated with the increased risk of a secondary progressive disease course. Conclusions: This study showed that POMS patients with brainstem/cerebellar presentation and a relatively higher age at MS onset had disposition for SPMS and warrant an aggressive therapeutic approach.


2020 ◽  
Vol 189 (10) ◽  
pp. 1163-1172
Author(s):  
Tracy A Becerra-Culqui ◽  
Darios Getahun ◽  
Vicki Chiu ◽  
Lina S Sy ◽  
Hung Fu Tseng

Abstract As prenatal vaccinations become more prevalent, it is important to assess potential safety events. In a retrospective cohort study of Kaiser Permanente Southern California (Pasadena, California) mother-child pairs with birth dates during January 1, 2011–December 31, 2014, we investigated the association between prenatal tetanus, diphtheria, and acellular pertussis (Tdap) vaccination and risk of attention-deficit/hyperactivity disorder (ADHD) in offspring. Information on Tdap vaccination during pregnancy was obtained from electronic medical records. ADHD was defined by International Classification of Diseases codes (Ninth or Tenth Revision) and dispensed ADHD medication after age 3 years. Children were followed to the date of their first ADHD diagnosis, the end of Kaiser Permanente membership, or the end of follow-up (December 31, 2018). In Cox proportional hazards models, we estimated unadjusted and adjusted hazard ratios for the association between maternal Tdap vaccination and ADHD, with inverse probability of treatment weighting (IPTW) used to adjust for confounding. Of 128,756 eligible mother-child pairs, 85,607 were included in the final sample. The ADHD incidence rate was 3.41 per 1,000 person-years in the Tdap-vaccinated women and 3.93 per 1,000 person-years in the unvaccinated (hazard ratio = 1.01, 95% confidence interval: 0.88, 1.16). The IPTW-adjusted analyses showed no association between prenatal Tdap vaccination and ADHD in offspring (hazard ratio = 1.00, 95% confidence interval: 0.88, 1.14). In this study, prenatal Tdap vaccination was not associated with ADHD risk in offspring, supporting recommendations to vaccinate pregnant women.


Sign in / Sign up

Export Citation Format

Share Document