scholarly journals MO495UNDERSTANDING INTERNATIONAL VARIATION IN KIDNEY FAILURE INCIDENCE: IMPACT OF DISPARITIES IN RAAS INHIBITOR PRESCRIPTION AND BLOOD PRESSURE CONTROL

2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Natalia Alencar de Pinho ◽  
Roberto Pecoits-Filho ◽  
Brian Bieber ◽  
Daniel Muenz ◽  
Antonio Lopes ◽  
...  

Abstract Background and Aims Blood pressure (BP) control and renin-angiotensin-aldosterone system (RAAS) blockade are key measures to slow CKD progression, and the achievement of targets for these measures vary greatly across countries. We sought to evaluate to what extend this might explain international variations in kidney failure incidence. Method We used data from the CKD Outcomes and Practice Patterns Study (CKDopps), a cohort study of adult patients recruited from national samples of nephrology clinics. Patients with CKD G3 or G4, from Brazil (n=498), France (n=2702), Germany (n=2314), and the US (n=905) were included. Those neither with hypertension nor with albuminuria were excluded (n=103). We assessed systolic BP and RAAS inhibitor prescription at baseline, and their association with time to kidney failure, defined as an estimated glomerular filtration rate (eGFR) < 15 ml/min/1.73m² or kidney replacement therapy initiation. Death was treated as a competing event. Cox proportional-hazards model was used to estimate cause-specific hazard ratios (cs-HR) and 95% confidence intervals (CI) for kidney failure according to country, before and after adjusting for systolic BP and RAAS inhibitor prescription, as well as demographics, and known risk factors for CKD progression. Results Median age (years) ranged from 67 in Brazil to 75 in Germany; and mean baseline eGFR (ml/min/1.73m²), from 27 in Germany to 33 in France. Prevalence of diabetes ranged from 20% in France to 36% in Brazil, and that of stage A3 albuminuria (>300 mg/g), from 31% in Brazil to 44% in the US. Mean systolic BP (mm Hg) ranged from 132 in Brazil to 143 in France, and the percentage of patients prescribed RAAS inhibitor, from 58% in the US to 81% in Germany. After median follow-up of 4.0 (2.6-5.0) years, 1897 participants progressed to kidney failure and 522 died before meeting this outcome. Two-year crude cumulative incidence of kidney failure was the lowest in France (14%), where patients were recruited at an earlier CKD stage, and similar across Germany (25%), the US (26%), and Brazil (27%); that for all-cause death, the lowest in Brazil (2.5%), followed by France (3.4%), the US (4.4%), and Germany (4.6%). Sequential adjustment for demographics and progression risk factors, in particular baseline eGFR and albuminuria, significantly reduced the gap between France and the other countries (Figure). Despite the associations of systolic BP (cs-HR 1.14, 95%CI 0.95-1.38 for 120-129; 1.18, 95%CI 0.95-1.46 for 130-139; and 1.46, 95%CI 1.23-1.74 for ≥140 versus <120 mm Hg) and RAAS inhibitor prescription (cs-HR 0.81, 95%CI 0.70-0.95 at 6 months of follow-up) with kidney failure, adjustment for these two treatment targets only marginally changed comparisons across studied countries. Conclusion In CKD patients under nephrology care, BP control and RAAS inhibitor prescription were associated with lower risk of kidney failure and substantially varied across countries. Despite this variation in practice, BP control and RAAS inhibitor prescription appear to explain little of the differences in risk of kidney failure by country.

Circulation ◽  
2018 ◽  
Vol 137 (suppl_1) ◽  
Author(s):  
Jennifer C D'Souza ◽  
Jennifer Weuve ◽  
Robert D Brook ◽  
Denis A Evans ◽  
Joel D Kaufman ◽  
...  

Objectives: Over half the US population experiences noise levels above WHO recommendations yet little research within the US has examined the health effects of these exposures. Our objective is to investigate the associations between community noise and blood pressure in residents of Chicago. Methods: Participants were from two prospective cohort studies: the Multi Ethnic Study of Atherosclerosis (MESA) and the Chicago Health and Aging Project (CHAP). MESA is a multi-site study of persons aged 45-84 years and free of clinical cardiovascular disease. CHAP is an open cohort initiated to study chronic conditions of aging among persons aged ≥65 years. This analysis focuses on the 5,167 participants of these cohorts living in Chicago with an average of 2.5 (CHAP) and 4.5 (MESA) assessments per participant, for systolic (SBP) and diastolic (DBP) blood pressure between 1999-2011. In both cohorts, hypertension was defined as taking antihypertensive medication, SBP ≥140 or DBP ≥ 90 mmHg. We estimated noise at participant addresses using land use regression models weighted according to participants’ 5-year residential history before each exam. Among those taking antihypertensive medication, blood pressure was adjusted using multiple imputation. Associations between noise and blood were estimated using linear mixed models. A Cox proportional hazards model was used to estimate relative risk (RR) of incident hypertension. All models included calendar time, age, sex, race, income, education, neighborhood socioeconomic score, smoking, cohort, interaction between cohort and age, race, and gender, and NO x (a traffic-related air pollutant). Findings : At baseline, MESA participants were younger (63 vs 73 years) and more educated (36 vs. 3% with ≥graduate degree) than CHAP participants. MESA participants had higher noise levels (60 vs 56 dB) and lower blood pressures (e.g. SBP: 124 vs 135 mmHg) than CHAP participants. After adjusting for cohort and other confounders, we found that 10 dB higher residential noise levels were associated with 0.9 (95% CI: -0.2, 0.2; p=0.1) and 0.5 mmHg greater (95% CI: -0.1, 0.11; p=0.08) SBP and DBP, respectively. Similar associations were found within each cohort. Noise was not associated with incident hypertension overall (RR: 1.00; 95% CI: 0.8, 1.3, p=0.98) or within cohort. Conclusions: We found a suggestive association between noise and blood pressure levels, but no association with hypertension. This could be due to the lack of nighttime noise information, which has been shown to be more strongly associated with blood pressure outcomes than daytime levels or with the selection of healthy older participants.


2019 ◽  
Vol 104 (1) ◽  
pp. 81-86 ◽  
Author(s):  
Sung Uk Baek ◽  
Ahnul Ha ◽  
Dai Woo Kim ◽  
Jin Wook Jeoung ◽  
Ki Ho Park ◽  
...  

Background/AimsTo investigate the risk factors for disease progression of normal-tension glaucoma (NTG) with pretreatment intraocular pressure (IOP) in the low-teens.MethodsOne-hundred and two (102) eyes of 102 patients with NTG with pretreatment IOP≤12 mm Hg who had been followed up for more than 60 months were retrospectively enrolled. Patients were divided into progressor and non-progressor groups according to visual field (VF) progression as correlated with change of optic disc or retinal nerve fibre layer defect. Baseline demographic and clinical characteristics including diurnal IOP and 24 hours blood pressure (BP) were compared between the two groups. The Cox proportional hazards model was used to identify the risk factors for disease progression.ResultsThirty-six patients (35.3%) were classified as progressors and 66 (64.7%) as non-progressors. Between the two groups, no significant differences were found in the follow-up periods (8.7±3.4 vs 7.7±3.2 years; p=0.138), baseline VF mean deviation (−4.50±5.65 vs −3.56±4.30 dB; p=0.348) or pretreatment IOP (11.34±1.21 vs 11.17±1.06 mm Hg; p=0.121). The multivariate Cox proportional hazards model indicated that greater diurnal IOP at baseline (HR=1.609; p=0.004), greater fluctuation of diastolic BP (DBP; HR=1.058; p=0.002) and presence of optic disc haemorrhage during follow-up (DH; HR=3.664; p=0.001) were risk factors for glaucoma progression.ConclusionIn the low-teens NTG eyes, 35.3% showed glaucoma progression during the average 8.7 years of follow-up. Fluctuation of DBP and diurnal IOP as well as DH were significantly associated with greater probability of disease progression.


Author(s):  
Jiwei Bai ◽  
Mingxuan Li ◽  
Jianxin Shi ◽  
Liwei Jing ◽  
Yixuan Zhai ◽  
...  

Abstract Objective Skull base chordoma (SBC) is rare and one of the most challenging diseases to treat. We aimed to assess the optimal timing of adjuvant radiation therapy (RT) and to evaluate the factors that influence resection and long-term outcomes. Methods In total, 284 patients with 382 surgeries were enrolled in this retrospective study. Postsurgically, 64 patients underwent RT before recurrence (pre-recurrence RT), and 47 patients underwent RT after recurrence. During the first attempt to achieve gross-total resection (GTR), when the entire tumor was resected, 268 patients were treated with an endoscopic midline approach, and 16 patients were treated with microscopic lateral approaches. Factors associated with the success of GTR were identified using χ2 and logistic regression analyses. Risk factors associated with chordoma-specific survival (CSS) and progression-free survival (PFS) were evaluated with the Cox proportional hazards model. Results In total, 74.6% of tumors were marginally resected [GTR (40.1%), near-total resection (34.5%)]. History of surgery, large tumor volumes, and tumor locations in the lower clivus were associated with a lower GTR rate. The mean follow-up period was 43.9 months. At the last follow-up, 181 (63.7%) patients were alive. RT history, histologic subtype (dedifferentiated and sarcomatoid), non-GTR, no postsurgical RT, and the presence of metastasis were associated with poorer CSS. Patients with pre-recurrence RT had the longest PFS and CSS, while patients without postsurgical RT had the worst outcome. Conclusion GTR is the goal of initial surgical treatment. Pre-recurrence RT would improve outcome regardless of GTR.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Hirofumi SOEJIMA ◽  
Takeshi Morimoto ◽  
Sadanori Okada ◽  
Chisa Matsumoto ◽  
Masafumi Nakayama ◽  
...  

Introduction: Blood pressure (BP) is a significant predictor for chronic kidney disease (CKD). Hypothesis: We sought to evaluate whether the progression of CKD in diabetic patients, without a history of atherosclerotic events, is dependent on BP control. Methods: The Japanese primary prevention of atherosclerosis with aspirin for diabetes (JPAD) trial was a multicenter, prospective, randomized, open label, blinded, end-point study done from 2002 to 2008. After completion of the JPAD trial, we followed up the patients until 2019. We defined late-stage kidney disease (LSKD) as estimated glomerular filtration rate (eGFR) <30 ml/min/1.73m 2 or hemodialysis. Among 2,536 JPAD patients, 27 patients were excluded for eGFR <30ml/min/1.73m 2 on registration. BP of the JPAD patients was recorded on average 8 times. Based on the mean value of systolic BP (SBP), we divided the patients into three groups: a High BP Group (n=607, SBP≥140 mm Hg); a Moderate BP Group (n=989, 140>SBP≥130 mm Hg); or a Low BP Group (n=913, SBP<130 mm Hg). We compared the incidence of LSKD among the three groups. Results: The mean eGFR (ml/min/1.73m 2 ) was 75.1 in the High BP Group, 72.6 in the Moderate BP Group, and 75.7 in the Low BP Group on registration. During a 11.2-year follow-up, the incidence of LSKD was significantly higher in the High BP and Moderate BP Groups than in the Low BP Group (P<0.0018, Figure). Cox proportional hazards model analysis revealed that the High BP (HR, 1.57, P=0.049) and Moderate BP (HR, 1.52, P=0.037) were independent factors after adjustment for proteinuria≥±, age≥65 years, men, body mass index≥24 kg/m 2 , duration of diabetes ≥7.0 years, statin usage, aspirin usage, eGFR≥60 ml/min/1.73m 2 , and hemoglobin A1c ≥7.2 % (Figure). Conclusions: Our study demonstrated that SBP was independently associated with the progression to LSKD in diabetic patients, without a history of atherosclerotic events. SBP less than 130 mm Hg is recommended for diabetic patients to prevent progression to LSKD.


Author(s):  
Marlise P. dos Santos ◽  
Armin Sabri ◽  
Dar Dowlatshahi ◽  
Ali Muraback Bakkai ◽  
Abed Elallegy ◽  
...  

AbstractBackground: Recurrence after intracranial aneurysm coiling is a highly prevalent outcome, yet to be understood. We investigated clinical, radiological and procedural factors associated with major recurrence of coiled intracranial aneurysms. Methods: We retrospectively analyzed prospectively collected coiling data (2003-12). We recorded characteristics of aneurysms, patients and interventional techniques, pre-discharge and angiographic follow-up occlusion. The Raymond-Roy classification was used; major recurrence was a change from class I or II to class III, increase in class III remnant, and any recurrence requiring any type of retreatment. Identification of risk factors associated with major recurrence used univariate Cox Proportional Hazards Model followed by multivariate regression analysis of covariates with P<0.1. Results: A total of 467 aneurysms were treated in 435 patients: 283(65%) harboring acutely ruptured aneurysms, 44(10.1%) patients died before discharge and 33(7.6%) were lost to follow-up. A total of 1367 angiographic follow-up studies (range: 1-108 months, Median [interquartile ranges (IQR)]: 37[14-62]) was performed in 384(82.2%) aneurysms. The major recurrence rate was 98(21%) after 6(3.5-22.5) months. Multivariate analysis (358 patients with 384 aneurysms) revealed the risk factors for major recurrence: age>65 y (hazard ratio (HR): 1.61; P=0.04), male sex (HR: 2.13; P<0.01), hypercholesterolemia (HR: 1.65; P=0.03), neck size ≥4 mm (HR: 1.79; P=0.01), dome size ≥7 mm (HR: 2.44; P<0.01), non-stent-assisted coiling (HR: 2.87; P=0.01), and baseline class III (HR: 2.18; P<0.01). Conclusion: Approximately one fifth of the intracranial aneurysms resulted in major recurrence. Modifiable factors for major recurrence were choice of stent-assisted technique and confirmation of adequate baseline occlusion (Class I/II) in the first coiling procedure.


2021 ◽  
Author(s):  
jiwei bai ◽  
Mingxuan Li ◽  
Jianxin Shi ◽  
Liwei Jing ◽  
Yixuan Zhai ◽  
...  

Abstract OBJECTIVE: Skull-base chordoma (SBC) is rare and one of the most challenging diseases to treat. We aimed to assess the optimal timing of adjuvant radiation therapy (RT) and evaluate the factors that influence resection and long-term outcomes.METHODS: In total, 284 patients with 382 surgeries were enrolled in this retrospective study. Postsurgically, 64 patients underwent RT before recurrence (pre-recurrence RT), and 47 patients underwent RT after recurrence. During the first attempt to achieve gross-total resection (GTR), when the entire tumor was resected, 268 patients were treated with an endoscopic midline approach, and 16 patients were treated with microscopic lateral approaches. Factors associated with the success of GTR were identified using c2 and logistic regression analyses. Risk factors associated with chordoma-specific survival (CSS) and progression-free survival (PFS) were evaluated with the Cox proportional hazards model.RESULTS: In total, 74.6% of tumors were marginally resected [GTR (40.1%); near-total resection (34.5%)]. History of surgery, large tumor volumes and tumor locations in the lower clivus were associated with a lower GTR rate. The mean follow-up period was 43.9 months. At last follow-up, 181 (63.7%) patients were alive. RT history, histologic subtype (dedifferentiated and sarcomatoid), non-GTR, no postsurgical RT, and the presence of metastasis were associated with poorer CSS. Patients with pre-recurrence RT had the longest PFS and CSS, while patients without postsurgical RT had the worst outcome.CONCLUSION: GTR is the goal of initial surgical treatment. Pre-recurrence RT would improve outcome regardless of GTR.


1983 ◽  
Vol 3 (3_suppl) ◽  
pp. 14-17 ◽  
Author(s):  
Paul N. Corey ◽  
Cathy Steele

The Cox proportional hazards model was used to identify prognostic risk factors for time to first infection and time to failure among 183 patients on chronic ambulatory peritoneal dialysis (CAPD). This methodology permits continuous variables such as albumin and blood pressure to be used in the predictive equation avoiding arbitrary categorization. Initial serum creatinine and albumin were found to be related to the risk of first infection. Serum creatinine increases the risk whereas albumin is protective. Age and blood pressure are related to an increased risk of failure on CAPD whereas albumin is associated with a lower risk. The occurrence of the first infection almost doubles the risk of failure. Patients who have “high” albumin and “low” blood pressure have a 75th percentile for time to failure on CAPD which is more than 1000 days longer than those who have both “low” albumin and “high” blood pressure.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Fujino ◽  
H Ogawa ◽  
S Ikeda ◽  
K Doi ◽  
Y Hamatani ◽  
...  

Abstract Background Atrial fibrillation (AF) commonly progresses from paroxysmal type to sustained type in the natural course of the disease, and we previously demonstrated that the progression of AF was associated with increased risk of clinical adverse events. There are some patients, though less frequently, who regress from sustained to paroxysmal AF, but the clinical impact of the regression of AF remains unknown. Purpose We sought to investigate whether regression from sustained to paroxysmal AF is associated with better clinical outcomes. Methods Using the dataset of the Fushimi AF Registry, patients who were diagnosed as sustained (persistent or permanent) AF at baseline were studied. Conversion of sustained AF to paroxysmal AF during follow-up was defined as regression of AF. Major adverse cardiac events (MACE) were defined as the composite of cardiac death, stroke, and hospitalization for heart failure (HF). Event rates were compared between the patients with and without regression of AF. In patients with sustained AF at baseline, predictors of MACE were identified using Cox proportional hazards model. Results Among 2,253 patients who were diagnosed as sustained AF at baseline, regression of AF was observed in 9.0% (202/2,253, 2.0 per 100 patient-years) during a median follow-up of 4.0 years. Of these, 24.3% (49/202, 4.6 per 100 patient-years) of the patients finally recurred to sustained AF during follow-up. The proportion of asymptomatic patients was lower in patients with regression of AF than those without (with vs without regression; 49.0% vs 69.5%, p&lt;0.01). The percentage of beta-blocker use at baseline was similar between the two groups (37.2% vs 33.8%, p=0.34). The prevalence of patients who underwent catheter ablation or electrical cardioversion during follow-up was higher in patients with regression of AF (catheter ablation: 15.8% vs 5.5%; p&lt;0.01, cardioversion: 4.0% vs 1.4%; p&lt;0.01, respectively). The rate of MACE was significantly lower in patients with regression of AF as compared with patients who maintained sustained AF (3.7 vs 6.2 per 100 patient-years, log-rank p&lt;0.01). Figure shows the Kaplan-Meier curves for MACE, cardiac death, hospitalization for heart failure, and stroke. In patients with sustained AF at baseline, multivariable Cox proportional hazards model demonstrated that regression of AF was an independent predictor of lower MACE (adjusted hazard ratio [HR]: 0.50, 95% confidence interval [CI]: 0.28 to 0.88, p=0.02), stroke (HR: 0.51, 95% CI: 0.30 to 0.88, p=0.02), and hospitalization for HF (HR: 0.50, 95% CI: 0.29 to 0.85, p=0.01). Conclusion Regression from sustained to paroxysmal AF was associated with a lower incidence of adverse cardiac events. Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document