scholarly journals Optimal surgeon and hospital volume thresholds to reduce mortality and length of stay for CABG

PLoS ONE ◽  
2021 ◽  
Vol 16 (4) ◽  
pp. e0249750
Author(s):  
Ying-Yi Chou ◽  
Juey-Jen Hwang ◽  
Yu-Chi Tung

Objective We used nationwide population-based data to identify optimal hospital and surgeon volume thresholds and to discover the effects of these volume thresholds on operative mortality and length of stay (LOS) for coronary artery bypass surgery (CABG). Design Retrospective cohort study. Setting General acute care hospitals throughout Taiwan. Participants A total of 12,892 CABG patients admitted between 2011 and 2015 were extracted from Taiwan National Health Insurance claims data. Main Outcome Measures Operative mortality and LOS. Restricted cubic splines were applied to discover the optimal hospital and surgeon volume thresholds needed to reduce operative mortality. Generalized estimating equation regression modeling, Cox proportional-hazards modeling and instrumental variables analysis were employed to examine the effects of hospital and surgeon volume thresholds on the operative mortality and LOS. Results The volume thresholds for hospitals and surgeons were 55 cases and 5 cases per year, respectively. Patients who underwent CABG from hospitals that did not reach the volume threshold had higher operative mortality than those who received CABG from hospitals that did reach the volume threshold. Patients who underwent CABG with surgeons who did not reach the volume threshold had higher operative mortality and LOS than those who underwent CABG with surgeons who did reach the volume threshold. Conclusions This is the first study to identify the optimal hospital and surgeon volume thresholds for reducing operative mortality and LOS. This supports policies regionalizing CABG at high-volume hospitals. Identifying volume thresholds could help patients, providers, and policymakers provide optimal care.

2016 ◽  
Vol 34 (4_suppl) ◽  
pp. 191-191 ◽  
Author(s):  
Margaret T Mandelson ◽  
Vincent J. Picozzi

191 Background: Surgical outcomes for resected PC are known to be superior at HVCs. However, the impact of adjuvant (Rx) performed at HVCs is less studied. We examined the impact of site of adjuvant Rx administration on our resected patients (pts). Methods: Eligible pts were diagnosed 2003-2014 and resected at HVC. Pts were excluded for neoadjuvant Rx, synchronous cancer, death/lost to follow-up within 3 months or contraindications (e.g. morbidity) to adjuvant Rx.. Pts were also excluded if they refused adjuvant treatment or if a community oncologist (CC) was not identified in the medical record or in the western Washington population-based cancer registry. Pt and tumor characteristics were compared in univariate analysis and survival was calculated from date of diagnosis to death or last follow-up. Five year OS was estimated by the Kaplan Meier method and compared using Cox proportional hazards modeling to evaluate the impact of HVC adjuvant Rx on OS while adjusting for potential confounding factors. Results: 245 pts were eligible for study: 139 (57%) treated at HVC, 106 (43%) treated at CC. HVC and CC pts were similar with respect to stage and tumor size, nodal status, resection margins and average distance travelled to HVC. They differed by age (HVC: 63.1, CC: 68.2 p < 0.01). Median and 5-yr OS was 36 mos and 33%. Median OS for HVC vs CC was 44 mos vs. 28 mos (p < 0.01), and 5yr OS was 38.6% vs. 24.8% (p < 0.01), adjustment for age did not alter our findings. Conclusions: 1) With respect to adjuvant Rx for resected PC, HVC and CC pts differed with respect to age only. 2) Both median and 5- yr OS was statistically superior at HVC vs CC. 3) Our study supports the use of HVCs for all Rx components for PC treated with curative intent. 4) Ongoing investigation of patterns of care and their impact on OS in PC is warranted.


Nutrients ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 1034
Author(s):  
Vincenza Gianfredi ◽  
Annemarie Koster ◽  
Anna Odone ◽  
Andrea Amerio ◽  
Carlo Signorelli ◽  
...  

Our aim was to assess the association between a priori defined dietary patterns and incident depressive symptoms. We used data from The Maastricht Study, a population-based cohort study (n = 2646, mean (SD) age 59.9 (8.0) years, 49.5% women; 15,188 person-years of follow-up). Level of adherence to the Dutch Healthy Diet (DHD), Mediterranean Diet, and Dietary Approaches To Stop Hypertension (DASH) were derived from a validated Food Frequency Questionnaire. Depressive symptoms were assessed at baseline and annually over seven-year-follow-up (using the 9-item Patient Health Questionnaire). We used Cox proportional hazards regression analyses to assess the association between dietary patterns and depressive symptoms. One standard deviation (SD) higher adherence in the DHD and DASH was associated with a lower hazard ratio (HR) of depressive symptoms with HRs (95%CI) of 0.78 (0.69–0.89) and 0.87 (0.77–0.98), respectively, after adjustment for sociodemographic and cardiovascular risk factors. After further adjustment for lifestyle factors, the HR per one SD higher DHD was 0.83 (0.73–0.96), whereas adherence to Mediterranean and DASH diets was not associated with incident depressive symptoms. Higher adherence to the DHD lowered risk of incident depressive symptoms. Adherence to healthy diet could be an effective non-pharmacological preventive measure to reduce the incidence of depression.


2021 ◽  
Vol 19 (1) ◽  
Author(s):  
Lindsay Hedden ◽  
Megan A. Ahuja ◽  
M. Ruth Lavergne ◽  
Kimberlyn M. McGrail ◽  
Michael R. Law ◽  
...  

Abstract Background The retirement of a family physician can represent a challenge in accessibility and continuity of care for patients. In this population-based, longitudinal cohort study, we assess whether and how long it takes for patients to find a new majority source of primary care (MSOC) when theirs retires, and we investigate the effect of demographic and clinical characteristics on this process. Methods We used provincial health insurance records to identify the complete cohort of patients whose majority source of care left clinical practice in either 2007/2008 or 2008/2009 and then calculated the number of days between their last visit with their original MSOC and their first visit with their new one. We compared the clinical and sociodemographic characteristics of patients who did and did not find a new MSOC in the three years following their original physician’s retirement using Chi-square and Fisher’s exact test. We also used Cox proportional hazards models to determine the adjusted association between patient age, sex, socioeconomic status, location and morbidity level (measured using Johns Hopkins’ Aggregated Diagnostic Groupings), and time to finding a new primary care physician. We produce survival curves stratified by patient age, sex, income and morbidity. Results Fifty-four percent of patients found a new MSOC within the first 12 months following their physician’s retirement. Six percent of patients still had not found a new physician after 36 months. Patients who were older and had higher levels of morbidity were more likely to find a new MSOC and found one faster than younger, healthier patients. Patients located in more urban regional health authorities also took longer to find a new MSOC compared to those in rural areas. Conclusions Primary care physician retirements represent a potential threat to accessibility; patients followed in this study took more than a year on average to find a new MSOC after their physician retired. Providing programmatic support to retiring physicians and their patients, as well as addressing shortages of longitudinal primary care more broadly could help to ensure smoother retirement transitions.


Author(s):  
Majdi Imterat ◽  
Tamar Wainstock ◽  
Eyal Sheiner ◽  
Gali Pariente

Abstract Recent evidence suggests that a long inter-pregnancy interval (IPI: time interval between live birth and estimated time of conception of subsequent pregnancy) poses a risk for adverse short-term perinatal outcome. We aimed to study the effect of short (<6 months) and long (>60 months) IPI on long-term cardiovascular morbidity of the offspring. A population-based cohort study was performed in which all singleton live births in parturients with at least one previous birth were included. Hospitalizations of the offspring up to the age of 18 years involving cardiovascular diseases and according to IPI length were evaluated. Intermediate interval, between 6 and 60 months, was considered the reference. Kaplan–Meier survival curves were used to compare the cumulative morbidity incidence between the groups. Cox proportional hazards model was used to control for confounders. During the study period, 161,793 deliveries met the inclusion criteria. Of them, 14.1% (n = 22,851) occurred in parturient following a short IPI, 78.6% (n = 127,146) following an intermediate IPI, and 7.3% (n = 11,796) following a long IPI. Total hospitalizations of the offspring, involving cardiovascular morbidity, were comparable between the groups. The Kaplan–Meier survival curves demonstrated similar cumulative incidences of cardiovascular morbidity in all groups. In a Cox proportional hazards model, short and long IPI did not appear as independent risk factors for later pediatric cardiovascular morbidity of the offspring (adjusted HR 0.97, 95% CI 0.80–1.18; adjusted HR 1.01, 95% CI 0.83–1.37, for short and long IPI, respectively). In our population, extreme IPIs do not appear to impact long-term cardiovascular hospitalizations of offspring.


Circulation ◽  
2007 ◽  
Vol 116 (suppl_16) ◽  
Author(s):  
Billie Jean Martin ◽  
Dimitri Kalavrouziotis ◽  
Roger Baskett

Introduction While there are rigourous assessments made of trainees’ knowledge through formal examinations, objective assessments of technical skills are not available. Little is known about the safety of allowing resident trainees to perform cardiac surgical operations. Methods Peri-operative date was prospectively collected on all patients who underwent coronary artery bypass grafting (CABG), aortic valve replacement (AVR) or a combined procedure between 1998 and 2005. Teaching-cases were identified by resident records and defined as cases which the resident performed skin to skin. Pre-operative characteristics were compared between teaching and non-teaching cases. Short-term adverse events were defined as a composite of: in-hospital mortality, stroke, intra- or post-operative intra-aortic balloon pump (IABP) insertion, myocardial infarction, renal failure, wound infection, sepsis or return to the operating room. Intermediate adverse outcomes were defined as hospital readmission for any cardiac disease or late mortality. Logistic regression and Cox proportional hazard models were used to adjust for differences in age, acuity, and medical co-morbidities. Outcomes were compared between teaching and non-teaching cases. Results 6929 cases were included, 895 of which were identified as teaching-cases. Teaching-cases were more likely to have an EF<40%, pre-operative IABP, CHF, combined CABG/AVRs or total arterial grafting cases (all p<0.01). However, a case being a teaching-case was not a predictor of in-hospital mortality (OR=1.02, 95%CI 0.67–1.55) or the composite short-term outcome (OR=0.97, 95%CI 0.75–1.24). The Kaplan-Meier event-free survival of staff and teaching-cases was equivalent at 1, 3, and 5 years: 80% vs. 78%, 67% vs. 66%, and 58% vs. 55% (log-rank p=0.06). Cox proportional hazards regression modeling did not demonstrate teaching-case to be a predictor of late death or re-hospitalization (HR=1.05, 95%CI 0.94 –1.18). Conclusions Teaching-cases were more likely to have greater acuity and complexity than non-teaching cases. Despite this, teaching cases did no worse than staff cases in the short or intermediate term. Allowing residents to perform cardiac surgery does not appear to adversely affect patient outcomes.


Author(s):  
Joshua R Ehrlich ◽  
Bonnielin K Swenor ◽  
Yunshu Zhou ◽  
Kenneth M Langa

Abstract Background Vision impairment (VI) is associated with incident cognitive decline and dementia. However, it is not known whether VI is associated only with the transition to cognitive impairment, or whether it is also associated with later transitions to dementia. Methods We used data from the population-based Aging, Demographics and Memory Study (ADAMS) to investigate the association of visual acuity impairment (VI; defined as binocular presenting visual acuity &lt;20/40) with transitions from cognitively normal (CN) to cognitive impairment no dementia (CIND) and from CIND to dementia. Multivariable Cox proportional hazards models and logistic regression were used to model the association of VI with cognitive transitions, adjusted for covariates. Results There were 351 participants included in this study (weighted percentages: 45% male, 64% age 70-79 years) with a mean follow-up time of 4.1 years. In a multivariable model, the hazard of dementia was elevated among those with VI (HR=1.63, 95%CI=1.04-2.58). Participants with VI had a greater hazard of transitioning from CN to CIND (HR=1.86, 95%CI=1.09-3.18). However, among those with CIND and VI a similar percentage transitioned to dementia (48%) and remained CIND (52%); there was no significant association between VI and transitioning from CIND to dementia (HR=0.94, 95%CI=0.56-1.55). Using logistic regression models, the same associations between VI and cognitive transitions were identified. Conclusions Poor vision is associated with the development of CIND. The association of VI and dementia appears to be due to the higher risk of dementia among individuals with CIND. Findings may inform the design of future interventional studies.


Neurology ◽  
2021 ◽  
pp. 10.1212/WNL.0000000000012973
Author(s):  
Sokratis Charisis ◽  
Eva Ntanasi ◽  
Mary Yannakoulia ◽  
Costas A Anastasiou ◽  
Mary H Kosmidis ◽  
...  

Background and objectives:Aging is characterized by a functional shift of the immune system towards a proinflammatory phenotype. This derangement has been associated with cognitive decline and has been implicated in the pathogenesis of dementia. Diet can modulate systemic inflammation; thus, it may be a valuable tool to counteract the associated risks for cognitive impairment and dementia. The present study aimed to explore the associations between the inflammatory potential of diet, assessed using an easily applicable, population-based, biomarker-validated diet inflammatory index (DII), and the risk for dementia in community-dwelling older adults.Methods:Individuals from the Hellenic Longitudinal Investigation of Aging and Diet (HELIAD) were included in the present cohort study. Participants were recruited through random population sampling, and were followed for a mean of 3.05 (SD=0.85) years. Dementia diagnosis was based on standard clinical criteria. Those with baseline dementia and/or missing cognitive follow-up data were excluded from the analyses. The inflammatory potential of diet was assessed through a DII score which considers literature-derived associations of 45 food parameters with levels of pro- and anti-inflammatory cytokines in the blood; higher values indicated a more pro-inflammatory diet. Consumption frequencies were derived from a detailed food frequency questionnaire, and were standardized to representative dietary intake normative data from 11 different countries. Analysis of dementia incidence as a function of baseline DII scores was performed by Cox proportional hazards models.Results:Analyses included 1059 individuals (mean age=73.1 years; 40.3% males; mean education=8.2 years), 62 of whom developed incident dementia. Each additional unit of DII was associated with a 21% increase in the risk for dementia incidence [HR=1.21 (1.03 – 1.42); p=0.023]. Compared to participants in the lowest DII tertile, participants in the highest one (maximal pro-inflammatory diet potential) were 3 [(1.2 – 7.3); p=0.014] times more likely to develop incident dementia. The test for trend was also significant, indicating a potential dose-response relationship (p=0.014).Conclusions:In the present study, higher DII scores (indicating greater pro-inflammatory diet potential) were associated with an increased risk for incident dementia. These findings might avail the development of primary dementia preventive strategies through tailored and precise dietary interventions.


2019 ◽  
Vol 8 (5) ◽  
pp. 733
Author(s):  
Jooyoung Chang ◽  
Seulggie Choi ◽  
Kyuwoong Kim ◽  
Sang Min Park

Several studies suggest that 5-alpha reductase inhibitors (5ARIs) may be associated with elevated risk of cardiovascular disease (CVD). We investigated the association of 5ARI exposure and CVD incidence using the National Health Insurance Service-Health Screening Cohort, a nationally representative population-based sample of Koreans. We calculated the 4-year cumulative exposure to 5ARI for 215,003 men without prior 5ARI use. Participants were followed from January 1st, 2008 to December 31st, 2015. A subcohort of newly diagnosed benign prostatic hyperplasia (BPH) patients during 2004–2010 was also analyzed. The primary study outcome was CVD and secondary outcomes were myocardial infarction (MI) or stroke. Hazard ratios (HRs) were estimated using Cox proportional hazards models adjusted for conventional risk factors. In both the main cohort and BPH subcohort, the use of any 5ARI did not increase the risk of cardiovascular disease (HR = 1.06; 95% CI = 0.91–1.23; HR = 0.95; 95% CI = 0.88–1.03; respectively). Furthermore, as an unexpected finding, a dose-analysis among the BPH subcohort showed that the highest tertile of 5ARI exposure reduced the risk of CVD (HR = 0.82; 95% CI = 0.72–0.92; p-trend = 0.001), MI (HR = 0.69; 95% CI = 0.50–0.95), and stroke (HR = 0.84; 95% CI = 0.72–0.98) compared to non-users. Among men and BPH patients, 5ARI did not increase the risk of CVD. Among BPH patients, 5ARI use may reduce the risk CVD.


2022 ◽  
Vol 14 (1) ◽  
Author(s):  
Jiacheng He

Abstract Purpose Creatinine to body weight (Cre/BW) ratio is considered the independent risk factor for incident type 2 diabetes mellitus (T2DM), but research on this relationship is limited. The relationship between the Cre/BW ratio and T2DM among Chinse individuals is still ambiguous. This study aimed to evaluate the correlation between the Cre/BW ratio and the risk of T2DM in the Chinese population. Methods This is a retrospective cohort study from a prospectively collected database. We included a total of 200,658 adults free of T2DM at baseline. The risk of incident T2DM according to Cre/BW ratio was estimated using multivariable Cox proportional hazards models, and a two-piece wise linear regression model was developed to find out the threshold effect. Results With a median follow-up of 3.13 ± 0.94 years, a total of 4001 (1.99%) participants developed T2DM. Overall, there was an L-shaped relation of Cre/BW ratio with the risk of incident T2DM (P for non-linearity < 0.001). When the Cre/BW ratio (× 100) was less than 0.86, the risk of T2DM decreased significantly as the Cre/BW ratio increased [0.01 (0.00, 0.10), P < 0.001]. When the Cre/BW ratio (× 100) was between 0.86 and 1.36, the reduction in the risk of developing T2DM was not as significant as before [0.22 (0.12, 0.38), P < 0.001]. In contrast, when the Cre/BW ratio (× 100) was greater than 1.36, the reduction in T2DM incidence became significantly flatter than before [0.73 (0.29,1.8), P = 0.49]. Conclusion There was an L-shaped relation of Cre/BW ratio with incidence of T2DM in general Chinese adults. A negative curvilinear association between Cre/BW ratio and incident T2DM was present, with a saturation effect predicted at 0.86 and 1.36 of Cre/BW ratio (× 100).


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0242429
Author(s):  
Shian-Ying Sung ◽  
Trang Thi Huynh Le ◽  
Jin- Hua Chen ◽  
Teng-Fu Hsieh ◽  
Chia-Ling Hsieh

Elevated Renal cell carcinoma (RCC) risk has been associated with the use of several antihypertensive medications but has not yet been elucidated in the populations prescribed alpha-1 blockers that are commonly used in the treatment of hypertension and lower urinary tract symptoms associated with benign prostatic hyperplasia (LUTS-BPH). The aim of the present study was to investigate the association between alpha-1 blocker use and the risk of developing RCC using a nationwide population-based database in Taiwan. Patients who were treated with alpha-1 blockers for at least 28 days were identified through the Taiwan National Health Insurance Research Database from 2000 to 2010. The unexposed participants were matched with the exposed cases according to age, sex, and index year at a ratio of 3:1. Cox proportional hazards regression, stratified by sex and comorbidities and adjusted for age, was performed to estimate hazard ratios (HRs) for the risk of subsequent RCC. Among 2,232,092 subjects, patients who received alpha-1 blocker treatment had a higher risk of RCC than the unexposed group. Taking into account hypertension and BPH, the adjusted HR was significantly higher in male alpha-1 blocker users who had no BPH and either the presence (HR: 1.63, 95% confidence interval [CI] = 1.22–2.18) or absence (HR: 2.31, 95% CI = 1.40–3.81) of hypertension than in men not receiving these drugs. Taken together, male alpha-1 blocker users who had no comorbidity of BPH exhibited an increased risk for developing RCC independent of hypertension. Further study is warranted to elucidate the underlying mechanisms of this association.


Sign in / Sign up

Export Citation Format

Share Document