Increasing Age Is Not Independently Associated with Increased 30-Day Morbidity after Hip Arthroscopy

2018 ◽  
Vol 02 (03) ◽  
pp. 148-154
Author(s):  
Venkat Boddapati ◽  
Jamie Confino ◽  
Michael Fu ◽  
Kyle Duchman ◽  
Robert Westermann ◽  
...  

AbstractThe purpose of this study is to examine the 30-day postoperative complications after hip arthroscopy as a function of patient age. The American College of Surgeons National Surgical Quality Improvement Program database from 2005 to 2016 was used to identify all patients undergoing hip arthroscopy using Current Procedural Terminology and International Classification of Disease codes. Patient characteristics and postoperative complications were compared in a retrospective cohort study with a level of evidence 3 across patient age cohorts using bivariate and multivariate analysis that corrected for differences in baseline patient characteristics. In total, 2,427 patients undergoing hip arthroscopy were identified. Of all identified patients, 667 (27.5%) were under 30 years of age, 596 (24.5%) were between 31 and 40, 599 (24.6%) were between 41 and 50, and 566 (23.3%) were older than 50. Chondroplasty, abrasion arthroplasty, and/or resection of the labrum were the most commonly performed procedures in all age groups. As age increased, patients were more likely to be female and have a higher body mass index, more medical comorbidities, a shorter operative duration, and a higher American Society of Anesthesiologists class. The rate of any 30-day postoperative complication was 1.35% in patients under 30 years of age, 1.68% in patients between 31 and 40, 2.67% in patients between 41 and 50, and 5.12% in patients older than 50 (p < 0.001). Older patients also had a higher rate of deep surgical site infections and blood transfusions (p ≤ 0.001). However, no differences were identified with multivariate analysis. While older patients had higher short-term complications following hip arthroscopy, age alone was not an independent predictor of adverse outcomes. Further investigation is necessary to determine the risk factors associated with significant postoperative morbidity in older patients undergoing hip arthroscopy.

Stroke ◽  
2016 ◽  
Vol 47 (suppl_1) ◽  
Author(s):  
Isibor J Arhuidese ◽  
Tammam Obeid ◽  
Besma Nejim ◽  
Kanhua Yin ◽  
Sophie Wang ◽  
...  

Introduction: The increasing prevalence and earlier onset of risk factors has resulted in an expanding population of younger patients undergoing carotid endarterectomy (CEA) in recent times. Outcomes after CEA are largely unreported in these patients. In this study, we evaluate 30-day postoperative outcomes after CEA in an exclusive cohort of young and middle aged patients. Methods: We studied all patients aged 64 years and younger, who underwent CEA in the American College of Surgeons National Surgical Quality Improvement Program (NSQIP) from January 2005 to December, 2013. Univariate methods (Chi Square, ttest) were employed to compare patients’ characteristics. Multivariate regression adjusting for patient characteristics was used to identify predictors of adverse outcomes. Results: There were 15830 CEA’s performed in this cohort with a mean age of 58 (S.D:5.1) years. The majority of patients were male (59%), Caucasian (85%) and hypertensive (81%). Nearly half (46%) were symptomatic. Overall, 266 (1.7%) patients suffered stroke in the 30 day post-operative period, while mortality and myocardial infarction rates were 0.6% and 0.4% respectively. The significant predictors of stroke or death were female gender (OR: 1.49; 95%CI: 1.15-1.92; p=0.002), symptomatic status (OR: 1.69; 95%CI: 1.30-2.21; p<0.001), previous cardiac intervention (OR: 1.42; 95%CI: 1.04-1.93; p=0.026) and physical dependence (OR: 1.81; 95%CI: 1.16-2.82; p=0.01). The mean length of in-hospital stay was 3 (SD:5.6) days and complications within 30 days of surgery are shown in Table 1. Conclusions: Absolute stroke and mortality rates after CEA in young and middle aged patients are not different from those reported in the general population. Stroke and mortality are significantly higher in symptomatic, physically dependent patients and those with prior cardiac intervention. We recommend extra vigilance in the management of these patients in order to improve CEA outcomes.


2018 ◽  
Vol 7 (12) ◽  
pp. 506 ◽  
Author(s):  
Katarina Boršič ◽  
Rok Blagus ◽  
Tjaša Cerar ◽  
Franc Strle ◽  
Daša Stupica

Infected elderly people often present with signs and symptoms that differ from those in younger adults, but data on the association between patient age and presentation of early Lyme borreliosis (LB) are limited. In this study, the association between patient age (18–44 years, young vs. 45–64 years, middle-aged vs. ≥ 65 years, elderly) and disease course, microbiologic characteristics, and the long-term outcome of treatment was investigated prospectively in 1220 adult patients with early LB manifesting as erythema migrans (EM) at a single-center university hospital. Patients were assessed at enrolment and followed-up for 12 months. Age was associated with comorbidities, previous LB, presenting with multiple EM, and seropositivity to borreliae at enrolment. The time to resolution of EM after starting antibiotic treatment was longer in older patients. At 12 months, 59/989 (6.0%) patients showed incomplete response. The odds for incomplete response decreased with time from enrolment (odds ratio (OR) of 0.49, 0.50, and 0.48 for 2-month vs. 14-days, 6-month vs. 2-month, and 12-month vs. 6-month follow-up visits, respectively), but were higher with advancing age (OR 1.57 for middle-aged vs. young, and 1.95 for elderly vs. young), in women (OR 1.41, 95% confidence interval (CI) 1.01–1.96), in patients who reported LB-associated constitutional symptoms at enrolment (OR 7.69, 95% CI 5.39–10.97), and in those who presented with disseminated disease (OR 1.65, 95% CI 1.09–2.51). The long-term outcome of EM was excellent in patients of all age groups. However, older patients had slower resolution of EM and higher odds for an unfavorable outcome of treatment (OR 1.57, 95% CI 1.05–2.34 for middle-aged vs. young; and OR 1.95, 95% CI 1.14–3.32 for elderly vs. young), manifested predominantly as post-LB symptoms. The presence of LB-associated constitutional symptoms at enrolment was the strongest predictor of incomplete response.


2020 ◽  
Vol 35 (12) ◽  
pp. 2331-2338
Author(s):  
Vera E. R. Asscher ◽  
◽  
Quirine van der Vliet ◽  
Karen van der Aalst ◽  
Anniek van der Aalst ◽  
...  

Abstract Purpose To assess safety and effectiveness of anti-tumor necrosis factor (anti-TNF) therapy in IBD patients ≥ 60 years. Methods Ninety IBD patients ≥ 60 years at initiation of anti-TNF therapy, 145 IBD patients ≥ 60 years without anti-TNF therapy and 257 IBD patients < 60 years at initiation of anti-TNF therapy were retrospectively included in this multicentre study. Primary outcome was the occurrence of severe adverse events (SAEs), serious infections and malignancies. Secondary outcome was effectiveness of therapy. Cox regression analyses were used to assess differences in safety and effectiveness. In safety analyses, first older patients with and without anti-TNF therapy and then older and younger patients with anti-TNF therapy were assessed. Results In older IBD patients, the use of anti-TNF therapy was associated with serious infections (aHR 3.920, 95% CI 1.185–12.973, p = .025). In anti-TNF-exposed patients, cardiovascular disease associated with serious infections (aHR 3.279, 95% CI 1.098–9.790, p = .033) and the presence of multiple comorbidities (aHR 9.138 (1.248–66.935), p = .029) with malignancies, while patient age did not associate with safety outcomes. Effectiveness of therapy was not affected by age or comorbidity. Conclusion Older patients receiving anti-TNF therapy have a higher risk of serious infections compared with older IBD patients without anti-TNF therapy, but not compared with younger patients receiving anti-TNF therapy. However, in anti-TNF-exposed patients, comorbidity was found to be an indicator with regards to SAEs. Effectiveness was comparable between older and younger patients.


2017 ◽  
Vol 8 (2) ◽  
pp. 164-171 ◽  
Author(s):  
Kevin Phan ◽  
Jun S. Kim ◽  
Joshua Xu ◽  
John Di Capua ◽  
Nathan J. Lee ◽  
...  

Study Design: Retrospective analysis of prospectively collected data. Objective: The effect of malnutrition on outcomes after general surgery has been well reported in the literature. However, there is a paucity of data on the effect of malnutrition on postoperative complications during adult deformity surgery. The study attempts to explore and quantify the association between hypoalbuminemia and postoperative complications. Methods: A retrospective cohort analysis was performed on the American College of Surgeons National Surgical Quality Improvement Program (ACS NSQIP) database from 2010 to 2014. Patients (≥18 years of age) from the NSQIP database undergoing adult deformity surgery were separated into cohorts based serum albumin (<3.5 or >3.5 g/dL). Chi-square and multivariate logistic regression models were used to identify independent risk factors. Results: A total of 2236 patients met the inclusion criteria for the study, of which 2044 (91.4%) patients were nutritionally sufficient while 192 (8.6%) patients were nutritionally insufficient. Multivariate logistic regressions revealed nutritional insufficiency as a risk factors for mortality (odds ratio [OR] = 15.67, 95% confidence interval [CI] = 6.01-40.84, P < .0001), length of stay ≥5 days (OR = 2.22, 95% CI = 1.61-3.06, P < .0001), any complications (OR = 1.82, 95% CI = 1.31-2.51, P < .0001), pulmonary complications (OR = 2.29, 95% CI = 1.29-4.06, P = .005), renal complications (OR = 2.71, 95% CI = 1.05-7.00, P = .039), and intra-/postoperative red blood cell transfusion (OR = 1.52, 95% CI = 1.08-2.12, P = .015). Conclusions: This study demonstrates that preoperative hypoalbuminemia is a significant and independent risk factor for postoperative complications, 30-day mortality, and increased length of hospital in patients undergoing adult deformity surgery surgery. Nutritional status is a modifiable risk factor that can potentially improve surgical outcomes after adult deformity surgery.


Blood ◽  
2004 ◽  
Vol 104 (11) ◽  
pp. 298-298
Author(s):  
Christoph Schmid ◽  
Myriam Labopin ◽  
Juergen Finke ◽  
Gerhard Ehninger ◽  
Olle Ringden ◽  
...  

Abstract Relapsed AML after allogeneic SCT has a poor prognosis. So far, no standard therapy could be defined. Donor lymphocyte transfusion (DLT) has been effective in a minority, however, no data is available to identify patients who will benefit from the procedure. Neither, the outcome of patients treated with or without DLT have been compared. We retrospectively evaluated overall survival (OS) of 489 adults with de novo AML in hematological relapse after SCT, receiving DLT (n=190) or not (n=299). DLT and noDLTgroups were well balanced in terms of patient age (median:37y in both groups), donor age, cytogenetics (good:5vs7%, intermediate:83vs79%, poor:12%vs14%), WBC at diagnosis, donor type (geno-id:71vs72%, MUD:18% both, mismatched:11vs10%), status at transplantation (CR1:38vs41%, CR2:13vs15%, advanced:49vs44%), conditioning, source of stem cells, and time from transplant to relapse (5vs4.5 months). However, DLT patients had a median of 39% BM blasts, as compared to 54% for the noDLT group (p=0.03). Follow-up was 32 and 30 months. Within the DLT group, chemotherapy was additionally given in 130 cases. Nevertheless, only 33% of patients received DLT in CR or aplasia, 67% had measurable disease. AGvHD developed in 41% of patients following DLT. CR and PR were achieved in 31.1% and 4.8% of DLT patients. In a multivariate analysis, younger patient age (&lt;36 years) (HR=1.53,p=0.02) and a longer interval (&gt; 5 months) from transplant to relapse (HR=7.74,p=0.002) were associated with better OS after DLT. When comparing the outcome of patients receiving or not DLT, OS at 2 years was 10±1% for the entire cohort, 18±3% for DLT and 6±1% for noDLT (p&lt;.0001). In a multivariate analysis, use of DLT (HR=2.11,p&lt;0.0001); recipient’s age&lt;36 y (HR=1.69, p&lt;0.001); longer interval (&gt;5 months) from transplant to relapse (HR=2.40, p&lt;0.0001) and number of BM blasts (&lt;48%) at relapse (HR=1.56,p=0.002) were favorable for OS. In this retrospective analysis the results suggest that DLT may be of advantage in the treatment of AML relapse post transplant, at least in younger patients with a longer post transplant remission and relapsing with smaller amounts of blasts in BM. However, patients receiving DLT might represent a positive selection among all relapsed cases, since a considerable number from the noDLT cohort had died too early to proceed to DLT. An intetion-to-treat analysis and further prospective studies should investigate the role of DLT and other approaches, such as second reduced intensity SCT.


2018 ◽  
Vol 06 (05) ◽  
pp. E610-E615
Author(s):  
Heechan Kang ◽  
Mo Thoufeeq

Abstract Background an study aims Polypectomy and endoscopic mucosal resection (EMR) are effective and safe ways of removing polyps from the colon at endoscopy. Guidelines exist for advising the time allocation for diagnostic endoscopy but not for polypectomy and EMR. The aim of this study was to identify if time allocated for polypectomy and EMR at planned therapeutic lists in our endoscopy unit is sufficient for procedures to be carried out. We also wanted to identify factors that might be associated with procedures taking longer than the allocated time and to identify factors that might predict duration of these procedures. Patients and methods A retrospective case study of planned 100 lower gastrointestinal EMR and polypectomy procedures at colonoscopy and sigmoidoscopy was performed and analyzed with quantitative analysis. Results The mean actual procedural time (APT) for 100 procedures was 52 minutes and the mean allocated time (AT) was 43.05 minutes. Hence the mean APT was 9 minutes longer than the mean AT. Factors that were significantly associated with procedures taking longer than the allocated time were patient age (P = 0.029) and polyp size (P = 0.005). Factors that significant changed the actual procedure time were patient age (P = 0.018), morphology (P = 0.002) and polyp size (P < 0.001). Procedures involving flat and lateral spreading tumor (LST) type polyps took longer than the protruding ones. On multivariate analysis, polyp size was the only factor that associated with actual procedure time. Number of polyps, quality of bowel preparation, and distance of polyp from insertion did significantly change procedure duration. Conclusion Factors that significantly contribute to duration of polypectomy and EMR at lower gastrointestinal endoscopy include patient age and polyp size and morphology on univariate analysis, with polyp size being the factor with a significant association on multivariate analysis. We recommend that endoscopy units take these factors into consideration locally when allocating time for these procedures to be safe and effective.


2014 ◽  
Vol 121 (3) ◽  
pp. 580-586 ◽  
Author(s):  
Timothy Wen ◽  
Shuhan He ◽  
Frank Attenello ◽  
Steven Y. Cen ◽  
May Kim-Tenser ◽  
...  

Object As health care administrators focus on patient safety and cost-effectiveness, methodical assessment of quality outcome measures is critical. In 2008 the Centers for Medicare and Medicaid Services (CMS) published a series of “never events” that included 11 hospital-acquired conditions (HACs) for which related costs of treatment are not reimbursed. Cerebrovascular procedures (CVPs) are complex and are often performed in patients with significant medical comorbidities. Methods This study examines the impact of patient age and medical comorbidities on the occurrence of CMS-defined HACs, as well as the effect of these factors on the length of stay (LOS) and hospitalization charges in patients undergoing common CVPs. Results The HACs occurred at a frequency of 0.49% (1.33% in the intracranial procedures and 0.33% in the carotid procedures). Falls/trauma (n = 4610, 72.3% HACs, 357 HACs per 100,000 CVPs) and catheter-associated urinary tract infections (n = 714, 11.2% HACs, 55 HACs per 100,000 CVPs) were the most common events. Age and the presence of ≥ 2 comorbidities were strong independent predictors of HACs (p < 0.0001). The occurrence of HACs negatively impacts both LOS and hospital costs. Patients with at least 1 HAC were 10 times more likely to have prolonged LOS (≥ 90th percentile) (p < 0.0001), and 8 times more likely to have high inpatient costs (≥ 90th percentile) (p < 0.0001) when adjusting for patient and hospital factors. Conclusions Improved quality protocols focused on individual patient characteristics might help to decrease the frequency of HACs in this high-risk population. These data suggest that risk adjustment according to underlying patient factors may be warranted when considering reimbursement for costs related to HACs in the setting of CVPs.


Blood ◽  
2016 ◽  
Vol 128 (22) ◽  
pp. 4665-4665
Author(s):  
Zain Bashey ◽  
Scott R. Solomon ◽  
Lawrence E. Morris ◽  
H. Kent Holland ◽  
Xu Zhang ◽  
...  

Abstract Introduction: Outcomes for non-transplant therapy in older adults (age> 60 years) with AML or high-grade MDS have historically been poor. Allogeneic hematopoietic cell transplantation (allo-HCT) may improve these outcomes. However, many older patients will lack suitably HLA-matched sibling donors (MRD). Furthermore, many patients from ethnic minorities will lack an optimally matched unrelated donors (MUD). Additionally, the greater incidence and severity of chronic GVHD typically seen following MUD transplants may be particularly difficult to tolerate in older patients. T-replete haploidentical donor transplants (HAPLO) using post-transplant cyclophosphamide to mitigate alloreactivity may provide a suitable donor option for some older patients. However, no detailed comparison of outcomes after HAPLO to MRD and MUD donors in elderly patients with AML and high-grade MDS have been reported in the modern era.. Methods: We analyzed outcomes of patients aged > 60 years with AML or high-grade MDS who received an allo-HCT at our center between 2005 and 2015. Ex-vivo T-cell depleted transplants and cord blood transplants were excluded. Supportive care measures were identical between the three donor groups. Patient characteristics and outcome parameters were extracted from our institutional database where they had been prospectively entered. Kaplan-Meier estimates of overall survival (OS) and disease-free survival (DFS) were calculated and the cumulative incidence method with competing risks was used to calculate rates of non-relapse mortality (NRM) and relapse. Cumulative incidences of acute and chronic GVHD were estimated with death being treated as the competing risk. Cox proportional hazards models, stratified on the three transplant donor groups, were developed using OS, DFS, NRM and relapse as endpoints and other parameters as covariates. GVHD was prospectively documented by a single dedicated nurse using established criteria including NIH consensus criteria for chronic GVHD and rates calculated using the cumulative incidence method. Results: Patient characteristics (n=127, 33 HAPLO, 37 MRD, 57 MUD) were as follows: median age 64 (60-77); male 57%; regimen- myeloablative (24%) non-myeloablative (76%); graft- PBSC (80%) BM (20%); Diagnoses- AML 59%, MDS 41%; DRI- low (2%), intermediate (58%), high (39%), very high (1%); Sorror HCT-comorbidity index 0-2 (46%), >3 (54%); Median HLA mismatches were 5/10 (range 2/10 to 5/10) for HAPLO patients. Estimated rates of OS, DFS, NRM and relapse for the entire group at 2 years were 60%, 49%, 18%, and 33%. When compared to MRD and MUD, HAPLO patients had similar characteristics but were less likely to have myeloablative conditioning (6% vs. 32% and 30% respectively for MRD and MUD, p=0.016) and were more likely to have a BM graft (52% vs. 0% and 21%, p<0.001). Median follow-up of surviving patients following MRD, MUDT and HAPLO transplants were 34m, 26m and 17m. For MRD, MUD, and HAPLO transplants respectively, estimated outcomes are as follows: TRM at 1year: 14%, 14% and 9% and 2yrs 17%, 23%, 9%, Relapse at 1year - 25%, 34%, 22% and 2 yrs -32%, 34%, 33%; OS at 1 yr 72%, 72%, 77% and 2 yrs - 62%, 55% , 67%. DFS at 1 yr - 61%, 52%, 69% and 2 yrs - 51%, 43%, 58% (Fig 1.) (p=NS for all endpoints on pointwise and global comparison). The cumulative incidences of acute GVHD at 180 days were: grade 2-4 - 27%, 37% and 39%; grade 3-4 - 8%, 18% and 15% (p=NS for all) and chronic GVHD at 2 yrs were: moderate to severe - 38%, 35%, 15% (p=0.028 MUD vs HAPLO, p=0.026 MRD vs HAPLO); severe - 12%, 11%, 0% (p=0.030 MUD vs HAPLO, p=0.009 MRD vs HAPLO). On multivariable Cox analysis, donor type was not a significant predictor of OS, DFS, NRM or relapse (Table 1). Conclusions: The results show that in the current era, using predominantly non-myeloablative conditioning regimens, 2 year OS and DFS rates of 60% and 49% with a NRM <20% can be achieved in patients aged >60 years who undergo allo-HCT for AML and high-grade MDS. Outcomes of patients transplanted from HAPLO donors are comparable to those from matched donors although the rate of clinically significant chronic GVHD appears significantly less following HAPLO transplants, which may translate to an improved quality of life. Figure 1 Figure 1. Figure 2 Figure 2. Table 1 Multivariate Analysis on Overall Survival, Disease Free Survival, Transplant Related Mortality and Relapse Table 1. Multivariate Analysis on Overall Survival, Disease Free Survival, Transplant Related Mortality and Relapse Disclosures No relevant conflicts of interest to declare.


Sign in / Sign up

Export Citation Format

Share Document