The prognostic significance of lymphovascular tumor invasion in localized high-grade osteosarcoma: Outcomes of a single institution over ten years.

2021 ◽  
Vol 39 (15_suppl) ◽  
pp. e22013-e22013
Author(s):  
Charles Gusho ◽  
Ira Miller ◽  
Bishir Clayton ◽  
Matthew W. Colman ◽  
Steven Gitelis ◽  
...  

e22013 Background: Lymphovascular tumor invasion (LVI) has shown evidence of an association with worse survival in high-grade osteosarcoma. The purpose of this investigation was to prognosticate LVI as a predictor of survival in these patients. Methods: This study was a retrospective review of high-grade, localized osteosarcoma patients diagnosed over a consecutive ten-year period. Cox proportional hazards regression was used to assess the prognostic significance of LVI on overall survival (OS). Results: A total of 42 cases met inclusion criteria with a median follow-up of 64 months (range, 6-158 months). LVI was present in 21.4% (n = 9) cases. The five and ten-year probabilities of OS in LVI (+) were 40% and 20%, respectively, compared to LVI (-) with five and ten-year estimates of 93% and 81%, respectively (p < 0.001). After controlling for confounding variables, advanced age at diagnosis (HR, 1.134; 95% CI, 1-1.2; p = 0.01) and LVI (HR, 21.768; 95% CI, 3-135; p = 0.001) were found to be significantly negative predictors of OS. Using a competing risk analysis and Gray's test of equality, LVI (+) and LVI (-) were not statistically different with respect to cumulative incidence of recurrence (p = 0.8118), though were highly significant for cumulative incidence of mortality over time (p = 0.0029). Conclusions: The presence of LVI in the setting of high-grade, localized osteosarcoma is associated with greater rates of mortality and tumor recurrence and portends a dismal prognosis.

BMJ Open ◽  
2017 ◽  
Vol 7 (9) ◽  
pp. e015101 ◽  
Author(s):  
Hsien-Feng Lin ◽  
Kuan-Fu Liao ◽  
Ching-Mei Chang ◽  
Cheng-Li Lin ◽  
Shih-Wei Lai

ObjectiveThis study aimed to investigate the association between splenectomy and empyema in Taiwan.MethodsA population-based cohort study was conducted using the hospitalisation dataset of the Taiwan National Health Insurance Program. A total of 13 193 subjects aged 20–84 years who were newly diagnosed with splenectomy from 2000 to 2010 were enrolled in the splenectomy group and 52 464 randomly selected subjects without splenectomy were enrolled in the non-splenectomy group. Both groups were matched by sex, age, comorbidities and the index year of undergoing splenectomy. The incidence of empyema at the end of 2011 was calculated. A multivariable Cox proportional hazards regression model was used to estimate the HR with 95% CI of empyema associated with splenectomy and other comorbidities.ResultsThe overall incidence rate of empyema was 2.56-fold higher in the splenectomy group than in the non-splenectomy group (8.85 vs 3.46 per 1000 person-years). The Kaplan-Meier analysis revealed a higher cumulative incidence of empyema in the splenectomy group than in the non-splenectomy group (6.99% vs 3.37% at the end of follow-up). After adjusting for confounding variables, the adjusted HR of empyema was 2.89 for the splenectomy group compared with that for the non-splenectomy group. Further analysis revealed that HR of empyema was 4.52 for subjects with splenectomy alone.ConclusionThe incidence rate ratio between the splenectomy and non-splenectomy groups reduced from 2.87 in the first 5 years of follow-up to 1.73 in the period following the 5 years. Future studies are required to confirm whether a longer follow-up period would further reduce this average ratio. For the splenectomy group, the overall HR of developing empyema was 2.89 after adjusting for age, sex and comorbidities, which was identified from previous literature. The risk of empyema following splenectomy remains high despite the absence of these comorbidities.


Author(s):  
Annina Ropponen ◽  
Jurgita Narusyte ◽  
Mo Wang ◽  
Sanna Kärkkäinen ◽  
Lisa Mather ◽  
...  

Abstract Purpose To investigate associations between social benefits and disability pension (DP), long-term sickness absence (LTSA, ≥ 90 days), or unemployment among Swedish twins with sickness absence (SA) due to mental diagnoses. Methods This population-based prospective twin study included register data on first incident SA spell (< 90 days) due to mental diagnoses (ICD 10 codes F00-F99) during the follow-up 2005–2016. SA < 90 days due to other diagnoses than mental diagnoses or any other social insurance benefit was identified for the preceding year of the first incident SA spell due to mental diagnoses (coded yes/no). Comparing those with any previous social benefits vs without, cumulative incidence curve to compare time to an event, and Cox proportional hazards models for cause-specific hazard ratios (HR, 95% confidence intervals, CI) treating first incident DP, LTSA and unemployment as competing risks were modeled. Results During follow-up, 21 DP, 1619 LTSA, and 808 unemployment events took place. Compared to those without, those with at least one benefit had a higher risk for DP (HR 5.03; 95%CI 1.80, 14.01), LTSA (1.67; 1.50, 1.84) and unemployment (1.24; 1.03, 1.50). The cumulative incidence for DP was very low, < 1%, for LTSA 80% with any previous social benefits vs. 60% without, and for unemployment ≤ 5%. Conclusion Social benefits received during the preceding year of SA due to mental diagnoses (< 90 days) predict DP, LTSA, and unemployment. Hence, previous social benefits may provide means for early identification of persons at risk for exit from labor market.


2018 ◽  
Vol 77 (7) ◽  
pp. 1048-1052 ◽  
Author(s):  
Nicola Dalbeth ◽  
Amanda Phipps-Green ◽  
Christopher Frampton ◽  
Tuhina Neogi ◽  
William J Taylor ◽  
...  

ObjectivesTo provide estimates of the cumulative incidence of gout according to baseline serum urate.MethodsUsing individual participant data from four publicly available cohorts (Atherosclerosis Risk in Communities Study, Coronary Artery Risk Development in Young Adults Study, and both the Original and Offspring cohorts of the Framingham Heart Study), the cumulative incidence of clinically evident gout was calculated according to baseline serum urate category. Cox proportional hazards modelling was used to evaluate the relation of baseline urate categories to risk of incident gout.ResultsThis analysis included 18 889 participants who were gout-free at baseline, with mean (SD) 11.2 (4.2) years and 212 363 total patient-years of follow-up. The cumulative incidence at each time point varied according to baseline serum urate concentrations, with 15-year cumulative incidence (95% CI) ranging from 1.1% (0.9 to 1.4) for <6 mg/dL to 49% (31 to 67) for ≥10 mg/dL. Compared with baseline serum urate <6 mg/dL, the adjusted HR for baseline serum urate 6.0–6.9 mg/dL was 2.7, for 7.0–7.9 mg/dL was 6.6, for 8.0–8.9 mg/dL was 15, for 9.0–9.9 mg/dL was 30, and for ≥10 mg/dL was 64.ConclusionsSerum urate level is a strong non-linear concentration-dependent predictor of incident gout. Nonetheless, only about half of those with serum urate concentrations ≥10mg/dL develop clinically evident gout over 15 years, implying a role for prolonged hyperuricaemia and additional factors in the pathogenesis of gout.


2018 ◽  
Vol 13 (4) ◽  
pp. 628-637 ◽  
Author(s):  
Laura C. Plantinga ◽  
Raymond J. Lynch ◽  
Rachel E. Patzer ◽  
Stephen O. Pastan ◽  
C. Barrett Bowling

Background and objectivesSerious fall injuries in the setting of ESKD may be associated with poor access to kidney transplant. We explored the burden of serious fall injuries among patients on dialysis and patients on the deceased donor waitlist and the associations of these fall injuries with waitlisting and transplantation.Design, setting, participants, & measurementsOur analytic cohorts for the outcomes of (1) waitlisting and (2) transplantation included United States adults ages 18–80 years old who (1) initiated dialysis (n=183,047) and (2) were waitlisted for the first time (n=37,752) in 2010–2013. Serious fall injuries were determined by diagnostic codes for falls plus injury (fracture, joint dislocation, or head trauma) in inpatient and emergency department claims; the first serious fall injury after cohort entry was included as a time-varying exposure. Follow-up ended at the specified outcome, death, or the last date of follow-up (September 30, 2014). We used multivariable Cox proportional hazards models to determine the independent associations between serious fall injury and waitlisting or transplantation.ResultsOverall, 2-year cumulative incidence of serious fall injury was 6% among patients on incident dialysis; with adjustment, patients who had serious fall injuries were 61% less likely to be waitlisted than patients who did not (hazard ratio, 0.39; 95% confidence interval, 0.35 to 0.44). Among incident waitlisted patients (4% 2-year cumulative incidence), those with serious fall injuries were 29% less likely than their counterparts to be subsequently transplanted (hazard ratio, 0.71; 95% confidence interval, 0.63 to 0.80).ConclusionsSerious fall injuries among United States patients on dialysis are associated with substantially lower likelihood of waitlisting for and receipt of a kidney transplant.PodcastThis article contains a podcast at https://www.asn-online.org/media/podcast/CJASN/2018_03_06_CJASNPodcast_18_4_P.mp3


2013 ◽  
Vol 31 (4_suppl) ◽  
pp. 277-277
Author(s):  
Ryan Thomas Groeschl ◽  
T. Clark Gamblin ◽  
Kiran Turaga

277 Background: Although many previous studies on ablation outcomes for hepatocellular carcinoma (HCC) have dichotomized tumor size around a 3cm cutoff to determine prognostic significance, a growing number of reports describe excellent outcomes for larger tumors. To address the sensibility of this somewhat arbitrary 3-cm cutoff, we stratified patients by 1cm tumor size intervals and hypothesized that disease-specific survival (DSS) would not vary significantly between adjacent groups. Methods: Patients treated with local ablation for T1 HCC (≤8cm) were identified from the Surveillance, Epidemiology, and End Results database (2004-2008). Log-rank tests were used to compare DSS curves of adjacent study groups, and multivariable Cox proportional hazards models were used adjust for confounding variables. Results: There were 1,093 patients included in the study (26% female, median age: 62 years). The 3-year DSS was significantly lower in patients with 3-4cm tumors compared to 2-3cm tumors (58% vs 72%, p=0.002, Table). In adjusted models, DSS did not vary significantly between any size intervals up to 3cm. Patients with 3-4cm tumors, however, had a poorer prognosis compared to patients with 2-3cm tumors (hazard ratio: 1.60, 95% confidence interval: 1.18-2.18, p=0.002). DSS also fell significantly when tumor size increased from 5-6cm to 6-7cm (53% vs 21%, 0.006). Age and alpha-fetoprotein levels were also independently predictive of DSS in most multivariable models; however, the presence or absence of cirrhosis was not predictive in any models (smallest p=0.382). Conclusions: This study defends the use of a 3cm breakpoint when studying outcomes after ablation for HCC. Although some have advocated that ablation is more successful in cirrhotics, we found no evidence for this in our study. [Table: see text]


2020 ◽  
Vol 132 (4) ◽  
pp. 998-1005 ◽  
Author(s):  
Haihui Jiang ◽  
Yong Cui ◽  
Xiang Liu ◽  
Xiaohui Ren ◽  
Mingxiao Li ◽  
...  

OBJECTIVEThe aim of this study was to investigate the relationship between extent of resection (EOR) and survival in terms of clinical, molecular, and radiological factors in high-grade astrocytoma (HGA).METHODSClinical and radiological data from 585 cases of molecularly defined HGA were reviewed. In each case, the EOR was evaluated twice: once according to contrast-enhanced T1-weighted images (CE-T1WI) and once according to fluid attenuated inversion recovery (FLAIR) images. The ratio of the volume of the region of abnormality in CE-T1WI to that in FLAIR images (VFLAIR/VCE-T1WI) was calculated and a receiver operating characteristic curve was used to determine the optimal cutoff value for that ratio. Univariate and multivariate analyses were performed to identify the prognostic value of each factor.RESULTSBoth the EOR evaluated from CE-T1WI and the EOR evaluated from FLAIR could divide the whole cohort into 4 subgroups with different survival outcomes (p < 0.001). Cases were stratified into 2 subtypes based on VFLAIR/VCE-T1WIwith a cutoff of 10: a proliferation-dominant subtype and a diffusion-dominant subtype. Kaplan-Meier analysis showed a significant survival advantage for the proliferation-dominant subtype (p < 0.0001). The prognostic implication has been further confirmed in the Cox proportional hazards model (HR 1.105, 95% CI 1.078–1.134, p < 0.0001). The survival of patients with proliferation-dominant HGA was significantly prolonged in association with extensive resection of the FLAIR abnormality region beyond contrast-enhancing tumor (p = 0.03), while no survival benefit was observed in association with the extensive resection in the diffusion-dominant subtype (p=0.86).CONCLUSIONSVFLAIR/VCE-T1WIis an important classifier that could divide the HGA into 2 subtypes with distinct invasive features. Patients with proliferation-dominant HGA can benefit from extensive resection of the FLAIR abnormality region, which provides the theoretical basis for a personalized resection strategy.


Nutrients ◽  
2021 ◽  
Vol 13 (3) ◽  
pp. 1034
Author(s):  
Vincenza Gianfredi ◽  
Annemarie Koster ◽  
Anna Odone ◽  
Andrea Amerio ◽  
Carlo Signorelli ◽  
...  

Our aim was to assess the association between a priori defined dietary patterns and incident depressive symptoms. We used data from The Maastricht Study, a population-based cohort study (n = 2646, mean (SD) age 59.9 (8.0) years, 49.5% women; 15,188 person-years of follow-up). Level of adherence to the Dutch Healthy Diet (DHD), Mediterranean Diet, and Dietary Approaches To Stop Hypertension (DASH) were derived from a validated Food Frequency Questionnaire. Depressive symptoms were assessed at baseline and annually over seven-year-follow-up (using the 9-item Patient Health Questionnaire). We used Cox proportional hazards regression analyses to assess the association between dietary patterns and depressive symptoms. One standard deviation (SD) higher adherence in the DHD and DASH was associated with a lower hazard ratio (HR) of depressive symptoms with HRs (95%CI) of 0.78 (0.69–0.89) and 0.87 (0.77–0.98), respectively, after adjustment for sociodemographic and cardiovascular risk factors. After further adjustment for lifestyle factors, the HR per one SD higher DHD was 0.83 (0.73–0.96), whereas adherence to Mediterranean and DASH diets was not associated with incident depressive symptoms. Higher adherence to the DHD lowered risk of incident depressive symptoms. Adherence to healthy diet could be an effective non-pharmacological preventive measure to reduce the incidence of depression.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


Sign in / Sign up

Export Citation Format

Share Document