scholarly journals Mortality of Japanese Olympic athletes in 1964 Tokyo Olympic Games

2021 ◽  
Vol 7 (1) ◽  
pp. e000896
Author(s):  
Taro Takeuchi ◽  
Yuri Kitamura ◽  
Soya Ishizuka ◽  
Sachiko Yamada ◽  
Hiroshi Aono ◽  
...  

ObjectivesTo compare the mortality of Japanese athletes in the 1964 Tokyo Olympic Games with that of the Japanese population, and to elucidate factors associated with their mortality.MethodsWe obtained from the Japan Sport Association study subjects’ biographical information, information on lifestyles and medical data. Missing data were obtained from online databases. Standardised mortality ratio (SMR) was calculated to compare athletes’ mortality with the Japanese population. Cox proportional hazards model was applied to estimate the HR for each category of body mass index (BMI), smoking history and handgrip strength. This analysis was limited to male athletes due to the small number of female athletes.ResultsAmong 342 (283 men, 59 women) athletes, deaths were confirmed for 70 (64 men, 6 women) athletes between September 1964 and December 2017. Total person years was 15 974.8, and the SMR was 0.64 (95% CI 0.50 to 0.81). Multivariate analysis performed on 181 male athletes. Mortality was significantly higher for BMI≥25 kg/m2 than for 21–23 kg/m2 (HR: 3.03, 95% CI 1.01 to 9.07). We found no statistically significant associations between smoking history and mortality; the HR (95% CI) for occasional and daily smokers were 0.82 (0.26 to 2.57) and 1.30 (0.55 to 3.03) compared with never smokers. We also found no statistically significant associations between handgrip strength and mortality (P for trend: 0.51).ConclusionJapanese athletes in the 1964 Tokyo Olympic Games lived longer than the Japanese population. BMI≥25 kg/m2 was associated with higher mortality, but smoking history and handgrip strength were not associated with mortality.

2012 ◽  
Vol 30 (4_suppl) ◽  
pp. 294-294 ◽  
Author(s):  
Linh My Alejandro ◽  
Nelly G. Adel ◽  
Eileen Mary O'Reilly ◽  
Elyn Riedel ◽  
Mario E. Lacouture

294 Background: Rash is a common adverse event of E, an epidermal growth factor receptor (EGFR) inhibitor approved for advanced PC. Clinical trial results have shown that E-related rash grade 2 or higher is associated with a survival benefit in PC. We examined the correlation between all grades of rash and overall survival (OS) in PC patients receiving E at MSKCC. Methods: This was a retrospective single-institution study that included a review of all PC patients treated with E between March 1st 2005 and December 15th 2009 at MSKCC. The association of development of rash on OS was examined using a Cox proportional hazards model using development of rash as a time dependent covariate. The associations were examined univariately and after adjusting for gender, race, smoking history, prior lines of treatment for metastatic disease, and chemotherapy. An intervention was defined as a dose change, interruption, discontinuation or medical intervention for rash. Results: N=193 constituted the cohort of analysis. The median age was 64; 116 (60%) were male and 162 (84%) were Caucasian. Most patients (N=111, 58%) did not receive any prior medical treatment for pancreatic cancer. Skin rash occurred in 113 (59%) of patients. The median OS was 6.7 months (95% confidence interval, 5.7-7.9 months). In a univariate analysis, rash was protective compared to no skin rash (Grade 1 HR 0.71; 95% CI 0.50-1.00, Grade 2+ HR 0.57; 95% CI 0.40-0.82; P=0.007). In the multivariate model, rash appeared to have a protective effect on survival, but this was not statistically significant (Grade 1 HR 0.69; 95% CI 0.49-0.97, Grade 2+ HR 0.78; 95% CI 0.48-1.27). Non-medical interventions for rash included dose adjustment (5%), dose interruption (6%) and dose discontinuation (9%). 33 (29%) patients received medical intervention for rash. Conclusions: Our findings suggest that grade 1 or higher E-related rash may be a surrogate for survival. Appropriate symptom interventions are recommended to enhance patient comfort and avoid discontinuation of treatment.


Circulation ◽  
2020 ◽  
Vol 142 (Suppl_3) ◽  
Author(s):  
Morgan Harloff ◽  
Laura Piechura ◽  
Farhang Yazdchi ◽  
Mohamed Keshk ◽  
Hunbo Shim ◽  
...  

Introduction: Prolonged cardiopulmonary resuscitation (CPR) duration remains a source of apprehension with regards to the acceptance of donor hearts for orthotopic heart transplantation (OHT). Unfortunately, many of these organs are declined due to concern for adverse outcomes after OHT, further straining an already limited donor pool. Nevertheless, donor hearts with a history of prolonged CPR may represent an opportunity to expand the donor pool for patients with end-stage heart failure on the waiting list for OHT. Therefore, we sought to examine the duration of donor CPR and its impact on recipient survival after OHT. Methods: The United Network of Organ Sharing (UNOS) database was retrospectively quarried to identify all adult patients who underwent first-time OHT between 2000 and 2019 from a donor who had experienced cardiac arrest with a quantified downtime duration. The population was divided into five groups with a granular focus on longer downtimes: donors with CPR < 30 minutes, 30-39 minutes, 40-49 minutes, 50-59 minutes, and ≥ 60 minutes. Primary outcome of interest was post-transplant survival. Kaplan-Meier analysis was used to compare recipient survival between groups after OHT. Results: In total, 7,470 patients were identified during the study period. Overall survival by Kaplan-Meier analysis was not statistically different among the five groups (p=0.69) (Figure 1). In a Cox proportional-hazards model, duration of CPR was found to have no influence on survival (HR 1.00, p=0.56). Significant predictors of mortality included donor age (HR 1.01, p=0.013), donor smoking history (HR 1.11, p<0.005), and recipient diabetes (HR 1.27, p<0.0001). Conclusions: These findings suggest that, for hearts determined appropriate for transplant, duration of CPR performed on the donor heart does not significantly impact survival after OHT. Therefore, donor hearts with a prolonged downtime should be fully evaluated for OHT to maximize the donor pool.


Crisis ◽  
2018 ◽  
Vol 39 (1) ◽  
pp. 27-36 ◽  
Author(s):  
Kuan-Ying Lee ◽  
Chung-Yi Li ◽  
Kun-Chia Chang ◽  
Tsung-Hsueh Lu ◽  
Ying-Yeh Chen

Abstract. Background: We investigated the age at exposure to parental suicide and the risk of subsequent suicide completion in young people. The impact of parental and offspring sex was also examined. Method: Using a cohort study design, we linked Taiwan's Birth Registry (1978–1997) with Taiwan's Death Registry (1985–2009) and identified 40,249 children who had experienced maternal suicide (n = 14,431), paternal suicide (n = 26,887), or the suicide of both parents (n = 281). Each exposed child was matched to 10 children of the same sex and birth year whose parents were still alive. This yielded a total of 398,081 children for our non-exposed cohort. A Cox proportional hazards model was used to compare the suicide risk of the exposed and non-exposed groups. Results: Compared with the non-exposed group, offspring who were exposed to parental suicide were 3.91 times (95% confidence interval [CI] = 3.10–4.92 more likely to die by suicide after adjusting for baseline characteristics. The risk of suicide seemed to be lower in older male offspring (HR = 3.94, 95% CI = 2.57–6.06), but higher in older female offspring (HR = 5.30, 95% CI = 3.05–9.22). Stratified analyses based on parental sex revealed similar patterns as the combined analysis. Limitations: As only register-­based data were used, we were not able to explore the impact of variables not contained in the data set, such as the role of mental illness. Conclusion: Our findings suggest a prominent elevation in the risk of suicide among offspring who lost their parents to suicide. The risk elevation differed according to the sex of the afflicted offspring as well as to their age at exposure.


2020 ◽  
Vol 132 (4) ◽  
pp. 998-1005 ◽  
Author(s):  
Haihui Jiang ◽  
Yong Cui ◽  
Xiang Liu ◽  
Xiaohui Ren ◽  
Mingxiao Li ◽  
...  

OBJECTIVEThe aim of this study was to investigate the relationship between extent of resection (EOR) and survival in terms of clinical, molecular, and radiological factors in high-grade astrocytoma (HGA).METHODSClinical and radiological data from 585 cases of molecularly defined HGA were reviewed. In each case, the EOR was evaluated twice: once according to contrast-enhanced T1-weighted images (CE-T1WI) and once according to fluid attenuated inversion recovery (FLAIR) images. The ratio of the volume of the region of abnormality in CE-T1WI to that in FLAIR images (VFLAIR/VCE-T1WI) was calculated and a receiver operating characteristic curve was used to determine the optimal cutoff value for that ratio. Univariate and multivariate analyses were performed to identify the prognostic value of each factor.RESULTSBoth the EOR evaluated from CE-T1WI and the EOR evaluated from FLAIR could divide the whole cohort into 4 subgroups with different survival outcomes (p < 0.001). Cases were stratified into 2 subtypes based on VFLAIR/VCE-T1WIwith a cutoff of 10: a proliferation-dominant subtype and a diffusion-dominant subtype. Kaplan-Meier analysis showed a significant survival advantage for the proliferation-dominant subtype (p < 0.0001). The prognostic implication has been further confirmed in the Cox proportional hazards model (HR 1.105, 95% CI 1.078–1.134, p < 0.0001). The survival of patients with proliferation-dominant HGA was significantly prolonged in association with extensive resection of the FLAIR abnormality region beyond contrast-enhancing tumor (p = 0.03), while no survival benefit was observed in association with the extensive resection in the diffusion-dominant subtype (p=0.86).CONCLUSIONSVFLAIR/VCE-T1WIis an important classifier that could divide the HGA into 2 subtypes with distinct invasive features. Patients with proliferation-dominant HGA can benefit from extensive resection of the FLAIR abnormality region, which provides the theoretical basis for a personalized resection strategy.


2020 ◽  
Vol 32 (2) ◽  
pp. 160-167 ◽  
Author(s):  
Alessandro Siccoli ◽  
Victor E. Staartjes ◽  
Marlies P. de Wispelaere ◽  
Marc L. Schröder

OBJECTIVEWhile it has been established that lumbar discectomy should only be performed after a certain waiting period unless neurological deficits are present, little is known about the association of late surgery with outcome. Using data from a prospective registry, the authors aimed to quantify the association of time to surgery (TTS) with leg pain outcome after lumbar discectomy and to identify a maximum TTS cutoff anchored to the minimum clinically important difference (MCID).METHODSTTS was defined as the time from the onset of leg pain caused by radiculopathy to the time of surgery in weeks. MCID was defined as a minimum 30% reduction in the numeric rating scale score for leg pain from baseline to 12 months. A Cox proportional hazards model was utilized to quantify the association of TTS with MCID. Maximum TTS cutoffs were derived both quantitatively, anchored to the area under the curve (AUC), and qualitatively, based on cutoff-specific MCID rates.RESULTSFrom a prospective registry, 372 patients who had undergone first-time tubular microdiscectomy were identified; 308 of these patients (83%) obtained an MCID. Attaining an MCID was associated with a shorter TTS (HR 0.718, 95% CI 0.546–0.945, p = 0.018). Effect size was preserved after adjustment for potential confounders. The optimal maximum TTS was estimated at 23.5 weeks based on the AUC, while the cutoff-specific method suggested 24 weeks. Discectomy after this cutoff starts to yield MCID rates under 80%. The 24-week cutoff also coincided with the time point after which the specificity for MCID first drops below 50% and after which the negative predictive value for nonattainment of MCID first surpasses ≥ 20%.CONCLUSIONSThe study findings suggest that late lumbar discectomy is linked with poorer patient-reported outcomes and that—in accordance with the literature—a maximum TTS of 6 months should be aimed for.


2020 ◽  
Vol 30 (Supplement_5) ◽  
Author(s):  
M V Tancredi ◽  
S Sakabe ◽  
C S B Domingues ◽  
G F M Pereira² ◽  
E A Waldman

Abstract Background To estimate median survival time of AIDS patients, with and without tuberculosis (TB), in a cohort in Sao Paulo, Brazil, and to investigate survival predictors. Methods Retrospective cohort study of AIDS patients above 12 years old, registered at the Ministry of Health AIDS surveillance system between 2003-2007, and followed until 2014. Survival analysis used the Kaplan-Meier method and Cox proportional hazards model to estimate hazard ratios (HR), with respective 95% confidence intervals (CI = 95%). Results 35,515 patients were included, being 4,581 (12.9%) co-infected with TB. Among the latter, probability of survival 12 years after AIDS diagnosis was 95.2%, 82.9%, and 21.9%, respectively for patients receiving at least one third line ARV (HAART2), receiving triple therapy (HAART1) and the last one not on ARV. In the same period, the probability of survival for patients without TB, in the same order as for the therapeutic regimens, was 95.2%, 90.5%, and 40.9%, respectively. The main factors associated with survival, adjusted for the year of diagnosis, were: Living in the city of Sao Paulo (HR = 1,16;IC95% 1,01-1,32), living away from the capital city (HR = 1.43; 95%CI 1.25-1.62); or on the coast (HR = 1.49; 95%CI 1.21-1.82); having TB (HR = 1.70; 95%CI 1.49-1.87); above 49 years old (HR = 1.35; 95%CI 1.18-1.54); black (HR = 1.27; 95%CI 1.12-1.45); IV drug use (HR = 1.73; 95%CI 1.49-2.02); CD4+ below 200 cell/mm³ at AIDS diagnosis (HR = 2.31; 95%CI 1.97-2.72); viral load above 500 copies at AIDS diagnosis (HR = 1.99; 95%CI 1.72-2.30); HAART1 scheme (HR = 1.94; 95%CI 1.47-2.55); no ARV (HR = 8.22; 95%CI 2.95-22.87). Conclusions A large proportion of patients did not receive ARVs or were late diagnosed with AIDS, especially those with TB, whose survival was shorter. Survival is heterogeneous in the state, being lower in regions with higher TB rates. The results point to the need for specific strategies for patients with TB-HIV co-infection. Key messages Tuberculosis is the main cause of death among HIV-infected people, being responsible for one third of deaths in this group and causing a great impact on the survival of this population. The Brazilian policy of universal access to ARV and treatment for TB has increased the survival of AIDS-TB from 22% to 95% and in patients without TB from 50% to 95% up to 12 years after diagnosis.


Risks ◽  
2021 ◽  
Vol 9 (6) ◽  
pp. 103
Author(s):  
Morne Joubert ◽  
Tanja Verster ◽  
Helgard Raubenheimer ◽  
Willem D. Schutte

Survival analysis is one of the techniques that could be used to predict loss given default (LGD) for regulatory capital (Basel) purposes. When using survival analysis to model LGD, a proposed methodology is the default weighted survival analysis (DWSA) method. This paper is aimed at adapting the DWSA method (used to model Basel LGD) to estimate the LGD for International Financial Reporting Standard (IFRS) 9 impairment requirements. The DWSA methodology allows for over recoveries, default weighting and negative cashflows. For IFRS 9, this methodology should be adapted, as the estimated LGD is a function of in the expected credit losses (ECL). Our proposed IFRS 9 LGD methodology makes use of survival analysis to estimate the LGD. The Cox proportional hazards model allows for a baseline survival curve to be adjusted to produce survival curves for different segments of the portfolio. The forward-looking LGD values are adjusted for different macro-economic scenarios and the ECL is calculated for each scenario. These ECL values are probability weighted to produce a final ECL estimate. We illustrate our proposed IFRS 9 LGD methodology and ECL estimation on a dataset from a retail portfolio of a South African bank.


2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Maryam Farhadian ◽  
Sahar Dehdar Karsidani ◽  
Azadeh Mozayanimonfared ◽  
Hossein Mahjub

Abstract Background Due to the limited number of studies with long term follow-up of patients undergoing Percutaneous Coronary Intervention (PCI), we investigated the occurrence of Major Adverse Cardiac and Cerebrovascular Events (MACCE) during 10 years of follow-up after coronary angioplasty using Random Survival Forest (RSF) and Cox proportional hazards models. Methods The current retrospective cohort study was performed on 220 patients (69 women and 151 men) undergoing coronary angioplasty from March 2009 to March 2012 in Farchshian Medical Center in Hamadan city, Iran. Survival time (month) as the response variable was considered from the date of angioplasty to the main endpoint or the end of the follow-up period (September 2019). To identify the factors influencing the occurrence of MACCE, the performance of Cox and RSF models were investigated in terms of C index, Integrated Brier Score (IBS) and prediction error criteria. Results Ninety-six patients (43.7%) experienced MACCE by the end of the follow-up period, and the median survival time was estimated to be 98 months. Survival decreased from 99% during the first year to 39% at 10 years' follow-up. By applying the Cox model, the predictors were identified as follows: age (HR = 1.03, 95% CI 1.01–1.05), diabetes (HR = 2.17, 95% CI 1.29–3.66), smoking (HR = 2.41, 95% CI 1.46–3.98), and stent length (HR = 1.74, 95% CI 1.11–2.75). The predictive performance was slightly better by the RSF model (IBS of 0.124 vs. 0.135, C index of 0.648 vs. 0.626 and out-of-bag error rate of 0.352 vs. 0.374 for RSF). In addition to age, diabetes, smoking, and stent length, RSF also included coronary artery disease (acute or chronic) and hyperlipidemia as the most important variables. Conclusion Machine-learning prediction models such as RSF showed better performance than the Cox proportional hazards model for the prediction of MACCE during long-term follow-up after PCI.


Author(s):  
Yuko Yamaguchi ◽  
Marta Zampino ◽  
Toshiko Tanaka ◽  
Stefania Bandinelli ◽  
Yusuke Osawa ◽  
...  

Abstract Background Anemia is common in older adults and associated with greater morbidity and mortality. The causes of anemia in older adults have not been completely characterized. Although elevated circulating growth and differentiation factor 15 (GDF-15) has been associated with anemia in older adults, it is not known whether elevated GDF-15 predicts the development of anemia. Methods We examined the relationship between plasma GDF-15 concentrations at baseline in 708 non-anemic adults, aged 60 years and older, with incident anemia during 15 years of follow-up among participants in the Invecchiare in Chianti (InCHIANTI) Study. Results During follow-up, 179 (25.3%) participants developed anemia. The proportion of participants who developed anemia from the lowest to highest quartile of plasma GDF-15 was 12.9%, 20.1%, 21.2%, and 45.8%, respectively. Adults in the highest quartile of plasma GDF-15 had an increased risk of developing anemia (Hazards Ratio 1.15, 95% Confidence Interval 1.09, 1.21, P&lt;.0001) compared to those in the lower three quartiles in a multivariable Cox proportional hazards model adjusting for age, sex, serum iron, soluble transferrin receptor, ferritin, vitamin B12, congestive heart failure, diabetes mellitus, and cancer. Conclusions Circulating GDF-15 is an independent predictor for the development of anemia in older adults.


2020 ◽  
Vol 41 (Supplement_2) ◽  
Author(s):  
A Funada ◽  
Y Goto ◽  
T Maeda ◽  
H Okada ◽  
M Takamura

Abstract Background/Introduction Shockable rhythm after cardiac arrest is highly expected after early initiation of bystander cardiopulmonary resuscitation (CPR) owing to increased coronary perfusion. However, the relationship between bystander CPR and initial shockable rhythm in patients with out-of-hospital cardiac arrest (OHCA) remains unclear. We hypothesized that chest-compression-only CPR (CC-CPR) before emergency medical service (EMS) arrival has an equivalent effect on the likelihood of initial shockable rhythm to the standard CPR (chest compression plus rescue breathing [S-CPR]). Purpose We aimed to examine the rate of initial shockable rhythm and 1-month outcomes in patients who received bystander CPR after OHCA. Methods The study included 59,688 patients (age, ≥18 years) who received bystander CPR after an OHCA with a presumed cardiac origin witnessed by a layperson in a prospectively recorded Japanese nationwide Utstein-style database from 2013 to 2017. Patients who received public-access defibrillation before arrival of the EMS personnel were excluded. The patients were divided into CC-CPR (n=51,520) and S-CPR (n=8168) groups according to the type of bystander CPR received. The primary end point was initial shockable rhythm recorded by the EMS personnel just after arrival at the site. The secondary end point was the 1-month outcomes (survival and neurologically intact survival) after OHCA. In the statistical analyses, a Cox proportional hazards model was applied to reflect the different bystander CPR durations before/after propensity score (PS) matching. Results The crude rate of the initial shockable rhythm in the CC-CPR group (21.3%, 10,946/51,520) was significantly higher than that in the S-CPR group (17.6%, 1441/8168, p&lt;0.0001) before PS matching. However, no significant difference in the rate of initial shockable rhythm was found between the 2 groups after PS matching (18.3% [1493/8168] vs 17.6% [1441/8168], p=0.30). In the Cox proportional hazards model, CC-CPR was more negatively associated with the initial shockable rhythm before PS matching (unadjusted hazards ratio [HR], 0.97; 95% confidence interval [CI], 0.94–0.99; p=0.012; adjusted HR, 0.92; 95% CI, 0.89–0.94; p&lt;0.0001) than S-CPR. After PS matching, however, no significant difference was found between the 2 groups (adjusted HR of CC-CPR compared with S-CPR, 0.97; 95% CI, 0.94–1.00; p=0.09). No significant differences were found between C-CPR and S-CPR in the 1-month outcomes after PS matching as follows, respectively: survival, 8.5% and 10.1%; adjusted odds ratio, 0.89; 95% CI, 0.79–1.00; p=0.07; cerebral performance category 1 or 2, 5.5% and 6.9%; adjusted odds, 0.86; 95% CI, 0.74–1.00; p=0.052. Conclusions Compared with S-CPR, the CC-CPR before EMS arrival had an equivalent multivariable-adjusted association with the likelihood of initial shockable rhythm in the patients with OHCA due to presumed cardiac causes that was witnessed by a layperson. Funding Acknowledgement Type of funding source: None


Sign in / Sign up

Export Citation Format

Share Document