Long-Term Survivors of Gastric Cancer: A California Population-Based Study

2012 ◽  
Vol 30 (28) ◽  
pp. 3507-3515 ◽  
Author(s):  
Pamela L. Kunz ◽  
Matthew Gubens ◽  
George A. Fisher ◽  
James M. Ford ◽  
Daphne Y. Lichtensztajn ◽  
...  

Purpose In the United States, gastric cancer is rapidly fatal with a 25% 5-year survival. Of the few patients who survive, little is known about their demographic, clinical, and tumor characteristics. Patients and Methods Data regarding all cases of gastric and gastroesophageal junction (GEJ) adenocarcinoma diagnosed in California between 1988 and 2005 were obtained from the California Cancer Registry, a member of the National Cancer Institute's Surveillance, Epidemiology, and End Results (SEER) program. A Cox proportional hazards model was constructed to understand the independent relationships of patient demographic, disease, and treatment factors with survival. Results We identified 47,647 patients diagnosed with gastric or GEJ cancer. Of those, only 9,325 (20%) survived at least 3 years. Variables associated with longer survival were localized stage (hazard ratio [HR], 0.20), surgery with diagnosis in 2002 or later (HR, 0.34), surgery with diagnosis in 2001 or before (0.37), regional stage (HR, 0.53), chemotherapy (HR, 0.56), intestinal histology (HR, 0.74), well- or moderately differentiated tumors (HR, 0.76), radiation (HR, 0.80), Asian/Pacific Islander race (HR, 0.81), treatment at an academic hospital (HR, 0.85), fundus/body/antrum location (HR, 0.90), highest socioeconomic status quintile (HR, 0.91), female sex (HR, 0.92), Hispanic race (HR, 0.92), and hospital size more than 150 beds (HR, 0.94). Kaplan-Meier curves showed longer median disease-specific survival (DSS) in patients with tumors originating in the fundus/body/antrum compared with esophagus/cardia (13.4 v 10.8 months). Intestinal histology had significantly longer median DSS (28.9 months) compared with other (11.0 months) or diffuse (10.1 months) histology. Conclusion Patients who survive gastric and GEJ cancer more than 3 years after diagnosis have demographic and pathologic characteristics distinct from those who do not survive.

Author(s):  
Majdi Imterat ◽  
Tamar Wainstock ◽  
Eyal Sheiner ◽  
Gali Pariente

Abstract Recent evidence suggests that a long inter-pregnancy interval (IPI: time interval between live birth and estimated time of conception of subsequent pregnancy) poses a risk for adverse short-term perinatal outcome. We aimed to study the effect of short (<6 months) and long (>60 months) IPI on long-term cardiovascular morbidity of the offspring. A population-based cohort study was performed in which all singleton live births in parturients with at least one previous birth were included. Hospitalizations of the offspring up to the age of 18 years involving cardiovascular diseases and according to IPI length were evaluated. Intermediate interval, between 6 and 60 months, was considered the reference. Kaplan–Meier survival curves were used to compare the cumulative morbidity incidence between the groups. Cox proportional hazards model was used to control for confounders. During the study period, 161,793 deliveries met the inclusion criteria. Of them, 14.1% (n = 22,851) occurred in parturient following a short IPI, 78.6% (n = 127,146) following an intermediate IPI, and 7.3% (n = 11,796) following a long IPI. Total hospitalizations of the offspring, involving cardiovascular morbidity, were comparable between the groups. The Kaplan–Meier survival curves demonstrated similar cumulative incidences of cardiovascular morbidity in all groups. In a Cox proportional hazards model, short and long IPI did not appear as independent risk factors for later pediatric cardiovascular morbidity of the offspring (adjusted HR 0.97, 95% CI 0.80–1.18; adjusted HR 1.01, 95% CI 0.83–1.37, for short and long IPI, respectively). In our population, extreme IPIs do not appear to impact long-term cardiovascular hospitalizations of offspring.


Author(s):  
Tzu-Wei Yang ◽  
Chi-Chih Wang ◽  
Ming-Chang Tsai ◽  
Yao-Tung Wang ◽  
Ming-Hseng Tseng ◽  
...  

The prognosis of different etiologies of liver cirrhosis (LC) is not well understood. Previous studies performed on alcoholic LC-dominated cohorts have demonstrated a few conflicting results. We aimed to compare the outcome and the effect of comorbidities on survival between alcoholic and non-alcoholic LC in a viral hepatitis-dominated LC cohort. We identified newly diagnosed alcoholic and non-alcoholic LC patients, aged ≥40 years old, between 2006 and 2011, by using the Longitudinal Health Insurance Database. The hazard ratios (HRs) were calculated using the Cox proportional hazards model and the Kaplan–Meier method. A total of 472 alcoholic LC and 4313 non-alcoholic LC patients were identified in our study cohort. We found that alcoholic LC patients were predominantly male (94.7% of alcoholic LC and 62.6% of non-alcoholic LC patients were male) and younger (78.8% of alcoholic LC and 37.4% of non-alcoholic LC patients were less than 60 years old) compared with non-alcoholic LC patients. Non-alcoholic LC patients had a higher rate of concomitant comorbidities than alcoholic LC patients (79.6% vs. 68.6%, p < 0.001). LC patients with chronic kidney disease demonstrated the highest adjusted HRs of 2.762 in alcoholic LC and 1.751 in non-alcoholic LC (all p < 0.001). In contrast, LC patients with hypertension and hyperlipidemia had a decreased risk of mortality. The six-year survival rates showed no difference between both study groups (p = 0.312). In conclusion, alcoholic LC patients were younger and had lower rates of concomitant comorbidities compared with non-alcoholic LC patients. However, all-cause mortality was not different between alcoholic and non-alcoholic LC patients.


Author(s):  
Min-Hua Lin ◽  
She-Yu Chiu ◽  
Pei-Hsuan Chang ◽  
Yu-Liang Lai ◽  
Pau-Chung Chen ◽  
...  

Background: Previous research found that statins, in addition to its efficiency in treating hyperlipidemia, may also incur adverse drug reactions, which mainly include myopathies and abnormalities in liver function. Aim: This study aims to assess the risk for newly onset sarcopenia among patients with chronic kidney disease using statins. Material and Method: In a nationwide retrospective population-based cohort study, 75,637 clinically confirmed cases of chronic kidney disease between 1997 and 2011were selected from the National Health Insurance Research Database of Taiwan. The selection of the chronic kidney disease cohort included a discharge diagnosis with chronic kidney disease or more than 3 outpatient visits with the diagnosis of chronic kidney disease found within 1 year. After consideration of patient exclusions, we finally got a total number of 67,001 cases of chronic kidney disease in the study. The Cox proportional hazards model was used to perform preliminary analysis on the effect of statins usage on the occurrence of newly diagnosed sarcopenia; the Cox proportional hazards model with time-dependent covariates was conducted to take into consideration the individual temporal differences in medication usage, and calculated the hazard ratio (HR) and 95% confidence interval after controlling for gender, age, income, and urbanization. Results: Our main findings indicated that patients with chronic kidney disease who use statins seem to effectively prevent patients from occurrences of sarcopenia, high dosage of statins seem to show more significant protective effects, and the results are similar over long-term follow-up. In addition, the risk for newly diagnosed sarcopenia among patients with lipophilic statins treatment was lower than that among patients with hydrophilic statins treatment. Conclusion: It seems that patients with chronic kidney disease could receive statin treatment to reduce the occurrence of newly diagnosed sarcopenia. Additionally, a higher dosage of statins could reduce the incidence of newly diagnosed sarcopenia in patients with chronic kidney disease.


2019 ◽  
Vol 2019 ◽  
pp. 1-11 ◽  
Author(s):  
Zhang Haiyu ◽  
Pei Xiaofeng ◽  
Mo Xiangqiong ◽  
Qiu Junlan ◽  
Zheng Xiaobin ◽  
...  

Purpose. The morbidity of esophageal adenocarcinoma (EAC) has significantly increased in Western countries. We aimed to identify trends in incidence and survival in patients with EAC in the recent 30 years and then analyzed potential risk factors, including race, sex, age, and socioeconomic status (SES). Methods. All data were collected from the Surveillance, Epidemiology, and End Results or SEER database. Kaplan–Meier analysis and the Cox proportional hazards model were conducted to compare the differences in survival between variables, including sex, race, age, and SES, as well as to evaluate the association of these factors with prognosis. Results. A total of 16,474 patients with EAC were identified from 1984 to 2013 in the United States. Overall incidence increased every 10 years from 1.8 to 3.1 to 3.9 per 100. Overall survival gradually improved (p<0.0001), which was evident in male patients ((hazard ratio (HR) = 1.111; 95% confidence interval (CI) (1.07, 1.15)); however, the 5-year survival rate remained low (20.1%). The Cox proportional hazards model identified old age, black ethnicity, and medium/high poverty as risk factors for EAC (HR = 1.018; 95% CI (1.017, 1.019; HR = 1.240, 95% CI (1.151,1.336), HR = 1.000, 95% CI (1.000, 1.000); respectively). Conclusions. The incidence of EAC in the United States increased over time. Survival advantage was observed in white patients and patients in the low-poverty group. Sex was an independent prognostic factor for EAC, but this finding has to be confirmed by further research.


2015 ◽  
Vol 33 (3_suppl) ◽  
pp. 721-721
Author(s):  
Doug Baughman ◽  
Krishna Bilas Ghimire ◽  
Binay Kumar Shah

721 Background: Combination chemoradiotherapy is the standard of care for treatment of non-metastatic squamous cell carcinoma of the anus (SCCA). This population-based study evaluated disparities in receipt of radiotherapy (RT) and its effect on survival in patients with localized and regional SCCA in the United States. Methods: The Surveillance, Epidemiology, and End Results (SEER) 18 database was used to identify patients with localized and regional SCCA diagnosed between 1998 and 2008. We used univariate and multivariate logistic regression to model the relationships between receipt of RT and age, sex, marital status, stage, and race. Relative survival rates were calculated and compared using two sample z-tests. A Cox proportional hazards model was used to find adjusted hazard ratios (HR). Results: A total of 3,971 patients with localized or regional SCCA as the only primary malignancy were included in the study, of which 3,278 (82.6%) received RT. After adjusting for covariates, those 65 years and older (adjusted OR 0.82, p=0.029) were less likely to receive RT. Females were more likely to receive RT compared to males (adjusted OR 1.54, p<0.001). We found no difference in receipt of RT by race. Comparisons of 1- and 5-year relative survival rates showed lower survival for blacks (p-value <0.01 at 1-year and <0.0001 at 5-years), those 65 years and older, and males. A 1-year survival disparity was found for those not receiving RT (p-value <0.0001 at 1-year), but no difference was observed at 5-years. A Cox proportional hazards model adjusting for all covariates showed greater hazard for blacks (adjusted HR 1.36, p=0.001), those not receiving RT (adjusted HR 1.23, p=0.03), patients 65 years or older, and males. Conclusions: This population based study identified older patients as less likely to receive RT and females as more likely to receive RT. Survival analysis identified blacks, males, older patients, and those not receiving RT as having lower rates of survival.


2020 ◽  
Author(s):  
Yue Zhao ◽  
Deepika Dilip

Abstract Background: The outbreak of Coronavirus disease 2019 (COVID-19) has struck us in many ways and we observed that China and South Korea found an effective measure to contain the virus. Conversely, the United States and the European countries are struggling to fight the virus. China is not considered a democracy and South Korea is less democratic than the United States. Therefore, we want to explore the association between the deaths of COVID-19 and democracy. Methods: We collected COVID-19 deaths data for each country from the Johns Hopkins University website and democracy indices of 2018 from the Economist Intelligence Unit website in May 2020. Then we conducted a survival analysis, regarding each country as a subject, with the Cox Proportional Hazards Model, adjusting for other selected variables. Result: The result showed that the association between democracy and deaths of COVID-19 was significant (P=0.04), adjusting for other covariates. Conclusion: In conclusion, less democratic governments performed better in containing the virus and controlling the number of deaths.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Yeonghee Eun ◽  
Keun Hye Jeon ◽  
Kyungdo Han ◽  
Dahye Kim ◽  
Hyungjin Kim ◽  
...  

AbstractIn previous literature regarding development of rheumatoid arthritis (RA), female reproductive factors have been described as protective factors, risk factors, or irrelevant, leading to inconsistent results. The aim of this study was to investigate the effect of female reproductive factors on the incidence of seropositive RA. A large population-based retrospective cohort of the National Health Insurance Service data in South Korea was used. Postmenopausal women who participated in both cardiovascular and breast cancer screening in 2009 were included and followed until date of seropositive RA diagnosis, death, or December 31, 2018. Multivariable-adjusted Cox proportional hazards model was used to assess the association between reproductive factors and incident seropositive RA. Of 1,357,736 postmenopausal women, 6056 women were diagnosed with seropositive RA, and the incidence rate was 54.16 cases/100,000 person-years. Reproductive factors other than hormone replacement therapy (HRT) were not significantly associated with seropositive RA incidence. Postmenopausal women who used HRT ≥ 5 years were associated with a higher aHR of incident seropositive RA than never-users (aHR 1.25; 95% CI 1.09–1.44). Alcohol consumption less than 30 g per day (aHR 0.80; 95% CI 0.74–0.87), regular physical activity (aHR 0.90; 95% CI 0.84–0.97), diabetes mellitus (aHR 0.85; 95% CI 0.78–0.93), and cancer (aHR 0.77; 95% CI 0.64–0.92) were associated with lower risk of seropositive RA. Most female reproductive factors did not significantly affect the development of seropositive RA in postmenopausal women. Only HRT is associated with a small but significant increase in risk of seropositive RA.


2020 ◽  
Vol 7 ◽  
pp. 205435812090697
Author(s):  
Mohamed Shantier ◽  
Yanhong Li ◽  
Monika Ashwin ◽  
Olsegun Famure ◽  
Sunita K. Singh

Background: The Living Kidney Donor Profile Index (LKDPI) was derived in a cohort of kidney transplant recipients (KTR) from the United States to predict the risk of total graft failure. There are important differences in patient demographics, listing practices, access to transplantation, delivery of care, and posttransplant mortality in Canada as compared with the United States, and the generalizability of the LKDPI in the Canadian context is unknown. Objective: The purpose of this study was to externally validate the LKDPI in a large contemporary cohort of Canadian KTR. Design: Retrospective cohort validation study. Setting: Toronto General Hospital, University Health Network, Toronto, Ontario, Canada Patients: A total of 645 adult (≥18 years old) living donor KTR between January 1, 2006 and December 31, 2016 with follow-up until December 31, 2017 were included in the study. Measurements: The predictive performance of the LKDPI was evaluated. The outcome of interest was total graft failure, defined as the need for chronic dialysis, retransplantation, or death with graft function. Methods: The Cox proportional hazards model was used to examine the relation between the LKDPI and total graft failure. The Cox proportional hazards model was also used for external validation and performance assessment of the model. Discrimination and calibration were used to assess model performance. Discrimination was assessed using Harrell’s C statistic and calibration was assessed graphically, comparing observed versus predicted probabilities of total graft failure. Results: A total of 645 living donor KTR were included in the study. The median LKDPI score was 13 (interquartile range [IQR] = 1.1, 29.9). Higher LKDPI scores were associated with an increased risk of total graft failure (hazard ratio = 1.01; 95% confidence interval [CI] = 1.0-1.02; P = .02). Discrimination was poor (C statistic = 0.55; 95% CI = 0.48-0.61). Calibration was as good at 1-year posttransplant but suboptimal at 3- and 5-years posttransplant. Limitations: Limitations include a relatively small sample size, predicted probabilities for assessment of calibration only available for scores of 0 to 100, and some missing data handled by imputation. Conclusions: In this external validation study, the predictive ability of the LKDPI was modest in a cohort of Canadian KTR. Validation of prediction models is an important step to assess performance in external populations. Potential recalibration of the LKDPI may be useful prior to clinical use in external cohorts.


2020 ◽  
Vol 2020 ◽  
pp. 1-8
Author(s):  
Ya-Hsu Yang ◽  
Chih-Chiang Chiu ◽  
Hao-Wei Teng ◽  
Chun-Teng Huang ◽  
Chun-Yu Liu ◽  
...  

Background. Late onset depression (LOD) often occurs in the context of vascular disease and may be associated with risk of dementia. Aspirin is widely used to reduce the risk of cardiovascular disease and stroke. However, its role in patients with LOD and risk of dementia remains inconclusive. Materials and Methods. A population-based study was conducted using data from National Health Insurance of Taiwan during 1996–2009. Patients fulfil diagnostic criteria for LOD with or without subsequent dementia (incident dementia) and among whom users of aspirin (75 mg daily for at least 6 months) were identified. The time-dependent Cox proportional hazards model was applied for multivariate analyses. Propensity scores with the one-to-one nearest-neighbor matching model were used to select matching patients. Cumulative incidence of incident dementia after diagnosis of LOD was calculated by Kaplan–Meier Method. Results. A total of 6028 (13.4%) and 40,411 (86.6%) patients were defined as, with and without diagnosis of LOD, among whom 2,424 (41.9%) were aspirin users. Patients with LOD had more comorbidities such as cardiovascular diseases, diabetes, and hypertension comparing to those without LOD. Among patients with LOD, aspirin users had lower incidence of subsequent incident dementia than non-users (Hazard Ratio = 0.734, 95% CI 0.641–0.841, p<0.001). After matching aspirin users with non-users by propensity scores-matching method, the cumulative incidence of incident dementia was significantly lower in aspirin users of LOD patients (p=0.022). Conclusions. Aspirin may be associated with a lower risk of incident dementia in patients with LOD. This beneficial effect of aspirin in LOD patients needs validation in prospective clinical trials and our results should be interpreted with caution.


2020 ◽  
Vol 38 (15_suppl) ◽  
pp. 7503-7503
Author(s):  
Muna Qayed ◽  
Carrie L Kitko ◽  
Kwang Woo Ahn ◽  
Mariam H Johnson ◽  
Kirk R. Schultz ◽  
...  

7503 Background: Characteristics such as disease, disease status and cytogenetic abnormalities impact relapse and survival after transplantation for acute myeloid (AML) and acute lymphoblastic (ALL) leukemia. In adults, these attributes were used to derive the disease risk index for survival. Thus, the current analysis sought to develop and validate a pediatric disease risk index (p-DRI). Methods: Eligible were patients aged <18 years with AML (n=1135) and ALL (n=1228) transplanted between 2008 and 2017 in the United States. Separate analyses were performed for AML and ALL. Patients were randomly assigned (1:1) to a training and validation cohort. Cox proportional hazards model with stepwise selection was used to select significant variables (2-sided p<0.05). The primary outcome was leukemia-free survival (LFS; relapse or death were events). Based on the magnitude of log(HR), a weighted score was assigned to each characteristic that met the level of significance and risk groups were created. Results: Four risk groups were identified for AML and three risk groups for ALL (Table). The 5-year probabilities of LFS for AML were 81% (68-91), 56% (51-61), 44% (39-49) and 21% (15-28) for good, intermediate, high and very high-risk groups, respectively. The 5-year probabilities of LFS for ALL were 68% (63-72), 50% (45-54) and 15% (3-34) for good, intermediate, high risk groups, respectively. Conclusions: This validated p-DRI successfully stratified children with AML and ALL for prognostication undergoing allogeneic transplantation. [Table: see text]


Sign in / Sign up

Export Citation Format

Share Document