Age, Sickness, and Longevity in the Late Nineteenth and the Early Twentieth Centuries

2006 ◽  
Vol 30 (4) ◽  
pp. 571-600
Author(s):  
Martin Gorsky ◽  
Bernard Harris ◽  
Andrew Hinde

We examine the relationship between age, sickness, and longevity among men who were members of the Hampshire Friendly Society (HFS) in southern England during the late nineteenth and the early twentieth centuries. The HFS insured its members against sickness, death, and old age, keeping detailed records of the claims for sick pay submitted by its members from 1868 onward. From 1892 onward these records included information about the cause of the sickness for which compensation was paid. We can therefore use this information to construct individual “sickness biographies” for men who joined the society during this period. This article uses these sickness histories to address two questions. The first concerns the relationship between the age of the society’s members and the nature of the claims they submitted. We find that both the incidence and the duration of periods of sickness increased with age. Older men experienced longer periods of sickness both because they experienced different types of sickness and because it took them longer to recover from the same illnesses as those suffered by younger men. The second question is whether sickness in early adulthood was associated with increased mortality. We find that repeated bouts of sickness, as revealed by the number of claims made for sick pay, at ages under 50 years were associated with an increased risk of death at ages over 50 years.

2004 ◽  
Vol 24 (6) ◽  
pp. 903-920 ◽  
Author(s):  
OMAR RAHMAN ◽  
JANE MENKEN ◽  
RANDALL KUHN

The purpose of this study is to examine whether the co-residence of spouses and children affects self-reported general health among older men and women in a rural area of Bangladesh. Binary logistic regression has been used to explore the impact of spouses and children on self-reported health, with particular attention to the gender of children and interactions with chronic disease. The data are from the Matlab Health and Socio-Economic Survey. A sample of 765 women and 979 men aged 60 or more years with at least one surviving child was available. The principal result is that for an older woman, optimum self-reported health is most likely when a spouse and at least one son and one daughter are present. Any deviation from this family pattern (either no spouse or children of only one sex) leads to a significantly increased risk of poor self-reported health. On the other hand, among older men there were no differences in self-reported health among the various spouse-child combinations. The relationship between a balanced gender distribution of children and optimum self-reported health among older women may explain the levelling out of fertility at roughly three children per women despite intensive family planning promotion in the area. Further reductions in fertility (an important policy concern) may depend on improving the substitutability of sons and daughters in the support of their elderly mothers.


BMJ Open ◽  
2019 ◽  
Vol 9 (11) ◽  
pp. e030330
Author(s):  
Erin Grinshteyn ◽  
Peter Muennig ◽  
Roman Pabayo

ObjectivesFear of crime is associated with adverse mental health outcomes and reduced social interaction independent of crime. Because mental health and social interactions are associated with poor physical health, fear of crime may also be associated with death. The main objective is to determine whether neighbourhood fear is associated with time to death.Setting and participantsData from the 1978–2008 General Social Survey were linked to mortality data using the National Death Index (GSS-NDI) (n=20 297).MethodsGSS-NDI data were analysed to assess the relationship between fear of crime at baseline and time to death among adults after removing violent deaths. Fear was measured by asking respondents if they were afraid to walk alone at night within a mile of their home. Crude and adjusted HRs were calculated using survival analysis to calculate time to death. Analyses were stratified by sex.ResultsAmong those who responded that they were fearful of walking in their neighbourhood at night, there was a 6% increased risk of death during follow-up in the adjusted model though this was not significant (HR=1.06, 95% CI 0.99 to 1.13). In the fully adjusted models examining risk of mortality stratified by sex, findings were significant among men but not women. Among men, in the adjusted model, there was an 8% increased risk of death during follow-up among those who experienced fear at baseline in comparison with those who did not experience fear (HR=1.08, 95% CI 1.02 to 1.14).ConclusionsResearch has recently begun examining fear as a public health issue. With an identified relationship with mortality among men, this is a potential public health problem that must be examined more fully.


Blood ◽  
2011 ◽  
Vol 118 (21) ◽  
pp. 3033-3033 ◽  
Author(s):  
Ann Dahlberg ◽  
Filippo Milano ◽  
Ted A. Gooley ◽  
Colleen Delaney

Abstract Abstract 3033 Background: Cord blood transplant (CBT) recipients have higher infection-related morbidity and mortality than recipients of other stem cells sources following allogeneic transplant due to delayed hematopoietic recovery and immune reconstitution. Even in recipients of myeloablative (MA) double cord blood transplant (dCBT), time to engraftment (defined as the first of two consecutive days with an absolute neutrophil count (ANC) ≥ 500/μl) is delayed more than three weeks resulting in higher rates of infection in the first 100 days post-transplant. However, a better understanding of the relationship between duration of neutropenia and risk of early transplant related mortality is needed to assess the clinical impact of methodologies aimed at reducing neutropenia post transplant. Previously, the relationship between severe neutropenia (ANC ≤ 100/μl) and risk of death was evaluated for allogeneic bone marrow transplant recipients using proportional hazards models and demonstrated a significantly increased risk for those with severe neutropenia at day 15 or beyond following transplantation (Offner et al, Blood, 1996; 88(10): 4058–62). Here we use a similar model to determine how duration of severe neutropenia relates to risk of death following CBT. Methods: All patients (n=137) who received a CBT on a research protocol at a single institution from 2006–2010 were eligible. On each day from day 0 to day +100, surviving patients were divided into those with ANC ≤ 100/μl and those with ANC >100/μl and the number of patients who died by day +100 determined for each group. Hazard ratios (HR) with 95% confidence intervals for day +100 mortality were then calculated for day post-CBT with the HR representing the risk of day +100 death among those with ANC ≤ 100/μl relative to those with ANC >100/μl for each day. Results: Of the 137 patients who received a CBT on a research protocol, 99 patients (72%) received MA conditioning regimens and 38 patients (28%) received reduced-intensity conditioning regimens (RIC). Twenty-two patients (16%) received a single cord blood unit while the remainder received dCBT. As the overall results and trends observed for patients receiving MA or RIC regimens were similar, only the combined results are presented. Thirty-one patients (23%) died before day +100. Causes of death were primary graft failure (17), infection (7), disease relapse (5), multi-organ failure (1), and leukoencephalopathy (1). The median time to engraftment was the same (20 days) for those with death before day+100 (9–39 days) as those alive at day+100 (6–69 days). The hazard ratio for day 100 mortality by day post-CBT was significantly higher for patients with ANC ≤ 100/μl beginning on day +16 and remained so through day +50, at which point the number of patients with ANC ≤ 100/μl was small and thus the calculations were no longer significant. The HRs for each day post-CBT from day 0 to day +50 are plotted in Figure 1 along with their 95% upper and lower confidence intervals. From the graphical plot of these HRs, one can identify date ranges (days 0–12, days 13–23, days 24–40, days 41–100) where the HRs are roughly equivalent with a clear increase in HR in the next date range. Modeling ANC ≤ 100/μl as a time-dependent covariate, we calculated HRs for each of these date ranges and found a significantly increased risk of day 100 mortality for days 12–23 (HR=2.96, p=0.01), days 24–40 (HR=5.53, p=0.0004), and days 41–100 (HR=14.59, p<0.0001) for patients with ANC ≤ 100/μl. The HR was 1.16 (0.49–2.76, p=0.73) for days 0–12; however, some of these patients had initial autologous recovery following RIC regimens and poor outcomes. Conclusions: Our study demonstrates that severe neutropenia, defined as ANC ≤ 100, poses a significant increased risk of day 100 mortality for recipients of CBT as early as 12–23 days post-CBT. Importantly, this patient cohort was transplanted in the modern era of aggressive supportive care, including use of newer antimicrobials. However, despite this, severe neutropenia remains a significant risk factor for day 100 mortality in recipients of CBT. Interestingly, median time to engraftment (ANC ≥ 500/μl) was the same for those with death before day+100 as those alive at day+100 suggesting that time to ANC of 100 may be a better predictor of early death following CBT. Thus, strategies that result in more rapid myeloid recovery (to an ANC of 100) remain essential for recipients of CBT. Disclosures: No relevant conflicts of interest to declare.


2014 ◽  
Vol 32 (4_suppl) ◽  
pp. 31-31
Author(s):  
Alicia Katherine Morgans ◽  
Kang-Hsien Fan ◽  
Tatsuki Koyama ◽  
Peter C. Albertsen ◽  
Michael Goodman ◽  
...  

31 Background: Androgen deprivation therapy (ADT) has been associated with an increased risk of developing diabetes (DM) and cardiovascular disease (CVD), though this is controversial, particularly for CVD. We prospectively assessed the relationship between ADT and incident DM and CVD in the Prostate Cancer Outcomes Study (PCOS), a population-based cohort of prostate cancer survivors followed longitudinally for 15 years from diagnosis. Methods: We identified men in the PCOS with non-metastatic prostate cancer diagnosed from 1994 to 1995 and followed through 2009 to 2010. We used multivariable logistic regression models to compare groups receiving short-term ADT (less than 2 years), prolonged ADT (2 years or more) and no ADT to assess the relationship between ADT exposure and subsequent diagnoses of DM and CVD (determined by patient report and cause of death data). We evaluated the effects of age at diagnosis, race, stage, and comorbidity on the development of DM and CVD. Results: Among 3,526 men with comorbidity and treatment data, 2,985 men without baseline DM and 3,112 men without baseline CVD constituted the DM and CVD cohorts, respectively. Regardless of duration of ADT exposure, there was not an increased risk of DM or CVD in men younger than 70 at diagnosis. Compared to no ADT exposure, prolonged ADT was associated with an increased risk of DM and CVD that increased steadily over age 76 at diagnosis for DM (OR 2.11 at age 74, 95% CI 1.02 – 4.36; OR 2.65 at age 80, 95% CI 1.09 – 6.47) and age 74 at diagnosis for CVD (OR 1.89 at age 74, 95% CI 1.02 - 3.49; OR 3.19 at age 80, 95% 1.25 – 8.17). Increasing comorbidity burden modified risk of DM and CVD (for 3 or more comorbidities vs. no comorbidities; for DM, OR 4.25, 95% CI 2.3 - 7.9; and for CVD, OR 8.1, 95% CI 4.3 -15.5 P<0.001). Conclusions: The relationship between ADT and development of CVD and DM may be dependent upon age at diagnosis in addition to length of ADT administration, with longer ADT exposure predominantly increasing risk among older men only. Men with greater comorbid burden had increased risk of developing DM and CVD. Closer monitoring for development of DM and CVD may be most important among older men receiving prolonged ADT, especially those with other comorbidities.


Stroke ◽  
2021 ◽  
Vol 52 (4) ◽  
pp. 1322-1329
Author(s):  
Ivã Taiuan Fialho Silva ◽  
Pedro Assis Lopes ◽  
Tiago Timotio Almeida ◽  
Saint Clair Ramos ◽  
Ana Teresa Caliman Fontes ◽  
...  

Background and Purpose: Delirium is an acute and fluctuating impairment of attention, cognition, and behavior. Although common in stroke, studies that associate the clinical subtypes of delirium with functional outcome and death are lacking. We aimed to evaluate the influence of delirium occurrence and its different motor subtypes over stroke patients’ prognosis. Methods: Prospective cohort of stroke patients with symptom onset within 72 hours before research admission. Delirium was diagnosed by Confusion Assessment Method for the Intensive Care Unit, and its motor subtypes were defined according to the Richmond Agitation-Sedation Scale. The main outcome was functional dependence or death (modified Rankin Scale>2) at 90 days comparing: delirium versus no delirium patients; and between motor subtypes. Secondary outcomes included modified Rankin Scale score >2 at 30 days and 90-day-mortality. Results: Two hundred twenty-seven patients were enrolled. Delirium occurred in 71 patients (31.3%), with the hypoactive subtype as the most frequent, in 41 subjects (57.8%). Delirium was associated with increased risk of death and functional dependence at 30 and 90 days and higher 90-day mortality. Multivariate analysis showed delirium (odds ratio, 3.28 [95% CI, 1.17–9.22]) as independent predictor of modified Rankin Scale >2 at 90 days. Conclusions: Delirium is frequent in stroke patients in the acute phase. Its occurrence—specifically in mixed and hypoactive subtypes—seems to predict worse outcomes in this population. To our knowledge, this is the first study to prospectively investigate differences between delirium motor subtypes over functional outcome three months poststroke. Larger studies are needed to elucidate the relationship between motor subtypes of delirium and functional outcomes in the context of acute stroke.


2020 ◽  
pp. bjgp20X713981
Author(s):  
Fergus W Hamilton ◽  
Rupert Payne ◽  
David T Arnold

Abstract Background: Lymphopenia (reduced lymphocyte count) during infections such as pneumonia is common and is associated with increased mortality. Little is known about the relationship between lymphocyte count prior to developing infections and mortality risk. Aim: To identify whether patients with lymphopenia who develop pneumonia have increased risk of death. Design and Setting: A cohort study in the Clinical Practice Research Datalink (CPRD), linked to national death records. This database is representative of the UK population, and is extracted from routine records. Methods: Patients aged >50 years with a pneumonia diagnosis were included. We measured the relationship between lymphocyte count and mortality, using a time-to-event (multivariable Cox regression) approach, adjusted for age, sex, social factors, and potential causes of lymphopenia. Our primary analysis used the most recent test prior to pneumonia. The primary outcome was 28 day, all-cause mortality. Results: 40,909 participants with pneumonia were included from 1998 until 2019, with 28,556 having had a lymphocyte test prior to pneumonia (median time between test and diagnosis 677 days). When lymphocyte count was categorised (0-1×109/L, 1-2×109/L, 2-3×109/L, >3×109/L, never tested), both 28-day and one-year mortality varied significantly: 14%, 9.2%, 6.5%, 6.1% and 25% respectively for 28-day mortality, and 41%, 29%, 22%, 20% and 52% for one-year mortality. In multivariable Cox regression, lower lymphocyte count was consistently associated with increased hazard of death. Conclusion: Lymphopenia is an independent predictor of mortality in primary care pneumonia. Even low-normal lymphopenia (1-2×109/L) is associated with an increase in short- and long-term mortality compared with higher counts.


2016 ◽  
Vol 6 (1) ◽  
Author(s):  
Yen-Yuan Chen ◽  
Yih-Sharng Chen ◽  
Tzong-Shinn Chu ◽  
Kuan-Han Lin ◽  
Chau-Chung Wu

2011 ◽  
Vol 39 (1) ◽  
pp. 54-59 ◽  
Author(s):  
KALEB MICHAUD ◽  
MONTSERRAT VERA-LLONCH ◽  
GERRY OSTER

Objective.Patients with rheumatoid arthritis (RA) are at increased risk of death. Modern RA therapy has been shown to improve health status, but the relationship of such improvements to mortality risk is unknown. We assessed the relationship between health status and all-cause mortality in patients with RA, using the Health Assessment Questionnaire (HAQ) and the Medical Outcomes Study Short Form-36 questionnaire (SF-36) physical and mental component summary scores (PCS, MCS).Methods.Subjects (n = 10,319) were selected from the National Data Bank for Rheumatic Diseases, a prospective longitudinal observational US study with semiannual assessments of HAQ, PCS, and MCS. Risk of death up to 7 years through 2006 was obtained from the US National Death Index. Relationship of HAQ, PCS, and MCS to mortality was assessed using Cox regression models; prediction accuracy was compared using Harrell’s concordance coefficient (C).Results.Over 64,888 patient-years of followup, there were 1317 deaths. Poorer baseline health status was associated with greater mortality risk. Adjusting for age, sex, and baseline PCS and MCS, declines in PCS and HAQ were associated with higher risk of death. HAQ improvement was associated with reduced mortality risk from 6 months through 3 years; a similar relationship was not observed for PCS or MCS improvement. Controlling for baseline values, change in PCS or HAQ did not improve prediction accuracy.Conclusion.The HAQ and the SF-36 PCS are similarly and strongly associated with mortality risk in patients with RA. Change in these measures over time does not appear to add to predictive accuracy over baseline levels.


2009 ◽  
Vol 102 (5) ◽  
pp. 750-756 ◽  
Author(s):  
Jukka Montonen ◽  
Ritva Järvinen ◽  
Antti Reunanen ◽  
Paul Knekt

Studies of the beneficial role of fish consumption in the prevention of CVD are not consistent in their findings, particularly those studies that focus on the risk of stroke. The aim of the present study is to investigate the relationship between the consumption of different types of fish and the subsequent incidence of cerebrovascular disease (CVA). We prospectively evaluated the association between consumption of different types of fish and CVA in 3958 men and women aged 40–79 years who were free of heart disease and had participated in a health examination survey from 1967 to 1972. A total of 659 incident cases of CVA occurred during a follow-up until the end of 1994. A dietary history interview method provided data on habitual consumption of fish and other foods over the preceding year at baseline. Total fish intake did not predict CVA, but consumption of salted fish suggested an increased risk of intracerebral haemorrhage. The relative risk of intracerebral haemorrhage between the highest tertile of salted fish consumption and non-consumers was 1·98 (95 % CI 1·02, 3·84; P for trend = 0·06) after adjustment for age, sex, energy intake, smoking, BMI, physical activity, geographic area, occupation, diabetes, use of post-menopausal hormones, serum cholesterol, hypertension, and consumptions of butter, vegetables, fruits and berries. The relationship between fish consumption and stroke risk is not straightforward. How the fish is prepared for consumption may play an important role, affecting the association.


Sign in / Sign up

Export Citation Format

Share Document