scholarly journals O5D.3 Methods of estimating lifetime occupational exposure in the general population, based on job-exposure matrices

2019 ◽  
Vol 76 (Suppl 1) ◽  
pp. A48.1-A48
Author(s):  
Marie Houot ◽  
Julie Homère ◽  
Hélène Goulard ◽  
Loïc Garras ◽  
Laurène Delabre ◽  
...  

ObjectivesTo estimate proportion of pathologies attributable to occupational exposure, lifetime occupational exposure prevalence (LOEP) and relative risk are necessary. LOEP estimates are commonly used but often estimated with different methods. The method choice and the impact on estimates are rarely discuss in the literature. This study present and discuss the most widely used means of estimating LOEP and their respective impacts on estimates.MethodsA sample of individuals representative of the French population from 2007 was linked with four Matgéné job-exposure matrices: flour dust, cement dust, silica dust and benzene. LOEP and the 95% confidence interval were estimated using five methods: the maximum exposure probability during the career (Method 1), four using individual exposure probabilities, three of which subdivide careers into job-periods (Methods 2–4) and one which subdivides them into job-years (Method 5). To quantify differences between methods, percentage of variation were calculated for prevalence values on Methods 2 to 5 versus Method 1.ResultsFor each agent, LOEP estimated from the maximum probability during the career (Method 1) was consistently lower than prevalence taking account of job-periods or job-years. LOEP on Method 1 for flour dust, cement dust, silica dust and benzene were respectively 4.4%–95% CI [4.0–4.7], 4.3% [3.9–4.6], 6.1% [5.7–6.5] and 3.9% [3.6–4.2]. Percentage of variation ranged from 0% to 25.0% for flour dust, from 11.6% to 55.8% for cement dust, from 11.5% to 49.1% for silica dust and from 0% to 53.8% for benzene.ConclusionsThe present study provides a description of several LOEP estimation methods in the general population based on job-exposure matrices. It specifies the strong and weak points of each of the five chosen methods. For health monitoring purposes, LOEP should be reported as intervals, with low and high estimates obtained on different methods using job-periods (Methods 2–4).

2019 ◽  
Vol 2019 ◽  
pp. 1-7 ◽  
Author(s):  
Theresa Tiffe ◽  
Caroline Morbach ◽  
Viktoria Rücker ◽  
Götz Gelbrich ◽  
Martin Wagner ◽  
...  

Background. Effective antihypertensive treatment depends on patient compliance regarding prescribed medications. We assessed the impact of beliefs related towards antihypertensive medication on blood pressure control in a population-based sample treated for hypertension. Methods. We used data from the Characteristics and Course of Heart Failure Stages A-B and Determinants of Progression (STAAB) study investigating 5000 inhabitants aged 30 to 79 years from the general population of Würzburg, Germany. The Beliefs about Medicines Questionnaire German Version (BMQ-D) was provided in a subsample without established cardiovascular diseases (CVD) treated for hypertension. We evaluated the association between inadequately controlled hypertension (systolic RR >140/90 mmHg; >140/85 mmHg in diabetics) and reported concerns about and necessity of antihypertensive medication. Results. Data from 293 participants (49.5% women, median age 64 years [quartiles 56.0; 69.0]) entered the analysis. Despite medication, half of the participants (49.8%) were above the recommended blood pressure target. Stratified for sex, inadequately controlled hypertension was less frequent in women reporting higher levels of concerns (OR 0.36; 95%CI 0.17-0.74), whereas no such association was apparent in men. We found no association for specific-necessity in any model. Conclusion. Beliefs regarding the necessity of prescribed medication did not affect hypertension control. An inverse association between concerns about medication and inappropriately controlled hypertension was found for women only. Our findings highlight that medication-related beliefs constitute a serious barrier of successful implementation of treatment guidelines and underline the role of educational interventions taking into account sex-related differences.


Lupus ◽  
2018 ◽  
Vol 27 (8) ◽  
pp. 1247-1258 ◽  
Author(s):  
N McCormick ◽  
C A Marra ◽  
M Sadatsafavi ◽  
J A Aviña-Zubieta

Objective We estimated the incremental (extra) direct medical costs of a population-based cohort of newly diagnosed systemic lupus erythematosus (SLE) for five years before and after diagnosis, and the impact of sex and socioeconomic status (SES) on pre-index costs for SLE. Methods We identified all adults newly diagnosed with SLE over 2001–2010 in British Columbia, Canada, and obtained a sample of non-SLE individuals from the general population, matched on sex, age, and calendar-year of study entry. We captured costs for all outpatient encounters, hospitalisations, and dispensed medications each year. Using generalised linear models, we estimated incremental costs of SLE each year before/after diagnosis (difference in costs between SLE and non-SLE, controlling for covariates). Similar models were used to examine the impact of sex and SES on costs within SLE. Results We included 3632 newly diagnosed SLE (86% female, mean age 49.6 ± 15.9) and 18,060 non-SLE individuals. Over the five years leading up to diagnosis, per-person healthcare costs for SLE patients increased year-over-year by 35%, on average, with the biggest increases in the final two years by 39% and 97%, respectively. Per-person all-cause medical costs for SLE the year after diagnosis (Year + 1) averaged C$12,019 (2013 Canadian) with 58% from hospitalisations, 24% outpatient, and 18% from prescription medications; Year + 1 costs for non-SLE averaged C$2412. Following adjustment for age, sex, urban/rural residence, socioeconomic status, and prior year's comorbidity score, SLE was associated with significantly greater hospitalisation, outpatient, and medication costs than non-SLE in each year of study. Altogether, adjusted incremental costs of SLE rose from C$1131 per person in Year –5 (fifth year before diagnosis) to C$2015 (Year –2), C$3473 (Year –1) and C$6474 (Year + 1). In Years –2, –1 and +1, SLE patients in the lowest SES group had significantly greater costs than the highest SES. Unlike the non-SLE cohort, male patients with SLE had higher costs than females. Annual incremental costs of SLE males (vs. SLE females) rose from C$540 per person in Year –2, to C$1385 in Year –1, and C$2288 in Year + 1. Conclusion Even years before diagnosis, SLE patients incur significantly elevated direct medical costs compared with the age- and sex-matched general population, for hospitalisations, outpatient care, and medications.


2019 ◽  
Author(s):  
Lukas Salasevicius ◽  
Giedre Rutkauskiene ◽  
Ieva Vincerzevskiene ◽  
Jelena Rascon

Abstract Background: Pediatric very rare tumors (VRTs) represent a heterogeneous subset of childhood malignancies, with reliable survival rate estimations depending dramatically on each (un)registered case. The current study aimed to evaluate the number of VRTs among Lithuanian children and the change in treatment outcome over the 16 year study period as well as to assess the impact of the registration status on survival estimation. Methods . We performed a population-based retrospective analysis across children below 18 years old diagnosed with VRTs in Lithuania between the years 2000 to 2015. The identified cases were then crosschecked with the Lithuanian Cancer Registry (a population-based epidemiology cancer registry) for the registration and survival status. A five year overall survival (OS 5y ) was calculated using Kaplan-Meier estimation method. Results . Forty-four children affected by VRTs were identified within the defined time frame. Nine of them (20.5%) were not reported to the Lithuanian Cancer Registry at the time of diagnosis. The OS 5y of the entire cohort was 55.8%. The cure rate did not improve over the analyzed time periods – 54.2% in 2000-2007 vs 49.4% in 2008-2015. The OS 5y differed significantly between registered (n=35) and unregistered (n=9) cohorts: 45.1% vs 100%, respectively (p=0.016). The tumor progression was responsible for treatment failure in 95% of cases. Conclusions. The OS 5y of all analyzed children affected by VRT was lower as compared to the other childhood cancers. The survival rate of the unregistered patients was significantly superior that mislead interpretation of treatment outcome. Meticulous registration of VRTs is crucial for correct evaluation of treatment outcome, especially across small countries with fewer numbers of cases.


Blood ◽  
2015 ◽  
Vol 126 (23) ◽  
pp. 2779-2779 ◽  
Author(s):  
Magnus Bjorkholm ◽  
Hannah Bower ◽  
Paul W Dickman ◽  
Paul C Lambert ◽  
Martin Höglund ◽  
...  

Abstract Background Chronic Myeloid Leukemia (CML) is a myeloproliferative neoplasm with an incidence of 1-1.5 cases per 100,000 adults, accounting for ∼ 15-20 % of newly diagnosed patients with myeloid leukemia in adults. Treatment for CML has changed dramatically with the introduction of imatinib mesylate (IM), the first tyrosine kinase inhibitor (TKI) targeting the BCR-ABL1 oncoprotein. Previous population-based research (Björkholm et al. JCO, 2011) showed a major improvement in outcome of patients with CML up to 79 years of age diagnosed from 2001 to 2008. The elderly still had poorer outcome, partly because of a limited use of IM. However, increasing recognition of IM resistance and intolerance has led to the development of additional (second and third-generation) TKIs, which have demonstrated effectiveness as salvage therapies or alternative first-line treatments. Here we quantify how the life years lost due to a diagnosis of CML has changed between 1973 and 2013 using a measure called the loss in expectation of life (LEL). Methods This population-based study included3,684CML patients diagnosed in Sweden between 1973 and 2013; diagnoses were obtained from the Swedish Cancer Registry. The LEL was estimated using flexible parametric models. The LEL is the difference between the life expectancy in the diseased population and that in a matched subset of the general population. This measure has a simple interpretation as the number of life years lost, or the reduction in the life expectancy, due to a diagnosis of cancer. Results The life expectancy increased dramatically between 1990 and 2013 for CML patients of all ages; see figure. Patients in 2013, on average, lose less than 3 life years due to their diagnosis of CML. The largest increase in the life expectancy and thus the largest decrease in LEL over time was seen in younger patients; a diagnosis of CML in 1990 for a male 55-year old, on average, reduced his life expectancy by approximately 20.6 (95% CI: 20.3-21.1) years whereas a diagnosis in 2010 in the same male would on average reduce his life expectancy by only 2.6 (95% CI: 1.4-3.8) years. Although the greatest improvements were seen in those diagnosed at a younger age, those diagnosed at 85 years still benefitted in better survival over year of diagnosis; a diagnosis of CML in 1990 for a 85-year old, on average, reduced his life expectancy by approximately 3.6 (95% CI: 3.5-3.8) years whereas a diagnosis in 2010 in the same male would on average reduce his life expectancy by only 1.6 (95% CI: 1.0-2.2) years. Conclusions The reduction in life expectancy, or the number of life years lost due to a diagnosis of CML has greatly reduced over the years Patients who are diagnosed at a younger age lose dramatically fewer years in the most recent calendar years compared to previous years due to their CML diagnosis. Improvements in survival in the late 1990s were at least as great as those from 2001 in the youngest patients. Increased number of allogeneic stem cell transplantations, the introduction of interferon-alpha, improved supportive care and second line treatment with IM have all contributed. Less improvement was seen in the older patients which is probably explained by the relatively slow implementation of IM in this patient group. The impact of second generation TKIs on long-term survival remains to be determined. Figure 1. Life expectancy of the general population and CML patients aged 55, 65, 75 and 85 years over year of diagnosis, by sex. Figure 1. Life expectancy of the general population and CML patients aged 55, 65, 75 and 85 years over year of diagnosis, by sex. Disclosures No relevant conflicts of interest to declare.


2006 ◽  
Vol 24 (18_suppl) ◽  
pp. 4508-4508 ◽  
Author(s):  
S. D. Fosså ◽  
J. Chen ◽  
G. M. Dores ◽  
K. A. McGlynn ◽  
S. J. Schonfeld ◽  
...  

4508 Background: Multiple reports address the incidence of second cancer (SC) and long-term morbidity in TCSs, yet few data analyze the impact of non-malignant late sequelae on mortality. Methods: 39,657 one-year TCSs were identified in 14 population-based cancer registries in North America and Europe, with 17,856, 13,084 and 6,298 men followed for 10, 20 and 30 years, respectively. Standardized mortality ratios (SMRs), comparing TCSs to the general population, were calculated for deaths due to all non-cancer causes (n = 2,942) and specific sites. Further, absolute mortality due to TC, non-TC SC and all non-cancer disorders was estimated. Results: The SMR for all non-malignant diseases combined was 0.99 (95% CI: 0.95–1.02), with a significant reduction of deaths due to circulatory diseases (SMR: 0.92, n = 1,117). However, following initial treatment with chemotherapy and radiotherapy, the SMR for circulatory diseases was significantly elevated (SMR: 1.76), with a non-significant 29% excess after chemotherapy alone. Mortality due to digestive diseases was significantly increased (SMR: 1.32, n = 222), including gastric and duodenal ulcers (SMR = 1.52; excess deaths were observed between 10 and 25 years after initial radiotherapy). For the first 20 years after TC diagnosis, deaths due to infection were significantly elevated (SMR: 1.52, n=211). Absolute mortality due to non-cancer disorders always exceeded that due to SC, and was 15% after 30 years in a TCS diagnosed at age 35 compared with about 11% for SC. Conclusions: Compared with the general population, the overall risk of mortality due to all non-cancer causes combined does not appear to be increased in TCSs. However, they experience excess non-cancer deaths due to infection and digestive diseases, but not circulatory diseases. Additional analytic studies with detailed data on treatment and co-morbidities are required to further evaluate associations with specific causes of death. No significant financial relationships to disclose.


Author(s):  
Oluwaseun Esan ◽  
Daniella Schlüter ◽  
Rhiannon Phillips ◽  
Rebecca Cosgriff ◽  
Shantini Parajothy ◽  
...  

Objective To estimate the pregnancy rates and outcomes for women with cystic fibrosis (wwCF) in the UK compared to the general population and to explore the impact of the introduction of disease modifying treatments on pregnancy rates. Design A population-based cross-sectional study. Setting Electronic records of UK CF Registry Data (~99% of all CF), and conceptions data for England and Wales (E&W). Population All women aged 15-44 years who were pregnant between 2003-2017. Methods We calculated 3-yearly crude and age-specific pregnancy rates per 1,000 women years (wys), pregnancy rates for wwCF with a G551D mutation before and after Ivacaftor was introduced in 2012 and compared live birth rates. Main outcome measures Crude rates, age specific fertility, and maternal morbidity. Results The overall pregnancy rate was 23.5 (95% CI 21.9-25.3) per 1,000 wys, ~3.4fold difference to E&W women (77.7). This pattern was evident in the age specific rates, except for those aged 40-44 years where the difference in rates was much less (wwCF 8.2 per 1,000 wys vs. 13.3 in E&W). LB rate differences mirrored pregnancy rates (wwCF 17.4 per 1000 wys vs. 61.4 E&W women). Following the introduction of Ivacaftor, pregnancy rates in wwCF with G551D increased from 29.5 to 56.9 per 1000wys (2012-2014 to 2015-2017). Conclusions Pregnancy rates in wwCF are about a third of the rates in the general population but on the rise following the introduction of Ivacaftor. There is no indication that there is a reduced chance of a live birth in wwCF who become pregnant.


2020 ◽  
Vol 109 (11) ◽  
pp. 1352-1357 ◽  
Author(s):  
Benedikt Schrage ◽  
Nicole Rübsamen ◽  
Andreas Schulz ◽  
Thomas Münzel ◽  
Norbert Pfeiffer ◽  
...  

Abstract Background Iron deficiency is now accepted as an independent entity beyond anemia. Recently, a new functional definition of iron deficiency was proposed and proved strong efficacy in randomized cardiovascular clinical trials of intravenous iron supplementation. Here, we characterize the impact of iron deficiency on all-cause mortality in the non-anemic general population based on two distinct definitions. Methods The Gutenberg Health Study is a population-based, prospective, single-center cohort study. The 5000 individuals between 35 and 74 years underwent baseline and a planned follow-up visit at year 5. Tested definitions of iron deficiency were (1) functional iron deficiency—ferritin levels below 100 µg/l, or ferritin levels between 100 and 299 µg/l and transferrin saturation below 20%, and (2) absolute iron deficiency—ferritin below 30 µg/l. Results At baseline, a total of 54.5% of participants showed functional iron deficiency at a mean hemoglobin of 14.3 g/dl; while, the rate of absolute iron deficiency was 11.8%, at a mean hemoglobin level of 13.4 g/dl. At year 5, proportion of newly diagnosed subjects was 18.5% and 4.8%, respectively. Rate of all-cause mortality was 7.2% (n = 361); while, median follow-up was 10.1 years. After adjustment for hemoglobin and major cardiovascular risk factors, the hazard ratio with 95% confidence interval of the association of iron deficiency with mortality was 1.3 (1.0–1.6; p = 0.023) for the functional definition, and 1.9 (1.3–2.8; p = 0.002) for absolute iron deficiency. Conclusions Iron deficiency is very common in the apparently healthy general population and independently associated with all-cause mortality in the mid to long term. Graphic abstract


2016 ◽  
Vol 102 (2) ◽  
pp. 139-144 ◽  
Author(s):  
Amit Assa ◽  
Yael Frenkel-Nir ◽  
Ya'ara Leibovici-Weissman ◽  
Dorit Tzur ◽  
Arnon Afek ◽  
...  

ObjectivesTo investigate the impact of coeliac disease (CD) diagnosis on anthropometric measures at late adolescence and to assess trends in the prevalence of diagnosed CD over time.DesignA population based study.PatientsPrior to enlistment, at the age of 17 years, most of the Israeli Jewish population undergoes a general health examination. Subjects' medical diagnoses are entered into a structured database.InterventionsThe enlistment database was thoroughly searched for CD cases between the years 1988 and 2015. Medical records of 2 001 353 subjects were reviewed.Main outcome measuresAnthropometric measures at the age of 17 years.ResultsOverall, 10 566 CD cases (0.53%) were identified and analysed. Median age at data ascertainment was 17.1 years (IQR, 16.9–17.4). Multivariable analysis demonstrated that boys with CD were leaner (Body Mass Index 21.2±3.7 vs 21.7±3.8, p=0.02) while girls with CD were shorter (161.5±6 cm vs 162.1±6 cm, p=0.017) than the general population. The prevalence of diagnosed CD increased from 0.5% to 1.1% in the last 20 years with a female predominance (0.64% vs 0.46%). CD prevalence was significantly lower in subjects of lower socioeconomic status and those of African, Asian and former Soviet Union origin.ConclusionsAdolescent boys with CD were leaner and girls with CD were shorter compared with the general population. However, the clinical relevance of the small differences suggests that when CD is diagnosed during childhood, final weight and height are not severely impaired. Our cohort reinforces the observed increase in diagnosed CD.


Sign in / Sign up

Export Citation Format

Share Document