scholarly journals VALIDITY OF COMMUNITY-BASED FRAILTY CHECK-UP BY SENIOR VOLUNTEERS FOR PREDICTING ADVERSE HEALTH OUTCOMES

2019 ◽  
Vol 3 (Supplement_1) ◽  
pp. S680-S681
Author(s):  
Tomoki Tanaka ◽  
Kyo Takahashi ◽  
Masahiro Akishita ◽  
Katsuya Iijima

Abstract Aim: For achieving healthy aging for all, multi-faceted frailty is serious problem in super-aged society such as Japan. We developed community-based frailty check-up program performed by trained senior volunteers. In this study, we aimed to validate the ability of the results of check-up to predict needing long-term support or care insurance or death in community-dwelling older population. Methods: A total of 1,536 older adults (mean age, 73.0±6.1 years; 74% women; non-eligible for long-term support or care) participated in the check-ups held from April, 2015 to March, 2018 in Kashiwa City, Japan. At check-ups cite, 21 items including nutrition, oral and physical functions, and social conditions were assessed; Outcome was needing long-term support or care insurance, or death from the day of check-ups until October, 2018. Results: During follow-up {median 678 days (inter-quartile range, 199-1263)}, 116 (7.6%) were newly needing for long-term support (n=50) or care (n=49), or death (n=18). The number of positive responses among 21 items was associated with decreased risks of outcome {age-sex adjusted hazard ratio (95% confidence interval), 0.87 (0.81-0.92)}. Compared those with > 18 positive responses (third tertile), individuals with < 14 positive responses (first tertile) were highly increased risks of outcome {age-sex adjusted hazard ratio (95% confidence interval), 2.44 (1.22-4.49)}. Conclusions: Community-based frailty check-ups program could predict the needing long-term support or care insurance or death in community-dwelling older population. The appropriate intervention for individuals with bad results of the check-up might contribute to serving as early prevention of multi-faceted frailty.

2021 ◽  
Vol 25 (71) ◽  
pp. 1-174
Author(s):  
Jonathan Bedford ◽  
Laura Drikite ◽  
Mark Corbett ◽  
James Doidge ◽  
Paloma Ferrando-Vivas ◽  
...  

Background New-onset atrial fibrillation occurs in around 10% of adults treated in an intensive care unit. New-onset atrial fibrillation may lead to cardiovascular instability and thromboembolism, and has been independently associated with increased length of hospital stay and mortality. The long-term consequences are unclear. Current practice guidance is based on patients outside the intensive care unit; however, new-onset atrial fibrillation that develops while in an intensive care unit differs in its causes and the risks and clinical effectiveness of treatments. The lack of evidence on new-onset atrial fibrillation treatment or long-term outcomes in intensive care units means that practice varies. Identifying optimal treatment strategies and defining long-term outcomes are critical to improving care. Objectives In patients treated in an intensive care unit, the objectives were to (1) evaluate existing evidence for the clinical effectiveness and safety of pharmacological and non-pharmacological new-onset atrial fibrillation treatments, (2) compare the use and clinical effectiveness of pharmacological and non-pharmacological new-onset atrial fibrillation treatments, and (3) determine outcomes associated with new-onset atrial fibrillation. Methods We undertook a scoping review that included studies of interventions for treatment or prevention of new-onset atrial fibrillation involving adults in general intensive care units. To investigate the long-term outcomes associated with new-onset atrial fibrillation, we carried out a retrospective cohort study using English national intensive care audit data linked to national hospital episode and outcome data. To analyse the clinical effectiveness of different new-onset atrial fibrillation treatments, we undertook a retrospective cohort study of two large intensive care unit databases in the USA and the UK. Results Existing evidence was generally of low quality, with limited data suggesting that beta-blockers might be more effective than amiodarone for converting new-onset atrial fibrillation to sinus rhythm and for reducing mortality. Using linked audit data, we showed that patients developing new-onset atrial fibrillation have more comorbidities than those who do not. After controlling for these differences, patients with new-onset atrial fibrillation had substantially higher mortality in hospital and during the first 90 days after discharge (adjusted odds ratio 2.32, 95% confidence interval 2.16 to 2.48; adjusted hazard ratio 1.46, 95% confidence interval 1.26 to 1.70, respectively), and higher rates of subsequent hospitalisation with atrial fibrillation, stroke and heart failure (adjusted cause-specific hazard ratio 5.86, 95% confidence interval 5.33 to 6.44; adjusted cause-specific hazard ratio 1.47, 95% confidence interval 1.12 to 1.93; and adjusted cause-specific hazard ratio 1.28, 95% confidence interval 1.14 to 1.44, respectively), than patients who did not have new-onset atrial fibrillation. From intensive care unit data, we found that new-onset atrial fibrillation occurred in 952 out of 8367 (11.4%) UK and 1065 out of 18,559 (5.7%) US intensive care unit patients in our study. The median time to onset of new-onset atrial fibrillation in patients who received treatment was 40 hours, with a median duration of 14.4 hours. The clinical characteristics of patients developing new-onset atrial fibrillation were similar in both databases. New-onset atrial fibrillation was associated with significant average reductions in systolic blood pressure of 5 mmHg, despite significant increases in vasoactive medication (vasoactive-inotropic score increase of 2.3; p < 0.001). After adjustment, intravenous beta-blockers were not more effective than amiodarone in achieving rate control (adjusted hazard ratio 1.14, 95% confidence interval 0.91 to 1.44) or rhythm control (adjusted hazard ratio 0.86, 95% confidence interval 0.67 to 1.11). Digoxin therapy was associated with a lower probability of achieving rate control (adjusted hazard ratio 0.52, 95% confidence interval 0.32 to 0.86) and calcium channel blocker therapy was associated with a lower probability of achieving rhythm control (adjusted hazard ratio 0.56, 95% confidence interval 0.39 to 0.79) than amiodarone. Findings were consistent across both the combined and the individual database analyses. Conclusions Existing evidence for new-onset atrial fibrillation management in intensive care unit patients is limited. New-onset atrial fibrillation in these patients is common and is associated with significant short- and long-term complications. Beta-blockers and amiodarone appear to be similarly effective in achieving cardiovascular control, but digoxin and calcium channel blockers appear to be inferior. Future work Our findings suggest that a randomised controlled trial of amiodarone and beta-blockers for management of new-onset atrial fibrillation in critically ill patients should be undertaken. Studies should also be undertaken to provide evidence for or against anticoagulation for patients who develop new-onset atrial fibrillation in intensive care units. Finally, given that readmission with heart failure and thromboembolism increases following an episode of new-onset atrial fibrillation while in an intensive care unit, a prospective cohort study to demonstrate the incidence of atrial fibrillation and/or left ventricular dysfunction at hospital discharge and at 3 months following the development of new-onset atrial fibrillation should be undertaken. Trial registration Current Controlled Trials ISRCTN13252515. Funding This project was funded by the National Institute for Health Research (NIHR) Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 25, No. 71. See the NIHR Journals Library website for further project information.


2007 ◽  
Vol 106 (6) ◽  
pp. 1088-1095 ◽  
Author(s):  
Elisabeth Mahla ◽  
Anneliese Baumann ◽  
Peter Rehak ◽  
Norbert Watzinger ◽  
Martin N. Vicenzi ◽  
...  

Background Preoperative N-terminal pro-BNP (NT-proBNP) is independently associated with adverse cardiac outcome but does not anticipate the dynamic consequences of anesthesia and surgery. The authors hypothesized that a single postoperative NT-proBNP level provides additional prognostic information for in-hospital and late cardiac events. Methods Two hundred eighteen patients scheduled to undergo vascular surgery were enrolled and followed up for 24-30 months. Logistic regression and Cox proportional hazards model were performed to evaluate predictors of in-hospital and long-term cardiac outcome. The optimal discriminatory level of preoperative and postoperative NT-proBNP was determined by receiver operating characteristic analysis. Results During a median follow-up of 826 days, 44 patients (20%) experienced 51 cardiac events. Perioperatively, median NT-proBNP increased from 215 to 557 pg/ml (interquartile range, 83/457 to 221/1178 pg/ml; P&lt;0.001). The optimum discriminate threshold for preoperative and postoperative NT-proBNP was 280 pg/ml (95% confidence interval, 123-400) and 860 pg/ml (95% confidence interval, 556-1,054), respectively. Adjusted for age, previous myocardial infarction, preoperative fibrinogen, creatinine, high-sensitivity C-reactive protein, type, duration, and surgical complications, only postoperative NT-proBNP remained significantly associated with in-hospital (adjusted hazard ratio, 19.8; 95% confidence interval, 3.4-115) and long-term cardiac outcome (adjusted hazard ratio, 4.88; 95% confidence interval, 2.43-9.81). Conclusion A single postoperative NT-proBNP determination provides important additional prognostic information to preoperative levels and may support therapeutic decisions to prevent subsequent structural myocardial damage.


BMJ ◽  
2020 ◽  
pp. m2533 ◽  
Author(s):  
Casey Crump ◽  
Jan Sundquist ◽  
Kristina Sundquist

Abstract Objectives To examine the long term mortality associated with preterm delivery in a large population based cohort of women, and to assess for potential confounding by shared familial factors. Design National cohort study. Setting Sweden. Participants All 2 189 477 women with a singleton delivery in 1973-2015. Main outcome measures All cause and cause specific mortality up to 2016, identified from nationwide death records. Cox regression was used to calculate hazard ratios while adjusting for confounders, and co-sibling analyses assessed the potential influence of unmeasured shared familial (genetic and environmental) factors. Results In 50.7 million person years of follow-up, 76 535 (3.5%) women died (median age at death was 57.6). In the 10 years after delivery, the adjusted hazard ratio for all cause mortality associated with preterm delivery (<37 weeks) was 1.73 (95% confidence interval 1.61 to 1.87), and when further stratified was 2.20 (1.63 to 2.96) for extremely preterm delivery (22-27 weeks), 2.28 (2.01 to 2.58) for very preterm delivery (28-33 weeks), 1.52 (1.39 to 1.67) for late preterm delivery (34-36 weeks), and 1.19 (1.12 to 1.27) for early term delivery (37-38 weeks) compared with full term delivery (39-41 weeks). These risks declined but remained significantly raised after longer follow-up times: for preterm versus full term births, 10-19 years after delivery, the adjusted hazard ratio was 1.45 (95% confidence interval 1.37 to 1.53); 20-44 years after delivery, the adjusted hazard ratio was 1.37 (1.33 to 1.41). These findings did not seem to be attributable to shared genetic or environmental factors within families. Several causes were identified, including cardiovascular and respiratory disorders, diabetes, and cancer. Conclusions In this large national cohort of women, the findings suggested that preterm and early term delivery were independent risk factors for premature mortality from several major causes. These associations declined over time but remained raised up to 40 years later.


2021 ◽  
Vol 42 (Supplement_1) ◽  
Author(s):  
S.J Kiddle ◽  
A Abdul-Sultan ◽  
K Andersson Sundell ◽  
S Nolan ◽  
S Perl ◽  
...  

Abstract Background There is a strong association between hyperuricemia (elevated serum uric acid) and the risk of heart failure. However, it remains unclear whether prescribing urate lowering therapies have any bearing on long term clinical outcomes. Purpose In this study, we assessed the impact of urate lowering therapy treatment on the risk of adverse health outcomes (hospitalisation for heart failure and all-cause mortality) in patients with hyperuricemia and heart failure. Methods We utilised data from Clinical Practice Research Datalink (CPRD) GOLD, a UK-based primary care database linked to secondary care (Hospital Episode Statistics) and mortality data (Office of National Statistics). The study population included patients with a first record of hyperuricemia (serum uric acid &gt;7 mg/dl for men and &gt;6 mg/dl for women or a gout diagnosis) between 1990 and 2019 with a history of heart failure. Incident urate lowering therapy users were identified post hyperuricemia diagnosis. To account for potential confounding variables and potential treatment paradigm changes over the study period, a propensity score matched cohort was constructed for urate lowering therapy initiators and non-initiators within 6 month accrual blocks. Adverse health outcomes were compared between matched treatment groups using Cox regression analysis adjusted for the same variables used in the propensity score. Due to extensive treatment switching and discontinuation, on-treatment analysis was the main analysis. Results A total of 2,174 propensity score matched pairs were identified. We found that urate lowering therapy was associated with a 43% lower risk of all-cause mortality or hospitalization for heart failure (Figure 1, adjusted hazard ratio 0.57, 95% confidence interval 0.51–0.65), and a 19% lower risk of cardiovascular mortality or hospitalization for heart failure (Figure 2, adjusted hazard ratio 0.81, 95% confidence interval 0.71–0.92) within five years compared to those not on therapy (on-treatment analysis). In an intention-to-treat sensitivity analysis, urate lowering therapy was associated with a 17% lower risk of all-cause mortality or hospitalization for heart failure (adjusted hazard ratio 0.83, 95% confidence interval 0.76–0.91), and a 11% lower risk of cardiovascular mortality or hospitalization for heart failure (adjusted hazard ratio 0.89, 95% confidence interval 0.81–0.98) within five years compared to those not on urate lowering therapy. Adjusted and non-adjusted hazard ratios were consistent for all outcomes. Conclusion We found that urate lowering therapy was associated with a lower risk of adverse outcomes in hyperuricemia or gout patients with a history of heart failure. These results are consistent with the hypothesis that uric acid lowering may lead to improved outcome in patients with heart failure and hyperuricemia, emphasizing the need to investigate the potential benefits of intense uric acid lowering in prospective randomized controlled trials. FUNDunding Acknowledgement Type of funding sources: Private company. Main funding source(s): AstraZeneca Figure 1 (HF = heart failure) Figure 2 (CV = cardiovascular)


2010 ◽  
Vol 21 (2) ◽  
pp. 197-203 ◽  
Author(s):  
Morten Olsen ◽  
Vibeke E. Hjortdal ◽  
Laust H. Mortensen ◽  
Thomas D. Christensen ◽  
Henrik T. Sørensen ◽  
...  

AbstractBackgroundCongenital heart defect patients may experience neurodevelopmental impairment. We investigated their educational attainments from basic schooling to higher education.Patients and methodsUsing administrative databases, we identified all Danish patients with a cardiac defect diagnosis born from 1 January, 1977 to 1 January, 1991 and alive at age 13 years. As a comparison cohort, we randomly sampled 10 persons per patient. We obtained information on educational attainment from Denmark's Database for Labour Market Research. The study population was followed until achievement of educational levels, death, emigration, or 1 January, 2006. We estimated the hazard ratio of attaining given educational levels, conditional on completing preceding levels, using discrete-time Cox regression and adjusting for socio-economic factors. Analyses were repeated for a sub-cohort of patients and controls born at term and without extracardiac defects or chromosomal anomalies.ResultsWe identified 2986 patients. Their probability of completing compulsory basic schooling was approximately 10% lower than that of control individuals (adjusted hazard ratio = 0.79, ranged from 0.75 to 0.82 0.79; 95% confidence interval: 0.75–0.82). Their subsequent probability of completing secondary school was lower than that of the controls, both for all patients (adjusted hazard ratio = 0.74; 95% confidence interval: 0.69–0.80) and for the sub-cohort (adjusted hazard ratio = 0.80; 95% confidence interval: 0.73–0.86). The probability of attaining a higher degree, conditional on completion of youth education, was affected both for all patients (adjusted hazard ratio = 0.88; 95% confidence interval: 0.76–1.01) and for the sub-cohort (adjusted hazard ratio = 0.92; 95% confidence interval: 0.79–1.07).ConclusionThe probability of educational attainment was reduced among long-term congenital heart defect survivors.


2018 ◽  
Vol 8 (2) ◽  
pp. 153-160 ◽  
Author(s):  
Patrick Sulzgruber ◽  
Sebastian Schnaubelt ◽  
Lorenz Koller ◽  
Georg Goliasch ◽  
Jan Niederdöckl ◽  
...  

Background: The development of cardiac arrhythmias resulting in cardiac arrest represents a severe complication in patients with acute myocardial infarction. While the worsening of the prognosis in this vulnerable patient collective is well known, less attention has been paid to its age-specific relevance from a long-term perspective. Methods: Based on a clinical acute myocardial infarction registry we analysed 832 patients with acute myocardial infarction within the current analysis. Patients were stratified into equal groups ( n=208 per group) according to age in less than 45 years, 45–64 years, 65–84 years and 85 years and older via propensity score matching. Multivariate Cox regression analysis was used to assess the age-dependent influence of cardiac arrest on mortality. Results: The total number of cardiac arrests differed significantly between age groups, demonstrating the highest incidence in the youngest population with 18.8% ( n=39), and a significantly lower incidence by increasing age (−11.6%; P=0.01). After a mean follow-up time of 8 years, a total of 264 patients (31.7%) died due to cardiovascular causes. While cardiac arrest was a strong and independent predictor for mortality within the total study population with an adjusted hazard ratio of 3.21 (95% confidence interval 2.23–4.61; P<0.001), there was no significant association with mortality independently in very young patients (<45 years; adjusted hazard ratio of 1.73, 95% confidence interval 0.55–5.53; P=0.35). Conclusion: We found that arrhythmias resulting in cardiac arrest are more common in very young acute myocardial infarction patients (<45 years) compared to their older counterparts, and were able to demonstrate that the prognostic value of cardiac arrest on long-term mortality in patients with acute myocardial infarction is clearly age dependent.


Author(s):  
Chisato Hayashi ◽  
Soshiro Ogata ◽  
Tadashi Okano ◽  
Hiromitsu Toyoda ◽  
Sonoe Mashino

Abstract Background The effects of group exercise on the physical function of community-dwelling older adults remain unclear. The changes in lower extremity muscle strength, timed up and go (TUG) time, and the motor fitness scale (MFS), over time, among older adults who expressed a willingness to participate in community-based physical exercise groups, were determined using multilevel modelling. Methods We analyzed data of 2407 older adults between April 2010 and December 2019 from the registry of physical tests of community-based physical exercise groups. We conducted a retrospective cohort study to assess the effect of physical exercise on lower extremity muscle strength, TUG time, and MFS scores. The durations of the exercises were evaluated by frequency of physical test’s participate. Results A deterioration in lower extremity muscle strength was found in the short-term participant group only. However, in the mid-term and long-term participation groups, lower extremity muscle strength showed a trend of improvement. The TUG time and the MFS score were negatively correlated with increasing age in both groups divided by the duration of participation. However, there was a slower rate of deterioration in the long-term participation group. Discussion Lower extremity muscle strength, TUG time, and MFS scores decline with increasing age and there were differences in the slope of deterioration that depended on the duration of participation in community-based group exercise. Conclusion Participation in group exercise improved lower extremity muscle strength, TUG time, and MFS scores of older adults living in a community. The positive effects of group exercise were dependent on long-term participation.


Stroke ◽  
2016 ◽  
Vol 47 (suppl_1) ◽  
Author(s):  
Yilong Wang ◽  
Xiaomeng Yang ◽  
Jing Jing ◽  
Xingquan Zhao ◽  
Liping Liu ◽  
...  

Objective: We aim to investigate the effects and safety of clopidogrel plus aspirin in patients with different types of single small subcortical infarction(SSSI) in the Clopidogrel in High-risk patients with Acute Non-disabling Cerebrovascular Events (CHANCE) trial. Methods: In this subgroup analysis, SSSI was defined as single DWI lesion of ≤2.0 cm and SSSI with stenosis of any degree of the parent artery was regarded as a SSSI+PAD. We assessed the interaction of the treatment effects of clopidogrel plus aspirin versus aspirin alone among patients with and without PAD. Efficacy was assessed by intention to treat analysis and safety was assessed in the on-treatment population. Results: A total of 338 patients with SSSI were included in the final analysis,105 with SSSI+PAD and 233 SSSI-PAD. In SSSI+PAD patients, 10.9% (5/46) had recurrent stroke in the clopidogrel-aspirin group as compared to 13.6% (8/59) in the aspirin group (adjusted hazard ratio, 0.66; 95% confidence interval, 0.20-2.20; P=0.50). In SSSI-PAD patients, 8.9% (11/124) had recurrent stroke in the clopidogrel-aspirin group as compared 7.3% (8/109) in the aspirin group (adjusted hazard ratio, 1.64; 95% confidence interval, 0.61- 4.38; P=0.32). The number of bleeding events was similar between the clopidogrel-aspirin group and aspirin group regardless of SSSI+PAD or SSSI-PAD. Conclusions: Although dual antiplatelet therapy did not significantly reduce the risk of recurrent stroke than aspirin alone in patients with SSSI. It was potentially beneficial to the patients with SSSI+PAD. Dual antiplatelet treatment did not increase the risk of bleeding in patients with any kind of SSSI.


2019 ◽  
Author(s):  
Nicolai A Lund-Blix ◽  
German Tapia ◽  
Karl Mårild ◽  
Anne Lise Brantsaeter ◽  
Pål R Njølstad ◽  
...  

ABSTRACTOBJECTIVETo examine the association between maternal and child gluten intake and risk of type 1 diabetes in children.DESIGNPregnancy cohortSETTINGPopulation-based, nation-wide study in NorwayPARTICIPANTS86,306 children in The Norwegian Mother and Child Cohort Study born from 1999 through 2009, followed to April 15, 2018.MAIN OUTCOME MEASURESClinical type 1 diabetes, ascertained in a nation-wide childhood diabetes registry. Hazard ratios were estimated using Cox regression for the exposures maternal gluten intake up to week 22 of pregnancy and child’s gluten intake when the child was 18 months old.RESULTSDuring a mean follow-up of 12.3 years (range 0.7-16.0), 346 children (0.4%) developed type 1 diabetes (incidence rate 32.6 per 100,000 person-years). The average gluten intake was 13.6 grams/day for mothers during pregnancy, and 8.8 grams/day for the child at 18 months of age. Maternal gluten intake in mid-pregnancy was not associated with the development of type 1 diabetes in the child (adjusted hazard ratio 1.02 (95% confidence interval 0.73 to 1.43) per 10 grams/day increase in gluten intake). However, the child’s gluten intake at 18 months of age was associated with an increased risk of later developing type 1 diabetes (adjusted hazard ratio 1.46 (95% confidence interval 1.06 to 2.01) per 10 grams/day increase in gluten intake).CONCLUSIONSThis study suggests that the child’s gluten intake at 18 months of age, and not the maternal intake during pregnancy, could increase the risk of type 1 diabetes in the child.WHAT IS ALREADY KNOWN ON THIS TOPICA national prospective cohort study from Denmark found that a high maternal gluten intake during pregnancy could increase the risk of type 1 diabetes in the offspring (adjusted hazard ratio 1.31 (95% confidence interval 1.001 to 1.72) per 10 grams/day increase in gluten intake). No studies have investigated the relation between the amount of gluten intake by both the mother during pregnancy and the child in early life and risk of developing type 1 diabetes in childhood.WHAT THIS STUDY ADDSIn this prospective population-based pregnancy cohort with 86,306 children of whom 346 developed type 1 diabetes we found that the child’s gluten intake at 18 months of age was associated with the risk of type 1 diabetes (adjusted hazard ratio 1.46 (95% confidence interval 1.06 to 2.01) per 10 grams/day increase in gluten intake). This study suggests that the child’s gluten intake at 18 months of age, and not the maternal intake during pregnancy, could increase the child’s risk of type 1 diabetes.


BMJ ◽  
2018 ◽  
pp. k3547 ◽  
Author(s):  
Julie C Antvorskov ◽  
Thorhallur I Halldorsson ◽  
Knud Josefsen ◽  
Jannet Svensson ◽  
Charlotta Granström ◽  
...  

Abstract Objective To examine the association between prenatal gluten exposure and offspring risk of type 1 diabetes in humans. Design National prospective cohort study. Setting National health information registries in Denmark. Participants Pregnant Danish women enrolled into the Danish National Birth Cohort, between January 1996 and October 2002, Main outcome measures Maternal gluten intake, based on maternal consumption of gluten containing foods, was reported in a 360 item food frequency questionnaire at week 25 of pregnancy. Information on type 1 diabetes occurrence in the participants’ children, from 1 January 1996 to 31 May 2016, were obtained through registry linkage to the Danish Registry of Childhood and Adolescent Diabetes. Results The study comprised 101 042 pregnancies in 91 745 women, of whom 70 188 filled out the food frequency questionnaire. After correcting for multiple pregnancies, pregnancies ending in abortions, stillbirths, lack of information regarding the pregnancy, and pregnancies with implausibly high or low energy intake, 67 565 pregnancies (63 529 women) were included. The average gluten intake was 13.0 g/day, ranging from less than 7 g/day to more than 20 g/day. The incidence of type 1 diabetes among children in the cohort was 0.37% (n=247) with a mean follow-up period of 15.6 years (standard deviation 1.4). Risk of type 1 diabetes in offspring increased proportionally with maternal gluten intake during pregnancy (adjusted hazard ratio 1.31 (95% confidence interval 1.001 to 1.72) per 10 g/day increase of gluten). Women with the highest gluten intake versus those with the lowest gluten intake (≥20 v <7 g/day) had double the risk of type 1 diabetes development in their offspring (adjusted hazard ratio 2.00 (95% confidence interval 1.02 to 4.00)). Conclusions High gluten intake by mothers during pregnancy could increase the risk of their children developing type 1 diabetes. However, confirmation of these findings are warranted, preferably in an intervention setting.


Sign in / Sign up

Export Citation Format

Share Document