What Do Adults in Prince Edward Island: Know About Nutrition?

2007 ◽  
Vol 68 (3) ◽  
pp. 123-130 ◽  
Author(s):  
Kathy Gottschall-Pass ◽  
Lauren Reyno ◽  
Debbie MacLellan ◽  
Mark Spidel

Purpose: To assess adults’ knowledge of dietary recommendations, food sources of key nutrients, food choices, and diet-disease relationships. Methods: A previously validated survey, designed to assess nutrition knowledge, was adapted for use in Prince Edward Island and mailed to a random sample of 3,500 adults (aged 18 to 74). Dillman's Total Design Method was followed and a response rate of 26.4% achieved. Mean scores and 95% confidence intervals (CIs) were calculated for the overall survey and for each section. Demographic variations were assessed by univariate analysis. Results: Of an overall possible score of 110 points, the mean score with 95% CI was 71.0 (70.1, 71.9). Respondents scored higher on the sections on dietary recommendations, food sources, and food choices than diet-disease relationships. Demographic differences existed in gender, age, education, and income. Findings suggest that adults have good general knowledge of dietary recommendations, but lack knowledge about how to make healthier food choices and the impact of diet on disease risk. Conclusion: When designing intervention strategies, dietitians should consider targeted messages to provide adults with the information they need to make healthy food choices.

2020 ◽  
Vol 79 (OCE2) ◽  
Author(s):  
Daniela Prozorovscaia ◽  
Emma Jacquier ◽  
Florent Dudan ◽  
Suttipong Mungkala ◽  
Hilary Green

AbstractIntroductionFood choice is complex. Digital nutrition applications are emerging to help decisions about food choices. Nestlé's Meal Nutritional Score (MNS) is a number between 0 and 100 that provides a measure of the extent to which a meal meets US dietary recommendations. The objective was to evaluate if the MNS influences food choices in a workplace restaurant.Materials and MethodsA workplace education campaign ran over two weeks in order to introduce the MNS to employees. This was done during the lunch break at the entrance to the cafeteria of Nestlé's research centre, which serves around 250 hot meals at lunchtime every day. Employees choose between a typical Western meal, a healthy meal and a vegetarian meal. During the campaign, and for three weeks afterwards, LED screens displayed the MNS for the three different types of meal, every day. Employees’ voluntary feedback on the MNS was collected using a closed-ended questionnaire. Descriptive analyses were done for the scores and sales of each type of meals for one week before the education campaign, immediately after the campaign and three weeks later. Data are reported as mean ± 1 standard deviation.ResultsFeedback was obtained from 152 employees, of whom 96% said the MNS helped them to understand the nutritional balance of the meals, and 38% said the MNS influenced their meal choices. The MNS scores pre-campaign, post-post campaign and 3 weeks later were 52 ± 14, 50 ± 16 and 56 ± 11 for the Western meal; 54 ± 14, 62 ± 6 and 67 ± 6 for the healthy meal and 64 ± 11, 57 ± 14 and 57 ± 12 for the vegetarian meal, respectively. The percentage of sales pre-campaign, post-post campaign and 3 weeks later were 48 ± 10%, 43 ± 9% and 33 ± 9% for the Western meal; 25 ± 10%, 30 ± 8% and 36 ± 10% for the healthy meal, and 27 ± 4%, 27 ± 4% and 30 ± 8% for the vegetarian meal, respectively.DiscussionThe number of people selecting the healthy meal, which usually had the best score, increased during the three weeks following the education campaign, suggesting that the MNS positively influences food choices. The MNS may also help chefs to design more nutritionally balanced meals. Longer-term follow up is necessary to evaluate if these are sustained behaviour changes as well as to test the impact of the MNS in a different workplace environment.


2019 ◽  
Vol 7 (3_suppl) ◽  
pp. 2325967119S0018
Author(s):  
Neeraj M. Patel ◽  
Christopher R. Gajewski ◽  
Anthony M. Ascoli ◽  
J. Todd Lawrence

Background: The use of a washer to supplement screw fixation can prevent fragmentation and penetration during the surgical treatment of medial epicondyle fractures. However, concerns may arise regarding screw prominence and the need for subsequent implant removal. The purpose of this study is to evaluate the impact of washer utilization on the need for hardware removal and elbow range of motion (ROM). Methods: All surgically-treated pediatric medial epicondyle fractures over a 7-year period were queried for this retrospective case-control study. Patients were only included if their fracture was fixed with a single screw with or without a washer. Per institutional protocol, implants were not routinely removed after fracture healing. Hardware removal was performed only if the patient experienced a complication or implant-related symptoms that were refractory to non-operative management. Full ROM was considered flexion beyond 130 degrees and less than a 10-degree loss of extension. Univariate analysis was followed by creation of Kaplan-Meier (one minus survival) curves in order to analyze the time until full ROM was regained after surgery. Curves between patients with and without a washer were compared with a log rank test. Results: Of the 137 patients included in the study, the mean age was 12.2±2.3 years and 85 (62%) were male. A total of 31 (23%) patients ultimately underwent hardware removal. A washer was utilized in 90 (66%) cases overall. There was not an increased need for subsequent implant removal in these patients compared to those that underwent screw fixation alone (p=0.11). The mean BMI of patients that underwent hardware removal (19.1±2.5) was similar to that of children who did not (20.4±3.5, p=0.06). When analyzing a subgroup of 102 athletes only, there was similarly no difference in the rate of implant removal if a washer was used (p=0.64). Overall, 107 (78%) patients regained full ROM at a mean of 13.9±9.7 weeks after surgery (Figure 1). There was no statistically significant difference in the proportions of patients with and without a washer that achieved full ROM (p=0.46). Full ROM was achieved at a mean of 14.1±11.0 weeks in those with a washer compared to 13.6±6.2 weeks in those without one (p=0.21). Conclusions: Use of a washer did not affect the need for subsequent implant removal or elbow ROM after fixation of pediatric medial epicondyle fractures, even in thinner patients or competitive athletes. If there is concern for fracture fragmentation or penetration, a washer can be included without concern that future unplanned surgeries may be required.


Children ◽  
2021 ◽  
Vol 8 (9) ◽  
pp. 800
Author(s):  
Pilar Alfageme-García ◽  
Julián Fernando Calderón-García ◽  
Alfonso Martínez-Nova ◽  
Sonia Hidalgo-Ruiz ◽  
Belinda Basilio-Fernández ◽  
...  

Background: Schoolchildren often spend a lot of time carrying a backpack with school equipment, which can be very heavy. The impact a backpack may have on the pronated feet of schoolchildren is unknown. Aims: The objective of this study was to evaluate the association of the backpack use on static foot posture in schoolchildren with a pronated foot posture over 36 months of follow-up. Methods: This observational longitudinal prospective study was based on a cohort of consecutive healthy schoolchildren with pronated feet from fifteen different schools in Plasencia (Spain). The following parameters were collected and measured in all children included in the study: sex, age, height, weight, body mass index, metatarsal formula, foot shape, type of shoes, and type of schoolbag (non-backpack and backpack). Static foot posture was determined by the mean of the foot posture index (FPI). The FPI was assessed again after 36 months. Results: A total of 112 participants used a backpack when going to school. Over the 36-month follow-up period, 76 schoolchildren who had a static pronated foot posture evolve a neutral foot posture. Univariate analysis showed that the schoolchildren using backpacks were at a greater risk of not developing neutral foot (odds ratio [OR]: 2.09; 95% CI: 1.08–4.09). The multivariate analysis provided similar results, where the schoolchildren using a backpack (adjusted OR [aOR]: 1.94; 95% CI: 1.02–3.82) had a significantly greater risk of not developing a neutral foot posture. Conclusions: A weak relationship was found between backpack use and schoolchildren aged from five to eleven years with static pronated feet not developing a neutral foot posture over a follow-up period of 36 months.


2017 ◽  
Vol 30 (2) ◽  
pp. 149-190 ◽  
Author(s):  
Alison M. Stephen ◽  
Martine M.-J. Champ ◽  
Susan J. Cloran ◽  
Mathilde Fleith ◽  
Lilou van Lieshout ◽  
...  

AbstractResearch into the analysis, physical properties and health effects of dietary fibre has continued steadily over the last 40–50 years. From the knowledge gained, countries have developed guidelines for their populations on the optimal amount of fibre to be consumed each day. Food composition tables from many countries now contain values for the dietary fibre content of foods, and, from these, combined with dietary surveys, population intakes have been determined. The present review assessed the uniformity of the analytical methods used, health claims permitted, recommendations and intakes, particularly from national surveys across Europe and around the world. It also assessed current knowledge on health effects of dietary fibre and related the impact of different fibre types on health. The overall intent was to be able to provide more detailed guidance on the types of fibre which should be consumed for good health, rather than simply a total intake figure, the current situation. Analysis of data indicated a fair degree of uniformity in the definition of dietary fibre, the method used for analysis, the recommended amount to be consumed and a growing literature on effects on digestive health and disease risk. However, national dietary survey data showed that intakes do not reach recommendations and very few countries provide guidance on the types of fibre that are preferable to achieve recommended intakes. Research gaps were identified and ideas suggested to provide information for more detailed advice to the public about specific food sources that should be consumed to achieve health benefits.


2017 ◽  
Vol 18 (2) ◽  
pp. 428-444 ◽  
Author(s):  
Sanjay Dhamija ◽  
Ravinder Kumar Arora

The study examines the impact of quality certification of initial public offerings (IPOs) arising out of lead manager’s reputation, grading by credit rating agencies, presence of anchor investors and the reputation of auditors on the level of IPO underpricing. The mean initial excess return that measures the level of IPO underpricing is 22 per cent based on a sample of 399 IPOs made by Indian companies during the period from April 2005 to March 2015. Contrary to expectations, nearly 37 per cent of the IPOs do not provide a positive initial excess return. Univariate analysis reveals that except for IPO grading, the other quality certification variables do not make a significant impact on the level of underpricing. Graded issues are more fairly priced compared to non-graded issues. The Securities and Exchange Board of India (SEBI), the capital market regulator, has recently done away with mandatory grading of IPOs. As graded issues have been observed to improve pricing efficiency, SEBI should reconsider its decision and reintroduce compulsory IPO grading. Multivariate analysis, that includes other variables, such as issue size, level of subscription and promoters holding, reveals that the two variables that have a significant influence on initial excess returns from IPOs are the issue size and the level of oversubscription of the IPO.


2021 ◽  
pp. 019459982110045
Author(s):  
Alana Aylward ◽  
Morganne Murphy-Meyers ◽  
Chelsea McCarty Allen ◽  
Neil S. Patel ◽  
Richard K. Gurgel

Objective To examine the relationship among frailty index, hearing measures, and hearing-related quality of life (QOL) in older recipients of cochlear implants. Study Design Cross-sectional survey. Setting Academic medical center. Methods Adults aged ≥65 years at the time of receiving cochlear implants between July 13, 2000, and April 3, 2019, were asked to complete a questionnaire on hearing-related QOL. Chart review was performed to identify patients’ characteristics. Correlations were calculated between frailty index and audiologic outcome measures as well as between speech recognition scores and QOL scores. Linear regression models were developed to examine the impact of clinical characteristics, frailty index, and hearing measures on hearing-related QOL. Results Data for 143 respondents were included. The mean age was 80.7 years (SD, 7.1), with a mean 27.8 years of hearing loss (SD, 17.4) before implantation. The mean frailty index was 11.1 (SD, 10.6), indicating that patients had 1 or 2 of the measured comorbidities on average. No correlation was found between lower frailty index (better health) and hearing scores, including pure tone averages (PTAs) and speech recognition scores. Lower frailty index and larger improvement in PTA after cochlear implantation predicted better QOL scores on univariate analysis (respectively, P = .002, β = −0.42 [95% CI, −0.68 to −0.16]; P = .008, β = −0.15 [95% CI, −0.26 to −0.04]) and multivariate analysis ( P = .047, β = −0.28 [95% CI, −0.55 to −0.01]; P = .006, β = −0.16 [95% CI, −0.28 to −0.05]). No speech recognition scores correlated with QOL after cochlear implantation. Conclusions Frailty index does not correlate with hearing scores after cochlear implantation in older adults. Lower frailty index and more improvement in PTA predict better QOL scores after cochlear implantation in older adults.


2019 ◽  
Author(s):  
Georgina Milne ◽  
Adrian Allen ◽  
Jordon Graham ◽  
Angela Lahuerta-Marin ◽  
Carl McCormick ◽  
...  

Background. Despite rigorous controls placed on herds which disclose antemortem test positive cattle to bovine tuberculosis, caused by the infection of Mycobacterium bovis, many herds in Northern Ireland (NI) experience prolonged breakdowns. These herds represent a considerable administrative and financial burden to the State and farming community. Methods. A retrospective observational study was conducted to better understand the factors associated with breakdown duration, which was modelled using both negative binomial and ordinal regression approaches. Six explanatory variables were important predictors of breakdown length in both models; herd size, the number of reactors testing positive in the initial SICCT test, the presence of a lesioned animal at routine slaughter (LRS), the count of M. bovis genotypes during the breakdown (MLVA richness), the local herd-level bTB prevalence, and the presence of herds linked via management factors (associated herds). Results. We report that between 2008 and 2014, mean breakdown duration in NI was 226 days (approx. seven months; median; 188 days). In the same period, however, more than 6% of herds in the region remained under movement restriction for more than 420 days (13 months); almost twice as long as the mean. The MLVA richness variable was a particularly important predictor of breakdown duration. We contend that this variable primarily represents a proxy for beef fattening herds, which can operate by purchasing cattle and selling animals straight to slaughter, despite prolonged trading restrictions. For other herd types, the model supports the hypothesis that prolonged breakdowns are a function of both residual infection within the herd, and infection from the environment (e.g. infected wildlife, contiguous herds and/or a contaminated environment). The impact of badger density on breakdown duration was assessed by including data on main sett (burrow) density. Whilst a positive association was observed in the univariate analysis, confounding with other variables means that the contribution of badgers to prolonged breakdowns was not clear from our study. We do not fully reject the hypothesis that badgers are implicated in prolonging bTB breakdowns via spillback infection, but given our results, we posit that increased disease risk from badgers is unlikely to simply be a function of increasing badger density measured using sett metrics.


2020 ◽  
pp. 1-10
Author(s):  
Eliseu Verly-Jr ◽  
Alessandra da Silva Pereira ◽  
Emanuele Souza Marques ◽  
Paula Martins Horta ◽  
Daniela Silva Canella ◽  
...  

Abstract The aim was to design culturally acceptable and healthy diets with reduced energetic share of ultra-processed foods (UPF%) at no cost increment and to evaluate the impact of the change in the UPF% on diet quality. Food consumption and price data were obtained from the Household Budget Survey (n 55 970 households) and National Dietary Survey (n 32 749 individuals). Linear programming models were performed to design diets in which the mean population UPF% was reduced up to 5 % with no cost increment relative to the observed costs. The models were isoenergetic or allowed the energy content to vary according to the UPF%, and they were not constrained to nutritional goals (nutrient-free models) or maximised the compliance with dietary recommendations (nutrient-constrained models). Constraints regarding food preference were introduced in the models to obtain culturally acceptable diets. The mean population UPF% was 23·8 %. The lowest UPF% attained was approximately 10 %. The optimised diet cost was up to 20 % cheaper than the observed cost, depending on the model and the income level. In the optimised diets, the reduction in the UPF% was followed by an increase in fruits, vegetables, beans, tubers, dairy products, nuts, fibre, K, Mg, vitamin A and vitamin C in the nutrient-constrained models, compared with the observed consumption in the population. There was little variation in most nutrients across the UPF% reduction. The UPF% reduction in the nutrient-free models impacted only trans-fat and added sugar content. UPF% reduction and increase in diet quality are possible at no cost increment.


PeerJ ◽  
2020 ◽  
Vol 8 ◽  
pp. e8319 ◽  
Author(s):  
Georgina Milne ◽  
Adrian Allen ◽  
Jordon Graham ◽  
Angela Lahuerta-Marin ◽  
Carl McCormick ◽  
...  

Background Despite rigorous controls placed on herds which disclose ante-mortem test positive cattle to bovine tuberculosis, caused by the infection of Mycobacterium bovis, many herds in Northern Ireland (NI) experience prolonged breakdowns. These herds represent a considerable administrative and financial burden to the State and farming community. Methods A retrospective observational study was conducted to better understand the factors associated with breakdown duration, which was modelled using both negative binomial and ordinal regression approaches. Results Six explanatory variables were important predictors of breakdown length in both models; herd size, the number of reactors testing positive in the initial SICCT test, the presence of a lesioned animal at routine slaughter (LRS), the count of M. bovis genotypes during the breakdown (MLVA richness), the local herd-level bTB prevalence, and the presence of herds linked via management factors (associated herds). We report that between 2008 and 2014, mean breakdown duration in NI was 226 days (approx. seven months; median: 188 days). In the same period, however, more than 6% of herds in the region remained under movement restriction for more than 420 days (13 months); almost twice as long as the mean. The MLVA richness variable was a particularly important predictor of breakdown duration. We contend that this variable primarily represents a proxy for beef fattening herds, which can operate by purchasing cattle and selling animals straight to slaughter, despite prolonged trading restrictions. For other herd types, the model supports the hypothesis that prolonged breakdowns are a function of both residual infection within the herd, and infection from the environment (e.g. infected wildlife, contiguous herds and/or a contaminated environment). The impact of badger density on breakdown duration was assessed by including data on main sett (burrow) density. Whilst a positive association was observed in the univariate analysis, confounding with other variables means that the contribution of badgers to prolonged breakdowns was not clear from our study. We do not fully reject the hypothesis that badgers are implicated in prolonging bTB breakdowns via spillback infection, but given our results, we posit that increased disease risk from badgers is unlikely to simply be a function of increasing badger density measured using sett metrics.


2018 ◽  
Vol 148 (4) ◽  
pp. 573-580 ◽  
Author(s):  
Didier Brassard ◽  
Benoît J Arsenault ◽  
Marjorie Boyer ◽  
Daniela Bernic ◽  
Maude Tessier-Grenier ◽  
...  

Abstract Background Recent evidence suggests that the association between dietary saturated fatty acids (SFAs) and coronary artery disease risk varies according to food sources. How SFAs from butter and cheese influence HDL-mediated cholesterol efflux capacity (CEC), a key process in reverse cholesterol transport, is currently unknown. Objective In a predefined secondary analysis of a previously published trial, we have examined how diets rich in SFAs from either cheese or butter influence HDL-mediated CEC, compared with diets rich in either monounsaturated fatty acids (MUFAs) or polyunsaturated fatty acids (PUFAs). Methods In a randomized crossover controlled consumption trial, 46 men and women with abdominal obesity consumed 5 isocaloric diets, each for 4 wk. Two diets were rich in SFAs either from cheese (CHEESE) or butter (BUTTER) [12.4–12.6% of energy (%E) as SFAs, 32%E as fat, 52%E as carbohydrates]. In 2 other diets, SFAs (5.8%E) were replaced with either MUFAs from refined olive oil (MUFA) or PUFAs from corn oil (PUFA). Finally, a lower fat and carbohydrate diet was used as a control (5.8%E as SFAs, 25.0%E as fat, 59%E as carbohydrates; CHO). Post-diet HDL-mediated CEC was determined ex vivo using radiolabelled J774 macrophages incubated with apolipoprotein B–depleted serum from the participants. Results Mean (±SD) age was 41.4 ± 14.2 y, and waist circumference was 107.6 ± 11.5 cm in men and 94.3 ± 12.4 cm in women. BUTTER and MUFA increased HDL-mediated CEC compared with CHEESE (+4.3%, P = 0.026 and +4.7%, P = 0.031, respectively). Exploring the significant diet × sex interaction (P = 0.044) revealed that the increase in HDL-mediated CEC after BUTTER compared with CHEESE was significant among men (+6.0%, P = 0.047) but not women (+2.9%, P = 0.19), whereas the increase after MUFA compared with CHEESE was significant among women (+9.1%, P = 0.008) but not men (–0.6%, P = 0.99). Conclusion These results provide evidence of a food matrix effect modulating the impact of dairy SFAs on HDL-mediated CEC with potential sex-related differences that deserve further investigation. This trial was registered at clinicaltrials.gov as NCT02106208.


Sign in / Sign up

Export Citation Format

Share Document