scholarly journals How does a simplified recipe collection procedure in dietary assessment tools affect the food group and nutrient intake distributions of the population

2020 ◽  
Vol 124 (2) ◽  
pp. 189-198 ◽  
Author(s):  
Liangzi Zhang ◽  
Hendriek Boshuizen ◽  
Marga Ocké

AbstractTechnology advancements have driven the use of self-administered dietary assessment methods in large-scale dietary surveys. Interviewer-assisted methods generally have a complicated recipe recording procedure enabling the adjustment from a standard recipe. In order to decide if this functionality can be omitted for self-administered dietary assessment, this study aimed to assess the extent of standard recipe modifications in the Dutch National Food Consumption Survey and measure the impact on the food group and nutrient intake distributions of the population when the modifications were disregarded. A two-scenario simulation analysis was conducted. Firstly, the individual recipe scenario omitted the full modifications to the standard recipes made by people who knew their recipes. Secondly, the modified recipe scenario omitted the modifications made by those who partially modified the standard recipe due to their limited knowledge. The weighted percentage differences for the nutrient and food group intake distributions between the scenarios and the original data set were calculated. The highest percentage of energy consumed through mixed dishes was 10 % for females aged 19–79 years. Comparing the combined scenario and the original data set, the average of the absolute percentage difference for the population mean intakes was 1·6 % across all food groups and 0·6 % for nutrients. The soup group (−6·6 %) and DHA (−2·3 %) showed the largest percentage difference. The recipe simplification caused a slight underestimation of the consumed amount of both foods (−0·2 %) and nutrients (−0·4 %). These results are promising for developing self-administered 24-hour recalls or food diary applications without complex recipe function.

2014 ◽  
Vol 18 (11) ◽  
pp. 1922-1931 ◽  
Author(s):  
Marc A Mason ◽  
Marie Fanelli Kuczmarski ◽  
Deanne Allegro ◽  
Alan B Zonderman ◽  
Michele K Evans

AbstractObjectiveAnalysing dietary data to capture how individuals typically consume foods is dependent on the coding variables used. Individual foods consumed simultaneously, like coffee with milk, are given codes to identify these combinations. Our literature review revealed a lack of discussion about using combination codes in analysis. The present study identified foods consumed at mealtimes and by race when combination codes were or were not utilized.DesignDuplicate analysis methods were performed on separate data sets. The original data set consisted of all foods reported; each food was coded as if it was consumed individually. The revised data set was derived from the original data set by first isolating coded foods consumed as individual items from those foods consumed simultaneously and assigning a code to designate a combination. Foods assigned a combination code, like pancakes with syrup, were aggregated and associated with a food group, defined by the major food component (i.e. pancakes), and then appended to the isolated coded foods.SettingHealthy Aging in Neighborhoods of Diversity across the Life Span study.SubjectsAfrican-American and White adults with two dietary recalls (n 2177).ResultsDifferences existed in lists of foods most frequently consumed by mealtime and race when comparing results based on original and revised data sets. African Americans reported consumption of sausage/luncheon meat and poultry, while ready-to-eat cereals and cakes/doughnuts/pastries were reported by Whites on recalls.ConclusionsUse of combination codes provided more accurate representation of how foods were consumed by populations. This information is beneficial when creating interventions and exploring diet–health relationships.


2015 ◽  
Vol 8 (1) ◽  
pp. 421-434 ◽  
Author(s):  
M. P. Jensen ◽  
T. Toto ◽  
D. Troyan ◽  
P. E. Ciesielski ◽  
D. Holdridge ◽  
...  

Abstract. The Midlatitude Continental Convective Clouds Experiment (MC3E) took place during the spring of 2011 centered in north-central Oklahoma, USA. The main goal of this field campaign was to capture the dynamical and microphysical characteristics of precipitating convective systems in the US Central Plains. A major component of the campaign was a six-site radiosonde array designed to capture the large-scale variability of the atmospheric state with the intent of deriving model forcing data sets. Over the course of the 46-day MC3E campaign, a total of 1362 radiosondes were launched from the enhanced sonde network. This manuscript provides details on the instrumentation used as part of the sounding array, the data processing activities including quality checks and humidity bias corrections and an analysis of the impacts of bias correction and algorithm assumptions on the determination of convective levels and indices. It is found that corrections for known radiosonde humidity biases and assumptions regarding the characteristics of the surface convective parcel result in significant differences in the derived values of convective levels and indices in many soundings. In addition, the impact of including the humidity corrections and quality controls on the thermodynamic profiles that are used in the derivation of a large-scale model forcing data set are investigated. The results show a significant impact on the derived large-scale vertical velocity field illustrating the importance of addressing these humidity biases.


2016 ◽  
Vol 5 ◽  
Author(s):  
Sanna Nybacka ◽  
Heléne Bertéus Forslund ◽  
Elisabet Wirfält ◽  
Ingrid Larsson ◽  
Ulrika Ericson ◽  
...  

AbstractTwo web-based dietary assessment tools have been developed for use in large-scale studies: the Riksmaten method (4-d food record) and MiniMeal-Q (food-frequency method). The aim of the present study was to examine the ability of these methods to capture energy intake against objectively measured total energy expenditure (TEE) with the doubly labelled water technique (TEEDLW), and to compare reported energy and macronutrient intake. This study was conducted within the pilot study of the Swedish CArdioPulmonary bioImage Study (SCAPIS), which included 1111 randomly selected men and women aged 50–64 years from the Gothenburg general population. Of these, 200 were enrolled in the SCAPIS diet substudy. TEEDLW was measured in a subsample (n 40). Compared with TEEDLW, both methods underestimated energy intake: −2·5 (sd  2·9) MJ with the Riksmaten method; −2·3 (sd 3·6) MJ with MiniMeal-Q. Mean reporting accuracy was 80 and 82 %, respectively. The correlation between reported energy intake and TEEDLW was r 0·4 for the Riksmaten method (P < 0·05) and r 0·28 (non-significant) for MiniMeal-Q. Women reported similar average intake of energy and macronutrients in both methods whereas men reported higher intakes with the Riksmaten method. Energy-adjusted correlations ranged from 0·14 (polyunsaturated fat) to 0·77 (alcohol). Bland–Altman plots showed acceptable agreement for energy and energy-adjusted protein and carbohydrate intake, whereas the agreement for fat intake was poorer. According to energy intake data, both methods displayed similar precision on energy intake reporting. However, MiniMeal-Q was less successful in ranking individuals than the Riksmaten method. The development of methods to achieve limited under-reporting is a major challenge for future research.


2017 ◽  
Vol 10 (5) ◽  
pp. 2031-2055 ◽  
Author(s):  
Thomas Schwitalla ◽  
Hans-Stefan Bauer ◽  
Volker Wulfmeyer ◽  
Kirsten Warrach-Sagi

Abstract. Increasing computational resources and the demands of impact modelers, stake holders, and society envision seasonal and climate simulations with the convection-permitting resolution. So far such a resolution is only achieved with a limited-area model whose results are impacted by zonal and meridional boundaries. Here, we present the setup of a latitude-belt domain that reduces disturbances originating from the western and eastern boundaries and therefore allows for studying the impact of model resolution and physical parameterization. The Weather Research and Forecasting (WRF) model coupled to the NOAH land–surface model was operated during July and August 2013 at two different horizontal resolutions, namely 0.03 (HIRES) and 0.12° (LOWRES). Both simulations were forced by the European Centre for Medium-Range Weather Forecasts (ECMWF) operational analysis data at the northern and southern domain boundaries, and the high-resolution Operational Sea Surface Temperature and Sea Ice Analysis (OSTIA) data at the sea surface.The simulations are compared to the operational ECMWF analysis for the representation of large-scale features. To analyze the simulated precipitation, the operational ECMWF forecast, the CPC MORPHing (CMORPH), and the ENSEMBLES gridded observation precipitation data set (E-OBS) were used as references.Analyzing pressure, geopotential height, wind, and temperature fields as well as precipitation revealed (1) a benefit from the higher resolution concerning the reduction of monthly biases, root mean square error, and an improved Pearson skill score, and (2) deficiencies in the physical parameterizations leading to notable biases in distinct regions like the polar Atlantic for the LOWRES simulation, the North Pacific, and Inner Mongolia for both resolutions.In summary, the application of a latitude belt on a convection-permitting resolution shows promising results that are beneficial for future seasonal forecasting.


BMC Nutrition ◽  
2019 ◽  
Vol 5 (1) ◽  
Author(s):  
Linda A. Bush ◽  
Jayne Hutchinson ◽  
Jozef Hooson ◽  
Marisol Warthon-Medina ◽  
Neil Hancock ◽  
...  

Abstract Background Measuring dietary intake in children and adolescents can be challenging due to misreporting, difficulties in establishing portion size and reliance on recording dietary data via proxy reporters. The aim of this review was to present results from a recent systematic review of reviews reporting and comparing validated dietary assessment tools used in younger populations in the UK. Methods Validation data for dietary assessment tools used in younger populations (≤18 years) were extracted and summarised using results from a systematic review of reviews of validated dietary assessment tools. Mean differences and Bland-Altman limits of agreement (LOA) between the test and reference tool were extracted or calculated and compared for energy, macronutrients and micronutrients. Results Seventeen studies which reported validation of 14 dietary assessment tools (DATs) were identified with relevant nutrition information. The most commonly validated nutrients were energy, carbohydrate, protein, fat, calcium, iron, folate and vitamin C. There were no validated DATs reporting assessment of zinc, iodine or selenium intake. The most frequently used reference method was the weighed food diary, followed by doubly labelled water and 24 h recall. Summary plots were created to facilitate comparison between tools. On average, the test tools reported higher mean intakes than the reference methods with some studies consistently reporting wide LOA. Out of the 14 DATs, absolute values for LOA and mean difference were obtained for 11 DATs for EI. From the 24 validation results assessing EI, 16 (67%) reported higher mean intakes than the reference. Of the seven (29%) validation studies using doubly labelled water (DLW) as the reference, results for the test DATs were not substantially better or worse than those using other reference measures. Further information on the studies from this review is available on the www.nutritools.org website. Conclusions Validated dietary assessment tools for use with children and adolescents in the UK have been identified and compared. Whilst tools are generally validated for macronutrient intakes, micronutrients are poorly evaluated. Validation studies that include estimates of zinc, selenium, dietary fibre, sugars and sodium are needed.


2009 ◽  
Vol 2 (1) ◽  
pp. 87-98 ◽  
Author(s):  
C. Lerot ◽  
M. Van Roozendael ◽  
J. van Geffen ◽  
J. van Gent ◽  
C. Fayt ◽  
...  

Abstract. Total O3 columns have been retrieved from six years of SCIAMACHY nadir UV radiance measurements using SDOAS, an adaptation of the GDOAS algorithm previously developed at BIRA-IASB for the GOME instrument. GDOAS and SDOAS have been implemented by the German Aerospace Center (DLR) in the version 4 of the GOME Data Processor (GDP) and in version 3 of the SCIAMACHY Ground Processor (SGP), respectively. The processors are being run at the DLR processing centre on behalf of the European Space Agency (ESA). We first focus on the description of the SDOAS algorithm with particular attention to the impact of uncertainties on the reference O3 absorption cross-sections. Second, the resulting SCIAMACHY total ozone data set is globally evaluated through large-scale comparisons with results from GOME and OMI as well as with ground-based correlative measurements. The various total ozone data sets are found to agree within 2% on average. However, a negative trend of 0.2–0.4%/year has been identified in the SCIAMACHY O3 columns; this probably originates from instrumental degradation effects that have not yet been fully characterized.


2018 ◽  
Vol 39 (2) ◽  
pp. 329-358 ◽  
Author(s):  
Colin Provost ◽  
Brian J. Gerber

AbstractEnvironmental justice (EJ) has represented an important equity challenge in policymaking for decades. President Clinton’s executive order (EO) 12898 in 1994 represented a significant federal action, requiring agencies to account for EJ issues in new rulemakings. We examine the impact of EO 12898 within the larger question of how EO are implemented in complex policymaking. We argue that presidential preferences will affect bureaucratic responsiveness and fire alarm oversight. However, EJ policy complexity produces uncertainty leading to bureaucratic risk aversion, constraining presidential efforts to steer policy. We utilise an original data set of nearly 2,000 final federal agency rules citing EO 12898 and find significant variation in its utilisation across administrations. Uncertainty over the nature of the order has an important influence on bureaucratic responsiveness. Our findings are instructive for the twin influences of political control and policy-making uncertainty and raise useful questions for future EJ and policy implementation research.


2014 ◽  
Vol 7 (4) ◽  
pp. 5087-5139 ◽  
Author(s):  
R. Pommrich ◽  
R. Müller ◽  
J.-U. Grooß ◽  
P. Konopka ◽  
F. Ploeger ◽  
...  

Abstract. Variations in the mixing ratio of trace gases of tropospheric origin entering the stratosphere in the tropics are of interest for assessing both troposphere to stratosphere transport fluxes in the tropics and the impact of these transport fluxes on the composition of the tropical lower stratosphere. Anomaly patterns of carbon monoxide (CO) and long-lived tracers in the lower tropical stratosphere allow conclusions about the rate and the variability of tropical upwelling to be drawn. Here, we present a simplified chemistry scheme for the Chemical Lagrangian Model of the Stratosphere (CLaMS) for the simulation, at comparatively low numerical cost, of CO, ozone, and long-lived trace substances (CH4, N2O, CCl3F (CFC-11), CCl2F2 (CFC-12), and CO2) in the lower tropical stratosphere. For the long-lived trace substances, the boundary conditions at the surface are prescribed based on ground-based measurements in the lowest model level. The boundary condition for CO in the free troposphere is deduced from MOPITT measurements (at &amp;approx; 700–200 hPa). Due to the lack of a specific representation of mixing and convective uplift in the troposphere in this model version, enhanced CO values, in particular those resulting from convective outflow are underestimated. However, in the tropical tropopause layer and the lower tropical stratosphere, there is relatively good agreement of simulated CO with in-situ measurements (with the exception of the TROCCINOX campaign, where CO in the simulation is biased low &amp;approx; 10–20 ppbv). Further, the model results are of sufficient quality to describe large scale anomaly patterns of CO in the lower stratosphere. In particular, the zonally averaged tropical CO anomaly patterns (the so called "tape recorder" patterns) simulated by this model version of CLaMS are in good agreement with observations. The simulations show a too rapid upwelling compared to observations as a consequence of the overestimated vertical velocities in the ERA-interim reanalysis data set. Moreover, the simulated tropical anomaly patterns of N2O are in good agreement with observations. In the simulations, anomaly patterns for CH4 and CFC-11 were found to be consistent with those of N2O; for all long-lived tracers, positive anomalies are simulated because of the enhanced tropical upwelling in the easterly phase of the quasi-biennial oscillation.


2019 ◽  
Vol 34 (9) ◽  
pp. 1369-1383 ◽  
Author(s):  
Dirk Diederen ◽  
Ye Liu

Abstract With the ongoing development of distributed hydrological models, flood risk analysis calls for synthetic, gridded precipitation data sets. The availability of large, coherent, gridded re-analysis data sets in combination with the increase in computational power, accommodates the development of new methodology to generate such synthetic data. We tracked moving precipitation fields and classified them using self-organising maps. For each class, we fitted a multivariate mixture model and generated a large set of synthetic, coherent descriptors, which we used to reconstruct moving synthetic precipitation fields. We introduced randomness in the original data set by replacing the observed precipitation fields in the original data set with the synthetic precipitation fields. The output is a continuous, gridded, hourly precipitation data set of a much longer duration, containing physically plausible and spatio-temporally coherent precipitation events. The proposed methodology implicitly provides an important improvement in the spatial coherence of precipitation extremes. We investigate the issue of unrealistic, sudden changes on the grid and demonstrate how a dynamic spatio-temporal generator can provide spatial smoothness in the probability distribution parameters and hence in the return level estimates.


2020 ◽  
pp. 089976402097768
Author(s):  
Noah D. Drezner ◽  
Oren Pizmony-Levy

Although Sense of Belonging has long been an important construct in understanding student success in higher education, it has not been examined in the alumni context. In this article, we explore the association between graduate students’ Sense of Belonging and alumni engagement. We draw on an original data set ( n = 1,601) that combines administrative records on alumni giving and data from a 2017 survey. Using multivariate analyses, we show that alumni with a stronger Sense of Belonging are more likely to give to their alma mater and to hold pro-philanthropic attitudes. Furthermore, Sense of Belonging is positively associated with other forms of alumni engagement and participation, including volunteering. Our findings highlight the need to examine the link between unintentional social interactions and alumni engagement and giving.


Sign in / Sign up

Export Citation Format

Share Document