Supply and Demand: A Framework for Explaining Variability in Dietary Intake and its Impact on Data

1997 ◽  
Vol 11 (4) ◽  
pp. 289-299 ◽  
Author(s):  
Gloria Joachim

Nutrition has an important relationship with health and illness. One difficulty in measuring intake is related to variability. The purpose of this paper is to examine 1) the impact of supply and demand on variability in data collected for dietary studies and 2) the relationship between data and estimates of usual intake. The forces of supply and demand over time generate a consumption curve for each food. Two types of consumption curves are identified. One curve is horizontal and represents staples that are steadily consumed. The other curve exhibits peaks and dips and is unique for each food whose consumption varies with time. The measurement of usual intake is discussed in. light of these two types of curves. Usual intake of foods whose consumption curve is horizontal could be read at any time since consumption does not vary with time. For all other foods, measuring usual consumption presents problems since the data vary with time. This examination indicates that foods whose consumption varies with time have unique properties that must be considered when attempting to calculate consumption. Suggestions are given to enhance measurement of consumption of these foods. Although excellent methodology currently exists for the calculation of intake, attention to the force of supply and demand with only serve to strengthen existing methods.

Author(s):  
Melanie K. T. Takarangi ◽  
Deryn Strange

When people are told that their negative memories are worse than other people’s, do they later remember those events differently? We asked participants to recall a recent negative memory then, 24 h later, we gave some participants feedback about the emotional impact of their event – stating it was more or less negative compared to other people’s experiences. One week later, participants recalled the event again. We predicted that if feedback affected how participants remembered their negative experiences, their ratings of the memory’s characteristics should change over time. That is, when participants are told that their negative event is extremely negative, their memories should be more vivid, recollected strongly, and remembered from a personal perspective, compared to participants in the other conditions. Our results provide support for this hypothesis. We suggest that external feedback might be a potential mechanism in the relationship between negative memories and psychological well-being.


Author(s):  
Frode Eika Sandnes

AbstractPurpose: Some universal accessibility practitioners have voiced that they experience a mismatch in the research focus and the need for knowledge within specialized problem domains. This study thus set out to identify the balance of research into the main areas of accessibility, the impact of this research, and how the research profile varies over time and across geographical regions. Method: All UAIS papers indexed in Scopus were analysed using bibliometric methods. The WCAG taxonomy of accessibility was used for the analysis, namely perceivable, operable, and understandable. Results: The results confirm the expectation that research into visual impairment has received more attention than papers addressing operable and understandable. Although papers focussing on understandable made up the smallest group, papers in this group attracted more citations. Funded research attracted fewer citations than research without funding. The breakdown of research efforts appears consistent over time and across different geographical regions. Researchers in Europe and North America have been active throughout the last two decades, while Southeast Asia, Latin America, and Middle East became active in during the last five years. There is also seemingly a growing trend of out-of-scope papers. Conclusions: Based on the findings, several recommendations are proposed to the UAIS editorial board.


1989 ◽  
Vol 18 (2) ◽  
pp. 187-210 ◽  
Author(s):  
Aidan Kelly

ABSTRACTThe theory of incrementalism is a long-standing and influential perspective on policy making and resource allocation in the public sector. Previous research on social services budgeting suggests that resources are allocated incrementally, although there has been some debate as to whether this would persist in an era of prolonged expenditure restraint. Incremental budgetary outcomes are operationalised as percentage changes in budgets pro-rata with percentage changes in the total budget, and as stable shares of total expenditure for each activity. Data for 99 English social service departments supports incrementalism in that budget shares change by only 1.8 per cent, but percentage allocations depart from pro-rata incrementalism by a mean of 74 per cent. The comparison of the two summary indices over time supports those who have argued that prolonged restraint would encourage non-incremental budgeting, but change in the agency's total budget does not consistently predict budgetary outcomes. The effect of restraint on incrementalism varies with the measure used and across the component activities of the measures, but there is enough evidence to suggest a significant decline in the level of incrementalism in social service departments. In particular, non-incremental budgeting is strongly associated with the growth of day centre expenditure on the mentally ill and the elderly before 1982–3, and after that with the pursuit of the ‘community care’ strategy within state provided services for the elderly and children. Incrementalism as a general theory of agency budgeting is limited in its ability to explain variations in the degree of incrementalism between agencies, between component budgets and over time. The conclusion suggests that further research should seek explanations for these variations in the varying balance of the competing forces which shape outcomes in welfare bureaucracies and in the relationship between these forces and the organisation's environment.


2014 ◽  
Vol 44 (4) ◽  
pp. 310-323 ◽  
Author(s):  
Ana Tominc

Purpose – The purpose of this study is to demonstrate the impact of global celebrity chefs and their discourse about food on the genre of cookbooks in Slovenia. Design/methodology/approach – Focusing this discourse study on cookbook topics only, the analysis demonstrates the relationship between the aspirations of local celebrity chefs for the food culture represented globally by global celebrity chefs, such as Oliver, and the necessity for a local construction of specific tastes. While the central genre of TV celebrity chefs remains TV cooking shows, their businesses include a number of side products, such as cookbooks, which can be seen as recontexualisations of TV food discourse. Findings – Hence, despite this study being limited to analysis of cookbooks only, it can be claimed that the findings extend to other genres. The analysis shows that local chefs aspire to follow current trends, such as an emphasis on the local and sustainable production of food as well as enjoyment and pleasure in the form of a postmodern hybrid genre, while, on the other hand, they strive to include topics that will resonate locally, as they aim to represent themselves as the “new middle class”. Originality/value – Such an analysis brings new insights into the relationship between discourse and globalisation as well as discourse and food.


2015 ◽  
Vol 23 (4) ◽  
pp. 306-327 ◽  
Author(s):  
Christian Geisler Asmussen ◽  
Bo Bernhard Nielsen ◽  
Tom Osegowitsch ◽  
Andre Sammartino

Purpose – The purpose of this paper is to model and test the dynamics of home-regional and global penetration by multi-national enterprises (MNEs). Design/methodology/approach – Drawing on international business (IB) theory, the authors model MNEs adjusting their home-regional and global market presence over time. The authors test the resulting hypotheses using sales data from a sample of 220 of the world’s largest MNEs over the period 1995-2005. The authors focus specifically on the relationship between levels of market penetration inside and outside the home region and rates of change in each domain. Findings – The authors demonstrate that MNEs do penetrate both home-regional and global markets, often simultaneously, and that penetration levels often oscillate within an MNE over time. The authors show firms’ rates of regional and global expansion to be affected by their existing regional and global penetration, as well as their interplay. Finally, the authors identify differences in the steady states at which firms stabilize their penetration levels in the home-regional and the global space. The findings broadly confirm the MNE as an interdependent portfolio with important regional demarcations. Originality/value – The authors identify complex interdependencies between home-regional and global penetration and growth, paving the way for further studies of the impact of regions on MNE expansion.


Author(s):  
Stepan Dankevych

The problem of ensuring the balanced use of forest lands determines the search for new economic and environmental tools that can influence this process. The need to improve the certification tool as part of the financial and economic mechanism for ensuring balanced forestry land use corresponds to the directions of state policy and European integration intentions of Ukraine, modern requirements of the ecological aspect of forestry land use. The work examines the practice in the field of forest certification in Ukraine from the point of view of balanced land use. Spatial-temporal analysis and assessment of the scale and dynamics of the spread of forest FSC certification in Ukraine has been carried out. The study was formed in three stages: (I) study of changes over time in the volume of forest certification on a national scale, (II) assessment of trends over time for indicators on a regional scale, (III) study of the relationship between individual indicators. The analysis of the impact of FSC-certification of forest management in Ukraine on the environmental indicators of forestry land use based on the results of the correlation between the statistical characteristics of certain economic and environmental indicators, such as the area of certified forests, capital investments, reforestation. Analysis of statistical data showed the relationship between environmental and economic performance over time and changes in specific characteristics on a regional scale. The study makes it possible, on the basis of an objectively existing causal relationship between phenomena and indicators, to identify the course of certain positive or negative processes in forestry land use. Forest certification can play a role in maintaining a balanced use of forest lands, preventing illegal logging, forest degradation and contributing to reforestation and capital investments. The study helps to identify certain key variables that limit the ability of forestry operators to ensure balanced use of forest lands and how forest certification can affect this. Foreign experience in stimulating forest certification has been investigated for the possibility of borrowing the experience of using management tools in order to motivate forest certification in Ukraine. It has been proven that certification is a significant environmental tool for ensuring a balanced level of land use and has the potential for further development.


2015 ◽  
Vol 19 (14) ◽  
pp. 1-504 ◽  
Author(s):  
Karl Claxton ◽  
Steve Martin ◽  
Marta Soares ◽  
Nigel Rice ◽  
Eldon Spackman ◽  
...  

BackgroundCost-effectiveness analysis involves the comparison of the incremental cost-effectiveness ratio of a new technology, which is more costly than existing alternatives, with the cost-effectiveness threshold. This indicates whether or not the health expected to be gained from its use exceeds the health expected to be lost elsewhere as other health-care activities are displaced. The threshold therefore represents the additional cost that has to be imposed on the system to forgo 1 quality-adjusted life-year (QALY) of health through displacement. There are no empirical estimates of the cost-effectiveness threshold used by the National Institute for Health and Care Excellence.Objectives(1) To provide a conceptual framework to define the cost-effectiveness threshold and to provide the basis for its empirical estimation. (2) Using programme budgeting data for the English NHS, to estimate the relationship between changes in overall NHS expenditure and changes in mortality. (3) To extend this mortality measure of the health effects of a change in expenditure to life-years and to QALYs by estimating the quality-of-life (QoL) associated with effects on years of life and the additional direct impact on QoL itself. (4) To present the best estimate of the cost-effectiveness threshold for policy purposes.MethodsEarlier econometric analysis estimated the relationship between differences in primary care trust (PCT) spending, across programme budget categories (PBCs), and associated disease-specific mortality. This research is extended in several ways including estimating the impact of marginal increases or decreases in overall NHS expenditure on spending in each of the 23 PBCs. Further stages of work link the econometrics to broader health effects in terms of QALYs.ResultsThe most relevant ‘central’ threshold is estimated to be £12,936 per QALY (2008 expenditure, 2008–10 mortality). Uncertainty analysis indicates that the probability that the threshold is < £20,000 per QALY is 0.89 and the probability that it is < £30,000 per QALY is 0.97. Additional ‘structural’ uncertainty suggests, on balance, that the central or best estimate is, if anything, likely to be an overestimate. The health effects of changes in expenditure are greater when PCTs are under more financial pressure and are more likely to be disinvesting than investing. This indicates that the central estimate of the threshold is likely to be an overestimate for all technologies which impose net costs on the NHS and the appropriate threshold to apply should be lower for technologies which have a greater impact on NHS costs.LimitationsThe central estimate is based on identifying a preferred analysis at each stage based on the analysis that made the best use of available information, whether or not the assumptions required appeared more reasonable than the other alternatives available, and which provided a more complete picture of the likely health effects of a change in expenditure. However, the limitation of currently available data means that there is substantial uncertainty associated with the estimate of the overall threshold.ConclusionsThe methods go some way to providing an empirical estimate of the scale of opportunity costs the NHS faces when considering whether or not the health benefits associated with new technologies are greater than the health that is likely to be lost elsewhere in the NHS. Priorities for future research include estimating the threshold for subsequent waves of expenditure and outcome data, for example by utilising expenditure and outcomes available at the level of Clinical Commissioning Groups as well as additional data collected on QoL and updated estimates of incidence (by age and gender) and duration of disease. Nonetheless, the study also starts to make the other NHS patients, who ultimately bear the opportunity costs of such decisions, less abstract and more ‘known’ in social decisions.FundingThe National Institute for Health Research-Medical Research Council Methodology Research Programme.


Author(s):  
Angel L. Meroño-Cerdan ◽  
Pedro Soto-Acosta ◽  
Carolina Lopez-Nicolas

This study seeks to assess the impact of collaborative technologies on innovation at the firm level. Collaborative technologies’ influence on innovation is considered here as a multi-stage process that starts at adoption and extends to use. Thus, the effect of collaborative technologies on innovation is examined not only directly, the simple presence of collaborative technologies, but also based on actual collaborative technologies’ use. Given the fact that firms can use this technology for different purposes, collaborative technologies’ use is measured according to three orientations: e-information, e-communication and e-workflow. To achieve these objectives, a research model is developed for assessing, on the one hand, the impact of the adoption and use of collaborative technologies on innovation and, on the other hand, the relationship between adoption and use of collaborative technologies. The research model is tested using a dataset of 310 Spanish SMEs. The results showed that collaborative technologies’ adoption is positively related to innovation. Also, as hypothesized, distinct collaborative technologies were found to be associated to different uses. In addition, the study found that while e-information had a positive and significant impact on innovation, e-communication and e-workflow did not.


2018 ◽  
Vol 16 (3) ◽  
pp. 78-93
Author(s):  
Yongtao Peng ◽  
Yaya Li ◽  
Meiling He

For the realization, a qualitative and quantitative description of matching degree between the elements for logistics supply network and demand network, logistics super network models are constructed by the theory of super network. Faced with the problems of diverse demand and massive circulation for commodities, this article studies the structure of the logistics super network of multi-commodity circulation and establishes the continuous cost function of the logistics demand and supply, reflecting the logistics cost of different commodities in different phrases. This article aims to establish the optimization model of logistics supernetwork by aiming to maximize the matching of supply and demand of multi-commodity. The model is transformed into the variational inequality problem, and proves the existence and uniqueness of the equivalence solution. Use the case of the logistics supernetwork of coal, a modified projection algorithm is adopted and the fact is revealed that improving the supply capacity of the network matching may have the original 81.3% increase to 90.5%, improving the impact of the relationship between trades, matching degree can be increased to 90.1%.


Author(s):  
Emily Zackin

The study of constitutionalism often begins with the question of what a constitution is. Sometimes the term refers to a single legal document with that name, but the term “constitution” may also refer to something unwritten, such as important political traditions or established customs. As a result, scholars sometimes distinguish between the “Big-C” constitution, that is, the constitutional document, and the “small-c” constitution, the set of unwritten practices and understandings that structure political life. Constitutionalism is typically associated with documents and practices that restrict the arbitrary exercise of power. Most constitutions contain guarantees of rights and outline the structures of government. Constitutions are often enforced in court, but nonjudicial actors, like legislatures or popular movements, may also enforce constitutional provisions. The relationship between democracy and constitutionalism is not at all straightforward, and it has received an enormous amount of scholarly attention. Constitutionalism seems to both undergird and restrain democracy. On the one hand, constitutions establish the institutions that allow for self-government. On the other, they are often said to restrict majoritarian decision-making. Related to this question of the relationship between constitutionalism and democracy are questions about how constitutions change and how they ought to change. Can written constitutions change without changes to the text, and can judges bring about these changes? Do extratextual changes threaten or promote democracy? Finally, not only do individual constitutions change, but the practice of writing constitutions and governing with them has also changed over time. In general, constitutions have grown more specific and flexible over time, arguably, allowing for a different kind of constitutional politics.


Sign in / Sign up

Export Citation Format

Share Document