Effect of Incentive upon Information Use in a Choice Situation

1963 ◽  
Vol 13 (2) ◽  
pp. 547-550 ◽  
Author(s):  
Donald R. Hoffeld

Using a static decision situation, 120 Ss were run in a 2 × 2 × 2 factorial design in which the main variables were number of choice alternatives (2 or 8), likelihood that the information provided was correct (higher or lower than chance), and presence or absence of monetary incentive. The following conclusions are drawn: (1) Ss in monetary incentive groups tend to behave in a manner more statistically advantageous in relation to use of information than Ss in non-incentive groups. (2) The more choice alternatives available, the more likely Ss are to use information. (3) Previous results were confirmed showing that, when faced with the option of making a random choice between equally likely alternatives and using available information, an S is inclined to use the information some of the time, even when it is statistically disadvantageous to do so, i.e., when S could do better by simple guessing behavior.

1964 ◽  
Vol 15 (1) ◽  
pp. 107-110 ◽  
Author(s):  
Donald R. Hoffeld ◽  
S. Carolyn Kent

Using a static decision situation, 90 Ss were tested for frequency of information use as a function of the number of choice alternatives and the likelihood that the information was correct. Extreme values were used in both high and low probability categories. The following major conclusions were reached. When a simple, i.e., 2-choice situation is used, a significant number of Ss actively avoid a very low information probability, while with a more complex choice, i.e., 8 alternatives, Ss behave in a random manner. The use of extreme probability values for the information likelihood pushes behavior toward, but not all the way to, a statistically good solution, for both high and low probability values.


2020 ◽  
Vol 21 (10) ◽  
pp. 3518 ◽  
Author(s):  
A. Mark Evans ◽  
D. Grahame Hardie

We live and to do so we must breathe and eat, so are we a combination of what we eat and breathe? Here, we will consider this question, and the role in this respect of the AMP-activated protein kinase (AMPK). Emerging evidence suggests that AMPK facilitates central and peripheral reflexes that coordinate breathing and oxygen supply, and contributes to the central regulation of feeding and food choice. We propose, therefore, that oxygen supply to the body is aligned with not only the quantity we eat, but also nutrient-based diet selection, and that the cell-specific expression pattern of AMPK subunit isoforms is critical to appropriate system alignment in this respect. Currently available information on how oxygen supply may be aligned with feeding and food choice, or vice versa, through our motivation to breathe and select particular nutrients is sparse, fragmented and lacks any integrated understanding. By addressing this, we aim to provide the foundations for a clinical perspective that reveals untapped potential, by highlighting how aberrant cell-specific changes in the expression of AMPK subunit isoforms could give rise, in part, to known associations between metabolic disease, such as obesity and type 2 diabetes, sleep-disordered breathing, pulmonary hypertension and acute respiratory distress syndrome.


1971 ◽  
Vol 5 (3) ◽  
pp. 374-387 ◽  
Author(s):  
Lewis H. Roberts

Although unnecessary assumptions are something we all try to avoid, advice on how to do so is much harder to come by than admonition. The most widely quoted dictum on the subject, often referred to by writers on philosophy as “Ockham's razor” and attributed generally to William of Ockham, states “Entia non sunt multiplicanda praeter necessitatem”. (Entities are not to be multiplied without necessity.) As pointed out in reference [I], however, the authenticity of this attribution is questionable.The same reference mentions Newton's essentially similar statement in his Principia Mathematica of 1726. Hume [3] is credited by Tribus [2c] with pointing out in 1740 that the problem of statistical inference is to find an assignment of probabilities that “uses the available information and leaves the mind unbiased with respect to what is not known.” The difficulty is that often our data are incomplete and we do not know how to create an intelligible interpretation without filling in some gaps. Assumptions, like sin, are much more easily condemned than avoided.In the author's opinion, important results have been achieved in recent years toward solving the problem of how best to utilize data that might heretofore have been regarded as inadequate. The approach taken and the relevance of this work to certain actuarial problems will now be discussed.Bias and PrejudiceOne type of unnecessary assumption lies in the supposition that a given estimator is unbiased when in fact it has a bias. We need not discuss this aspect of our subject at length here since what we might consider the scalar case of the general problem is well covered in textbooks and papers on sampling theory. Suffice it to say that an estimator is said to be biased if its expected value differs by an incalculable degree from the quantity being estimated. Such differences can arise either through faulty procedures of data collection or through use of biased mathematical formulas. It should be realized that biased formulas and procedures are not necessarily improper when their variance, when added to the bias, is sufficiently small as to yield a mean square error lower than the variance of an alternative, unbiased estimator.


2021 ◽  
Author(s):  
Jennifer M Taber ◽  
Clarissa A. Thompson ◽  
Pooja Gupta Sidney ◽  
Abigail O'Brien ◽  
John Updegraff

Background: In May, 2021, U.S. states began implementing “vaccination lotteries” to encourage vaccine-hesitant individuals to get a COVID-19 vaccine. Purpose: Drawing on theories from math cognition and behavioral economics, we tested several monetary lottery structures and their framing to determine which would best motivate unvaccinated individuals. Methods: In two online experiments conducted in May, 2021, U.S. adults were asked to imagine that their state had implemented a vaccination lottery. In Experiment 1, participants (n=589) were randomly assigned to 1 of 12 conditions that varied the monetary amount and number of winners, holding constant the total payout ($5 million). In Experiment 2, participants (n=274) were randomly assigned to 1 of 4 conditions in a 2 (Message Framing: Gain versus Loss) by 2 (Numeric Framing: 5 total winners versus 1 winner for 5 weeks) factorial design; in all four conditions, 5 people would each win $1 million. Following the manipulation, participants rated their COVID-19 vaccination intentions, perceived likelihood of winning, and anticipated regret. Results: Vaccination intentions did not differ across conditions in either experiment, and post-manipulation vaccination intentions were strongly associated with baseline vaccination willingness. When asked to choose from 12 different lottery structures, participants tended to prefer options that awarded less money to more people, with 41.9% of participants across experiments indicated they would not vaccinate for any lottery-based monetary incentive. Conclusion: Findings suggest that multiple lottery structures could be equally motivating for unvaccinated adults, although states could consider structures that distribute incentives across more people.


This chapter initiates the pathway through the ISSP framework described in the previous chapter. The chapter focuses on three aspects. It provides the approach for conducting the organisational information systems strategic review applying McFarlan's Strategic Grid. This model helps an organisation to determine the focus of its current application systems, where the entity wishes to be in the future, and generally assess where the competition (or best practice benchmark entity) is currently positioned. The chapter then provides a method to carry out the organisational data usage strategic review applying Marchand's Area of Information Use model. This model aids management to determine to what purpose the current available information is being utilised. Finally, the chapter illustrates how a government entity may conduct a review of its customer persona profiles to enable it to explicitly know its customers, thus enabling it to promote and target its specific services. All the models and techniques applied in this chapter are supported by examples.


Author(s):  
Ulrich Florian Simo ◽  
Henri Gwét

We present a new class of fuzzy aggregation operators that we call fuzzy triangular aggregation operators. To do so, we focus on the situation where the available information cannot be assessed with exact numbers and it is necessary to use another approach to assess uncertain or imprecise information such as fuzzy numbers. We also use the concept of triangular norms (t-norms and t-conorms) as pseudo-arithmetic operations. As a result, we get notably the fuzzy triangular weighted arithmetic (FTWA), the fuzzy triangular ordered weighted arithmetic (FTOWA), the fuzzy generalized triangular weighted arithmetic (FGTWA), the fuzzy generalized triangular ordered weighted arithmetic (FGTOWA), the fuzzy triangular weighted quasi-arithmetic (Quasi-FTWA), and the fuzzy triangular ordered weighted quasi-arithmetic (Quasi-FTOWA) operators. Main properties of these operators are discussed as well as their comparison with other existing ones. The fuzzy triangular aggregation operators not only cover a wide range of useful existing fuzzy aggregation operators but also provide new interesting cases. Finally, an illustrative example is also developed regarding the selection of strategies.


Author(s):  
LOVE EKENBERG ◽  
JOHAN THORBIÖRNSON

The purpose of this work is to provide theoretical foundations of, as well as some computational aspects on, a theory for analysing decisions under risk, when the available information is vague and imprecise. Many approaches to model unprecise information, e.g., by using interval methods, have prevailed. However, such representation models are unnecessarily restrictive since they do not admit discrimination between beliefs in different values, i.e., the epistemologically possible values have equal weights. In many situations, for instance, when the underlying information results from learning techniques based on variance analyses of statistical data, the expressibility must be extended for a more perceptive treatment of the decision situation. Our contribution herein is an approach for enabling a refinement of the representation model, allowing for an elaborated discrimination of possible values by using belief distributions with weak restrictions. We show how to derive admissible classes of local distributions from sets of global distributions and introduce measures expressing into which extent explicit local distributions can be used for modelling decision situations. As will turn out, this results in a theory that has very attractive features from a computational viewpoint.


1957 ◽  
Vol 147 (929) ◽  
pp. 552-553

I shall not attempt to summarize the content of today’s papers and discussions. We have, as expected, travelled from molecules to mammals, and from physics to physiology. We have considered much of the available information about the changes which occur when a living cell freezes and how these changes may be destructive to the cell. The removal of water by freezing sets up a variety of stresses and strains in or on the cell. These stresses may be mechanical, biophysical or biochemical in nature; it is difficult at present to assign an exact role to each. What is clear is that knowledge in this field is in its infancy and that factors hitherto unsuspected may play a major part. The likelihood of ‘bubble trouble’ in thawed tissues, discussed today, is a good example of a hitherto unsuspected complication arising from the temporary removal of water. The extent of the direct effect of cold, as such, on the biochemical components of the cell is even less certain, and it may well be that the destructive effects on the lipoproteins are accompanied by equal damage to the enzymic and other active systems. The capacity of various neutral solutes, notably glycerol, to decrease or even prevent the damage caused by freezing and thawing has featured largely in today’s proceedings and here even more problems appear. It seems that according to the type of cell, the glycerol can be applied abruptly and the cell returned to a normal medium abruptly, or it can be applied abruptly, but must be removed gradually to avoid osmotic damage to the cell, or it must be both applied and removed slowly. There seems little doubt that the glycerol passes to the interior of the cell, and some evidence that it must do so to be effective. The optimal concentration of the substance evidently varies according to the type of cell, condition of freezing, nature of the suspending medium and so on. Having satisfactorily applied glycerol, freezing can vary according to the temperature gradient and the terminal temperature. Today’s discussions have made it clear that we are only at the beginning of the study of the effect of cooling velocity on viability of the cells.


1972 ◽  
Vol 30 (3) ◽  
pp. 839-845
Author(s):  
Harold R. Keller ◽  
Ronald K. Parker

2 probability learning studies were conducted, each employing 64 Ss (one with sixth graders and one with college females) in a 2 × 4 factorial design combining external incentives (noncontingent feedback and monetary incentive for accuracy) and event schedules (25%, 50%, 75%, and 100%). All Ss were given state instructions which, in effect, required Ss to indicate (via a written “yes” or “no” response) whether they were in an experimental state of acquisition or extinction. While there were no differences in acquisition for the college Ss, there were differences in terminal acquisition among the sixth graders—contrary to prediction. The partial reinforcement effect was supported. The interaction between external incentive, event schedule, and trials was also significant, suggesting that monetary incentive for accuracy, with Ss responding under state instructions, served to elicit more accurate discriminations of the acquisition and extinction phases.


Author(s):  
Marta Serra-Garcia

This chapter provides a review of the literature on deception in the field of economics. Until recently, the standard assumption in economics was that individuals would lie whenever there was a material incentive to do so. Recent work in behavioral economics and psychology has shown that this assumption is wrong. In fact, many will not lie, even if there is a large monetary incentive to do so. This chapter begins with a review of studies that measure individuals’ aversion to lying, discussing the advantages and disadvantages of different methodologies. Thereafter, there is an overview of studies examining factors that influence lying, and the chapter concludes with a discussion of future potential venues of research on deception.


Sign in / Sign up

Export Citation Format

Share Document