scholarly journals Information Theory for Agents in Artificial Intelligence, Psychology, and Economics

Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 310
Author(s):  
Michael S. Harré

This review covers some of the central relationships between artificial intelligence, psychology, and economics through the lens of information theory, specifically focusing on formal models of decision-theory. In doing so we look at a particular approach that each field has adopted and how information theory has informed the development of the ideas of each field. A key theme is expected utility theory, its connection to information theory, and the Bayesian approach to decision-making and forms of (bounded) rationality. What emerges from this review is a broadly unified formal perspective derived from three very different starting points that reflect the unique principles of each field. Each of the three approaches reviewed can, in principle at least, be implemented in a computational model in such a way that, with sufficient computational power, they could be compared with human abilities in complex tasks. However, a central critique that can be applied to all three approaches was first put forward by Savage in The Foundations of Statistics and recently brought to the fore by the economist Binmore: Bayesian approaches to decision-making work in what Savage called `small worlds’ but cannot work in `large worlds’. This point, in various different guises, is central to some of the current debates about the power of artificial intelligence and its relationship to human-like learning and decision-making. Recent work on artificial intelligence has gone some way to bridging this gap but significant questions still need to be answered in all three fields in order to make progress on these problems.

2016 ◽  
Vol 104 (8) ◽  
pp. 1647-1661 ◽  
Author(s):  
Carlo Cappello ◽  
Daniele Zonta ◽  
Branko Glisic

2020 ◽  
Vol 19 (3) ◽  
pp. 790-810
Author(s):  
Nicholas Kluge Corrêa ◽  
Nythamar Fernandes de Oliveira

How can someone reconcile the desire to eat meat, and a tendency toward vegetarian ideals? How should we reconcile contradictory moral values? How can we aggregate different moral theories? How individual preferences can be fairly aggregated to represent a will, norm, or social decision? Conflict resolution and preference aggregation are tasks that intrigue philosophers, economists, sociologists, decision theorists, and many other scholars, being a rich interdisciplinary area for research. When trying to solve questions about moral uncertainty a meta understanding of the concept of normativity can help us to develop strategies to deal with norms themselves. 2nd-order normativity, or norms about norms, is a hierarchical way to think about how to combine many different normative structures and preferences into a single coherent decision. That is what metanormativity is all about, a way to answer: what should we do when we don’t know what to do?  In this study, we will review a decision-making strategy dealing with moral uncertainty, Maximization of Expected Choice-Worthiness. This strategy, proposed by William MacAskill, allows for the aggregation and inter-theoretical comparison of different normative structures, cardinal theories, and ordinal theories. In this study, we will exemplify the metanormative methods proposed by MacAskill, using has an example, a series of vegetarian dilemmas. Given the similarity to this metanormative strategy to expected utility theory, we will also show that it is possible to integrate both models to address decision-making problems in situations of empirical and moral uncertainty. We believe that this kind of ethical-mathematical formalism can be useful to help develop strategies to better aggregate moral preferences and solve conflicts.


Author(s):  
Alexander Krasilnikov

The paper discusses evolution of the concept of risk in economics. History of probabilistic methods and approaches to risk and uncertainty analysis is considered. Expected utility theory, behavioral approaches, heuristic models and methods of neuroeconomics are analyzed. Author investigates stability of neoclassical program related to risk analysis and suggests further directions of development.


2018 ◽  
pp. 261-280
Author(s):  
Ivan Moscati

Chapter 16 shows how the validity of expected utility theory (EUT) was increasingly called into question between the mid-1960s and the mid-1970s and discusses how a series of experiments performed from 1974 to 1985 undermined the earlier confidence that EUT makes it possible to measure utility. Beginning in the mid-1960s, in a series of experiments seminal to the field later called behavioral economics, Sarah Lichtenstein, Paul Slovic, Amos Tversky, and others showed that decision patterns violating EUT are systematic. The new experimenters who engaged with the EUT-based measurement of utility from the mid-1970s, namely Uday Karmarkar, Richard de Neufville, Paul Schoemaker, and coauthors, showed that different elicitation methods to measure utility, which according to EUT should produce the same outcome, generate different measures. These findings contributed to destabilizing EUT, undermined the confidence in EUT-based utility measurement, and helped foster a blossoming of novel behavioral models of decision-making under risk.


2008 ◽  
Vol 98 (1) ◽  
pp. 38-71 ◽  
Author(s):  
Thierry Post ◽  
Martijn J van den Assem ◽  
Guido Baltussen ◽  
Richard H Thaler

We examine the risky choices of contestants in the popular TV game show “Deal or No Deal” and related classroom experiments. Contrary to the traditional view of expected utility theory, the choices can be explained in large part by previous outcomes experienced during the game. Risk aversion decreases after earlier expectations have been shattered by unfavorable outcomes or surpassed by favorable outcomes. Our results point to reference-dependent choice theories such as prospect theory, and suggest that path-dependence is relevant, even when the choice problems are simple and well defined, and when large real monetary amounts are at stake. (JEL D81)


2021 ◽  
Vol 13 (5) ◽  
pp. 2448
Author(s):  
Li-Ming Chien ◽  
Kung-Jen Tu

The purpose of this study is to propose a feasible operational evaluation model for property mergers. It is expected that through the merger of enterprises, the comprehensive improvement of business management and the promotion of logistics supply resources will be effectively promoted, so that enterprises can effectively reduce operating costs and achieve maximum profits. This study uses the modified Delphi method and analytic hierarchy process method to find out the key factors of the common dilemmas in Taiwan’s property management companies, and the weight of their impact on the operation. Finally, we use the expected utility theory to develop a valuation model for whether the property is suitable for integration, and to evaluate this, the result is used as a reference indicator for merger operations. After 30 years of vigorous development in Taiwan’s property management companies, due to fierce market competition, most of the companies have reduced their profitability in the face of common dilemmas. The study found that the merger model should be accurately evaluated by the evaluation model. The sharing of logistics resources can indeed bring about the benefits of investment and marketing to the merger, and improve the profitability of the company. At the time of writing, there is no research on such a combined analysis of the property management industry in Taiwan. This research method uses multiple decision analysis theory and utility theory to develop a decision-making model that is suitable for consolidation. It can also be applied to the assessment of mergers in other fields, such as the clean service industry, real estate brokerage and other industry merger assessments. This is also the biggest contribution of this research paper.


2009 ◽  
pp. 46-61
Author(s):  
I. Gilboa ◽  
W. A. Postlewaite ◽  
D. Schmeidler

The article considers the paradigm of subjective probability and expected utility theory with respect to their applications in the theory of decision-making. Advantages and shortcomings of Savage’s axiomatic in the subjective probability theory are analyzed, the models of beliefs formation are considered. The authors propose a new approach to the analysis of decision-making — a multiple priors model, where an agent attributes to each event not a single probability, but a range of probabilities.


Sign in / Sign up

Export Citation Format

Share Document