Fuzzy Rationality in Quantitative Decision Analysis

Author(s):  
Natalia Nikolova ◽  
◽  
Aleksei Shulus ◽  
Daniela Toneva ◽  
Kiril Tenekedjiev ◽  
...  

The paper presents a discussion on fuzzy rationality in the elicitation of subjective probabilities and utilities. In addition to previous research, two functions, measuring the degree of preference of the real decision maker on both sides of the uncertainty interval are introduced, and their relationship with the indifference function, measuring the degree of indifference of the decision maker over gambles, is analyzed and graphically interpreted. A new relation -- hesitation -- is introduced to give a better description of the actual process of subjective elicitation by real decision makers. The influence of the preference-hesitation combination in an elicitation process is presented graphically, and it is argued that the resulting uncertainty interval is much tighter than the one, resulting from the preference-indifference based elicitation.

2010 ◽  
Vol 7 (3) ◽  
pp. 511-528 ◽  
Author(s):  
Goran Devedzic ◽  
Danijela Milosevic ◽  
Lozica Ivanovic ◽  
Dragan Adamovic ◽  
Miodrag Manic

Negative-positive-neutral logic provides an alternative framework for fuzzy cognitive maps development and decision analysis. This paper reviews basic notion of NPN logic and NPN relations and proposes adaptive approach to causality weights assessment. It employs linguistic models of causality weights activated by measurement-based fuzzy cognitive maps? concepts values. These models allow for quasi-dynamical adaptation to the change of concepts values, providing deeper understanding of possible side effects. Since in the real-world environments almost every decision has its consequences, presenting very valuable portion of information upon which we also make our decisions, the knowledge about the side effects enables more reliable decision analysis and directs actions of decision maker.


Author(s):  
Soumana Fomba ◽  
Pascale Zarate ◽  
Marc Kilgour ◽  
Guy Camilleri ◽  
Jacqueline Konate ◽  
...  

Recommender systems aim to support decision-makers by providing decision advice. We review briefly tools of Multi-Criteria Decision Analysis (MCDA), including aggregation operators, that could be the basis for a recommender system. Then we develop a multi-criteria recommender system, STROMa (SysTem of RecOmmendation Multi-criteria), to support decisions by aggregating measures of performance contained in a performance matrix. The system makes inferences about preferences using a partial order on criteria input by the decision-maker. To determine a total ordering of the alternatives, STROMa uses a multi-criteria aggregation operator, the Choquet integral of a fuzzy measure. Thus, recommendations are calculated using partial preferences provided by the decision maker and updated by the system. An integrated web platform is under development.


Author(s):  
Marion Ledwig

Spohn's decision model, an advancement of Fishburn's theory, is valuable for making explicit the principle used also by other thinkers that 'any adequate quantitative decision model must not explicitly or implicitly contain any subjective probabilities for acts.' This principle is not used in the decision theories of Jeffrey or of Luce and Krantz. According to Spohn, this principle is important because it has effects on the term of action, on Newcomb's problem, and on the theory of causality and the freedom of the will. On the one hand, I will argue against Spohn with Jeffrey that the principle has to be given up. On the other, I will try to argue against Jeffrey that the decision-maker ascribes subjective probabilities to actions on the condition of the given decision situation.


Facilities ◽  
2016 ◽  
Vol 34 (13/14) ◽  
pp. 891-905 ◽  
Author(s):  
Peter Palm

Purpose The purpose of this paper is to examine how the real estate owner (decision maker) insures being able to make informed decisions and how they differ according to organisational form. Design/methodology/approach This research is based on an interview study of nineteen firm representatives, six decision makers and thirteen management representatives, all from Swedish commercial real estate sector. Findings The study concludes that, regardless of organisational setting, the industry has a plan regarding handling information. The decision makers have all secured themselves access to the required/desired information. How this is done and what kind of information it is however differ, if the real estate management is in-house or outsourced. Furthermore, a clear focus on financial and contractual information is evident in both organisational settings. Research limitations/implications The research in this paper is limited to Swedish commercial real estate sector. Practical implications The insight the paper provides regarding required information can shed light on how information systems are built and how to improve your information sharing. Originality/value It provides an insight regarding how the industry, depending on organisation setting, prioritises different information and how the decision maker secures access to it.


2018 ◽  
pp. 49-68 ◽  
Author(s):  
M. E. Mamonov

Our analysis documents that the existence of hidden “holes” in the capital of not yet failed banks - while creating intertemporal pressure on the actual level of capital - leads to changing of maturity of loans supplied rather than to contracting of their volume. Long-term loans decrease, whereas short-term loans rise - and, what is most remarkably, by approximately the same amounts. Standardly, the higher the maturity of loans the higher the credit risk and, thus, the more loan loss reserves (LLP) banks are forced to create, increasing the pressure on capital. Banks that already hide “holes” in the capital, but have not yet faced with license withdrawal, must possess strong incentives to shorten the maturity of supplied loans. On the one hand, it raises the turnovers of LLP and facilitates the flexibility of capital management; on the other hand, it allows increasing the speed of shifting of attracted deposits to loans to related parties in domestic or foreign jurisdictions. This enlarges the potential size of ex post revealed “hole” in the capital and, therefore, allows us to assume that not every loan might be viewed as a good for the economy: excessive short-term and insufficient long-term loans can produce the source for future losses.


Author(s):  
Vivek Raich ◽  
Pankaj Maurya

in the time of the Information Technology, the big data store is going on. Due to which, Huge amounts of data are available for decision makers, and this has resulted in the progress of information technology and its wide growth in many areas of business, engineering, medical, and scientific studies. Big data means that the size which is bigger in size, but there are several types, which are not easy to handle, technology is required to handle it. Due to continuous increase in the data in this way, it is important to study and manage these datasets by adjusting the requirements so that the necessary information can be obtained.The aim of this paper is to analyze some of the analytic methods and tools. Which can be applied to large data. In addition, the application of Big Data has been analyzed, using the Decision Maker working on big data and using enlightened information for different applications.


Author(s):  
J Ph Guillet ◽  
E Pilon ◽  
Y Shimizu ◽  
M S Zidi

Abstract This article is the first of a series of three presenting an alternative method of computing the one-loop scalar integrals. This novel method enjoys a couple of interesting features as compared with the method closely following ’t Hooft and Veltman adopted previously. It directly proceeds in terms of the quantities driving algebraic reduction methods. It applies to the three-point functions and, in a similar way, to the four-point functions. It also extends to complex masses without much complication. Lastly, it extends to kinematics more general than that of the physical, e.g., collider processes relevant at one loop. This last feature may be useful when considering the application of this method beyond one loop using generalized one-loop integrals as building blocks.


Axioms ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 124
Author(s):  
Dragiša Stanujkić ◽  
Darjan Karabašević ◽  
Gabrijela Popović ◽  
Predrag S. Stanimirović ◽  
Florentin Smarandache ◽  
...  

Some decision-making problems, i.e., multi-criteria decision analysis (MCDA) problems, require taking into account the attitudes of a large number of decision-makers and/or respondents. Therefore, an approach to the transformation of crisp ratings, collected from respondents, in grey interval numbers form based on the median of collected scores, i.e., ratings, is considered in this article. In this way, the simplicity of collecting respondents’ attitudes using crisp values, i.e., by applying some form of Likert scale, is combined with the advantages that can be achieved by using grey interval numbers. In this way, a grey extension of MCDA methods is obtained. The application of the proposed approach was considered in the example of evaluating the websites of tourism organizations by using several MCDA methods. Additionally, an analysis of the application of the proposed approach in the case of a large number of respondents, done in Python, is presented. The advantages of the proposed method, as well as its possible limitations, are summarized.


2020 ◽  
Vol 36 (S1) ◽  
pp. 37-37
Author(s):  
Americo Cicchetti ◽  
Rossella Di Bidino ◽  
Entela Xoxi ◽  
Irene Luccarini ◽  
Alessia Brigido

IntroductionDifferent value frameworks (VFs) have been proposed in order to translate available evidence on risk-benefit profiles of new treatments into Pricing & Reimbursement (P&R) decisions. However limited evidence is available on the impact of their implementation. It's relevant to distinguish among VFs proposed by scientific societies and providers, which usually are applicable to all treatments, and VFs elaborated by regulatory agencies and health technology assessment (HTA), which focused on specific therapeutic areas. Such heterogeneity in VFs has significant implications in terms of value dimension considered and criteria adopted to define or support a price decision.MethodsA literature research was conducted to identify already proposed or adopted VF for onco-hematology treatments. Both scientific and grey literature were investigated. Then, an ad hoc data collection was conducted for multiple myeloma; breast, prostate and urothelial cancer; and Non Small Cell Lung Cancer (NSCLC) therapies. Pharmaceutical products authorized by European Medicines Agency from January 2014 till December 2019 were identified. Primary sources of data were European Public Assessment Reports and P&R decision taken by the Italian Medicines Agency (AIFA) till September 2019.ResultsThe analysis allowed to define a taxonomy to distinguish categories of VF relevant to onco-hematological treatments. We identified the “real-world” VF that emerged given past P&R decisions taken at the Italian level. Data was collected both for clinical and economical outcomes/indicators, as well as decisions taken on innovativeness of therapies. Relevant differences emerge between the real world value framework and the one that should be applied given the normative framework of the Italian Health System.ConclusionsThe value framework that emerged from the analysis addressed issues of specific aspects of onco-hematological treatments which emerged during an ad hoc analysis conducted on treatment authorized in the last 5 years. The perspective adopted to elaborate the VF was the one of an HTA agency responsible for P&R decisions at a national level. Furthermore, comparing a real-world value framework with the one based on the general criteria defined by the national legislation, our analysis allowed identification of the most critical point of the current national P&R process in terms ofsustainability of current and future therapies as advance therapies and agnostic-tumor therapies.


AI & Society ◽  
2021 ◽  
Author(s):  
Simona Chiodo

AbstractWe continuously talk about autonomous technologies. But how can words qualifying technologies be the very same words chosen by Kant to define what is essentially human, i.e. being autonomous? The article focuses on a possible answer by reflecting upon both etymological and philosophical issues, as well as upon the case of autonomous vehicles. Most interestingly, on the one hand, we have the notion of (human) “autonomy”, meaning that there is a “law” that is “self-given”, and, on the other hand, we have the notion of (technological) “automation”, meaning that there is something “offhand” that is “self-given”. Yet, we are experiencing a kind of twofold shift: on the one hand, the shift from defining technologies in terms of automation to defining technologies in terms of autonomy and, on the other hand, the shift from defining humans in terms of autonomy to defining humans in terms of automation. From a philosophical perspective, the shift may mean that we are trying to escape precisely from what autonomy founds, i.e. individual responsibility of humans that, in the Western culture, have been defined for millennia as rational and moral decision-makers, even when their decisions have been the toughest. More precisely, the shift may mean that we are using technologies, and in particular emerging algorithmic technologies, as scapegoats that bear responsibility for us by making decisions for us. Moreover, if we consider the kind of emerging algorithmic technologies that increasingly surround us, starting from autonomous vehicles, then we may argue that we also seem to create a kind of technological divine that, by being always with us through its immanent omnipresence, omniscience, omnipotence and inscrutability, can always be our technological scapegoat freeing us from the most unbearable burden of individual responsibility resulting from individual autonomy.


Sign in / Sign up

Export Citation Format

Share Document