The Poliheuristic Theory of Crisis Decision Making and Applied Decision Analysis

Author(s):  
Inbal Hakman ◽  
Alex Mintz ◽  
Steven B. Redd

Poliheuristic theory addresses the “why” and “how” of decision making. It focuses on how decision makers use heuristics en route to choice by addressing both the process and the choice related to the decision task. More specifically, decision makers use a two-stage process wherein a more complicated choice set is reduced to one that is more manageable through the use of these heuristics, or cognitive shortcuts. In the second stage, decision makers are more likely to employ maximizing and analytical strategies in making a choice. Poliheuristic theory also focuses on the political consequences of decision making, arguing that decision makers will refrain from making politically costly decisions. While poliheuristic theory helps us better understand how decision makers process information and make choices, it does not specifically address how choice sets and decision matrices were created in the first place. Applied decision analysis (ADA) rectifies this shortcoming by focusing on how leaders create particular choice sets and matrices and then how they arrive at a choice. It does so by first identifying the decision maker’s choice set or decision matrix; that is, the alternatives or options available to choose from as well as the criteria or dimensions upon which the options will be evaluated. ADA then focuses on uncovering the decision maker’s decision code through the use of multiple decision models. Combining poliheuristic theory with ADA allows researchers to more fully explain decision making in general and crisis decision making in particular. An application of poliheuristic theory and ADA to decision making pertaining to the Fukushima nuclear disaster reveals that even in this high-stress crisis environment decision makers followed the two-stage process as predicted by poliheuristic theory. More specifically, in the first stage, decision makers simplified the decision task by resorting to cognitive heuristics (i.e., decision making shortcuts) to eliminate politically damaging alternatives such as voluntary evacuation. In the second stage, decision makers conducted a more analytical evaluation of the compulsory evacuation options.

1997 ◽  
Vol 91 (3) ◽  
pp. 553-566 ◽  
Author(s):  
Alex Mintz ◽  
Nehemia Geva ◽  
Steven B. Redd ◽  
Amy Carnes

Previous studies of political decision making have used only “static” choice sets, where alternatives are “fixed” and are a priori known to the decision maker. We assess the effect of a dynamic choice set (new alternatives appear during the decision process) on strategy selection and choice in international politics. We suggest that decision makers use a mixture of decision strategies when making decisions in a two-stage process consisting of an initial screening of available alternatives, and a selection of the best one from the subset of remaining alternatives. To test the effects of dynamic and static choice sets on the decision process we introduce a computer-based “process tracer” in a study of top-ranking officers in the U.S. Air Force. The results show that (1) national security decision makers use a mixture of strategies in arriving at a decision, and (2) strategy selection and choice are significantly influenced by the structure of the choice set (static versus dynamic).


Author(s):  
Alex Mintz ◽  
Steven B. Redd ◽  
Eldad Tal-Shir

Poliheuristic theory focuses on the why and how of decision-making. The primary argument is that decision-makers are sensitive to both cognitive and environmental constraints and are particularly likely to focus on the political consequences of their decisions. Decision-makers use a two-stage process en route to choice, wherein heuristic shortcuts are implemented in the first stage in an effort to reduce complexity and in the second stage a maximizing strategy on the remaining alternatives in the choice set. The theory focuses on five main information-processing characteristics: order-sensitive, nonholistic, and dimension-based searching and noncompensatory and satisficing decision rules. The theory has been tested using numerous case studies and statistical and experimental analyses. These studies have provided strong empirical support for this theory. In 2013, the United States decided not to attack Syria, despite domestic and international pressure to do so. This case shows the importance of political constraints on President Obama’s calculus of decision, leading to the adoption of the chemical disarmament of Syria.


Axioms ◽  
2021 ◽  
Vol 10 (2) ◽  
pp. 124
Author(s):  
Dragiša Stanujkić ◽  
Darjan Karabašević ◽  
Gabrijela Popović ◽  
Predrag S. Stanimirović ◽  
Florentin Smarandache ◽  
...  

Some decision-making problems, i.e., multi-criteria decision analysis (MCDA) problems, require taking into account the attitudes of a large number of decision-makers and/or respondents. Therefore, an approach to the transformation of crisp ratings, collected from respondents, in grey interval numbers form based on the median of collected scores, i.e., ratings, is considered in this article. In this way, the simplicity of collecting respondents’ attitudes using crisp values, i.e., by applying some form of Likert scale, is combined with the advantages that can be achieved by using grey interval numbers. In this way, a grey extension of MCDA methods is obtained. The application of the proposed approach was considered in the example of evaluating the websites of tourism organizations by using several MCDA methods. Additionally, an analysis of the application of the proposed approach in the case of a large number of respondents, done in Python, is presented. The advantages of the proposed method, as well as its possible limitations, are summarized.


Author(s):  
Sahinya Susindar ◽  
Harrison Wissel-Littmann ◽  
Terry Ho ◽  
Thomas K. Ferris

In studying naturalistic human decision-making, it is important to understand how emotional states shape decision-making processes and outcomes. Emotion regulation techniques can improve the quality of decisions, but there are several challenges to evaluating these techniques in a controlled research context. Determining the effectiveness of emotion regulation techniques requires methodology that can: 1) reliably elicit desired emotions in decision-makers; 2) include decision tasks with response measures that are sensitive to emotional loading; and 3) support repeated exposures/trials with relatively-consistent emotional loading and response sensitivity. The current study investigates one common method, the Balloon Analog Risk Task (BART), for its consistency and reliability in measuring the risk-propensity of decision-makers, and specifically how the method’s effectiveness might change over the course of repeated exposures. With the PANASX subjective assessment serving for comparison, results suggest the BART assessment method, when applied over repeated exposures, is reduced in its sensitivity to emotional stimuli and exhibits decision task-related learning effects which influence the observed trends in response data in complex ways. This work is valuable for researchers in decision-making and to guide design for humans with consideration for their affective states.


2010 ◽  
Vol 46 (4) ◽  
pp. 777-783
Author(s):  
Antônio Edson de Souza Lucena ◽  
Divaldo de Almeida Sampaio ◽  
Ednaldo Rosas da Silva ◽  
Virgínia Florêncio de Paiva ◽  
Ana Cláudia Santiago ◽  
...  

Highly purified intravenous immunoglobulin G concentrate (IV IgG) was produced with the use of polyethylene glycol associated to a single-stage precipitation by ethanol, instead of the classic Cohn-Oncley process, which employs cold alcohol as the precipitating agent, in a three-stage process. Precipitation of crude fraction containing more than 95% of immunoglobulin G was performed by liquid chromatography with a cation exchanger, CM-Sepharose, as a stationary phase. During the process, the product was subjected to two-stage viral inactivation. The first stage was performed by the action of sodium caprylate, 30 mM at pH 5.1+/- 0.1, and the second stage was performed by the action of a solvent-detergent mixture. The finished product was formulated at 5% with 10% sucralose as the stabilizing agent. The process yields 3.3g of IgG/liter of plasma. The finished product analysis showed an anti-complementary activity lower than 1CH50. Polymer and aggregate percent levels were lower than 3% in the five batches studied. The analysis of neutralizing capacity showed the presence of antibacterial and antiviral antibodies in at least three times higher concentrations than the levels found in source plasma. The finished product fulfilled all purity requirements stated in the 4th edition of the European pharmacopeia.


2019 ◽  
Vol 19 (1) ◽  
pp. 26-35 ◽  
Author(s):  
Xuan Luo ◽  
Gaoming Jiang ◽  
Honglian Cong

Abstract This paper focuses on the better performance between the garment simulation result and the simulation speed. For simplicity and clarity, a notation “PART” is defined to indicate the areas between the garment and the human body satisfying some constraints. The discrete mechanical model can be achieved by the two-stage process. In the first stage, the garment can be divided into several PARTs constrained by the distance. In the second stage, the mechanical model of each PART is formulated with a mathematical expression. Thus, the mechanical model of the garment can be obtained. Through changing the constrained distance, the simulation result and the simulation speed can be observed. From the variable distance, a desired value can be chosen for an optimal value. The results of simulations and experiments demonstrate that the better performance can be achieved at a higher speed by saving runtime with the acceptable simulation results and the efficiency of the proposed scheme can be verified as well.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Sashank Pisupati ◽  
Lital Chartarifsky-Lynn ◽  
Anup Khanal ◽  
Anne K Churchland

Perceptual decision-makers often display a constant rate of errors independent of evidence strength. These 'lapses' are treated as a nuisance arising from noise tangential to the decision, e.g. inattention or motor errors. Here, we use a multisensory decision task in rats to demonstrate that these explanations cannot account for lapses' stimulus dependence. We propose a novel explanation: lapses reflect a strategic trade-off between exploiting known rewarding actions and exploring uncertain ones. We tested this model's predictions by selectively manipulating one action's reward magnitude or probability. As uniquely predicted by this model, changes were restricted to lapses associated with that action. Finally, we show that lapses are a powerful tool for assigning decision-related computations to neural structures based on disruption experiments (here, posterior striatum and secondary motor cortex). These results suggest that lapses reflect an integral component of decision-making and are informative about action values in normal and disrupted brain states.


2021 ◽  
Author(s):  
Eva D. Regnier ◽  
Joel W. Feldmeier

General Eisenhower’s decisions to postpone and, one day later, to launch the “D-Day” invasion of Normandy are a gripping illustration of sequential decisions under uncertainty, suitable for any introductory decision analysis class. They’re also the archetypal example of weather-sensitive decision making using a forecast. This paper develops a framework for analyzing weather-sensitive decisions with a focus on the less-familiar strategic decisions that determine how forecasts are produced and what operational alternatives are available so that decision makers can extract value from forecasts. We tell the story of the decisions made in the months before D-Day regarding how to set up the forecasting process and the myriad decisions implicating nation-level resources that prepared Allied forces not just to invade, but to hold open that decision until the last possible hour so that Eisenhower and his staff could use the critical forecasts. Finally, we overview the current state of the weather-forecasting enterprise, the current challenges of interest to decision analysts, and what this means for decision analysts seeking opportunities to help the weather enterprise improve forecasts and to help operational decision makers extract more value from modern weather forecasts.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Thais Cristina Sampaio Machado ◽  
Plácido Rogerio Pinheiro ◽  
Isabelle Tamanini

The decision making is present in every activity of the human world, either in simple day-by-day problems or in complex situations inside of an organization. Sometimes emotions and reasons become hard to separate; therefore decision support methods were created to help decision makers to make complex decisions, and Decision Support Systems (DSS) were created to aid the application of such methods. The paper presents the development of a new tool, which reproduces the procedure to apply the Verbal Decision Analysis (VDA) methodology ORCLASS. The tool, called OrclassWeb, is software that supports the process of the mentioned DSS method and the paper provides proof of concepts, that which presents its reliability with ORCLASS.


Author(s):  
Rui Zheng ◽  
Chun Su ◽  
Yuqiao Zheng

Most existing warranty policies are rigid, and the downtime loss is also not taken into account. This study develops a two-stage decision framework to design flexible warranty policies, where the downtime loss is considered. In the first stage, by minimizing the warranty service cost, a fixed warranty policy is provided to determine the baseline of preventive maintenance’s times and effort. In the second stage, customers have three options to increase preventive maintenance times, preventive maintenance effort, or both of them, which results in three types of flexible warranty policies. The additional maintenance cost for the increased preventive maintenance times and/or preventive maintenance effort is paid by the customers. Besides, the flexible policies are optimized to minimize customer’s cost, which is the sum of the downtime loss and shared maintenance cost. A practical example is provided to illustrate the effectiveness of the proposed flexible warranty policies. The results indicate that compared with the fixed warranty policies, both the manufacturer and customers can benefit from the proposed flexible policies, especially when the downtime loss is substantial. Moreover, the proposed policy is more effective when the warranty period is longer.


Sign in / Sign up

Export Citation Format

Share Document