Developing A Theoretical Framework of Task Complexity for Research on Visualization in Support of Decision Making Under Uncertainty

Author(s):  
Stephen M. Fiore ◽  
Samantha F. Warta ◽  
Andrew Best ◽  
Olivia Newton ◽  
Joseph J. LaViola

This paper describes initial validation of a theoretical framework to support research on the visualization of uncertainty. Two experiments replicated and extended this framework, illustrating how the manipulation of task complexity produces differences in performance. Additionally, using a combinatory metric of workload and performance, this framework provides a new metric for assessing uncertainty visualization. We describe how this work acts as a theoretical scaffold for examining differing forms of visualizations of uncertainty by providing a means for systematic variations in task context.

2016 ◽  
Vol 106 (7) ◽  
pp. 1775-1801 ◽  
Author(s):  
Eran Shmaya ◽  
Leeat Yariv

The analysis of lab data entails a joint test of the underlying theory and of subjects' conjectures regarding the experimental design itself, how subjects frame the experiment. We provide a theoretical framework for analyzing such conjectures. We use experiments of decision making under uncertainty as a case study. Absent restrictions on subjects' framing of the experiment, we show that any behavior is consistent with standard updating (“anything goes”), including those suggestive of anomalies such as overconfidence, excess belief stickiness, etc. When the experimental protocol restricts subjects' conjectures (plausibly, by generating information during the experiment), standard updating has nontrivial testable implications. (JEL C91, D11, D81, D83)


Sign in / Sign up

Export Citation Format

Share Document