scholarly journals Hierarchical Expertise-Level Modeling for User Specific Robot-Behavior Explanations

2020 ◽  
Vol 34 (03) ◽  
pp. 2518-2526
Author(s):  
Sarath Sreedharan ◽  
Tathagata Chakraborti ◽  
Christian Muise ◽  
Subbarao Kambhampati

In this work, we present a new planning formalism called Expectation-Aware planning for decision making with humans in the loop where the human's expectations about an agent may differ from the agent's own model. We show how this formulation allows agents to not only leverage existing strategies for handling model differences like explanations (Chakraborti et al. 2017) and explicability (Kulkarni et al. 2019), but can also exhibit novel behaviors that are generated through the combination of these different strategies. Our formulation also reveals a deep connection to existing approaches in epistemic planning. Specifically, we show how we can leverage classical planning compilations for epistemic planning to solve Expectation-Aware planning problems. To the best of our knowledge, the proposed formulation is the first complete solution to planning with diverging user expectations that is amenable to a classical planning compilation while successfully combining previous works on explanation and explicability. We empirically show how our approach provides a computational advantage over our earlier approaches that rely on search in the space of models.

1999 ◽  
Vol 21 (s-1) ◽  
pp. 63-73 ◽  
Author(s):  
Anne M. Magro

Prior research in psychology and accounting suggests that features of the decision-making task and context affect information processing, yet the decision-making context is often ignored in tax judgment and decision-making research. Two primary decision contexts in the tax setting are planning and compliance. If these two contexts differ on significant features, the information processing of tax professionals in the settings also is likely to differ. An analysis of the characteristics of tax planning and compliance contexts suggests that planning problems are generally characterized by greater complexity, ambiguity, and justifiability demands than are compliance problems. Experienced tax professionals' knowledge of these differences in complexity, ambiguity, and justifiability demands of problems in the planning and compliance contexts was tested in an experiment in which decision-making context was manipulated. Each participant rated the complexity, ambiguity, and justifiability demands of six research cases. As predicted, participants in the planning condition rated the cases as higher in complexity, ambiguity, and justifiability demands than did participants in the compliance condition. Behavioral implications of these differences were demonstrated in that managers in the planning context budgeted significantly more time for staff to complete tax research than did those in the compliance context.


Author(s):  
Elzbieta Malinowski

Data warehouses (DWs) integrate data from different source systems in order to provide historical information that supports the decision-making process. The design of a DW is a complex and costly task since the inclusion of different data items in a DW depends on both users’ needs and data availability in source systems. Currently, there is still a lack of a methodological framework that guides developers through the different stages of the DW design process. On the one hand, there are several proposals that informally describe the phases used for developing DWs based on the authors’ experience in building such systems (Inmon, 2002; Kimball, Reeves, Ross, & Thornthwaite, 1998). On the other hand, the scientific community proposes a variety of approaches for developing DWs, discussed in the next section. Nevertheless, they either include features that are meant for the specific conceptual model used by the authors, or they are very complex. This situation has occurred since the need to build DW systems that fulfill user expectations was ahead of methodological and formal approaches for DW development, just like the one we had for operational databases.


1984 ◽  
Vol 1 (4) ◽  
pp. 76-79
Author(s):  
Alan R. Ek ◽  
Dietmar W. Rose ◽  
Hans M. Gregersen

Abstract Common Lake States practices of stand description and inventory, and continuous forest inventory (CFI) are criticized. The origin of current practices is discussed and suggestions are made for refinement of practices in light of emerging forest sampling and managment decision-making techniques. Emphasis is placed on developing data of varying precision levels that meet the requirements for specific forest managment and planning problems. North. J. Appl. For. 1:76-79, Dec. 1984.


2015 ◽  
Vol 10 (3) ◽  
pp. 595-600
Author(s):  
Rajeeb Ghimire

This paper deals with the concept of ‘community acceptance testing (CAT)’ which is perhaps a new concept in the water supply sector. To understand this it is necessary to accept the water supply system as a product of engineering works and water as social goods. While the engineering approach verifies the product against predefined specifications, the CAT validates the capability of that product to satisfy user expectations. In the water supply, sanitation and hygiene sector, there is a culture of verification, but validation should also be given due importance. The validation process is based on user stories and is done before handing over the project to the community. It establishes the community's supremacy over system decision-making and service delivery. The CAT approach promotes the designing of community-engineered systems.


2019 ◽  
Author(s):  
David R. Mandel ◽  
Mandeep K. Dhami ◽  
Serena Tran ◽  
Daniel Irwin

Probability information is regularly communicated to experts who must fuse multiple estimates to support decision-making. Such information is often communicated verbally (e.g., “likely”) rather than with precise numeric (point) values (e.g., “.75”), yet people are not taught to perform arithmetic on verbal probabilities. We hypothesized that the accuracy and logical coherence of averaging and multiplying probabilities will be poorer when individuals receive probability information in verbal rather than numerical point format. In four experiments (N = 213, 201, 26, and 343, respectively), we manipulated probability communication format between-subjects. Participants averaged and multiplied sets of four probabilities. Across experiments, arithmetic accuracy and coherence was significantly better with point than with verbal probabilities. These findings generalized between expert (intelligence analysts) and non-expert samples and when controlling for calculator use. Experiment 4 revealed an important qualification: whereas accuracy and coherence were better among participants presented with point probabilities than with verbal probabilities, imprecise numeric probability ranges (e.g., “.70 to .80”) afforded no computational advantage over verbal probabilities. Experiment 4 also revealed that the advantage of the point over the verbal format is partially mediated by strategy use. Participants presented with point estimates are more likely to use mental computation than guesswork, and mental computation was found to be associated with better accuracy. Our findings suggest that where computation is important, probability information should be communicated to end users with precise numeric probabilities.


1971 ◽  
Vol 3 (3) ◽  
pp. 253-266 ◽  
Author(s):  
A Faludi

This paper develops conceptual tools for the analysis of planning behaviour. These are, firstly, a model of planning systems as learning systems, and then three dimensions of planning behaviour, each described by defining a pair of dichotomous concepts at their far ends: ‘blueprint’ versus ‘process’ modes of planning; ‘rational-deductive’ decision-making versus ‘disjointed incrementalism’; ‘normative’ versus ‘functional’ planning. Each of these concepts is discussed in detail, and some indicators for the analysis of planning behaviour are suggested. Finally, a more complex model is constructed which combines the three dimensions. Elements of this model are firstly the level at which planning is conducted within a hierarchy of planning systems, and secondly, the ‘planning sub-structurel, that is the technology-image reflecting the nature of planning problems and available planning technologies. From this model one can derive a number of researchable hypotheses about planning behaviour.


2021 ◽  
Author(s):  
Rusne Sileryte ◽  
Alexander Wandl ◽  
Arjan van Timmeren

With circular economy being high on governmental agendas, there is an increasing request from governing bodies for circularity measurements. Yet currently existing macro-level monitoring frameworks are widely criticized for not being able to inform the decision making. The reasons behind their failure stem from a lack of consensus on terminologies and definitions among scholars, politicians and practitioners, a lack of supporting data and tools and, consequently, a lack of transparency and trustworthiness.To fulfill those needs, a bottom-up approach to build a shared terminology is suggested by involving macro-framework users within a government, data providers and tool developers. Their expertise and expectations for monitoring the transition are elicited through the process of formal ontology development and alignment.The ontology development experiment builds upon a use case of the Amsterdam Circular Economy Monitor (2020). First, four ontology development approaches are used to create a theory-centered, a user-centered, a tool-centered and a data-centered ontology. The ontologies are later compared, merged, and aligned with each other to arrive at one single ontology. The notes taken during the process are used to provide a detailed discussion on common concepts, identified conflicts, and gaps in monitoring expectations between the monitor users, data, tools, and the latest theory.


Author(s):  
Andrew Mitchell ◽  
Wheeler Ruml ◽  
Fabian Spaniol ◽  
Jorg Hoffmann ◽  
Marek Petrik

In real-time planning, an agent must select the next action to take within a fixed time bound. Many popular real-time heuristic search methods approach this by expanding nodes using time-limited A* and selecting the action leading toward the frontier node with the lowest f value. In this paper, we reconsider real-time planning as a problem of decision-making under uncertainty. We propose treating heuristic values as uncertain evidence and we explore several backup methods for aggregating this evidence. We then propose a novel lookahead strategy that expands nodes to minimize risk, the expected regret in case a non-optimal action is chosen. We evaluate these methods in a simple synthetic benchmark and the sliding tile puzzle and find that they outperform previous methods. This work illustrates how uncertainty can arise even when solving deterministic planning problems, due to the inherent ignorance of time-limited search algorithms about those portions of the state space that they have not computed, and how an agent can benefit from explicitly metareasoning about this uncertainty.


Sign in / Sign up

Export Citation Format

Share Document