POSSIBILISTIC EVALUATION OF SETS

Author(s):  
ANTOON BRONSELAER ◽  
JOSÉ ENRIQUE PONS ◽  
GUY DE TRÉ ◽  
OLGA PONS

In the past decades, the theory of possibility has been developed as a theory of uncertainty that is compatible with the theory of probability. Whereas probability theory tries to quantify uncertainty that is caused by variability (or equivalently randomness), possibility theory tries to quantify uncertainty that is caused by incomplete information. A specific case of incomplete information is that of ill-known sets, which is of particular interest in the study of temporal databases. However, the construction of possibility distributions in the case of ill-known sets is known to be overly complex. This paper contributes to the study of ill-known sets by investigating the inference of uncertainty when constraints are specified over ill-known values. More specific, in this paper it is investigated how the knowledge about constraint satisfaction can be inferred if the constraints themselves are defined by means of ill-known values. It is shown how such reasoning can contribute to the study of (fuzzy) temporal databases.

Author(s):  
K. DEMIRLI ◽  
M. MOLHIM ◽  
A. BULGAK

Sonar sensors are widely used in mobile robots applications such as navigation, map building, and localization. The performance of these sensors is affected by the environmental phenomena, sensor design, and target characteristics. Therefore, the readings obtained from these sensors are uncertain. This uncertainity is often modeled by using Probability Theory. However, the probablistic approach is valid when the available knowledge is precise which is not the case in sonar readings. In this paper, the behavior of sonar readings reflected from walls and corners are studied, then new models of angular uncertainty and radial imprecision for sonar readings obtained from corners and walls are proposed. These models are represented by using Possibility Theory, mainly possibility distributions.


2018 ◽  
Vol 13 (4) ◽  
pp. 815-839 ◽  
Author(s):  
Qinglong Gou ◽  
Fangdi Deng ◽  
Yanyan He

Purpose Selective crowdsourcing is an important type of crowdsourcing which has been popularly used in practice. However, because selective crowdsourcing uses a winner-takes-all mechanism, implying that the efforts of most participants except the final winner will be just in vain. The purpose of this paper is to explore why this costly mechanism can become a popularity during the past decade and which type of tasks can fit this mechanism well. Design/methodology/approach The authors propose a game model between a sponsor and N participants. The sponsor is to determine its reward and the participants are to optimize their effort-spending strategy. In this model, each participant's ability is the private information, and thus, all roles in the system face incomplete information. Findings The results of this paper demonstrate the following: whether the sponsor can obtain a positive expected payoff are determined by the type of tasks, while the complex tasks with a strong learning effect is more suitable to selective crowdsourcing, as for the other two types of task, the sponsor cannot obtain a positive payoff, or can just gain a rather low payoff; besides the task type, the sponsor's efficiency in using the solutions and the public's marginal cost also influence the result that whether the sponsor can obtain a positive surplus from the winner-takes-all mechanism. Originality/value The model presented in this paper is innovative by containing the following characteristics. First, each participant's ability is private information, and thus, all roles in the system face incomplete information. Second, the winner-takes-all mechanism is used, implying that the sponsor's reward will be entirely given to the participant with the highest quality solution. Third, the sponsor's utility from the solutions, as well as the public's cost to complete the task, are both assumed as functions just satisfying general properties.


2018 ◽  
Author(s):  
Alan Frost

For the past 225 years, the story of the Bounty's voyage has captured the public's imagination. Two compelling characters emerge at the forefront of the mutiny: Lieutenant William Bligh, and his deputy – and ringleader of the mutiny – Acting Lieutenant Fletcher Christian. One is a villain and the other a hero – who plays each role depends on how you view the story. With multiple narratives and incomplete information, some paint Bligh as tyrannical and abusive, and Christian as his deputy who broke under extreme emotional pressure. Others view Bligh as a victim and a hero, and Christian self-indulgent and underhanded. Alan Frost looks past these common narrative structures to shed new light on what truly happened during the infamous expedition. Reviewing previous accounts and explanations of the voyage and subsequent mutiny, and placing it within a broader historical context, Frost investigates the mayhem, mutiny and mythology of the Bounty.


2014 ◽  
Vol 15 (1) ◽  
pp. 79-116 ◽  
Author(s):  
KIM BAUTERS ◽  
STEVEN SCHOCKAERT ◽  
MARTINE DE COCK ◽  
DIRK VERMEIR

AbstractAnswer Set Programming (ASP) is a popular framework for modelling combinatorial problems. However, ASP cannot be used easily for reasoning about uncertain information. Possibilistic ASP (PASP) is an extension of ASP that combines possibilistic logic and ASP. In PASP a weight is associated with each rule, whereas this weight is interpreted as the certainty with which the conclusion can be established when the body is known to hold. As such, it allows us to model and reason about uncertain information in an intuitive way. In this paper we present new semantics for PASP in which rules are interpreted as constraints on possibility distributions. Special models of these constraints are then identified as possibilistic answer sets. In addition, since ASP is a special case of PASP in which all the rules are entirely certain, we obtain a new characterization of ASP in terms of constraints on possibility distributions. This allows us to uncover a new form of disjunction, called weak disjunction, that has not been previously considered in the literature. In addition to introducing and motivating the semantics of weak disjunction, we also pinpoint its computational complexity. In particular, while the complexity of most reasoning tasks coincides with standard disjunctive ASP, we find that brave reasoning for programs with weak disjunctions is easier.


Author(s):  
HENRI PRADE ◽  
RONALD R. YAGER

This note investigates how various ideas of "expectedness" can be captured in the framework of possibility theory. Particularly, we are interested in trying to introduce estimates of the kind of lack of surprise expressed by people when saying "I would not be surprised that…" before an event takes place, or by saying "I knew it" after its realization. In possibility theory, a possibility distribution is supposed to model the relative levels of possibility of mutually exclusive alternatives in a set, or equivalently, the alternatives are assumed to be rank-ordered according to their level of possibility to take place. Four basic set-functions associated with a possibility distribution, including standard possibility and necessity measures, are discussed from the point of view of what they estimate when applied to potential events. Extensions of these estimates based on the notions of Q-projection or OWA operators are proposed when only significant parts of the possibility distribution are retained in the evaluation. The case of partially-known possibility distributions is also considered. Some potential applications are outlined.


2006 ◽  
Vol 932 ◽  
Author(s):  
K. Yanagizawa ◽  
S. Takeda ◽  
H. Osawa ◽  
Y. Suyama ◽  
H. Takase ◽  
...  

ABSTRACTJNC has developed an uncertainty analysis methodology for application to the spatially heterogeneous characteristics of a geological environment. The developed methodology adopts a new approach that identifies all the possible options in concepts and parameter ranges that cannot be excluded in the light of evidence available. This approach enables uncertainties associated with the understanding at a given stage of the site characterization to be made explicit, using probability theory and possibility theory. The uncertainties could be reduced by screening, to exclude concepts and parameter ranges that can be denied in the light of additional evidence obtained in subsequent investigation stages. This paper describes an outline of the developed methodology and its applicability in the Tono area.


2005 ◽  
Vol 19 (4) ◽  
pp. 519-531 ◽  
Author(s):  
F. A. Campos ◽  
J. Villar ◽  
J. Barquín

It is known that Cournot game theory has been one of the theoretical approaches used more often to model electricity market behavior. Nevertheless, this approach is highly influenced by the residual demand curves of the market agents, which are usually not precisely known. This imperfect information has normally been studied with probability theory, but possibility theory might sometimes be more helpful in modeling not only uncertainty but also imprecision and vagueness. In this paper, two dual approaches are proposed to compute a robust Cournot equilibrium, when the residual demand uncertainty is modeled with possibility distributions. Additionally, it is shown that these two approaches can be combined into a bicriteria programming model, which can be solved with an iterative algorithm. Some interesting results for a real-size electricity system show the robustness of the proposed methodology.


2019 ◽  
Vol 18 (4) ◽  
pp. 358-367
Author(s):  
Nguyen Trung Thanh ◽  
Nguyen Minh Huan ◽  
Tran Quang Tien

Advances in ocean data assimilation have been improved with the power of computers over the past 30 years, but the theory of data assimilation has a long history with the probability, theory of mathematical simulation, and traditional numerical calculations. Therefore, the research and application of data assimilation technology in oceanography have been widely carried out in the world and have gotten considerable achievements. In Vietnam, the study of data assimilation has also been applied in the meteorology and has attracted the increasing interest in oceanography. This paper presents the results of the study to calculate the wave height data from satellites combined with measured wave height data at MSP1 station of Vietsovpetro using the Ensemble Kalman Filter (EnKF) method associated with SWAN wave model in the Eastern Vietnam Sea.


Sign in / Sign up

Export Citation Format

Share Document