Estimation of Mutual Information and Conditional Entropy for Surveillance Optimization

SPE Journal ◽  
2014 ◽  
Vol 19 (04) ◽  
pp. 648-661 ◽  
Author(s):  
Duc H. Le ◽  
Albert C. Reynolds

Summary Given a suite of potential surveillance operations, we define surveillance optimization as the problem of choosing the operation that gives the minimum expected value of P90 minus P10 (i.e., P90 – P10) of a specified reservoir variable J (e.g., cumulative oil production) that will be obtained by conditioning J to the observed data. Two questions can be posed: (1) Which surveillance operation is expected to provide the greatest uncertainty reduction in J? and (2) What is the expected value of the reduction in uncertainty that would be achieved if we were to undertake each surveillance operation to collect the associated data and then history match the data obtained? In this work, we extend and apply a conceptual idea that we recently proposed for surveillance optimization to 2D and 3D waterflooding problems. Our method is based on information theory in which the mutual information between J and the random observed data vector Dobs is estimated by use of an ensemble of prior reservoir models. This mutual information reflects the strength of the relationship between J and the potential observed data and provides a qualitative answer to Question 1. Question 2 is answered by calculating the conditional entropy of J to generate an approximation of the expected value of the reduction in (P90 – P10) of J. The reliability of our method depends on obtaining a good estimate of the mutual information. We consider several ways to estimate the mutual information and suggest how a good estimate can be chosen. We validate the results of our proposed method with an exhaustive history-matching procedure. The methodology provides an approximate way to decide the data that should be collected to maximize the uncertainty reduction in a specified reservoir variable and to estimate the reduction in uncertainty that could be obtained. We expect this paper will stimulate significant research on the application of information theory and lead to practical methods and workflows for surveillance optimization.

Author(s):  
QINGHUA HU ◽  
DAREN YU

Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is interpreted as the significance of an attribute in fuzzy rough set model. As an application, we redefine independency of an attribute set, reduct, relative reduct in fuzzy rough set model based on Yager's entropy. Some experimental results show the proposed approach is suitable for fuzzy and numeric data reduction.


2009 ◽  
Vol 48 (06) ◽  
pp. 552-557 ◽  
Author(s):  
W. A. Benish

Summary Objectives: Mutual information is a fundamental concept of information theory that quantifies the expected value of the amount of information that diagnostic testing provides about a patient’s disease state. The purpose of this report is to provide both intuitive and axiomatic descriptions of mutual information and, thereby, promote the use of this statistic as a measure of diagnostic test performance. Methods: We derive the mathematical expression for mutual information from the intuitive assumption that diagnostic information is the average amount that diagnostic testing reduces our surprise upon ultimately learning a patient’s diagnosis. This concept is formalized by defining “surprise” as the surprisal, a function that quantifies the unlikelihood of an event. Mutual information is also shown to be the only function that conforms to a set of axioms which are reasonable requirements of a measure of diagnostic information. These axioms are related to the axioms of information theory used to derive the expression for entropy. Results: Both approaches to defining mutual information lead to the known relationship that mutual information is equal to the pre-test uncertainty of the disease state minus the expected value of the posttest uncertainty of the disease state. Mutual information also has the property of being additive when a test provides information about independent health problems. Conclusion: Mutual information is the best single measure of the ability of a diagnostic test to discriminate among the possible disease states.


2005 ◽  
Vol 17 (4) ◽  
pp. 741-778 ◽  
Author(s):  
Eric E. Thomson ◽  
William B. Kristan

Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.


2018 ◽  
Author(s):  
Forlan La Rosa Almeida ◽  
Helena Nandi Formentin ◽  
Célio Maschio ◽  
Alessandra Davolio ◽  
Denis José Schiozer

Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.


This chapter presents a higher-order-logic formalization of the main concepts of information theory (Cover & Thomas, 1991), such as the Shannon entropy and mutual information, using the formalization of the foundational theories of measure, Lebesgue integration, and probability. The main results of the chapter include the formalizations of the Radon-Nikodym derivative and the Kullback-Leibler (KL) divergence (Coble, 2010). The latter provides a unified framework based on which most of the commonly used measures of information can be defined. The chapter then provides the general definitions that are valid for both discrete and continuous cases and then proves the corresponding reduced expressions where the measures considered are absolutely continuous over finite spaces.


Sign in / Sign up

Export Citation Format

Share Document