Information Theory and the Role of Intermediaries

Author(s):  
Reinhard H. Schmidt ◽  
Marcel Tyrell
Keyword(s):  
1998 ◽  
Vol 44 (6) ◽  
pp. 2045-2056 ◽  
Author(s):  
A.D. Wyner ◽  
J. Ziv ◽  
A.J. Wyner

2020 ◽  
pp. 464-490
Author(s):  
Miquel Feixas ◽  
Mateu Sbert

Around seventy years ago, Claude Shannon, who was working at Bell Laboratories, introduced information theory with the main purpose of dealing with the communication channel between source and receiver. The communication channel, or information channel as it later became known, establishes the shared information between the source or input and the receiver or output, both of which are represented by random variables, that is, by probability distributions over their possible states. The generality and flexibility of the information channel concept can be robustly applied to numerous, different areas of science and technology, even the social sciences. In this chapter, we will present examples of its application to select the best viewpoints of an object, to segment an image, and to compute the global illumination of a three-dimensional virtual scene. We hope that our examples will illustrate how the practitioners of different disciplines can use it for the purpose of organizing and understanding the interplay of information between the corresponding source and receiver.


Author(s):  
Esther Gal-Or

This chapter describes how methodologies developed in the field of game and information theory can assist in understanding the interaction of competitors in markets, and the study of managerial economics, in general. The chapter highlights, in particular, the role of incomplete information in generating market failures, and provides examples of mechanisms that can alleviate such failures. Some examples of topics addressed are: first- and second-mover advantages, long term strategic commitments versus short term tactical choices made by competitors, erection of entry barriers to secure market power, choices of product-mix, special pricing mechanisms to enhance profitability, and issues related to vertical control and the internal organization of the firm.


2005 ◽  
Vol 62 (9) ◽  
pp. 3368-3381 ◽  
Author(s):  
Timothy DelSole

Abstract This paper presents a framework for quantifying predictability based on the behavior of imperfect forecasts. The critical quantity in this framework is not the forecast distribution, as used in many other predictability studies, but the conditional distribution of the state given the forecasts, called the regression forecast distribution. The average predictability of the regression forecast distribution is given by a quantity called the mutual information. Standard inequalities in information theory show that this quantity is bounded above by the average predictability of the true system and by the average predictability of the forecast system. These bounds clarify the role of potential predictability, of which many incorrect statements can be found in the literature. Mutual information has further attractive properties: it is invariant with respect to nonlinear transformations of the data, cannot be improved by manipulating the forecast, and reduces to familiar measures of correlation skill when the forecast and verification are joint normally distributed. The concept of potential predictable components is shown to define a lower-dimensional space that captures the full predictability of the regression forecast without loss of generality. The predictability of stationary, Gaussian, Markov systems is examined in detail. Some simple numerical examples suggest that imperfect forecasts are not always useful for joint normally distributed systems since greater predictability often can be obtained directly from observations. Rather, the usefulness of imperfect forecasts appears to lie in the fact that they can identify potential predictable components and capture nonstationary and/or nonlinear behavior, which are difficult to capture by low-dimensional, empirical models estimated from short historical records.


2006 ◽  
Vol 21 (37) ◽  
pp. 2799-2811 ◽  
Author(s):  
GIAN PAOLO BERETTA

A seldom recognized fundamental difficulty undermines the concept of individual "state" in the present formulations of quantum statistical mechanics (and in its quantum information theory interpretation as well). The difficulty is an unavoidable consequence of an almost forgotten corollary proved by Schrödinger in 1936 and perused by Park, Am. J. Phys.36, 211 (1968). To resolve it, we must either reject as unsound the concept of state, or else undertake a serious reformulation of quantum theory and the role of statistics. We restate the difficulty and discuss a possible resolution proposed in 1976 by Hatsopoulos and Gyftopoulos, Found. Phys.6, 15; 127; 439; 561 (1976).


Author(s):  
Daniela Gîfu ◽  
Mirela Teodorescu

The scientists as Bateson, Watzlawick expresses the determining role of interaction in the axiom of the “impossibility of not communicating”. All behavior (verbal and nonverbal) occurring between persons who are conscious of each other's presence has behavioural effects, whether intended or not. Such effects have interpersonal message value, and thus are communicative in nature. Since it is impossible for humans not to behave in one way or another, it follows that in interaction it is impossible not to communicate (Bateson, 1963; Watzlawick et al., 1967). Communication theory is relatively new as science and interacts with the other disciplines of sciences. During its development some of the notions were used that were already committed and comprehensive. This article aims to present some of them. Terms as system, input, output, feedback, entropy specific to scientific disciplines as systems theory, cybernetics, information theory, physics, are especially used in communication theory.


Sign in / Sign up

Export Citation Format

Share Document