scholarly journals Information: Problems, Paradoxes, and Solutions

Author(s):  
Mark Burgin

The information age is upon us and the main paradox is that there is no satisfactory and commonly accepted answer to the crucial question what information is. This results in a quantity of contradictions, misconceptions, and paradoxes related to the world of information. We consider the existing situation in information studies, which is very paradoxical and inconsistent, in the first part of this paper. To remedy the situation, a new approach in information theory, which is called the general theory of information, is developed. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built on an axiomatic base as a system of two classes of principles and their consequences. The first class consists of the ontological principles, which are revealing general properties and regularities of information and its functioning. Principles from the second class explain how to measure information.

Author(s):  
Mark Burgin

The information age is upon us and the main paradox is that there is no satisfactory and commonly accepted answer to the crucial question what information is. This results in a quantity of contradictions, misconceptions, and paradoxes related to the world of information. We consider the existing situation in information studies, which is very paradoxical and inconsistent, in the first part of this paper. To remedy the situation, a new approach in information theory, which is called the general theory of information, is developed. The main achievement of the general theory of information is explication of a relevant and adequate definition of information. This theory is built on an axiomatic base as a system of two classes of principles and their consequences. The first class consists of the ontological principles, which are revealing general properties and regularities of information and its functioning. Principles from the second class explain how to measure information.


Information ◽  
2020 ◽  
Vol 11 (9) ◽  
pp. 406 ◽  
Author(s):  
Mark Burgin ◽  
Jaime F. Cárdenas-García

The goal of this paper is to represent two approaches to the phenomenon of information, explicating its nature and essence. In this context, Mark Burgin demonstrates how the general theory of information (GTI) describes and elucidates the phenomenon of information by explaining the axiomatic foundations for information studies and presenting the comprising mathematical theory of information. The perspective promoted by Jaime F. Cárdenas-García is based on Gregory Bateson’s description of information as “difference which makes a difference” and involves the process of info-autopoiesis as a sensory commensurable, self-referential feedback process.


Author(s):  
Tian-qing Qiao

It has been 60 years since the essential attributes of information were explored in the field of philosophy, resulting in many contentious schools of thought and a wide division of opinions. Some scholars in China and abroad have been trying to build a new system of information philosophy from an ontological perspective, so as to explain the world. In this paper, the author puts forward a definition of information and its mathematical expressions in order to demonstrate that information is the collection of three kinds of attributes of things. Analysis suggests that the essence of information is the interaction of matters and the representation of the law of causality in philosophy. The paper also explores the ways in which information—as a noun—is a term that people have customarily used and confused. Eventually, the induction, differentiation and utilization of information, as conventionally understood, should be applied into studying matters themselves.


Author(s):  
Mark Burgin

The general theory of information is a synthetic approach, which organizes and encompasses all main directions in information theory. It is developed on three levels: conceptual, methodological and theoretical. On the conceptual level, the concept of information is purified and information operations are separated and described. On the methodological level, it is formulated as system of principles, explaining what information is and how to measure information. On the theoretical level, mathematical models of information are constructed and studied. The goal of this paper is to clarify the concept of information and discuss its mathematical models, establishing relations with physics as the most developed science.


Author(s):  
Mark Burgin

The general theory of information is a synthetic approach, which organizes and encompasses all main directions in information theory. It is developed on three levels: conceptual, methodological and theoretical. On the conceptual level, the concept of information is purified and information operations are separated and described. On the methodological level, it is formulated as system of principles, explaining what information is and how to measure information. On the theoretical level, mathematical models of information are constructed and studied. The goal of this paper is to clarify the concept of information and discuss its mathematical models, establishing relations with physics as the most developed science.


Author(s):  
Fred Dretske

The mathematical theory of information (also called communication theory) defines a quantity called mutual information that exists between a source, s, and receiver, r. Mutual information is a statistical construct, a quantity defined in terms of conditional probabilities between the events occurring at r and s. If what happens at r depends on what happens at s to some degree, then there is a communication ‘channel’ between r and s, and mutual information at r about s. If, on the other hand, the events at two points are statistically independent, there is zero mutual information. Philosophers and psychologists are attracted to information theory because of its potential as a useful tool in describing an organism’s cognitive relations to the world. The attractions are especially great for those who seek a naturalistic account of knowledge, an account that avoids normative – and, therefore, scientifically unusable – ideas such as rational warrant, sufficient reason and adequate justification. According to this approach, philosophically problematic notions like evidence, knowledge, recognition and perception – perhaps even meaning – can be understood in communication terms. Perceptual knowledge, for instance, might best be rendered in terms of a brain (r) receiving mutual information about a worldly source (s) via sensory channels. When incoming signals carry appropriate information, suitably equipped brains ‘decode’ these signals, extract information and thereby come to know what is happening in the outside world. Perception becomes information-produced belief.


Author(s):  
Zeynep Altan

The NATO conference held in Garmisch in 1968 was on the future of the computer and software world, and it presented the process of realization of what has been talked about in those dates to the present day. This chapter also examines the development of software systems since 1968, depending on the technological developments. The contribution of mathematics and physics to the development of information systems was explained in chronological order by comparing the possibilities of yesterday and today. Complementary contributions of science and technology have been evaluated in the evolutionary and revolutionary developments ranging from the definition of information theory in 1948 to teleportation. It can clearly be seen that discrete mathematics directly affects the improvements in computer science. This review study clearly shows that it would not be possible to talk about digital transformation and quantum computation if the discoveries of Shannon, Turing and Neumann, and the studies of other scientists before them did not exist.


Author(s):  
Tian-qing Qiao

It has been 60 years since the essential attributes of information were explored in the field of philosophy, resulting in many contentious schools of thought and a wide division of opinions. Some scholars in China and abroad have been trying to build a new system of information philosophy from an ontological perspective, so as to explain the world. In this paper, the author puts forward a definition of information and its mathematical expressions in order to demonstrate that information is the collection of three kinds of attributes of things. Analysis suggests that the essence of information is the interaction of matters and the representation of the law of causality in philosophy. The paper also explores the ways in which information—as a noun—is a term that people have customarily used and confused. Eventually, the induction, differentiation and utilization of information, as conventionally understood, should be applied into studying matters themselves.


Sign in / Sign up

Export Citation Format

Share Document