scholarly journals Safety in aviation - critical situations in the aspect of the theory of subjective analysis

2018 ◽  
Vol 19 (12) ◽  
pp. 242-245 ◽  
Author(s):  
Krzysztof Szafran

The article discusses one of the hypotheses of the theory of making decisions in emergency situations. The methodology developed in recent years to determine the probability of a dangerous situation, using subjective analysis, and in particular the function of entropy, gives a new look to the agent, as the decision-maker responsible for security in the structure of the active system. The following publication attempts to signal issues affecting the borderline of various areas of knowledge. Analogously to information theory, where the mathematics-computer scientist Claude E. Shannon introduced the probabilistic concept of entropy, it is possible to determine entropy in the considerations of the theory of choice, which is directly related to making decisions in critical situations. In some works, which were quoted in the paper, the function was called "subjective entropy". It is closely related to the concept of information value. It was shown how the use of the maximum entropy principle of a dynamic system's state can be an effective research instrument determining the security stock.

1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document