2018 ◽  
Vol 12 (2) ◽  
pp. 1492-1498 ◽  
Author(s):  
Qi Guo ◽  
Bo-Wei Chen ◽  
Seungmin Rho ◽  
Wen Ji ◽  
Feng Jiang ◽  
...  

Author(s):  
Francisco Torrens ◽  
Gloria Castellano

Numerous definitions for complexity have been proposed with little consensus. The definition here is related to Kolmogorov complexity and Shannon entropy measures. However, the price is to introduce context dependence into the definition of complexity. Such context dependence is an inherent property of complexity. Scientists are uncomfortable with such context dependence that smacks of subjectivity, which is the reason why little agreement is found on the meaning of the terms. In an article published in Molecules, Lin presented a novel approach for assessing molecular diversity based on Shannon information theory. A set of compounds is viewed as a static collection of microstates that can register information about their environment. The method is characterized by a strong tendency to oversample remote areas of the feature space and produce unbalanced designs. This chapter demonstrates the limitation with some simple examples and provides a rationale for the failure to produce results that are consistent.


2004 ◽  
Vol 01 (01) ◽  
pp. 1-22 ◽  
Author(s):  
HAROLD SZU

The geometric topology of a point per event written in the higher dimensional μ-space of data (e.g. 6W's: who, where, when, what, how, and why) can help in the design of information acquisition (IA) systems. Measurement intensity of each W's sensor, or the number of words used to describe a specific W's attribute, represents the length of each vector dimension. Then, N concurrent reports of the same event become a distribution set of N points scattered all over μ-space. To discover the statistically independent components, an unsupervised or unbiased Artificial Neural Networks (ANN) methodology called Independent Component Analysis (ICA) can be used to reveal a new subspace called the feature space. The major and minor axes of the subspace correspond to highly precise and efficient combinations of old attributes (e.g. 2-D feature domains consisting of "where-who-when" and "what-how-why" could be good choices for Internet search indices). Thus, one realizes that the communication of an event is not just the address-where: but who and when are equally important attributes. In principle, the number of new sensors can be reduced (e.g. from 6 W's to 2 features), provided that they are physically realizable. In the combined space of 6N-dimensional Γ-space, one point can represent all N concurrent measurements; the flow of these generates the event behavior in time. The time flow over the reduced 2N feature space generates invariant features called knowledge. For surveillance against terrorists, legacy electrical power line communication (PLC) will offer a useful relay for the last mile of mobile communications for a Surveillance Sensor Web (SSW) employing ANN: there is no need for "where" addressing for switching because of smart coding and decoding of "who-when." After reviewing Auto-Regression (AR), we generalize AR to a supervised ANN implementation of Principal Component Analysis (PCA) (Appendix A) learning toward unsupervised learning ANN for ICA (Appendix B). This is possible non-statistically because the classical-closed information theory (CIT) of the maximum Shannon entropy S of a closed system must be generalized for open brain information theory (BIT) having non-zero energy exchange E at the minimum Helmholtz free energy H = E - T o S at isothermal equilibrium ( T o =37° C ). For such an open BIT system, we prove the Lyaponov convergence theorem. We compute the ICA features of image textures in order to measure the ICA classifier information content.


Sign in / Sign up

Export Citation Format

Share Document