Input Vector Identification and System Model Construction by Average Mutual Information

2000 ◽  
Author(s):  
Paul B. Deignan ◽  
Peter H. Meckl ◽  
Matthew A. Franchek ◽  
Salim A. Jaliwala ◽  
George G. Zhu

Abstract A methodology for the intelligent, model-independent selection of an appropriate set of input signals for the system identification of an unknown process is demonstrated. In modeling this process, it is shown that the terms of a simple nonlinear polynomial model may also be determined through the analysis of the average mutual information between inputs and the output. Average mutual information can be thought of as a nonlinear correlation coefficient and can be calculated from input/output data alone. The methodology described here is especially applicable to the development of virtual sensors.

1997 ◽  
Vol 9 (3) ◽  
pp. 595-606 ◽  
Author(s):  
Marc M. Van Hulle

This article introduces an extremely simple and local learning rule for to pographic map formation. The rule, called the maximum entropy learning rule (MER), maximizes the unconditional entropy of the map's output for any type of input distribution. The aim of this article is to show that MER is a viable strategy for building topographic maps that maximize the average mutual information of the output responses to noiseless input signals when only input noise and noise-added input signals are available.


Author(s):  
F Heister ◽  
M Froehlich

In recent years, after a period of disillusion in the field of neural processing and adaptive algorithms, neural networks have been reconsidered for solving complex technical tasks. The problem of neural network training is the presentation of input/output data showing an appropriate information content which represent a given problem. The training of a neural structure will definitely lead to poor results if the relation between input and output signals shows no functional dependence but a pure stochastic behaviour. This paper is concerned with the identification of the most relevant input-output data pairs for neural networks, using the concept of mutual information. A general, quantitative method is demonstrated for identifying the most relevant points from the transient measured data of a combustion engine. In this context mutual information is employed for the problem of determining the 50 per cent energy conversion point solely from the combustion chamber pressure during one combustion cycle.


Author(s):  
Zheng Liu ◽  
Jorge Angeles

Abstract Rank-deficiencies and ill-conditioning of the synthesis matrix in the optimization of function-generating linkages are often caused by an improper selection of the input-output data points given by the {ψi,ϕi}1m pairs, where ψ and ϕ denote the input and output values, respectively. Ten basic cases of rank-deficiencies in the synthesis matrix are discussed in this paper, the associated curves, termed singularity curves, being plotted in the ψ-ϕ plane. Measures to remedy the ill-conditioning that arises in the optimization procedure and means to find the best-conditioned synthesis matrices by minimizing their condition number are also proposed.


Author(s):  
Nguyen N. Tran ◽  
Ha X. Nguyen

A capacity analysis for generally correlated wireless multi-hop multi-input multi-output (MIMO) channels is presented in this paper. The channel at each hop is spatially correlated, the source symbols are mutually correlated, and the additive Gaussian noises are colored. First, by invoking Karush-Kuhn-Tucker condition for the optimality of convex programming, we derive the optimal source symbol covariance for the maximum mutual information between the channel input and the channel output when having the full knowledge of channel at the transmitter. Secondly, we formulate the average mutual information maximization problem when having only the channel statistics at the transmitter. Since this problem is almost impossible to be solved analytically, the numerical interior-point-method is employed to obtain the optimal solution. Furthermore, to reduce the computational complexity, an asymptotic closed-form solution is derived by maximizing an upper bound of the objective function. Simulation results show that the average mutual information obtained by the asymptotic design is very closed to that obtained by the optimal design, while saving a huge computational complexity.


Sign in / Sign up

Export Citation Format

Share Document