scholarly journals Minimax Mutual Information Approach for Independent Component Analysis

2004 ◽  
Vol 16 (6) ◽  
pp. 1235-1252 ◽  
Author(s):  
Deniz Erdogmus ◽  
Kenneth E. Hild ◽  
Yadunandana N. Rao ◽  
José C. Príncipe

Minimum output mutual information is regarded as a natural criterion for independent component analysis (ICA) and is used as the performance measure in many ICA algorithms. Two common approaches in information-theoretic ICA algorithms are minimum mutual information and maximum output entropy approaches. In the former approach, we substitute some form of probability density function (pdf) estimate into the mutual information expression, and in the latter we incorporate the source pdf assumption in the algorithm through the use of nonlinearities matched to the corresponding cumulative density functions (cdf). Alternative solutions to ICA use higher-order cumulant-based optimization criteria, which are related to either one of these approaches through truncated series approximations for densities. In this article, we propose a new ICA algorithm motivated by the maximum entropy principle (for estimating signal distributions). The optimality criterion is the minimum output mutual information, where the estimated pdfs are from the exponential family and are approximate solutions to a constrained entropy maximization problem. This approach yields an upper bound for the actual mutual information of the output signals—hence, the name minimax mutual information ICA algorithm. In addition, we demonstrate that for a specific selection of the constraint functions in the maximum entropy density estimation procedure, the algorithm relates strongly to ICA methods using higher-order cumulants.

2014 ◽  
Vol 6 (12) ◽  
pp. 4305-4311 ◽  
Author(s):  
Jiguang Li ◽  
Jun Gao ◽  
Hua Li ◽  
Xiaofeng Yang ◽  
Yu Liu

The synthesis mechanism of 4-amino-3,5-dimethyl pyrazole was investigated using in-line FT-IR spectroscopy combined with a Fast-ICA algorithm.


2020 ◽  
Author(s):  
Adam Borowicz

Abstract Independent component analysis (ICA) is a popular technique for demixing multi-channel data. The performance of typical ICA algorithm strongly depends on many factors such as the presence of additive noise, the actual distribution of source signals, and the estimated number of non-Gaussian components. Often a linear mixing model is assumed and the source signals are extracted by proceeding data whitening followed by a sequence of plane (Jacobi) rotations. In this article, we develop a four-unit, symmetric algorithm, based on the quaternionic factorization of the rotation matrices and the Newton-Raphson iterative scheme. Unlike conventional rotational techniques such as the JADE algorithm, our method exploits 4 x 4 rotation matrices and uses negentropy approximation as a contrast function. Consequently, the proposed method can be adapted to a given data distribution (e.g. super-Gaussians) by selecting the appropriate non-linear function that approximates the negentropy. Compared to the widely used, symmetric FastICA algorithm, the proposed method does not require an orthogonalization step and offers better numerical stability in the presence of multiple Gaussian sources.


Sign in / Sign up

Export Citation Format

Share Document