Mutual Information Minimization for Under-Determined Blind Source Separation

Author(s):  
Fuxiang Wang ◽  
Jun Zhang
2001 ◽  
Vol 8 (6) ◽  
pp. 174-176 ◽  
Author(s):  
K.E. Hild ◽  
D. Erdogmus ◽  
J. Principe

2005 ◽  
Vol 17 (2) ◽  
pp. 425-452 ◽  
Author(s):  
Kun Zhang ◽  
Lai-Wan Chan

The linear mixture model has been investigated in most articles tackling the problem of blind source separation. Recently, several articles have addressed a more complex model: blind source separation (BSS) of post-nonlinear (PNL) mixtures. These mixtures are assumed to be generated by applying an unknown invertible nonlinear distortion to linear instantaneous mixtures of some independent sources. The gaussianization technique for BSS of PNL mixtures emerged based on the assumption that the distribution of the linear mixture of independent sources is gaussian. In this letter, we review the gaussianization method and then extend it to apply to PNL mixture in which the linear mixture is close to gaussian. Our proposed method approximates the linear mixture using the Cornish-Fisher expansion. We choose the mutual information as the independence measurement to develop a learning algorithm to separate PNL mixtures. This method provides better applicability and accuracy. We then discuss the sufficient condition for the method to be valid. The characteristics of the nonlinearity do not affect the performance of this method. With only a few parameters to tune, our algorithm has a comparatively low computation. Finally, we present experiments to illustrate the efficiency of our method.


Sign in / Sign up

Export Citation Format

Share Document