Fully Coupled and Feedforward Neural Networks with Complex-Valued Neurons

Author(s):  
Jacek M. Zurada ◽  
Igor Aizenberg
2017 ◽  
Vol 47 (3) ◽  
pp. 1271-1284 ◽  
Author(s):  
Rongrong Wu ◽  
He Huang ◽  
Xusheng Qian ◽  
Tingwen Huang

1995 ◽  
Vol 06 (04) ◽  
pp. 435-446 ◽  
Author(s):  
P. ARENA ◽  
L. FORTUNA ◽  
R. RE ◽  
M.G. XIBILIA

In this paper the approximation capabilities of different structures of complex feedforward neural networks, reported in the literature, have been theoretically analyzed. In particular a new density theorem for Complex Multilayer Perceptrons with complex valued non-analytic sigmoidal activation functions has been proven. Such a result makes Multilayer Perceptrons with complex valued neurons universal interpolators of continuous complex valued functions. Moreover the approximation properties of superpositions of analytic activation functions have been investigated, proving that such combinations are not dense in the set of continuous complex valued functions. Several numerical examples have also been reported in order to show the advantages introduced by Complex Multilayer Perceptrons in terms of computational complexity with respect to the classical real MLP.


Sign in / Sign up

Export Citation Format

Share Document