Hidden-layer size reducing for multilayer neural networks using the orthogonal least-squares method

Author(s):  
Zi-Jiang Yang
1995 ◽  
Vol 03 (04) ◽  
pp. 1177-1191 ◽  
Author(s):  
HÉLÈNE PAUGAM-MOISY

This article is a survey of recent advances on multilayer neural networks. The first section is a short summary on multilayer neural networks, their history, their architecture and their learning rule, the well-known back-propagation. In the following section, several theorems are cited, which present one-hidden-layer neural networks as universal approximators. The next section points out that two hidden layers are often required for exactly realizing d-dimensional dichotomies. Defining the frontier between one-hidden-layer and two-hidden-layer networks is still an open problem. Several bounds on the size of a multilayer network which learns from examples are presented and we enhance the fact that, even if all can be done with only one hidden layer, more often, things can be done better with two or more hidden layers. Finally, this assertion 'is supported by the behaviour of multilayer neural networks in two applications: prediction of pollution and odor recognition modelling.


Sign in / Sign up

Export Citation Format

Share Document