input-output unit

Author(s):  
Martin H. Weik
Keyword(s):  
1995 ◽  
Vol 06 (03) ◽  
pp. 225-231 ◽  
Author(s):  
MARCELO BLATT ◽  
EYTAN DOMANY ◽  
IDO KANTER

We consider two-layered perceptrons consisting of N binary input units, K binary hidden units and one binary output unit, in the limit N≫K≥1. We prove that the weights of a regular irreducible network are uniquely determined by its input-output map up to some obvious global symmetries. A network is regular if its K weight vectors from the input layer to the K hidden units are linearly independent. A (single layered) perceptron is said to be irreducible if its output depends on every one of its input units; and a two-layered perceptron is irreducible if the K+1 perceptrons that constitute such network are irreducible. By global symmetries we mean, for instance, permuting the labels of the hidden units. Hence, two irreducible regular two-layered perceptrons that implement the same Boolean function must have the same number of hidden units, and must be composed of equivalent perceptrons.


1953 ◽  
Vol 41 (10) ◽  
pp. 1483-1486 ◽  
Author(s):  
P. Vance ◽  
D. Haas
Keyword(s):  

1968 ◽  
Author(s):  
Gordon L. Wilson ◽  
A.H. McMorris ◽  
E.G. Rogers

1970 ◽  
Vol 15 (2) ◽  
pp. 115, 118
Author(s):  
WILLIAM E. COLEMAN

Sign in / Sign up

Export Citation Format

Share Document