AVERAGE CASE ANALYSIS OF AN HEBB-TYPE RULE THAT FINDS THE NETWORK CONNECTIVITY

1994 ◽  
Vol 05 (02) ◽  
pp. 115-122
Author(s):  
MOSTEFA GOLEA

We describe an Hebb-type algorithm for learning unions of nonoverlapping perceptrons with binary weights. Two perceptrons are said to be nonoverlapping if they do not share any input variables. The learning algorithm is able to find both the network architecture and the weight values necessary to represent the target function. Moreover, the algorithm is local, homogeneous, and simple enough to be biologically plausible. We investigate the average behavior of this algorithm as a function of the size of the training set. We find that, as the size of the training set increases, the hypothesis network built by the algorithm “converges” to the target network, both in terms of the number of perceptrons and the connectivity. Moreover, the generalization rate converges exponentially to perfect generalization as a function of the number of training examples. The analytic expressions are in excellent agreement with the numerical simulations. To our knowledge, this is the first average case analysis of an algorithm that finds both the weight values and the network connectivity.

Algorithmica ◽  
2006 ◽  
Vol 46 (3-4) ◽  
pp. 469-491 ◽  
Author(s):  
Moritz G. Maass

Author(s):  
Remi Gribonval ◽  
Boris Mailhe ◽  
Holger Rauhut ◽  
Karin Schnass ◽  
Pierre Vandergheynst

Sign in / Sign up

Export Citation Format

Share Document