Fully-Programmable Analogue VLSI Devices for the Implementation of Neural Networks

Author(s):  
Alan Murray ◽  
Anthony Smith ◽  
Lionel Tarassenko
1993 ◽  
Vol 04 (04) ◽  
pp. 419-426 ◽  
Author(s):  
LIONEL TARASSENKO ◽  
JON TOMBS ◽  
GRAHAM CAIRNS

Results from simulations of weight perturbation as an on-chip learning scheme for analogue VLSI neural networks are presented. The limitations of analogue hardware are modelled as realistically as possible. Thus synaptic weight precision is defined according to the smallest change in the weight setting voltage which gives a measurable change at the output of the corresponding neuron. Tests are carried out on a hard classification problem constructed from mobile robot navigation data. The simulations show that the degradation in classification performance on a 500-pattern test set caused by the introduction of realistic hardware constraints is acceptable: with 8-bit weights, updated probabilistically and with a simplified output error criterion, the error rate increases by no more than 7% when compared with weight perturbation implemented with full 32-bit precision.


1997 ◽  
Vol 83 (1) ◽  
pp. 91-98 ◽  
Author(s):  
K. T. LAU ◽  
S. T. LEE

1999 ◽  
Vol 22 (8) ◽  
pp. 723-728 ◽  
Author(s):  
Artymiak ◽  
Bukowski ◽  
Feliks ◽  
Narberhaus ◽  
Zenner

1995 ◽  
Vol 40 (11) ◽  
pp. 1110-1110
Author(s):  
Stephen James Thomas

Sign in / Sign up

Export Citation Format

Share Document