Knowledge of Rules in Connectionist Networks

1990 ◽  
Vol 9 (1) ◽  
pp. 81-126
Author(s):  
Martin Davies
2002 ◽  
Vol 14 (7) ◽  
pp. 1755-1769 ◽  
Author(s):  
Robert M. French ◽  
Nick Chater

In error-driven distributed feedforward networks, new information typically interferes, sometimes severely, with previously learned information. We show how noise can be used to approximate the error surface of previously learned information. By combining this approximated error surface with the error surface associated with the new information to be learned, the network's retention of previously learned items can be improved and catastrophic interference significantly reduced. Further, we show that the noise-generated error surface is produced using only first-derivative information and without recourse to any explicit error information.


Sign in / Sign up

Export Citation Format

Share Document