An Information Theoretic View on Learning of Artificial Neural Networks

Author(s):  
Emilio Rafael Balda ◽  
Arash Behboodi ◽  
Rudolf Mathar
2008 ◽  
Vol 18 (05) ◽  
pp. 389-403 ◽  
Author(s):  
THOMAS D. JORGENSEN ◽  
BARRY P. HAYNES ◽  
CHARLOTTE C. F. NORLUND

This paper describes a new method for pruning artificial neural networks, using a measure of the neural complexity of the neural network. This measure is used to determine the connections that should be pruned. The measure computes the information-theoretic complexity of a neural network, which is similar to, yet different from previous research on pruning. The method proposed here shows how overly large and complex networks can be reduced in size, whilst retaining learnt behaviour and fitness. The technique proposed here helps to discover a network topology that matches the complexity of the problem it is meant to solve. This novel pruning technique is tested in a robot control domain, simulating a racecar. It is shown, that the proposed pruning method is a significant improvement over the most commonly used pruning method Magnitude Based Pruning. Furthermore, some of the pruned networks prove to be faster learners than the benchmark network that they originate from. This means that this pruning method can also help to unleash hidden potential in a network, because the learning time decreases substantially for a pruned a network, due to the reduction of dimensionality of the network.


Author(s):  
Kobiljon Kh. Zoidov ◽  
◽  
Svetlana V. Ponomareva ◽  
Daniel I. Serebryansky ◽  
◽  
...  

2012 ◽  
Vol 3 (2) ◽  
pp. 48-50
Author(s):  
Ana Isabel Velasco Fernández ◽  
◽  
Ricardo José Rejas Muslera ◽  
Juan Padilla Fernández-Vega ◽  
María Isabel Cepeda González

Sign in / Sign up

Export Citation Format

Share Document