Parallelizing Feed-Forward Artificial Neural Networks on Transputers
Keyword(s):
<p>This thesis is about parallelizing the training phase of a feed-forward, artificial neural network. More specifically, we develop and analyze a number of parallelizations of the widely used neural net learning algorithm called <em>back-propagation</em>.</p><p> </p><p>We describe two different strategies for parallelizing the back-propagation algorithm. A number of parallelizations employing these strategies have been implemented on a system of 48 transputers, permitting us to evaluate and analyze their performances based on the results of actual runs.</p>
2011 ◽
pp. 224-263
◽
2019 ◽
pp. 1-7
◽
2017 ◽
Vol 1
(2)
◽
pp. 31-36
2017 ◽
Vol 43
(4)
◽
pp. 26-32
◽