Neural networks and Sigmoid Activation Function in Multi-Layer Networks
Keyword(s):
Back propagation neural networks are known for computing the problems that cannot easily be computed (huge datasets analysis or training) in artificial neural networks. The main idea of this paper is to implement XOR logic gate by ANNs using back propagation neural networks for back propagation of errors, and sigmoid activation function. This neural networks to map non-linear threshold gate. The non-linear used to classify binary inputs ( ) and passing it through hidden layer for computing and ( ), after computing errors by ( ) the weights and thetas ( ) are changing according to errors. Sigmoid activation function is = and Derivation of sigmoid is = . The sig(x) and Dsig(x) is between 1 to 0.
2019 ◽
Vol 12
(3)
◽
pp. 156-161
◽
2009 ◽
Vol 08
(03)
◽
pp. 253-285
◽
Keyword(s):
1997 ◽
Vol 11
(1)
◽
pp. 33-44
◽
2012 ◽
Vol 42
(4)
◽
pp. 295-311
◽
2018 ◽
Vol 13
(2)
◽
pp. 98
1997 ◽
Vol 11
(5)
◽
pp. 395-408
◽
2019 ◽
Vol 8
(12)
◽
pp. 405-410