Fast learning in a backpropagation algorithm with a sine-type thresholding function

1992 ◽  
Vol 31 (14) ◽  
pp. 2414 ◽  
Author(s):  
Yan-Xin Zhang ◽  
Dong-Xue Wang
2016 ◽  
Vol 7 (2) ◽  
pp. 105-112
Author(s):  
Adhi Kusnadi ◽  
Idul Putra

Stress will definitely be experienced by every human being and the level of stress experienced by each individual is different. Stress experienced by students certainly will disturb their study if it is not handled quickly and appropriately. Therefore we have created an expert system using a neural network backpropagation algorithm to help counselors to predict the stress level of students. The network structure of the experiment consists of 26 input nodes, 5 hidden nodes, and 2 the output nodes, learning rate of 0.1, momentum of 0.1, and epoch of 5000, with a 100% accuracy rate. Index Terms - Stress on study, expert system, neural network, Stress Prediction


2012 ◽  
Vol 38 (11) ◽  
pp. 1831
Author(s):  
Wen-Jun HU ◽  
Shi-Tong WANG ◽  
Juan WANG ◽  
Wen-Hao YING

2020 ◽  
Vol 15 ◽  
pp. 155892501990083
Author(s):  
Xintong Li ◽  
Honglian Cong ◽  
Zhe Gao ◽  
Zhijia Dong

In this article, thermal resistance test and water vapor resistance test were experimented to obtain data of heat and humidity performance. Canonical correlation analysis was used on determining influence of basic fabric parameters on heat and humidity performance. Thermal resistance model and water vapor resistance model were established with a three-layered feedforward-type neural network. For the generalization of the network and the difficulty of determining the optimal network structure, trainbr was chosen as training algorithm to find the relationship between input factors and output data. After training and verification, the number of hidden layer neurons in the thermal resistance model was 12, and the error reached 10−3. In the water vapor resistance model, the number of hidden layer neurons was 10, and the error reached 10−3.


2019 ◽  
Vol 116 (16) ◽  
pp. 7723-7731 ◽  
Author(s):  
Dmitry Krotov ◽  
John J. Hopfield

It is widely believed that end-to-end training with the backpropagation algorithm is essential for learning good feature detectors in early layers of artificial neural networks, so that these detectors are useful for the task performed by the higher layers of that neural network. At the same time, the traditional form of backpropagation is biologically implausible. In the present paper we propose an unusual learning rule, which has a degree of biological plausibility and which is motivated by Hebb’s idea that change of the synapse strength should be local—i.e., should depend only on the activities of the pre- and postsynaptic neurons. We design a learning algorithm that utilizes global inhibition in the hidden layer and is capable of learning early feature detectors in a completely unsupervised way. These learned lower-layer feature detectors can be used to train higher-layer weights in a usual supervised way so that the performance of the full network is comparable to the performance of standard feedforward networks trained end-to-end with a backpropagation algorithm on simple tasks.


Sign in / Sign up

Export Citation Format

Share Document