Effects of hidden nodes on noisy network dynamics

2021 ◽  
Vol 103 (6) ◽  
Author(s):  
Beverly Gemao ◽  
Pik-Yin Lai
2015 ◽  
Vol E98.B (9) ◽  
pp. 1749-1757
Author(s):  
Yun WEN ◽  
Kazuyuki OZAKI ◽  
Hiroshi FUJITA ◽  
Teruhisa NINOMIYA ◽  
Makoto YOSHIDA

2016 ◽  
Vol 7 (2) ◽  
pp. 105-112
Author(s):  
Adhi Kusnadi ◽  
Idul Putra

Stress will definitely be experienced by every human being and the level of stress experienced by each individual is different. Stress experienced by students certainly will disturb their study if it is not handled quickly and appropriately. Therefore we have created an expert system using a neural network backpropagation algorithm to help counselors to predict the stress level of students. The network structure of the experiment consists of 26 input nodes, 5 hidden nodes, and 2 the output nodes, learning rate of 0.1, momentum of 0.1, and epoch of 5000, with a 100% accuracy rate. Index Terms - Stress on study, expert system, neural network, Stress Prediction


Author(s):  
Roni Tibon ◽  
Kamen A. Tsvetanov ◽  
Darren Price ◽  
David Nesbitt ◽  
Cam CAN ◽  
...  

Entropy ◽  
2020 ◽  
Vol 23 (1) ◽  
pp. 56
Author(s):  
Haoyu Niu ◽  
Jiamin Wei ◽  
YangQuan Chen

Stochastic Configuration Network (SCN) has a powerful capability for regression and classification analysis. Traditionally, it is quite challenging to correctly determine an appropriate architecture for a neural network so that the trained model can achieve excellent performance for both learning and generalization. Compared with the known randomized learning algorithms for single hidden layer feed-forward neural networks, such as Randomized Radial Basis Function (RBF) Networks and Random Vector Functional-link (RVFL), the SCN randomly assigns the input weights and biases of the hidden nodes in a supervisory mechanism. Since the parameters in the hidden layers are randomly generated in uniform distribution, hypothetically, there is optimal randomness. Heavy-tailed distribution has shown optimal randomness in an unknown environment for finding some targets. Therefore, in this research, the authors used heavy-tailed distributions to randomly initialize weights and biases to see if the new SCN models can achieve better performance than the original SCN. Heavy-tailed distributions, such as Lévy distribution, Cauchy distribution, and Weibull distribution, have been used. Since some mixed distributions show heavy-tailed properties, the mixed Gaussian and Laplace distributions were also studied in this research work. Experimental results showed improved performance for SCN with heavy-tailed distributions. For the regression model, SCN-Lévy, SCN-Mixture, SCN-Cauchy, and SCN-Weibull used less hidden nodes to achieve similar performance with SCN. For the classification model, SCN-Mixture, SCN-Lévy, and SCN-Cauchy have higher test accuracy of 91.5%, 91.7% and 92.4%, respectively. Both are higher than the test accuracy of the original SCN.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Giacomo Baggio ◽  
Danielle S. Bassett ◽  
Fabio Pasqualetti

AbstractOur ability to manipulate the behavior of complex networks depends on the design of efficient control algorithms and, critically, on the availability of an accurate and tractable model of the network dynamics. While the design of control algorithms for network systems has seen notable advances in the past few years, knowledge of the network dynamics is a ubiquitous assumption that is difficult to satisfy in practice. In this paper we overcome this limitation, and develop a data-driven framework to control a complex network optimally and without any knowledge of the network dynamics. Our optimal controls are constructed using a finite set of data, where the unknown network is stimulated with arbitrary and possibly random inputs. Although our controls are provably correct for networks with linear dynamics, we also characterize their performance against noisy data and in the presence of nonlinear dynamics, as they arise in power grid and brain networks.


Author(s):  
Daisuke Koshiyama ◽  
Makoto Miyakoshi ◽  
Yash B. Joshi ◽  
Juan L. Molina ◽  
Kumiko Tanaka-Koshiyama ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document