Advances in Nonlinear Network Models and Algorithms

Author(s):  
John M. Mulvey
Author(s):  
Shuo Cheng ◽  
Guohui Zhou

Because the shallow neural network has limited ability to represent complex functions with limited samples and calculation units, its generalization ability will be limited when it comes to complex classification problems. The essence of deep learning is to learn a nonlinear network structure, to represent input data distributed representation and demonstrate a powerful ability to learn deeper features of data from a small set of samples. In order to realize the accurate classification of expression images under normal conditions, this paper proposes an expression recognition model of improved Visual Geometry Group (VGG) deep convolutional neural network (CNN). Based on the VGG-19, the model optimizes network structure and network parameters. Most expression databases are unable to train the entire network from the start due to lack of sufficient data. This paper uses migration learning techniques to overcome the shortage of image training samples. Shallow CNN, Alex-Net and improved VGG-19 deep CNN are used to train and analyze the facial expression data on the Extended Cohn–Kanade expression database, and compare the experimental results obtained. The experimental results indicate that the improved VGG-19 network model can achieve 96% accuracy in facial expression recognition, which is obviously superior to the results of other network models.


1989 ◽  
Vol 37 (3) ◽  
pp. 353-372 ◽  
Author(s):  
Ron S. Dembo ◽  
John M. Mulvey ◽  
Stavros A. Zenios

2012 ◽  
Vol 2012 ◽  
pp. 1-12 ◽  
Author(s):  
Weisong Zhou ◽  
Zhichun Yang

A class of dynamical neural network models with time-varying delays is considered. By employing the Lyapunov-Krasovskii functional method and linear matrix inequalities (LMIs) technique, some new sufficient conditions ensuring the input-to-state stability (ISS) property of the nonlinear network systems are obtained. Finally, numerical examples are provided to illustrate the efficiency of the derived results.


2009 ◽  
Vol 88 (6) ◽  
pp. 68001 ◽  
Author(s):  
F. Piazza ◽  
Y.-H. Sanejouand

2007 ◽  
Vol 99 (23) ◽  
Author(s):  
B. Juanico ◽  
Y.-H. Sanejouand ◽  
F. Piazza ◽  
P. De Los Rios

1997 ◽  
pp. 371-377 ◽  
Author(s):  
D. D. Lee ◽  
B. Y. Reis ◽  
H. S. Seung ◽  
D. W. Tank

2019 ◽  
Vol 42 ◽  
Author(s):  
Hanna M. van Loo ◽  
Jan-Willem Romeijn

AbstractNetwork models block reductionism about psychiatric disorders only if models are interpreted in a realist manner – that is, taken to represent “what psychiatric disorders really are.” A flexible and more instrumentalist view of models is needed to improve our understanding of the heterogeneity and multifactorial character of psychiatric disorders.


2019 ◽  
Vol 42 ◽  
Author(s):  
Don Ross

AbstractUse of network models to identify causal structure typically blocks reduction across the sciences. Entanglement of mental processes with environmental and intentional relationships, as Borsboom et al. argue, makes reduction of psychology to neuroscience particularly implausible. However, in psychiatry, a mental disorder can involve no brain disorder at all, even when the former crucially depends on aspects of brain structure. Gambling addiction constitutes an example.


Author(s):  
S. R. Herd ◽  
P. Chaudhari

Electron diffraction and direct transmission have been used extensively to study the local atomic arrangement in amorphous solids and in particular Ge. Nearest neighbor distances had been calculated from E.D. profiles and the results have been interpreted in terms of the microcrystalline or the random network models. Direct transmission electron microscopy appears the most direct and accurate method to resolve this issue since the spacial resolution of the better instruments are of the order of 3Å. In particular the tilted beam interference method is used regularly to show fringes corresponding to 1.5 to 3Å lattice planes in crystals as resolution tests.


Sign in / Sign up

Export Citation Format

Share Document