Neural Networks for Reliable Information Transmission

Author(s):  
M. Hussain ◽  
J.S. Bedi
2008 ◽  
Vol 9 (S1) ◽  
Author(s):  
Andreas Herzog ◽  
Bernd Michaelis ◽  
Ana D de Lima ◽  
Thomas Baltz ◽  
Thomas Voigt

Author(s):  
Nikolay Anatolievich Vershkov ◽  
Mikhail Grigoryevich Babenko ◽  
Viktor Andreevich Kuchukov ◽  
Natalia Nikolaevna Kuchukova

The article deals with the problem of recognition of handwritten digits using feedforward neural networks (perceptrons) using a correlation indicator. The proposed method is based on the mathematical model of the neural network as an oscillatory system similar to the information transmission system. The article uses theoretical developments of the authors to search for the global extremum of the error function in artificial neural networks. The handwritten digit image is considered as a one-dimensional input discrete signal representing a combination of "perfect digit writing" and noise, which describes the deviation of the input implementation from "perfect writing". The ideal observer criterion (Kotelnikov criterion), which is widely used in information transmission systems and describes the probability of correct recognition of the input signal, is used to form the loss function. In the article is carried out a comparative analysis of the convergence of learning and experimentally obtained sequences on the basis of the correlation indicator and widely used in the tasks of classification of the function CrossEntropyLoss with the use of the optimizer and without it. Based on the experiments carried out, it is concluded that the proposed correlation indicator has an advantage of 2-3 times.


2012 ◽  
Vol 22 (02) ◽  
pp. 1230008 ◽  
Author(s):  
M. S. BAPTISTA ◽  
J. X. DE CARVALHO ◽  
M. S. HUSSEIN ◽  
C. GREBOGI

This work clarifies the relationship between network circuit (topology) and behavior (information transmission and synchronization) in active networks, e.g. neural networks. As an application, we show how to determine a network topology that is optimal for information transmission. By optimal, we mean that the network is able to transmit a large amount of information, it possesses a large number of communication channels, and it is robust under large variations of the network coupling configuration. This theoretical approach is general and does not depend on the particular dynamic of the elements forming the network, since the network topology can be determined by finding a Laplacian matrix (the matrix that describes the connections and the coupling strengths among the elements) whose eigenvalues satisfy some special conditions. To illustrate our ideas and theoretical approaches, we use neural networks of electrically connected chaotic Hindmarsh–Rose neurons.


Author(s):  
Saïd Kourrich ◽  
Antonello Bonci

The brain is an extraordinarily complex organ that constantly has to process information to adapt appropriately to internal and external stimuli. This information is received, processed, and transmitted within neural networks by neurons through specialized connections called synapses. While information transmission at synapses is primarily chemical, it propagates through a neuron via electrical signals made of patterns of action potentials. The present chapter will describe the fundamental types of plastic changes that can affect neuronal transmission. Importantly, these various types of neural plasticity have been associated with both adaptive such as learning and memory or pathological conditions such as neurological and psychiatric disorders.


Sign in / Sign up

Export Citation Format

Share Document