scholarly journals Optimal CNN–Hopfield Network for Pattern Recognition Based on a Genetic Algorithm

Algorithms ◽  
2021 ◽  
Vol 15 (1) ◽  
pp. 11
Author(s):  
Fekhr Eddine Keddous ◽  
Amir Nakib

Convolutional neural networks (CNNs) have powerful representation learning capabilities by automatically learning and extracting features directly from inputs. In classification applications, CNN models are typically composed of: convolutional layers, pooling layers, and fully connected (FC) layer(s). In a chain-based deep neural network, the FC layers contain most of the parameters of the network, which affects memory occupancy and computational complexity. For many real-world problems, speeding up inference time is an important matter because of the hardware design implications. To deal with this problem, we propose the replacement of the FC layers with a Hopfield neural network (HNN). The proposed architecture combines both a CNN and an HNN: A pretrained CNN model is used for feature extraction, followed by an HNN, which is considered as an associative memory that saves all features created by the CNN. Then, to deal with the limitation of the storage capacity of the HNN, the proposed work uses multiple HNNs. To optimize this step, the knapsack problem formulation is proposed, and a genetic algorithm (GA) is used solve it. According to the results obtained on the Noisy MNIST Dataset, our work outperformed the state-of-the-art algorithms.

2016 ◽  
pp. 1099-1114
Author(s):  
Zongyuan Zhao ◽  
Shuxiang Xu ◽  
Byeong Ho Kang ◽  
Mir Md Jahangir Kabir ◽  
Yunling Liu ◽  
...  

Artificial Neural Network has shown its impressive ability on many real world problems such as pattern recognition, classification and function approximation. An extension of ANN, higher order neural network (HONN), improves ANN's computational and learning capabilities. However, the large number of higher order attributes leads to long learning time and complex network structure. Some irrelevant higher order attributes can also hinder the performance of HONN. In this chapter, feature selection algorithms will be used to simplify HONN architecture. Comparisons of fully connected HONN with feature selected HONN demonstrate that proper feature selection can be effective on decreasing number of inputs, reducing computational time, and improving prediction accuracy of HONN.


Author(s):  
Olga RUZAKOVA

The article presents a methodological approach to assessing the investment attractiveness of an enterprise based on the Hopfield neural network mathematical apparatus. An extended set of evaluation parameters of the investment process has been compiled. An algorithm for formalizing the decision-making process regarding the investment attractiveness of the enterprise based on the mathematical apparatus of neural networks has been developed. The proposed approach allows taking into account the constantly changing sets of quantitative and qualitative parameters, identifying the appropriate level of investment attractiveness of the enterprise with minimal money and time expenses – one of the standards of the Hopfield network, which is most similar to the one that characterizes the activity of the enterprise. Developed complex formalization of the investment process allows you to make investment decisions in the context of incompleteness and heterogeneity of information, based on the methodological tools of neural networks.


2021 ◽  
Vol 15 ◽  
Author(s):  
Corentin Delacour ◽  
Aida Todri-Sanial

Oscillatory Neural Network (ONN) is an emerging neuromorphic architecture with oscillators representing neurons and information encoded in oscillator's phase relations. In an ONN, oscillators are coupled with electrical elements to define the network's weights and achieve massive parallel computation. As the weights preserve the network functionality, mapping weights to coupling elements plays a crucial role in ONN performance. In this work, we investigate relaxation oscillators based on VO2 material, and we propose a methodology to map Hebbian coefficients to ONN coupling resistances, allowing a large-scale ONN design. We develop an analytical framework to map weight coefficients into coupling resistor values to analyze ONN architecture performance. We report on an ONN with 60 fully-connected oscillators that perform pattern recognition as a Hopfield Neural Network.


2019 ◽  
Vol 8 (2) ◽  
pp. 4928-4937 ◽  

Odia character and digits recognition area are vital issues of these days in computer vision. In this paper a Hope field neural network design to solve the printed Odia character recognition has been discussed. Optical Character Recognition (OCR) is the principle of applying conversion of the pictures from handwritten, printed or typewritten to machine encoded text version. Artificial Neural Networks (ANNs) trained as a classifier and it had been trained, supported the rule of Hopfield Network by exploitation code designed within the MATLAB. Preprocessing of data (image acquisition, binarization, skeletonization, skew detection and correction, image cropping, resizing, implementation and digitalization) all these activities have been carried out using MATLAB. The OCR, designed a number of the thought accuses non-standard speech for different types of languages. Segmentation, feature extraction, classification tasks is the well-known techniques for reviewing of Odia characters and outlined with their weaknesses, relative strengths. It is expected that who are interested to figure within the field of recognition of Odia characters are described in this paper. Recognition of Odia printed characters, numerals, machine characters of research areas finds costly applications within the banks, industries, offices. In this proposed work we devolve an efficient and robust mechanism in which Odia characters are recognized by the Hopfield Neural Networks (HNN).


Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.


Author(s):  
Zongyuan Zhao ◽  
Shuxiang Xu ◽  
Byeong Ho Kang ◽  
Mir Md Jahangir Kabir ◽  
Yunling Liu ◽  
...  

Artificial Neural Network has shown its impressive ability on many real world problems such as pattern recognition, classification and function approximation. An extension of ANN, higher order neural network (HONN), improves ANN's computational and learning capabilities. However, the large number of higher order attributes leads to long learning time and complex network structure. Some irrelevant higher order attributes can also hinder the performance of HONN. In this chapter, feature selection algorithms will be used to simplify HONN architecture. Comparisons of fully connected HONN with feature selected HONN demonstrate that proper feature selection can be effective on decreasing number of inputs, reducing computational time, and improving prediction accuracy of HONN.


2012 ◽  
Vol 241-244 ◽  
pp. 1900-1903
Author(s):  
Na Wei ◽  
Zhe Cheng ◽  
Xiao Meng Wu

In accordance with the characteristic of radial running an algorithm for distribution network reconfiguration based on Hopfield neural network is put forward. The in-degree of each node is determined by Hopfield neural network, it is determined whether the lines run according to the in-degree of the nodes, and the state of each loop switch is determined according to whether the lines run, and thus the distribution network reconfiguration scheme is determined finally. The energy function of the neural network and its solution method are presented. In the energy function are considered the radial running of distribution network, the lowest distribution network loss and no loop switch in some lines. The IEEE distribution network structure with three power sources obtained by the algorithm is basically consistent to that obtained by genetic algorithm, but the time spent using the former is shorter than that the latter.


Sign in / Sign up

Export Citation Format

Share Document