Recent results on neural network architectures for vision and pattern recognition

2003 ◽  
Author(s):  
S. Grossberg
2000 ◽  
Vol 15 (2) ◽  
pp. 151-170 ◽  
Author(s):  
MIROSLAV KUBAT

An appropriately designed architecture of a neural network is essential to many realistic pattern-recognition tasks. A choice of just the right number of neurons, and their interconnections, can cut learning costs by orders of magnitude, and still warrant high classification accuracy. Surprisingly, textbooks often neglect this issue. A specialist seeking systematic information will soon realize that relevant material is scattered over diverse sources, each with a different perspective, terminology and goals. This brief survey attempts to rectify the situation by explaining the involved aspects, and by describing some of the fundamental techniques.


Author(s):  
О.А. Лукьянова ◽  
О.Ю. Никитин ◽  
А.С. Кунин

Представлены результаты исследований, связанных с автоматическим формированием архитектур нейронных сетей, состоящих из набора модулей. Реализован алгоритмический подход, основанный на формировании матриц активных модулей. Предложены способы процедурной генерации архитектур нейронных сетей для решения задач классификации. Дается описание процесса автоматического формирования архитектуры PathNet, реализованной на основе новых подходов, а также рассматриваются примеры генерации трех новых архитектур глубоких нейронных сетей (3DNN, GraphNet и BraidNet). Архитектура BraidNet включает в себя построение графа связей сети на основе теории кос. Исследование на примере задачи классификации изображений MNIST показало применимость всех четырех предложенных нейронных сетей к распознаванию образов. There are various approaches to the algorithmic specification of the network structure in the deep learning problems, which are successfully used in applications. These methods can be generalized by the concept of procedural generation of neural network architectures. Methodology. In the work, we use binary matrix filters. The filters are obtained with the help of the Hadamard product. Such filters define active network modules, thereby changing the way information is transmitted between layers. To build various architectures, the theory of braids is used in the work. The article reproduces the wellknown PathNet architecture. Examples of generating three new deep neural network architectures (3DNN, GraphNet, and BraidNet) are examined. Findings. The paper shows how the procedural generation of neural network architectures allows avoiding manually setting the network structure and automatically forming it. The use of matrix filters simplifies the process of generating network architecture due to a large number of possible combinations of modules and connections between them. Using the MNIST classification problem as an example, it is shown how the architectures presented in the article solve real-world pattern recognition problems. The results of application of neural networks indicate their diminishing tendency to retraining due to the subsequent convergence and the presence of stochastic dynamics in the learning process. Originality/value. Learning methods with dynamic adaptive changes in the network architecture allows achieving satisfactory accuracy faster and should also be less prone to retraining. The BraidNet algorithm presented in the article is applicable for ICT SB RAS, 2019 a convenient brief record of the structure of a neural network in genetic algorithms. Such features make BraidNet a promising algorithm for further application and research in complex problems of pattern recognition, including using neuroevolutionary approaches.


Sign in / Sign up

Export Citation Format

Share Document