Interrelation of Pattern Recognition, Machine Thinking and Learning Problems

2020 ◽  
Vol 52 (5) ◽  
pp. 51-78
Author(s):  
Vladimir I. Gritsenko ◽  
Mikhail I. Schlesinger
1993 ◽  
Vol 5 (5) ◽  
pp. 750-766 ◽  
Author(s):  
A. Norman Redlich

Factorial learning, finding a statistically independent representation of a sensory “image”—a factorial code—is applied here to solve multilayer supervised learning problems that have traditionally required backpropagation. This lends support to Barlow's argument for factorial sensory processing, by demonstrating how it can solve actual pattern recognition problems. Two techniques for supervised factorial learning are explored, one of which gives a novel distributed solution requiring only positive examples. Also, a new nonlinear technique for factorial learning is introduced that uses neural networks based on almost reversible cellular automata. Due to the special functional connectivity of these networks—which resemble some biological microcircuits—learning requires only simple local algorithms. Also, supervised factorial learning is shown to be a viable alternative to backpropagation. One significant advantage is the existence of a measure for the performance of intermediate learning stages.


1993 ◽  
Vol 5 (6) ◽  
pp. 893-909 ◽  
Author(s):  
V. Vapnik ◽  
L. Bottou

In previous publications (Bottou and Vapnik 1992; Vapnik 1992) we described local learning algorithms, which result in performance improvements for real problems. We present here the theoretical framework on which these algorithms are based. First, we present a new statement of certain learning problems, namely the local risk minimization. We review the basic results of the uniform convergence theory of learning, and extend these results to local risk minimization. We also extend the structural risk minimization principle for both pattern recognition problems and regression problems. This extended induction principle is the basis for a new class of algorithms.


2019 ◽  
Vol 8 (3) ◽  
pp. 1463-1465

Deep learning which is associated with the basics of Machine Learning has become popular over the years because of its fast paced adaptability and ability to handle complex problems. Prior to this technology breakthrough traditional methods of machine learning were used in applications of Image processing and pattern recognition, and analytics .With the advent of CNNs it has become easy to combat complex learning problems using the property of specificity and accuracy in CNN architectures and methodologies. This paper gives an introductory insights in CNNs like the feed-forward propagation networks and Back propagation Networks. The paper explains steps followed by CNNs for classifying the input and generating a predefined output. It also explains evolution of multiple Image CNN architectures which find applications in multiple domains of Computer Science like Image Processing & Segmentation, Pattern Recognition & Predictive Analytics, Text Analytics to name a few .


Author(s):  
G.Y. Fan ◽  
J.M. Cowley

In recent developments, the ASU HB5 has been modified so that the timing, positioning, and scanning of the finely focused electron probe can be entirely controlled by a host computer. This made the asynchronized handshake possible between the HB5 STEM and the image processing system which consists of host computer (PDP 11/34), DeAnza image processor (IP 5000) which is interfaced with a low-light level TV camera, array processor (AP 400) and various peripheral devices. This greatly facilitates the pattern recognition technique initiated by Monosmith and Cowley. Software called NANHB5 is under development which, instead of employing a set of photo-diodes to detect strong spots on a TV screen, uses various software techniques including on-line fast Fourier transform (FFT) to recognize patterns of greater complexity, taking advantage of the sophistication of our image processing system and the flexibility of computer software.


Author(s):  
L. Fei ◽  
P. Fraundorf

Interface structure is of major interest in microscopy. With high resolution transmission electron microscopes (TEMs) and scanning probe microscopes, it is possible to reveal structure of interfaces in unit cells, in some cases with atomic resolution. A. Ourmazd et al. proposed quantifying such observations by using vector pattern recognition to map chemical composition changes across the interface in TEM images with unit cell resolution. The sensitivity of the mapping process, however, is limited by the repeatability of unit cell images of perfect crystal, and hence by the amount of delocalized noise, e.g. due to ion milling or beam radiation damage. Bayesian removal of noise, based on statistical inference, can be used to reduce the amount of non-periodic noise in images after acquisition. The basic principle of Bayesian phase-model background subtraction, according to our previous study, is that the optimum (rms error minimizing strategy) Fourier phases of the noise can be obtained provided the amplitudes of the noise is given, while the noise amplitude can often be estimated from the image itself.


1989 ◽  
Vol 34 (11) ◽  
pp. 988-989
Author(s):  
Erwin M. Segal
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document