discrete sequences
Recently Published Documents


TOTAL DOCUMENTS

52
(FIVE YEARS 12)

H-INDEX

9
(FIVE YEARS 1)

Author(s):  
N.A. Gupal

Introduction. Numeration, or code, discrete sequences act fundamental part in the theory of recognition and estimation. By the code get codes or indexes of the programs and calculated functions. It is set that the universal programs are that programs which will realize all other programs. This one of basic results in the theory of estimation. On the basis of numeration of discrete sequences of Godel proved a famous theorem about incompleteness of arithmetic. Purpose of the article. To develop synonymous numerations by the natural numbers of eventual discrete sequences programs and calculable functions mutually. Results. On the basis of numerations of eventual discrete sequences numerations are built for four commands of machine with unlimited registers (MUR) in the natural numbers of type of 4u, 4u +1, 4u+2, 4u+3 accordingly. Every program consists of complete list of commands. On the basis of bijection for four commands of MUR certainly mutually synonymous numerations for all programs of MUR. Thus, on the basis of the set program it is possible effectively to find its code number, and vice versa, on the basis of the set number it is possible effectively to find the program. Conclusions. Synonymous numerations by the natural numbers of complete discrete sequences are developed mutually, programs for MUR and calculable functions. Leaning against numeration of the programs it is set in the theory of calculable functions, that the universal programs are, that programs which will realize all other programs. By application of the calculated functions and s-m-n theorem are got to operation on the calculated functions: combination ?x and ?y, giving work ?x?y, operation of conversion of functions, effective operation of recursion. Thus, the index of function ?x?y is on the indexes of x and y [2]. Keywords: numeration, Godel code number, diagonal method.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Ian Cone ◽  
Harel Z Shouval

Multiple brain regions are able to learn and express temporal sequences, and this functionality is an essential component of learning and memory. We propose a substrate for such representations via a network model that learns and recalls discrete sequences of variable order and duration. The model consists of a network of spiking neurons placed in a modular microcolumn based architecture. Learning is performed via a biophysically realistic learning rule that depends on synaptic ‘eligibility traces’. Before training, the network contains no memory of any particular sequence. After training, presentation of only the first element in that sequence is sufficient for the network to recall an entire learned representation of the sequence. An extended version of the model also demonstrates the ability to successfully learn and recall non-Markovian sequences. This model provides a possible framework for biologically plausible sequence learning and memory, in agreement with recent experimental results.


2020 ◽  
Author(s):  
I. Cone ◽  
H. Z. Shouval

AbstractThe ability to express and learn temporal sequences is an essential part of learning and memory. Learned temporal sequences are expressed in multiple brain regions and as such there may be common design in the circuits that mediate it. This work proposes a substrate for such representations, via a biophysically realistic network model that can robustly learn and recall discrete sequences of variable order and duration. The model consists of a network of spiking leaky-integrate-and-fire model neurons placed in a modular architecture designed to resemble cortical microcolumns. Learning is performed via a learning rule with “eligibility traces”, which hold a history of synaptic activity before being converted into changes in synaptic strength upon neuromodulator activation. Before training, the network responds to incoming stimuli, and contains no memory of any particular sequence. After training, presentation of only the first element in that sequence is sufficient for the network to recall an entire learned representation of the sequence. An extended version of the model also demonstrates the ability to successfully learn and recall non-Markovian sequences. This model provides a possible framework for biologically realistic sequence learning and memory, and is in agreement with recent experimental results, which have shown sequence dependent plasticity in sensory cortex.


2020 ◽  
Vol 34 (04) ◽  
pp. 3757-3764
Author(s):  
Thomas Demeester

Established recurrent neural networks are well-suited to solve a wide variety of prediction tasks involving discrete sequences. However, they do not perform as well in the task of dynamical system identification, when dealing with observations from continuous variables that are unevenly sampled in time, for example due to missing observations. We show how such neural sequence models can be adapted to deal with variable step sizes in a natural way. In particular, we introduce a ‘time-aware’ and stationary extension of existing models (including the Gated Recurrent Unit) that allows them to deal with unevenly sampled system observations by adapting to the observation times, while facilitating higher-order temporal behavior. We discuss the properties and demonstrate the validity of the proposed approach, based on samples from two industrial input/output processes.


In this article are discussed techniques of hiding information messages in cover image using direct spectrum spreading technology. This technology is based on the use of poorly correlated pseudorandom (noise) sequences. Modulating the information data with such signals, the message is presented as a noise-like form, which makes it very difficult to detect. Hiding means adding a modulated message to the cover image. If this image is interpreted as noise on the communication channel, then the task of hiding user’s data is equivalent to transmitting a noise-like modulated message on the noise communication channel. At the same it is supposed that noise-like signals are poorly correlated both with each other and with the cover image (or its fragment). However, the latter assumption may not be fulfilled because a realistic image is not an implementation of a random process; its pixels have a strong correlation. Obviously, the selection of pseudo-random spreading signals must take this feature into account. We are investigating various ways of formation spreading sequences while assessing Bit Error Rate (BER) of information data as well as cover image distortion by mean squared error (MSE) and by Peak signal-to-noise ratio (PSNR). The obtained experimental dependencies clearly confirm the advantage of using Walsh sequences. During the research, the lowest BER values were obtained. Even at low values of the signal power of the spreading sequences (P≈5), the BER value, in most cases, did not exceed 0,01. This is the best result of all the sequences under consideration in this work. The values of PSNR when using orthogonal Walsh sequences are, in most cases, comparable to other considered options. However, for a fixed value of PSNR, using the Walsh transform results in significantly lower BER values. It is noted that a promising direction is the use of adaptively generated discrete sequences. So, for example, if the rule for generating expanding signals takes into account the statistical properties of the container, then you can significantly reduce the value of BER. Also, another useful result could be increasing PSNR at a fixed (given) value of BER. The purpose of our work is to justify the choice of extending sequences to reduce BER and MSE (increase PSNR).


2019 ◽  
Vol 53 (5) ◽  
pp. 3787-3812 ◽  
Author(s):  
Rémi Domingues ◽  
Pietro Michiardi ◽  
Jérémie Barlet ◽  
Maurizio Filippone

Sign in / Sign up

Export Citation Format

Share Document