Nondeterministic Ordered Restarting Automata

2018 ◽  
Vol 29 (04) ◽  
pp. 663-685 ◽  
Author(s):  
Kent Kwee ◽  
Friedrich Otto

While (stateless) deterministic ordered restarting automata accept exactly the regular languages, it has been observed that nondeterministic ordered restarting automata (ORWW-automata for short) are more expressive. Here we show that the class of languages accepted by the latter automata is an abstract family of languages that is incomparable to the linear, the context-free, and the growing context-sensitive languages with respect to inclusion, and that the emptiness problem is decidable for these languages. In addition, we give a construction that turns a stateless ORWW-automaton into a nondeterministic finite-state acceptor for the same language.

2003 ◽  
Vol 14 (06) ◽  
pp. 1007-1018 ◽  
Author(s):  
CEZAR CÂMPEANU ◽  
KAI SALOMAA ◽  
SHENG YU

Regular expressions are used in many practical applications. Practical regular expressions are commonly called "regex". It is known that regex are different from regular expressions. In this paper, we give regex a formal treatment. We make a distinction between regex and extended regex; while regex represent regular languages, extended regex represent a family of languages larger than regular languages. We prove a pumping lemma for the languages expressed by extended regex. We show that the languages represented by extended regex are incomparable with context-free languages and a proper subset of context-sensitive languages. Other properties of the languages represented by extended regex are also studied.


Author(s):  
Holger Bock Axelsen ◽  
Martin Kutrib ◽  
Andreas Malcher ◽  
Matthias Wendlandt

It is well known that reversible finite automata do not accept all regular languages, that reversible pushdown automata do not accept all deterministic context-free languages, and that reversible queue automata are less powerful than deterministic real-time queue automata. It is of significant interest from both a practical and theoretical point of view to close these gaps. We here extend these reversible models by a preprocessing unit which is basically a reversible injective and length-preserving finite state transducer. It turns out that preprocessing the input using such weak devices increases the computational power of reversible deterministic finite automata to the acceptance of all regular languages, whereas for reversible pushdown automata the accepted family of languages lies strictly in between the reversible deterministic context-free languages and the real-time deterministic context-free languages. For reversible queue automata the preprocessing of the input leads to machines that are stronger than real-time reversible queue automata, but less powerful than real-time deterministic (irreversible) queue automata. Moreover, it is shown that the computational power of all three types of machines is not changed by allowing the preprocessing finite state transducer to work irreversibly. Finally, we examine the closure properties of the family of languages accepted by such machines.


2013 ◽  
Vol 24 (06) ◽  
pp. 747-763 ◽  
Author(s):  
STEFANO CRESPI REGHIZZI ◽  
PIERLUIGI SAN PIETRO

A recent language definition device named consensual is based on agreement between similar words. Considering a language over a bipartite alphabet made by pairs of unmarked/marked letters, the match relation specifies when such words agree. Thus a set (the “base”) over the bipartite alphabet consensually specifies another language that includes any terminal word such that a set of corresponding matching words is in the base. We show that all and only the regular languages are consensually generated by a strictly locally testable base; the result is based on a generalization of Medvedev's homomorphic characterization of regular languages. Consensually context-free languages strictly include the base family. The consensual and the base families collapse together if the base is context-sensitive.


2012 ◽  
Vol 367 (1598) ◽  
pp. 1956-1970 ◽  
Author(s):  
Gerhard Jäger ◽  
James Rogers

The first part of this article gives a brief overview of the four levels of the Chomsky hierarchy, with a special emphasis on context-free and regular languages. It then recapitulates the arguments why neither regular nor context-free grammar is sufficiently expressive to capture all phenomena in the natural language syntax. In the second part, two refinements of the Chomsky hierarchy are reviewed, which are both relevant to the extant research in cognitive science: the mildly context-sensitive languages (which are located between context-free and context-sensitive languages), and the sub-regular hierarchy (which distinguishes several levels of complexity within the class of regular languages).


2021 ◽  
Vol 9 ◽  
pp. 528-537
Author(s):  
Andrew Lamont

Abstract Phonological generalizations are finite-state. While Optimality Theory is a popular framework for modeling phonology, it is known to generate non-finite-state mappings and languages. This paper demonstrates that Optimality Theory is capable of generating non-context-free languages, contributing to the characterization of its generative capacity. This is achieved with minimal modification to the theory as it is standardly employed.


2005 ◽  
Vol 16 (04) ◽  
pp. 645-662 ◽  
Author(s):  
JÜRGEN DASSOW ◽  
MARKUS HOLZER

We formalize the hairpin inverted repeat excision, which is known in ciliate genetics as an operation on words and languages by defining [Formula: see text] as the set of all words xαyRαRz where w = xαyαRz and the pointer α is in P. We extend this concept to language families which results in families [Formula: see text]. For [Formula: see text] and [Formula: see text] be the families of finite, regular, context-free, context-sensitive or recursively enumerable language, respectively, we determine the hierarchy of the families [Formula: see text] and compare these families with those of the Chomsky hierarchy. Furthermore, we present the status of decidability of the membership problem, emptiness problem and finiteness problem for the families [Formula: see text].


2001 ◽  
Vol 13 (9) ◽  
pp. 2093-2118 ◽  
Author(s):  
Paul Rodriguez

It has been shown that if a recurrent neural network (RNN) learns to process a regular language, one can extract a finite-state machine (FSM) by treating regions of phase-space as FSM states. However, it has also been shown that one can construct an RNN to implement Turing machines by using RNN dynamics as counters. But how does a network learn languages that require counting? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent network (SRN) can learn to process a simple context-free language (CFL) by counting up and down. This article extends that to show a range of language tasks in which an SRN develops solutions that not only count but also copy and store counting information. In one case, the network stores information like an explicit storage mechanism. In other cases, the network stores information more indirectly in trajectories that are sensitive to slight displacements that depend on context. In this sense, an SRN can learn analog computation as a set of interdependent counters. This demonstrates how SRNs may be an alternative psychological model of language or sequence processing.


1996 ◽  
Vol 2 (4) ◽  
pp. 287-290 ◽  
Author(s):  
ANDRÁS KORNAI

In spite of the wide availability of more powerful (context free, mildly context sensitive, and even Turing-equivalent) formalisms, the bulk of the applied work on language and sublanguage modeling, especially for the purposes of recognition and topic search, is still performed by various finite state methods. In fact, the use of such methods in research labs as well as in applied work actually increased in the past five years. To bring together those developing and using extended finite state methods to text analysis, speech/OCR language modeling, and related CL and NLP tasks with those in AI and CS interested in analyzing and possibly extending the domain of finite state algorithms, a workshop was held in August 1996 in Budapest as part of the European Conference on Artificial Intelligence (ECAI'96).


2002 ◽  
Vol 14 (9) ◽  
pp. 2039-2041 ◽  
Author(s):  
J. Schmidhuber ◽  
F. Gers ◽  
D. Eck

In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.


Sign in / Sign up

Export Citation Format

Share Document