scholarly journals A finite state probabilistic automaton that accepts a context sensitive language that is not context free

1977 ◽  
Vol 35 (3) ◽  
pp. 196-208 ◽  
Author(s):  
Kuo An Chen ◽  
Ming-Kuei Hu
2021 ◽  
Vol 9 ◽  
pp. 528-537
Author(s):  
Andrew Lamont

Abstract Phonological generalizations are finite-state. While Optimality Theory is a popular framework for modeling phonology, it is known to generate non-finite-state mappings and languages. This paper demonstrates that Optimality Theory is capable of generating non-context-free languages, contributing to the characterization of its generative capacity. This is achieved with minimal modification to the theory as it is standardly employed.


2018 ◽  
Vol 29 (04) ◽  
pp. 663-685 ◽  
Author(s):  
Kent Kwee ◽  
Friedrich Otto

While (stateless) deterministic ordered restarting automata accept exactly the regular languages, it has been observed that nondeterministic ordered restarting automata (ORWW-automata for short) are more expressive. Here we show that the class of languages accepted by the latter automata is an abstract family of languages that is incomparable to the linear, the context-free, and the growing context-sensitive languages with respect to inclusion, and that the emptiness problem is decidable for these languages. In addition, we give a construction that turns a stateless ORWW-automaton into a nondeterministic finite-state acceptor for the same language.


2001 ◽  
Vol 13 (9) ◽  
pp. 2093-2118 ◽  
Author(s):  
Paul Rodriguez

It has been shown that if a recurrent neural network (RNN) learns to process a regular language, one can extract a finite-state machine (FSM) by treating regions of phase-space as FSM states. However, it has also been shown that one can construct an RNN to implement Turing machines by using RNN dynamics as counters. But how does a network learn languages that require counting? Rodriguez, Wiles, and Elman (1999) showed that a simple recurrent network (SRN) can learn to process a simple context-free language (CFL) by counting up and down. This article extends that to show a range of language tasks in which an SRN develops solutions that not only count but also copy and store counting information. In one case, the network stores information like an explicit storage mechanism. In other cases, the network stores information more indirectly in trajectories that are sensitive to slight displacements that depend on context. In this sense, an SRN can learn analog computation as a set of interdependent counters. This demonstrates how SRNs may be an alternative psychological model of language or sequence processing.


1996 ◽  
Vol 2 (4) ◽  
pp. 287-290 ◽  
Author(s):  
ANDRÁS KORNAI

In spite of the wide availability of more powerful (context free, mildly context sensitive, and even Turing-equivalent) formalisms, the bulk of the applied work on language and sublanguage modeling, especially for the purposes of recognition and topic search, is still performed by various finite state methods. In fact, the use of such methods in research labs as well as in applied work actually increased in the past five years. To bring together those developing and using extended finite state methods to text analysis, speech/OCR language modeling, and related CL and NLP tasks with those in AI and CS interested in analyzing and possibly extending the domain of finite state algorithms, a workshop was held in August 1996 in Budapest as part of the European Conference on Artificial Intelligence (ECAI'96).


2002 ◽  
Vol 14 (9) ◽  
pp. 2039-2041 ◽  
Author(s):  
J. Schmidhuber ◽  
F. Gers ◽  
D. Eck

In response to Rodriguez's recent article (2001), we compare the performance of simple recurrent nets and long short-term memory recurrent nets on context-free and context-sensitive languages.


1996 ◽  
Vol 25 (4) ◽  
pp. 587-612 ◽  
Author(s):  
Maria M. Egbert

ABSTRACTJust as turn-taking has been found to be both context-free and context-sensitive (Sacks, Schegloff & Jefferson 1974), the organization of repair is also shown here to be both context-free and context-sensitive. In a comparison of American and German conversation, repair can be shown to be context-free in that, basically, the same mechanism can be found across these two languages. However, repair is also sensitive to the linguistic inventory of a given language; in German, morphological marking, syntactic constraints, and grammatical congruity across turns are used as interactional resources. In addition, repair is sensitive to certain characteristics of social situations. The selection of a particular repair initiator, Germanbitte?‘pardon?’, indexes that there is no mutual gaze between interlocutors; i.e., there is no common course of action. The selection ofbitte?not only initiates repair; it also spurs establishment of mutual gaze, and thus displays that there is attention to a common focus. (Conversation analysis, context, cross-linguistic analysis, repair, gaze, telephone conversation, co-present interaction, grammar and interaction)


2020 ◽  
Vol 30 (02) ◽  
pp. 339-378
Author(s):  
Jared Adams ◽  
Eric M. Freden

Denote the Baumslag–Solitar family of groups as [Formula: see text]). When [Formula: see text] we study the Bass–Serre tree [Formula: see text] for [Formula: see text] as a geometric object. We suggest that the irregularity of [Formula: see text] is the principal obstruction for computing the growth series for the group. In the particular case [Formula: see text] we exhibit a set [Formula: see text] of normal form words having minimal length for [Formula: see text] and use it to derive various counting algorithms. The language [Formula: see text] is context-sensitive but not context-free. The tree [Formula: see text] has a self-similar structure and contains infinitely many cone types. All cones have the same asymptotic growth rate as [Formula: see text] itself. We derive bounds for this growth rate, the lower bound also being a bound on the growth rate of [Formula: see text].


2011 ◽  
Vol 14 ◽  
pp. 34-71 ◽  
Author(s):  
Eric M. Freden ◽  
Teresa Knudson ◽  
Jennifer Schofield

AbstractThe computation of growth series for the higher Baumslag–Solitar groups is an open problem first posed by de la Harpe and Grigorchuk. We study the growth of the horocyclic subgroup as the key to the overall growth of these Baumslag–Solitar groups BS(p,q), where 1<p<q. In fact, the overall growth series can be represented as a modified convolution product with one of the factors being based on the series for the horocyclic subgroup. We exhibit two distinct algorithms that compute the growth of the horocyclic subgroup and discuss the time and space complexity of these algorithms. We show that when p divides q, the horocyclic subgroup has a geodesic combing whose words form a context-free (in fact, one-counter) language. A theorem of Chomsky–Schützenberger allows us to compute the growth series for this subgroup, which is rational. When p does not divide q, we show that no geodesic combing for the horocyclic subgroup forms a context-free language, although there is a context-sensitive geodesic combing. We exhibit a specific linearly bounded Turing machine that accepts this language (with quadratic time complexity) in the case of BS(2,3) and outline the Turing machine construction in the general case.


Sign in / Sign up

Export Citation Format

Share Document