recursive structures
Recently Published Documents


TOTAL DOCUMENTS

102
(FIVE YEARS 14)

H-INDEX

13
(FIVE YEARS 1)

Author(s):  
Tanmai Khanna ◽  
Jonathan N. Washington ◽  
Francis M. Tyers ◽  
Sevilay Bayatlı ◽  
Daniel G. Swanson ◽  
...  

AbstractThis paper presents an overview of Apertium, a free and open-source rule-based machine translation platform. Translation in Apertium happens through a pipeline of modular tools, and the platform continues to be improved as more language pairs are added. Several advances have been implemented since the last publication, including some new optional modules: a module that allows rules to process recursive structures at the structural transfer stage, a module that deals with contiguous and discontiguous multi-word expressions, and a module that resolves anaphora to aid translation. Also highlighted is the hybridisation of Apertium through statistical modules that augment the pipeline, and statistical methods that augment existing modules. This includes morphological disambiguation, weighted structural transfer, and lexical selection modules that learn from limited data. The paper also discusses how a platform like Apertium can be a critical part of access to language technology for so-called low-resource languages, which might be ignored or deemed unapproachable by popular corpus-based translation technologies. Finally, the paper presents some of the released and unreleased language pairs, concluding with a brief look at some supplementary Apertium tools that prove valuable to users as well as language developers. All Apertium-related code, including language data, is free/open-source and available at https://github.com/apertium.


2021 ◽  
Vol 11 (5) ◽  
pp. 62
Author(s):  
Bing Bai ◽  
Xin Dong ◽  
Tyler Poisson ◽  
Caimei Yang

The recursive computational mechanism generates an infinite range of expressions. However, little is known about how different concepts interact with each other within recursive structures. The current study investigated how Mandarin-speaking children dealt with possessives and generics in recursive structures. The picture-matching task showed that Mandarin-speaking children 4 to 6 had a bias for generics in ambiguous possessive constructions in Mandarin, where the genitive maker was covert (e.g., Yuehan de baobao chuang John’s kid bed, where baobao chuang kid bed has both a generic interpretation and a referential interpretation). It was found that that Mandarin-speaking children below 6 had a non-recursive interpretation of the possessive John’s kid(’s) bed, and instead understand kid’s bed to refer generically to a type of bed. This finding suggests that semantics does not parallel syntax in the acquisition of indirect recursion, in line with the prediction of the generic-as-default hypothesis which claims that generics are the default mode of representation of ambiguous statements when the statement can be either generic or non-generic. The delayed recursive possessive interpretation suggests that the full determiner phrase is acquired later than a noun phrase modification, which is universal in all languages. We also discuss the role of the overt functional category in the acquisition of indirect recursion.


2021 ◽  
Author(s):  
Daoxin Li ◽  
Kathryn Schuler

Languages differ regarding the depth, structure, and syntactic domains of recursive structures. Even within a single language, some structures allow infinite self-embedding while others are more restricted. For example, English allows infinite free embedding of the prenominal genitive -s, whereas the postnominal genitive of is largely restricted to only one level and to a limited set of items. Therefore, while the ability for recursion is considered as a crucial part of the language faculty, speakers need to learn from experience which specific structures allow free embedding and which do not. One effort to account for the mechanism that underlies this learning process, the distributional learning proposal, suggests that the recursion of a structure (e.g. X1’s-X2) is licensed if the X1 position and the X2 position are productively substitutable in the input. A series of corpus studies have confirmed the availability of such distributional cues in child directed speech. The present study further tests the distributional learning proposal with an artificial language learning experiment. We found that, as predicted, participants exposed to productive input were more likely to accept unattested strings at both one and two-embedding levels than participants exposed to unproductive input. Therefore, our results suggest that speakers can indeed use distributional information at one level to learn whether or not a structure is freely recursive.


2021 ◽  
pp. 103-124
Author(s):  
Kevin Ohi

The openings of Dickens’s novels schematically isolate the elements that will form the texts: beginning with the “two figures” of the novel’s opening, this chapter examines the positing of character in Our Mutual Friend. It traces the novel’s fascination with proto-, incomplete, or newly emerging persons: the partial assembly of skeletons, the emergence of the nouveau riche or of “made” men, the awakening to consciousness after a near-drowning, the looming of forms that might be (but that are not yet) human out of the darkness or at the borders of perception. The novel repeatedly produces scenarios where recursive structures of gazing (fond spouses attending to their spouses’ looks, a daughter watching to see what her father sees) as if produce faces that loom out of the void. It also repeatedly dramatizes forms of reading that aren’t literally reading: Silas Wegg teaching Boffin to “decline and fall”; Lizzie Hexham seeing stories in the fire; Charlie Hexham looking at the spines of books; Gaffer “reading” posters illegible to him, and so on. The novel’s concern with incipient forms and, as it were, proto-reading, indexes the way its major and minor plots and subplots are structured by an overarching concern with inception.


Languages ◽  
2021 ◽  
Vol 6 (2) ◽  
pp. 65
Author(s):  
Junko Ito ◽  
Armin Mester

This paper investigates the role recursive structures play in prosody. In current understanding, phonological phrasing is computed by a general syntax–prosody mapping algorithm. Here, we are interested in recursive structure that arises in response to morphosyntactic structure that needs to be mapped. We investigate the types of recursive structures found in prosody, specifically: For a prosodic category κ, besides the adjunctive type of recursion κ[κ x], κ[x κ], is there also the coordinative type κ[κ κ]? Focusing on the prosodic forms of compounds in two typologically rather different languages, Danish and Japanese, we encounter three types of recursive word structures: coordinative ω[ω ω], left-adjunctive ω[f ω], right-adjunctive ω[ω f] and the strictly layered compound structure ω[f f]. In addition, two kinds of coordinative φ-compounds are found in Japanese, one with a non-recursive (strictly layered) structure φ[ω ω], a mono-phrasal compound consisting of two words, and one with coordinative recursion φ[φ φ], a bi-phrasal compound. A cross-linguistically rare type of post-syntactic compound has this biphrasal structure, a fact to be explained by its sentential origin.


2021 ◽  
Vol 6 (1) ◽  
pp. 133
Author(s):  
Adina Camelia Bleotu ◽  
Tom Roeper

The current paper examines Romanian 5-year-olds’ comprehension and production of recursive structures involving multiple adjectives such as florile mici mari ‘flowers-the small big’, i.e., “the big small flowers”. On the basis of an experiment we conducted on 20 Romanian 5-year-olds, we show that children have the tendency to reduce recursion to coordination, the default interpretation at this stage of language acquisition. Moreover, children avoid producing recursive structures, preferring simpler forms instead, while they produce coordinative structures to a much higher extent. Since children’s performance with recursive adjectives in Romanian seems to be worse than performance with recursive prepositional phrases (Bleotu 2020), we argue that this supports the idea that, unlike prepositional phrases, multiple adjectives in Romance are derived through the complex operation of Roll-Up (Cinque 1994, 2005, 2010).


2021 ◽  
Author(s):  
◽  
Jasmin Straub

Within the last thirty years, the contraction method has become an important tool for the distributional analysis of random recursive structures. While it was mainly developed to show weak convergence, the contraction approach can additionally be used to obtain bounds on the rate of convergence in an appropriate metric. Based on ideas of the contraction method, we develop a general framework to bound rates of convergence for sequences of random variables as they mainly arise in the analysis of random trees and divide-and-conquer algorithms. The rates of convergence are bounded in the Zolotarev distances. In essence, we present three different versions of convergence theorems: a general version, an improved version for normal limit laws (providing significantly better bounds in some examples with normal limits) and a third version with a relaxed independence condition. Moreover, concrete applications are given which include parameters of random trees, quantities of stochastic geometry as well as complexity measures of recursive algorithms under either a random input or some randomization within the algorithm.


Author(s):  
Bruno Guillon ◽  
Giovanni Pighizzini ◽  
Luca Prigioniero

Non-self-embedding grammars are a restriction of context-free grammars which does not allow to describe recursive structures and, hence, which characterizes only the class of regular languages. A double exponential gap in size from non-self-embedding grammars to deterministic finite automata is known. The same size gap is also known from constant-height pushdown automata and [Formula: see text]-limited automata to deterministic finite automata. Constant-height pushdown automata and [Formula: see text]-limited automata are compared with non-self-embedding grammars. It is proved that non-self-embedding grammars and constant-height pushdown automata are polynomially related in size. Furthermore, a polynomial size simulation by [Formula: see text]-limited automata is presented. However, the converse transformation is proved to cost exponential. Finally, a different simulation shows that also the conversion of deterministic constant-height pushdown automata into deterministic [Formula: see text]-limited automata costs polynomial.


2020 ◽  
Author(s):  
Ruhai Zhang ◽  
Feifei Li ◽  
Shan Jiang ◽  
Kexin Zhao ◽  
Chi Zhang ◽  
...  

The current research aimed to investigate the role that prior knowledge played in what structures could be implicitly learnt and also the nature of the memory buffer required for learning such structures. It is already established that people can implicitly learn to detect an inversion symmetry (i.e. a cross-serial dependency) based on linguistic tone types. The present study investigated the ability of the Simple Recurrent Network (SRN) to explain implicit learning of such recursive structures. We found that the SRN learnt the symmetry over tone types more effectively when given prior knowledge of the tone types (i.e. of the two categories tones were grouped into). The role of prior knowledge of the tone types in learning the inversion symmetry was tested on people: When an arbitrary classification of tones was used (i.e. in the absence of prior knowledge of categories), participants did not implicitly learn the inversion symmetry (unlike when they did have prior knowledge of the tone types). These results indicate the importance of prior knowledge in implicit learning of symmetrical structures. We further contrasted the learning of inversion symmetry and retrograde symmetry and showed that inversion was learnt more easily than retrograde by the SRN, matching our previous findings with people, thus showing that the type of memory buffer used in the SRN is suitable for modeling the implicit learning of symmetry in people.


Sign in / Sign up

Export Citation Format

Share Document