Do dependency lengths explain constraints on crossing dependencies?

2021 ◽  
Vol 7 (s3) ◽  
Author(s):  
Himanshu Yadav ◽  
Samar Husain ◽  
Richard Futrell

Abstract In syntactic dependency trees, when arcs are drawn from syntactic heads to dependents, they rarely cross. Constraints on these crossing dependencies are critical for determining the syntactic properties of human language, because they define the position of natural language in formal language hierarchies. We study whether the apparent constraints on crossing syntactic dependencies in natural language might be explained by constraints on dependency lengths (the linear distance between heads and dependents). We compare real dependency trees from treebanks of 52 languages against baselines of random trees which are matched with the real trees in terms of their dependency lengths. We find that these baseline trees have many more crossing dependencies than real trees, indicating that a constraint on dependency lengths alone cannot explain the empirical rarity of crossing dependencies. However, we find evidence that a combined constraint on dependency length and the rate of crossing dependencies might be able to explain two of the most-studied formal restrictions on dependency trees: gap degree and well-nestedness.

2020 ◽  
Author(s):  
Steven Samuel

Research and thinking into the cognitive aspects of language evolution has usually attempted to account for how the capacity for learning even one modern human language developed. Bilingualism has perhaps been thought of as something to think about only once the ‘real’ puzzle of monolingualism is solved, but this would assume in turn (and without evidence) that bilingualism evolved after monolingualism. All typically-developing children (and adults) are capable of learning multiple languages, and the majority of modern humans are at least bilingual. In this paper I ask whether by skipping bilingualism out of language evolution we have missed a trick. I propose that exposure to synonymous signs, such as food and alarm calls, are a necessary precondition for the abstracting away of sound from referent. In support of this possibility is evidence that modern day bilingual children are better at breaking this ‘word magic’ spell. More generally, language evolution should be viewed through the lens of bilingualism, as this is the end state we are attempting to explain.


2019 ◽  
Author(s):  
Edward Gibson ◽  
Richard Futrell ◽  
Steven T. Piantadosi ◽  
Isabelle Dautriche ◽  
Kyle Mahowald ◽  
...  

Cognitive science applies diverse tools and perspectives to study human language. Recently, an exciting body of work has examined linguistic phenomena through the lens of efficiency in usage: what otherwise puzzling features of language find explanation in formal accounts of how language might be optimized for communication and learning? Here, we review studies that deploy formal tools from probability and information theory to understand how and why language works the way that it does, focusing on phenomena ranging from the lexicon through syntax. These studies show how apervasive pressure for efficiency guides the forms of natural language and indicate that a rich future for language research lies in connecting linguistics to cognitive psychology and mathematical theories of communication and inference.


2019 ◽  
pp. 1-18
Author(s):  
Michael Yoshitaka Erlewine ◽  
Theodore Levin

Pronominal paradigms in Philippine-type Austronesian languages show a robust and curious gap: in transitive clauses, pivot arguments and nonpivot agents may have bound pronominal forms, appearing as second-position clitics, but pronominal nonpivot themes must be full, free pronouns. This gap is instructive regarding the organization of the lower phase edge. As cliticization involves a syntactic dependency between the host and argument position and all syntactic dependencies are constrained by phases, the gap is explained if pivots and nonpivot agents are specifiers of the phase head, making them the only DPs accessible for operations from outside of the lower phase.


2021 ◽  
Vol 34 ◽  
Author(s):  
Laura A. Janda ◽  
Anna Endresen ◽  
Valentina Zhukova ◽  
Daria Mordashova ◽  
Ekaterina Rakhilina

Abstract We provide a practical step-by-step methodology of how to build a full-scale constructicon resource for a natural language, sharing our experience from the nearly completed project of the Russian Constructicon, an open-access searchable database of over 2,200 Russian constructions (https://site.uit.no/russian-constructicon/). The constructions are organized in families, clusters, and networks based on their semantic and syntactic properties, illustrated with corpus examples, and tagged for the CEFR level of language proficiency. The resource is designed for both researchers and L2 learners of Russian and offers the largest electronic database of constructions built for any language. We explain what makes the Russian Constructicon different from other constructicons, report on the major stages of our work, and share the methods used to systematically expand the inventory of constructions. Our objective is to encourage colleagues to build constructicon resources for additional natural languages, thus taking Construction Grammar to a new quantitative and qualitative level, facilitating cross-linguistic comparison.


Author(s):  
Stephen Neale

Syntax (more loosely, ‘grammar’) is the study of the properties of expressions that distinguish them as members of different linguistic categories, and ‘well-formedness’, that is, the ways in which expressions belonging to these categories may be combined to form larger units. Typical syntactic categories include noun, verb and sentence. Syntactic properties have played an important role not only in the study of ‘natural’ languages (such as English or Urdu) but also in the study of logic and computation. For example, in symbolic logic, classes of well-formed formulas are specified without mentioning what formulas (or their parts) mean, or whether they are true or false; similarly, the operations of a computer can be fruitfully specified using only syntactic properties, a fact that has a bearing on the viability of computational theories of mind. The study of the syntax of natural language has taken on significance for philosophy in the twentieth century, partly because of the suspicion, voiced by Russell, Wittgenstein and the logical positivists, that philosophical problems often turned on misunderstandings of syntax (or the closely related notion of ‘logical form’). Moreover, an idea that has been fruitfully developed since the pioneering work of Frege is that a proper understanding of syntax offers an important basis for any understanding of semantics, since the meaning of a complex expression is compositional, that is, built up from the meanings of its parts as determined by syntax. In the mid-twentieth century, philosophical interest in the systematic study of the syntax of natural language was heightened by Noam Chomsky’s work on the nature of syntactic rules and on the innateness of mental structures specific to the acquisition (or growth) of grammatical knowledge. This work formalized traditional work on grammatical categories within an approach to the theory of computability, and also revived proposals of traditional philosophical rationalists that many twentieth-century empiricists had regarded as bankrupt. Chomskian theories of grammar have become the focus of most contemporary work on syntax.


2015 ◽  
Vol 370 (1664) ◽  
pp. 20140097 ◽  
Author(s):  
Martin Rohrmeier ◽  
Willem Zuidema ◽  
Geraint A. Wiggins ◽  
Constance Scharff

Human language, music and a variety of animal vocalizations constitute ways of sonic communication that exhibit remarkable structural complexity. While the complexities of language and possible parallels in animal communication have been discussed intensively, reflections on the complexity of music and animal song, and their comparisons, are underrepresented. In some ways, music and animal songs are more comparable to each other than to language as propositional semantics cannot be used as indicator of communicative success or wellformedness, and notions of grammaticality are less easily defined. This review brings together accounts of the principles of structure building in music and animal song. It relates them to corresponding models in formal language theory, the extended Chomsky hierarchy (CH), and their probabilistic counterparts. We further discuss common misunderstandings and shortcomings concerning the CH and suggest ways to move beyond. We discuss language, music and animal song in the context of their function and motivation and further integrate problems and issues that are less commonly addressed in the context of language, including continuous event spaces, features of sound and timbre, representation of temporality and interactions of multiple parallel feature streams. We discuss these aspects in the light of recent theoretical, cognitive, neuroscientific and modelling research in the domains of music, language and animal song.


Sign in / Sign up

Export Citation Format

Share Document