Monosyllabisme Originel, Fiction Et Reconstruction

Diachronica ◽  
1991 ◽  
Vol 8 (1) ◽  
pp. 17-44
Author(s):  
Guy Jucquois

SUMMARY The monosyllablic character of language is often presented as reflecting its original trait. This tendency seems to underlie much of the earlier conceptions of the origin of language as well as the more recent glottogonic theories, in which, with variations, an original system of monosyllabic cries ('calls') is hypothesized which evolves, parallel with humanity, into a more complex system of communication. This view reappears in the so-called 'constructed' or artificial languages whose goal it has been to facilitate international communication through a simplified system. Finally, in the reconstructed natural languages, the procedures usually lead to the hypostatization of an initial monosyllabic root. In the different sectors of linguistic study, it appears that a linear evolution from the simple to the complex is regularly implied, which finds its expression in the passage of language structures from an original monosyllabism to a subsequent polysyllabism. RÉSUMÉ Le caractere monosyllabique du langage est souvent presente comme etant un trait originel. Cette tendance se vérifie aussi bien dans les conceptions du passé sur les origines du langage que dans les theories contemporaines ou on pose toujours, avec des variantes, un systeme originel de cris monosyllabiques se complexifiant ensuite, parallélement avec l'évolution humaine. Cette conception se retrouve egalement dans les langues constitutes et artificielles dont le but explicite serait une communication meilleure, mais plus simplifiee entre les hommes. Enfin, dans les langues reconstruites, naturelles, les procédures aboutissent, pour des families de langues fort diverses, à poser un mono-syllabisme radical initial. Dans les différents secteurs des études linguistiques, on semble toujours supposer une évolution linéaire, du simple au complexe, qui s'exprimerait par le passage d'un monosyllabisme premier à un polysyl-labisme secondaire. ZUSAMMENFASSUNG Die einsilbige Beschaffenheit der Sprache wird oft als die ursprüngliche dargestellt. Die Neigung bestäigt sich sowohl in den fruheren Vorstellungen vom Sprachursprung als auch in den heutigen glottogonischen Theorien, in denen man stets — mit Abwandlungen — mit einem ursprunglichen System monosyllabischer Schreie operiert, die anschließend komplexer werden, und zwar parallel zur Entwicklung der Menschheit. Diese Auffassung findet sich ebenfalls in den konstruierten und küinstlichen Sprachen, deren ausdrückliches Ziel eine verbesserte, wenngleich weit vereinfachte, Kommunikation unter den Menschen ist. Schlieélich, was die rekonstruierten natürlichen Sprachen an-geht, so nimmt man am Ende immer an — und zwar in den verschiedenartigsten Sprachen —, daG eine ursprunglich einsilbige Wurzel anzusetzen sei. In den verschiedensten Sparten der Sprachforschung scheint man immer noch eine geradlinige Entwicklung vom Einfachen zum Komplexen anzusetzen, die ihren Ausdruck in der Annahme eines primaren Monosyllabismus findet, der anschließend in einen Polysyllabismus mündet.

2018 ◽  
Vol 45 (5) ◽  
pp. 1054-1072 ◽  
Author(s):  
Jessica F. SCHWAB ◽  
Casey LEW-WILLIAMS ◽  
Adele E. GOLDBERG

AbstractChildren tend to regularize their productions when exposed to artificial languages, an advantageous response to unpredictable variation. But generalizations in natural languages are typically conditioned by factors that children ultimately learn. In two experiments, adult and six-year-old learners witnessed two novel classifiers, probabilistically conditioned by semantics. Whereas adults displayed high accuracy in their productions – applying the semantic criteria to familiar and novel items – children were oblivious to the semantic conditioning. Instead, children regularized their productions, over-relying on only one classifier. However, in a two-alternative forced-choice task, children's performance revealed greater respect for the system's complexity: they selected both classifiers equally, without bias toward one or the other, and displayed better accuracy on familiar items. Given that natural languages are conditioned by multiple factors that children successfully learn, we suggest that their tendency to simplify in production stems from retrieval difficulty when a complex system has not yet been fully learned.


Author(s):  
Alan Reed Libert

Artificial languages—languages which have been consciously designed—have been created for more than 900 years, although the number of them has increased considerably in recent decades, and by the early 21st century the total figure probably was in the thousands. There have been several goals behind their creation; the traditional one (which applies to some of the best-known artificial languages, including Esperanto) is to make international communication easier. Some other well-known artificial languages, such as Klingon, have been designed in connection with works of fiction. Still others are simply personal projects. A traditional way of classifying artificial languages involves the extent to which they make use of material from natural languages. Those artificial languages which are created mainly by taking material from one or more natural languages are called a posteriori languages (which again include well-known languages such as Esperanto), while those which do not use natural languages as sources are a priori languages (although many a posteriori languages have a limited amount of a priori material, and some a priori languages have a small number of a posteriori components). Between these two extremes are the mixed languages, which have large amounts of both a priori and a posteriori material. Artificial languages can also be classified typologically (as natural languages are) and by how and how much they have been used. Many linguists seem to be biased against research on artificial languages, although some major linguists of the past have been interested in them.


2017 ◽  
Vol 18 (1) ◽  
pp. 105-127
Author(s):  
Sunyoung Park ◽  
Jin-young Tak

Author(s):  
Kathleen Haney

An international conference that takes Philosophy Educating Humanity as its theme does well to revisit the liberal arts tradition. Although the liberal arts are most often assimilated to studies brought together as the Humanities, the old usage included the arts which employed artificial languages in mathematics, music, and astronomy, as well as the literature and letters of the various natural languages. The current conflation of liberal education with the humanities does violence to the historical tradition in education, reducing it to fluff in the eyes of tough-minded scientists who know that only numbers deliver objectivity. The liberal arts of the traditional undergraduate curriculum provided the skills to liberate the student's linguistic powers so that he or she could read, speak, and understand natural language in all its functions. To educate human persons to master language is to encourage students to take possession of their natural powers so that they can express themselves, understand what others say, and reason together. The arts of natural language lead to mastery of the mathematical arts which use a language that is no one's mother tongue. Together, the seven arts rid students of the worst enemies of humankind: ignorance and prejudice.


1995 ◽  
Vol 1 (3) ◽  
pp. 217-234 ◽  
Author(s):  
Stephen G. Pulman

AbstractArtificial languages for person-machine communication seldom display the most characteristic properties of natural languages, such as the use of anaphoric or other referring expressions, or ellipsis. This paper argues that useful use could be made of such devices in artificial languages, and proposes a mechanism for the resolution of ellipsis and anaphora in them using finite state transduction techniques. This yields an interpretation system displaying many desirable properties: easily implementable, efficient, incremental and reversible.Linguists in general, and computational linguists in particular, do well to employ finite state devices wherever possible. They are theoretically appealing because they are computationally weak and best understood from a mathematical point of view. They are computationally appealing because they make for simple, elegant, and highly efficient implementations. In this paper, I hope I have shown how they can be applied to a problem… which seems initially to require heavier machinery.


Complexity ◽  
2017 ◽  
Vol 2017 ◽  
pp. 1-7 ◽  
Author(s):  
Vasilii A. Gromov ◽  
Anastasia M. Migrina

A natural language (represented by texts generated by native speakers) is considered as a complex system, and the type thereof to which natural languages belong is ascertained. Namely, the authors hypothesize that a language is a self-organized critical system and that the texts of a language are “avalanches” flowing down its word cooccurrence graph. The respective statistical characteristics for distributions of the number of words in the texts of English and Russian languages are calculated; the samples were constructed on the basis of corpora of literary texts and of a set of social media messages (as a substitution to the oral speech). The analysis found that the number of words in the texts obeys power-law distribution.


Author(s):  
Elyse Piquette

Transformational grammar has attempted to outline the systematic nature of language structure while also stressing the creative aspect of language. Language is systematic in that speakers use a finite number of means to make up their messages, and yet it is creative in that there are an infinite number of individual different messages which are possible in any natural language. In natural languages, however, there is not a perfect one-to-one correspondence between possible messages—intended or perceived—and possible linguistic realizations, as there exists in conventional or artificial languages. Often it is found in natural languages that a single linguistic form may have two or more meanings. Homonymy, whether it is lexical or syntactic, is an important notion, not only because syntactic ambiguity plays a central role in linguistic theory, but also because its study gives us a better understanding of the systematics of language and of the way we attach meaning to linguistic representations. Hence, the importance of evaluating how speakers deal with syntactic ambiguity in their attempts to understand and to be understood.


2006 ◽  
Vol 72 (1) ◽  
pp. 233-252 ◽  
Author(s):  
Ernesto Napoli

The paper is concerned with negation in artificial and natural languages. "Negation" is an ambiguous word. It can mean three different things: An operation(negating), an operator (a sign of negation), the result of an operation. The threethings, however, are intimately linked. An operation such as negation, is realizedthrough an operator of negation, i.e. consists in adding a symbol of negation to an entity to obtain an entity of the same type; and which operation it is dependson what it applies to and on what results from its application. I argue that negation is not an operation on linguistic acts but rather anoperation on the objects of linguistic acts, namely sentences. And I assume that the negation of a sentence is a sentence that contradicts it. If so, the negation of a sentence may be obtained, in case the sentence is molecular, by applying the operation of negation not to the sentence itself but to a constituent sentence. To put it in a succinct and paradoxically sounding way we could say that in order to negate a sentence it is sufficient but not necessary to negate it. However that negation applies to sentences is true only for artificial languages, in which the sign of negation is a monadic sentential connective. In natural language, negation applies to expressions other than sentences, namely word sand non-sentential phrases. Still words and not sentential phrases are interesting and valuable only as ultimate or immediate constituents of sentences, as a means of saying (something that can be true or false) and the concern with negation is ultimately the concern with the negation of sentences. So the problem is what sub-sentential and non sentential expressions negation should apply to in order to obtain the negation of the containing sentence. The standard answer is that the negation of a natural language sentence is equivalent to the negation of its predicate. Yet, I argue, predicate negation is necessary but not sufficient, due to the existence of molecular sentences. Finally I notice that if to apply negation to an artificial sentence is to put the negation sign in front of it, to negate the predicate of a natural language sentencemay or may not be to put the negation sign in front of it.


Author(s):  
Zulfiyya Abilova ◽  

Many natural languages contain a large number of borrowed words, which usually enter the language as the result of cultural-historical, socio-economic and other relations between people. The article is devoted to the English language which, in the process of its historical development, was crossed with the Scandinavian languages and the Norman dialect of the French language. In addition, English almost, throughout its history, had linguistic interaction with Latin, French, Spanish, Russian, German and other languages of the world. This article examines the influence of Latin, French and Scandinavian languages as well as the development of English as the language of international communication.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1335
Author(s):  
Shane Steinert-Threlkeld

While the languages of the world vary greatly, they exhibit systematic patterns, as well. Semantic universals are restrictions on the variation in meaning exhibit cross-linguistically (e.g., that, in all languages, expressions of a certain type can only denote meanings with a certain special property). This paper pursues an efficient communication analysis to explain the presence of semantic universals in a domain of function words: quantifiers. Two experiments measure how well languages do in optimally trading off between competing pressures of simplicity and informativeness. First, we show that artificial languages which more closely resemble natural languages are more optimal. Then, we introduce information-theoretic measures of degrees of semantic universals and show that these are not correlated with optimality in a random sample of artificial languages. These results suggest both that efficient communication shapes semantic typology in both content and function word domains, as well as that semantic universals may not stand in need of independent explanation.


Sign in / Sign up

Export Citation Format

Share Document