Using signed language collocations to investigate acquisition: A commentary on Ambridge (2020)

2020 ◽  
Vol 40 (5-6) ◽  
pp. 585-591
Author(s):  
Lynn Hou ◽  
Jill P. Morford

The visual-manual modality of sign languages renders them a unique test case for language acquisition and processing theories. In this commentary the authors describe evidence from signed languages, and ask whether it is consistent with Ambridge’s proposal. The evidence includes recent research on collocations in American Sign Language that reveal collocational frequency effects and patterns that do not constitute syntactic constituents. While these collocations appear to resist fully abstract schematization, further consideration of how speakers create exemplars and how they link exemplar clouds based on tokens and how much abstraction is involved in their creation is warranted.

1999 ◽  
Vol 26 (2) ◽  
pp. 321-338 ◽  
Author(s):  
E. DAYLENE RICHMOND-WELTY ◽  
PATRICIA SIPLE

Signed languages make unique demands on gaze during communication. Bilingual children acquiring both a spoken and a signed language must learn to differentiate gaze use for their two languages. Gaze during utterances was examined for a set of bilingual-bimodal twins acquiring spoken English and American Sign Language (ASL) and a set of monolingual twins acquiring ASL when the twins were aged 2;0, 3;0 and 4;0. The bilingual-bimodal twins differentiated their languages by age 3;0. Like the monolingual ASL twins, the bilingual-bimodal twins established mutual gaze at the beginning of their ASL utterances and either maintained gaze to the end or alternated gaze to include a terminal look. In contrast, like children acquiring spoken English monolingually, the bilingual-bimodal twins established mutual gaze infrequently for their spoken English utterances. When they did establish mutual gaze, it occurred later in their spoken utterances and they tended to look away before the end.


Gesture ◽  
2013 ◽  
Vol 13 (1) ◽  
pp. 1-27 ◽  
Author(s):  
Rachel Sutton-Spence ◽  
Donna Jo Napoli

Sign Language poetry is especially valued for its presentation of strong visual images. Here, we explore the highly visual signs that British Sign Language and American Sign Language poets create as part of the ‘classifier system’ of their languages. Signed languages, as they create visually-motivated messages, utilise categoricity (more traditionally considered ‘language’) and analogy (more traditionally considered extra-linguistic and the domain of ‘gesture’). Classifiers in sign languages arguably show both these characteristics (Oviedo, 2004). In our discussion of sign language poetry, we see that poets take elements that are widely understood to be highly visual, closely representing their referents, and make them even more highly visual — so going beyond categorisation and into new areas of analogue.


Author(s):  
Anne Therese Frederiksen ◽  
Rachel I. Mayberry

AbstractImplicit causality (IC) biases, the tendency of certain verbs to elicit re-mention of either the first-mentioned noun phrase (NP1) or the second-mentioned noun phrase (NP2) from the previous clause, are important in psycholinguistic research. Understanding IC verbs and the source of their biases in signed as well as spoken languages helps elucidate whether these phenomena are language general or specific to the spoken modality. As the first of its kind, this study investigates IC biases in American Sign Language (ASL) and provides IC bias norms for over 200 verbs, facilitating future psycholinguistic studies of ASL and comparisons of spoken versus signed languages. We investigated whether native ASL signers continued sentences with IC verbs (e.g., ASL equivalents of ‘Lisa annoys Maya because…’) by mentioning NP1 (i.e., Lisa) or NP2 (i.e., Maya). We found a tendency towards more NP2-biased verbs. Previous work has found that a verb’s thematic roles predict bias direction: stimulus-experiencer verbs (e.g., ‘annoy’), where the first argument is the stimulus (causing annoyance) and the second argument is the experiencer (experiencing annoyance), elicit more NP1 continuations. Verbs with experiencer-stimulus thematic roles (e.g., ‘love’) elicit more NP2 continuations. We probed whether the trend towards more NP2-biased verbs was related to an existing claim that stimulus-experiencer verbs do not exist in sign languages. We found that stimulus-experiencer structure, while permitted, is infrequent, impacting the IC bias distribution in ASL. Nevertheless, thematic roles predict IC bias in ASL, suggesting that the thematic role-IC bias relationship is stable across languages as well as modalities.


Target ◽  
1995 ◽  
Vol 7 (1) ◽  
pp. 135-149 ◽  
Author(s):  
William P. Isham

Abstract Research using interpreters who work with signed languages can aid us in understanding the cognitive processes of interpretation in general. Using American Sign Language (ASL) as an example, the nature of signed languages is outlined first. Then the difference between signed languages and manual codes for spoken languages is delineated, and it is argued that these two manners of communicating through the visual channel offer a unique research opportunity. Finally, an example from recent research is used to demonstrate how comparisons between spoken-language interpreters and signed-language interpreters can be used to test hypotheses regarding interpretation.


2016 ◽  
Vol 19 (1) ◽  
pp. 82-123 ◽  
Author(s):  
Erin Wilkinson

This study explores whether American Sign Language (ASL) users exhibit frequency effects on two-sign combinations as observed in spoken languages. Studies on spoken languages have demonstrated that frequency of usage influences the emergence of grammatical constructions; however, there has been less investigation of this question for signed languages. To examine frequency effects in ASL, this study analyzes patterns of a grammatical manual negation morpheme glossed as NOT produced sequentially with other signs. Findings reveal that NOT is produced with specific signs, demonstrating that the grammaticalization of NOT increases as frequency does in ASL collocations. The analysis shows that a few signs are highly phonologically fused with the negation marker, providing emerging evidence that these collocations have experienced chunking, as they are schematic, fused constituent structures in ASL. Given frequency effects found in the study, chunking appears to be a domain-general cognitive processing mechanism independent of modality effects.


Author(s):  
Greg Evans

Linguistic theory has traditionally defined language in terms of speech and has, as a result, labelled sign languages as non-linguistic systems. Recent advances in sign language linguistic research, however, indicate that modern linguistic theory must include sign language research and theory. This paper examines the historical bias linguistic theory has maintained towards sign languages and refutes the classification of sign languages as contrived artificial systems by surveying current linguistic research into American Sign Language. The growing body of American Sign Language research demonstrates that a signed language can have all the structural levels of spoken language despite its visual-spatial mode. This research also indicates that signed languages are an important source of linguistic data that can help further develop a cognitive linguistic theory.


2021 ◽  
pp. 026765832110376
Author(s):  
Emily Saunders ◽  
David Quinto-Pozos

Studies have shown that iconicity can provide a benefit to non-signers during the learning of single signs, but other aspects of signed messages that might also be beneficial have received less attention. In particular, do other features of signed languages help support comprehension of a message during the process of language learning? The following exploratory study investigates the comprehension of sentences in two signed and two spoken languages by non-signers and by American Sign Language (ASL) learners. The design allows for the examination of message comprehension, with a comparison of unknown spoken and signed languages. Details of the stimulus sentences are provided in order to contextualize features of the signing that might be providing benefits for comprehension. Included in this analysis are aspects of the sentences that are iconic and spatially deictic – some of which resemble common gestural forms of communication. The results indicate that iconicity and referential points in signed language likely assist with comprehension of sentences, even for non-signers and for a signed language that the ASL signers have not studied.


2021 ◽  
pp. 095679762199155
Author(s):  
Amanda R. Brown ◽  
Wim Pouw ◽  
Diane Brentari ◽  
Susan Goldin-Meadow

When we use our hands to estimate the length of a stick in the Müller-Lyer illusion, we are highly susceptible to the illusion. But when we prepare to act on sticks under the same conditions, we are significantly less susceptible. Here, we asked whether people are susceptible to illusion when they use their hands not to act on objects but to describe them in spontaneous co-speech gestures or conventional sign languages of the deaf. Thirty-two English speakers and 13 American Sign Language signers used their hands to act on, estimate the length of, and describe sticks eliciting the Müller-Lyer illusion. For both gesture and sign, the magnitude of illusion in the description task was smaller than the magnitude of illusion in the estimation task and not different from the magnitude of illusion in the action task. The mechanisms responsible for producing gesture in speech and sign thus appear to operate not on percepts involved in estimation but on percepts derived from the way we act on objects.


2021 ◽  
Author(s):  
Kathryn Woodcock ◽  
Steven L. Fischer

<div>"This Guide is intended for working interpreters, interpreting students and educators, and those who employ or purchase the services of interpreters. Occupational health education is essential for professionals in training, to avoid early attrition from practice. "Sign language interpreting" is considered to include interpretation between American Sign Language (ASL) and English, other spoken languages and corresponding sign languages, and between sign languages (e.g., Deaf Interpreters). Some of the occupational health issues may also apply equally to Communication Access Realtime Translation (CART) reporters, oral interpreters, and intervenors. The reader is encouraged to make as much use as possible of the information provided here". -- Introduction.</div><div><br></div>


Sign in / Sign up

Export Citation Format

Share Document