Symbol and Symptom

2009 ◽  
Vol 7 ◽  
pp. 89-110 ◽  
Author(s):  
Sherman Wilcox

This study examines the developmental routes by which gesture is codified into a linguistic system in the context of the natural signed languages of the deaf. I suggest that gestures follow two routes as they codify, and thus that signed languages provide evidence of how material which begins its developmental life external to the conventional linguistic system, as spontaneous or conventional gestures, is codified as language. The Italian Sign Language modal form ‘impossible’ is studied in detail, exploring the developmental route that led from Roman gestures, through liturgical gestures as depicted in medieval Italian art, through everyday Italian and Neapolitan gestures to its modal meaning.

Gesture ◽  
2004 ◽  
Vol 4 (1) ◽  
pp. 43-73 ◽  
Author(s):  
Sherman Wilcox

In this paper I explore the role of gesture in the development of signed languages. Using data from American Sign Language, Catalan Sign Language, French Sign Language, and Italian Sign Language, as well as historical sources describing gesture in the Mediterranean region, I demonstrate that gesture enters the linguistic system via two distinct routes. In one, gesture serves as a source of lexical and grammatical morphemes in signed languages. In the second, elements become directly incorporated into signed language morphology, bypassing the lexical stage. Finally, I propose a unifying framework for understanding the gesture-language interface in signed and spoken languages.


2014 ◽  
Vol 17 (2) ◽  
pp. 215-238 ◽  
Author(s):  
Sotaro Kita ◽  
Ingeborg van Gijn ◽  
Harry van der Hulst

Since Battison (1978), it has been noted in many signed languages that the Symmetry Condition constrains the form of two-handed signs in which two hands move independently. The Condition states that the form features (e.g., the handshapes and movements) of the two hands are ‘symmetrical’. The Symmetry Condition has been regarded in the literature as a part of signed language phonology. In this study, we examine the linguistic status of the Symmetry Condition by comparing the degree of symmetry in signs from Sign Language of the Netherlands and speech-accompanying representational gestures produced by Dutch speakers. Like signed language, such gestures use hand movements to express concepts, but they do not constitute a linguistic system in their own right. We found that the Symmetry Condition holds equally well for signs and spontaneous gestures. This indicates that this condition is a general cognitive constraint, rather than a constraint specific to language. We suggest that the Symmetry Condition is a manifestation of the mind having one active ‘mental articulator’ when expressing a concept with hand movements.


2016 ◽  
Vol 28 (1) ◽  
pp. 20-40 ◽  
Author(s):  
Velia Cardin ◽  
Eleni Orfanidou ◽  
Lena Kästner ◽  
Jerker Rönnberg ◽  
Bencie Woll ◽  
...  

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.


2021 ◽  
Author(s):  
Lorna C Quandt ◽  
Athena Willis ◽  
Carly Leannah

Signed language users communicate in a wide array of sub-optimal environments, such as in dim lighting or from a distance. While fingerspelling is a common and essential part of signed languages, the perception of fingerspelling in varying visual environments is not well understood. Signed languages such as American Sign Language (ASL) rely on visuospatial information that combines hand and bodily movements, facial expressions, and fingerspelling. Linguistic information in ASL is conveyed with movement and spatial patterning, which lends itself well to using dynamic Point Light Display (PLD) stimuli to represent sign language movements. We created PLD videos of fingerspelled location names. The location names were either Real (e.g., KUWAIT) or Pseudo-names (e.g., CLARTAND), and the PLDs showed either a High or a Low number of markers. In an online study, Deaf and Hearing ASL users (total N = 283) watched 27 PLD stimulus videos that varied by Realness and Number of Markers. We calculated accuracy and confidence scores in response to each video. We predicted that when signers see ASL fingerspelled letter strings in a suboptimal visual environment, language experience in ASL will be positively correlated with accuracy and self-rated confidence scores. We also predicted that Real location names would be understood better than Pseudo names. Our findings show that participants were more accurate and confident in response to Real place names than Pseudo names and for stimuli with High rather than Low markers. We also discovered a significant interaction between Age and Realness, which shows that as people age, they can better use outside world knowledge to inform their fingerspelling success. Finally, we examined the accuracy and confidence in fingerspelling perception in sub-groups of people who had learned ASL before the age of four. Studying the relationship between language experience with PLD fingerspelling perception allows us to explore how hearing status, ASL fluency levels, and age of language acquisition affect the core abilities of understanding fingerspelling.


Gesture ◽  
2013 ◽  
Vol 13 (1) ◽  
pp. 1-27 ◽  
Author(s):  
Rachel Sutton-Spence ◽  
Donna Jo Napoli

Sign Language poetry is especially valued for its presentation of strong visual images. Here, we explore the highly visual signs that British Sign Language and American Sign Language poets create as part of the ‘classifier system’ of their languages. Signed languages, as they create visually-motivated messages, utilise categoricity (more traditionally considered ‘language’) and analogy (more traditionally considered extra-linguistic and the domain of ‘gesture’). Classifiers in sign languages arguably show both these characteristics (Oviedo, 2004). In our discussion of sign language poetry, we see that poets take elements that are widely understood to be highly visual, closely representing their referents, and make them even more highly visual — so going beyond categorisation and into new areas of analogue.


Gesture ◽  
2016 ◽  
Vol 15 (2) ◽  
pp. 192-223 ◽  
Author(s):  
Orit Fuks

The study describes certain structural modifications employed on the citation forms of ISL during signing for intensification purposes. In Signed Languages, citation forms are considered relatively immune to modifications. Nine signers signed several scenarios describing some intense quality. The signers used conventional adverbs existing in ISL for intensification purposes. Yet, they also employed idiosyncratic modifications on the formational components of adjectives simultaneously to form realization. These optional modifications enriched the messages conveyed merely by the conventional forms. They show that signers can incorporate gradient modes of expressions directly into the production of the lexical items to communicate more diverse and explicit messages in context. Using a comparative semiotic approach allowed us to describe the synergetic cooperation manifested at the stage of utterance construction between formational elements which were more suited to convey gradient and analog meanings in context and those that were less suited and thus not modified.


Gesture ◽  
2001 ◽  
Vol 1 (1) ◽  
pp. 51-72 ◽  
Author(s):  
Evelyn McClave

This paper presents evidence of non-manual gestures in American Sign Language (ASL). The types of gestures identified are identical to non-manual, spontaneous gestures used by hearing non-signers which suggests that the gestures co-occurring with ASL signs are borrowings from hearing culture. A comparison of direct quotes in ASL with spontaneous movements of hearing non-signers suggests a history of borrowing and eventual grammaticization in ASL of features previously thought to be unique to signed languages. The electronic edition of this article includes audio-visial data.


Multilingua ◽  
2016 ◽  
Vol 35 (3) ◽  
Author(s):  
Elizabeth S. Parks

AbstractIn this paper, I use a holographic metaphor to explain the identification of overlapping sign language communities in Panama. By visualizing Panama’s complex signing communities as emitting community “hotspots” through social drama on multiple stages, I employ ethnographic methods to explore overlapping contours of Panama’s sign language communities in both time and space, similar to what a hologram accomplishes. Based on rapid appraisal of Panama’s signed languages through 2 weeks of participant observation, interviews, and lexical comparisons, and contextualization of this data in a broad 5-year project that included fieldwork in 15 countries in Latin America and the Caribbean, I propose recognition of overlapping Chiriquí and Panamanian Signing Communities using distinct signed languages: Lengua de Señas Panameñas and Lengua de Señas de Chiriquí.


Author(s):  
Cayley Guimarães ◽  
Rita Cassia Maestri

Sign Language is fundamental for Deaf communication, culture and citizenship. The Brazilian Sign Language (Libras) is a complet linguistic system, of visual-spatial modality, with specificities that present a challenge for teaching and learning as L2. Non-Manual Expression is on atributte of the language for meaning attribution. Meaning attribution occurs from visual symbolic processes where non-manual expressions acquire a central role, and differs from those used in the oral language. This requires adequate educational practices and pedagogical material for the acquisition of Libras as L2. This research proposes a learning object and a methodology for teaching and learning of Libras in the form of a game that focus on non-manual expressions. The proposed methodology comprises the context, the theme of the communication situation, the utterance of the sign in Libras, along with learning fixation activities. Validation shows the importance of valuing the grammar of Libras as a pedagogical strategy that is adequate to teaching and learning non-manual expression in Libras.


Author(s):  
Anne Therese Frederiksen ◽  
Rachel I. Mayberry

AbstractImplicit causality (IC) biases, the tendency of certain verbs to elicit re-mention of either the first-mentioned noun phrase (NP1) or the second-mentioned noun phrase (NP2) from the previous clause, are important in psycholinguistic research. Understanding IC verbs and the source of their biases in signed as well as spoken languages helps elucidate whether these phenomena are language general or specific to the spoken modality. As the first of its kind, this study investigates IC biases in American Sign Language (ASL) and provides IC bias norms for over 200 verbs, facilitating future psycholinguistic studies of ASL and comparisons of spoken versus signed languages. We investigated whether native ASL signers continued sentences with IC verbs (e.g., ASL equivalents of ‘Lisa annoys Maya because…’) by mentioning NP1 (i.e., Lisa) or NP2 (i.e., Maya). We found a tendency towards more NP2-biased verbs. Previous work has found that a verb’s thematic roles predict bias direction: stimulus-experiencer verbs (e.g., ‘annoy’), where the first argument is the stimulus (causing annoyance) and the second argument is the experiencer (experiencing annoyance), elicit more NP1 continuations. Verbs with experiencer-stimulus thematic roles (e.g., ‘love’) elicit more NP2 continuations. We probed whether the trend towards more NP2-biased verbs was related to an existing claim that stimulus-experiencer verbs do not exist in sign languages. We found that stimulus-experiencer structure, while permitted, is infrequent, impacting the IC bias distribution in ASL. Nevertheless, thematic roles predict IC bias in ASL, suggesting that the thematic role-IC bias relationship is stable across languages as well as modalities.


Sign in / Sign up

Export Citation Format

Share Document