Mouth gestures in British Sign Language

2011 ◽  
Vol 14 (1) ◽  
pp. 94-114 ◽  
Author(s):  
Donna Lewin ◽  
Adam C. Schembri

This article investigates the claim that tongue protrusion (‘th’) acts as a nonmanual adverbial morpheme in British Sign Language (BSL) (Brennan 1992; Sutton-Spence & Woll 1999) drawing on narrative data produced by two deaf native signers as part of the European Cultural Heritage Online (ECHO) corpus. Data from ten BSL narratives have been analysed to observe the frequency and form of tongue protrusion. The results from this preliminary investigation indicate tongue protrusion occurs as part of the phonological formation of lexical signs (i.e., ‘echo phonology’, see Woll 2001), as well as a separate meaningful unit that co-occurs (sometimes as part of constructed action) with classifier constructions and lexical verb signs. In the latter cases, the results suggest ‘th’ sometimes appears to function as an adverbial morpheme in BSL, but with a greater variety of meanings than previously suggested in the BSL literature. One use of the adverbial appears similar to a nonmanual signal in American Sign Language described by Liddell (1980), although the form of the mouth gesture in our BSL data differs from what is reported in Liddell’s work. Thus, these findings suggest the mouth gesture ‘th’ in BSL has a broad range of functions. Some uses of tongue protrusion, however, remain difficult to categorise and further research with a larger dataset is needed.

Gesture ◽  
2013 ◽  
Vol 13 (1) ◽  
pp. 1-27 ◽  
Author(s):  
Rachel Sutton-Spence ◽  
Donna Jo Napoli

Sign Language poetry is especially valued for its presentation of strong visual images. Here, we explore the highly visual signs that British Sign Language and American Sign Language poets create as part of the ‘classifier system’ of their languages. Signed languages, as they create visually-motivated messages, utilise categoricity (more traditionally considered ‘language’) and analogy (more traditionally considered extra-linguistic and the domain of ‘gesture’). Classifiers in sign languages arguably show both these characteristics (Oviedo, 2004). In our discussion of sign language poetry, we see that poets take elements that are widely understood to be highly visual, closely representing their referents, and make them even more highly visual — so going beyond categorisation and into new areas of analogue.


Sign languages are visual languages that use hand, facial and body movements as a means of communication. There are over 135 different sign languages all around the world including American Sign Language (ASL), Indian Sign Language (ISL) and British Sign Language (BSL). Sign language is commonly used as the main form of communication for people who are Deaf or hard of hearing, but sign languages also have a lot to offer for everyone. In our proposed system, we are creating a Web Application which contains two modules: The first module will accept the Information in Natural Language (Input Text) and it will show the corresponding Information in Sign Language Images (GIF Format). The second module will accept the Information in Sign Language (Input Hand Gesture of any ASL Letter) and it will detect the Letter and display it as the Output (Text). The system is built to bridge the communication gap between deaf-mute people and regular people as those who don’t know the American Sign Language can either use it to learn the Sign Language or to communicate with someone who knows the Sign Language. This approach will help users in quick communication without having to wait for any human interpreter to translate the Sign Language. The application is developed using Django and Flask frameworks and it includes NLP and Neural Network. We are focusing on improving the Living standards of the hearing impaired people as it can be very difficult to perform everyday tasks especially when people around them don’t know Sign Language. This application can also be used as a teaching tool for relatives and friends of deaf people as well as people interested in learning the sign language.


2020 ◽  
Vol 37 (4) ◽  
pp. 571-608
Author(s):  
Diane Brentari ◽  
Laura Horton ◽  
Susan Goldin-Meadow

Abstract Two differences between signed and spoken languages that have been widely discussed in the literature are: the degree to which morphology is expressed simultaneously (rather than sequentially), and the degree to which iconicity is used, particularly in predicates of motion and location, often referred to as classifier predicates. In this paper we analyze a set of properties marking agency and number in four sign languages for their crosslinguistic similarities and differences regarding simultaneity and iconicity. Data from American Sign Language (ASL), Italian Sign Language (LIS), British Sign Language (BSL), and Hong Kong Sign Language (HKSL) are analyzed. We find that iconic, cognitive, phonological, and morphological factors contribute to the distribution of these properties. We conduct two analyses—one of verbs and one of verb phrases. The analysis of classifier verbs shows that, as expected, all four languages exhibit many common formal and iconic properties in the expression of agency and number. The analysis of classifier verb phrases (VPs)—particularly, multiple-verb predicates—reveals (a) that it is grammatical in all four languages to express agency and number within a single verb, but also (b) that there is crosslinguistic variation in expressing agency and number across the four languages. We argue that this variation is motivated by how each language prioritizes, or ranks, several constraints. The rankings can be captured in Optimality Theory. Some constraints in this account, such as a constraint to be redundant, are found in all information systems and might be considered non-linguistic; however, the variation in constraint ranking in verb phrases reveals the grammatical and arbitrary nature of linguistic systems.


2016 ◽  
Vol 28 (1) ◽  
pp. 20-40 ◽  
Author(s):  
Velia Cardin ◽  
Eleni Orfanidou ◽  
Lena Kästner ◽  
Jerker Rönnberg ◽  
Bencie Woll ◽  
...  

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.


2021 ◽  
Vol 7 (2) ◽  
pp. 156-171
Author(s):  
Ilaria Berteletti ◽  
SaraBeth J. Sullivan ◽  
Lucas Lancaster

With two simple experiments we investigate the overlooked influence of handshape similarity for processing numerical information conveyed on the hands. In most finger-counting sequences there is a tight relationship between the number of fingers raised and the numerical value represented. This creates a possible confound where numbers closer to each other are also represented by handshapes that are more similar. By using the American Sign Language (ASL) number signs we are able to dissociate between the two variables orthogonally. First, we test the effect of handshape similarity in a same/different judgment task in a group of hearing non-signers and then test the interference of handshape in a number judgment task in a group of native ASL signers. Our results show an effect of handshape similarity and its interaction with numerical value even in the group of native signers for whom these handshapes are linguistic symbols and not a learning tool for acquiring numerical concepts. Because prior studies have never considered handshape similarity, these results open new directions for understanding the relationship between finger-based counting, internal hand representations and numerical proficiency.


2021 ◽  
pp. 1-12
Author(s):  
William Matchin ◽  
Deniz İlkbaşaran ◽  
Marla Hatrak ◽  
Austin Roth ◽  
Agnes Villwock ◽  
...  

Abstract Areas within the left-lateralized neural network for language have been found to be sensitive to syntactic complexity in spoken and written language. Previous research has revealed that these areas are active for sign language as well, but whether these areas are specifically responsive to syntactic complexity in sign language independent of lexical processing has yet to be found. To investigate the question, we used fMRI to neuroimage deaf native signers' comprehension of 180 sign strings in American Sign Language (ASL) with a picture-probe recognition task. The ASL strings were all six signs in length but varied at three levels of syntactic complexity: sign lists, two-word sentences, and complex sentences. Syntactic complexity significantly affected comprehension and memory, both behaviorally and neurally, by facilitating accuracy and response time on the picture-probe recognition task and eliciting a left lateralized activation response pattern in anterior and posterior superior temporal sulcus (aSTS and pSTS). Minimal or absent syntactic structure reduced picture-probe recognition and elicited activation in bilateral pSTS and occipital-temporal cortex. These results provide evidence from a sign language, ASL, that the combinatorial processing of anterior STS and pSTS is supramodal in nature. The results further suggest that the neurolinguistic processing of ASL is characterized by overlapping and separable neural systems for syntactic and lexical processing.


Sign in / Sign up

Export Citation Format

Share Document