British Sign Language Vocabulary Test--American Sign Language Version

2016 ◽  
Author(s):  
Wolfgang Mann ◽  
Penny Roy ◽  
Gary Morgan
Gesture ◽  
2013 ◽  
Vol 13 (1) ◽  
pp. 1-27 ◽  
Author(s):  
Rachel Sutton-Spence ◽  
Donna Jo Napoli

Sign Language poetry is especially valued for its presentation of strong visual images. Here, we explore the highly visual signs that British Sign Language and American Sign Language poets create as part of the ‘classifier system’ of their languages. Signed languages, as they create visually-motivated messages, utilise categoricity (more traditionally considered ‘language’) and analogy (more traditionally considered extra-linguistic and the domain of ‘gesture’). Classifiers in sign languages arguably show both these characteristics (Oviedo, 2004). In our discussion of sign language poetry, we see that poets take elements that are widely understood to be highly visual, closely representing their referents, and make them even more highly visual — so going beyond categorisation and into new areas of analogue.


2011 ◽  
Vol 14 (1) ◽  
pp. 94-114 ◽  
Author(s):  
Donna Lewin ◽  
Adam C. Schembri

This article investigates the claim that tongue protrusion (‘th’) acts as a nonmanual adverbial morpheme in British Sign Language (BSL) (Brennan 1992; Sutton-Spence & Woll 1999) drawing on narrative data produced by two deaf native signers as part of the European Cultural Heritage Online (ECHO) corpus. Data from ten BSL narratives have been analysed to observe the frequency and form of tongue protrusion. The results from this preliminary investigation indicate tongue protrusion occurs as part of the phonological formation of lexical signs (i.e., ‘echo phonology’, see Woll 2001), as well as a separate meaningful unit that co-occurs (sometimes as part of constructed action) with classifier constructions and lexical verb signs. In the latter cases, the results suggest ‘th’ sometimes appears to function as an adverbial morpheme in BSL, but with a greater variety of meanings than previously suggested in the BSL literature. One use of the adverbial appears similar to a nonmanual signal in American Sign Language described by Liddell (1980), although the form of the mouth gesture in our BSL data differs from what is reported in Liddell’s work. Thus, these findings suggest the mouth gesture ‘th’ in BSL has a broad range of functions. Some uses of tongue protrusion, however, remain difficult to categorise and further research with a larger dataset is needed.


Sign languages are visual languages that use hand, facial and body movements as a means of communication. There are over 135 different sign languages all around the world including American Sign Language (ASL), Indian Sign Language (ISL) and British Sign Language (BSL). Sign language is commonly used as the main form of communication for people who are Deaf or hard of hearing, but sign languages also have a lot to offer for everyone. In our proposed system, we are creating a Web Application which contains two modules: The first module will accept the Information in Natural Language (Input Text) and it will show the corresponding Information in Sign Language Images (GIF Format). The second module will accept the Information in Sign Language (Input Hand Gesture of any ASL Letter) and it will detect the Letter and display it as the Output (Text). The system is built to bridge the communication gap between deaf-mute people and regular people as those who don’t know the American Sign Language can either use it to learn the Sign Language or to communicate with someone who knows the Sign Language. This approach will help users in quick communication without having to wait for any human interpreter to translate the Sign Language. The application is developed using Django and Flask frameworks and it includes NLP and Neural Network. We are focusing on improving the Living standards of the hearing impaired people as it can be very difficult to perform everyday tasks especially when people around them don’t know Sign Language. This application can also be used as a teaching tool for relatives and friends of deaf people as well as people interested in learning the sign language.


2018 ◽  
Author(s):  
Leslie Pertz ◽  
Missy Plegue ◽  
Kathleen Diehl ◽  
Philip Zazove ◽  
Michael McKee

2020 ◽  
Vol 37 (4) ◽  
pp. 571-608
Author(s):  
Diane Brentari ◽  
Laura Horton ◽  
Susan Goldin-Meadow

Abstract Two differences between signed and spoken languages that have been widely discussed in the literature are: the degree to which morphology is expressed simultaneously (rather than sequentially), and the degree to which iconicity is used, particularly in predicates of motion and location, often referred to as classifier predicates. In this paper we analyze a set of properties marking agency and number in four sign languages for their crosslinguistic similarities and differences regarding simultaneity and iconicity. Data from American Sign Language (ASL), Italian Sign Language (LIS), British Sign Language (BSL), and Hong Kong Sign Language (HKSL) are analyzed. We find that iconic, cognitive, phonological, and morphological factors contribute to the distribution of these properties. We conduct two analyses—one of verbs and one of verb phrases. The analysis of classifier verbs shows that, as expected, all four languages exhibit many common formal and iconic properties in the expression of agency and number. The analysis of classifier verb phrases (VPs)—particularly, multiple-verb predicates—reveals (a) that it is grammatical in all four languages to express agency and number within a single verb, but also (b) that there is crosslinguistic variation in expressing agency and number across the four languages. We argue that this variation is motivated by how each language prioritizes, or ranks, several constraints. The rankings can be captured in Optimality Theory. Some constraints in this account, such as a constraint to be redundant, are found in all information systems and might be considered non-linguistic; however, the variation in constraint ranking in verb phrases reveals the grammatical and arbitrary nature of linguistic systems.


2014 ◽  
Author(s):  
Katherine Rogers ◽  
Chris Evans ◽  
Malcolm Campbell ◽  
Alys Young ◽  
Karina Lovell

1976 ◽  
Vol 41 (3) ◽  
pp. 339-347 ◽  
Author(s):  
John D. Bonvillian ◽  
Keith E. Nelson

A mute autistic boy learned to communicate extensively through American Sign Language. Over a six-month period he produced many spontaneous signs and sign combinations, and analyses of the child’s sign combinations indicated the presence of a full range of semantic relations. Further evidence of conceptual progress was provided by the child’s increased score on the Peabody Picture Vocabulary Test. In addition, parents' and teacher’s reports indicated that the child’s social behavior improved. The extent of the boy’s linguistic progress and associated improvement in social behavior markedly exceeds that usually reported for mute autistic children.


Sign in / Sign up

Export Citation Format

Share Document