“It is natural, really deaf signing” – script development for fictional programmes involving sign languages

Multilingua ◽  
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Annelies Kusters ◽  
Jordan Fenlon

Abstract Historically, fictional productions which use sign language have often begun with scripts that use the written version of a spoken language. This can be a challenge for deaf actors as they must translate the written word to a performed sign language text. Here, we explore script development in Small World, a television comedy which attempted to avoid this challenge by using improvisation to create their script. The creators framed this process as a response to what they saw as “inauthentic” sign language use on television, foregrounding the need to present “natural signing” on the screen. According to them, “natural signing” is not influenced by an English script but is varied language use that reflect a character’s background, their settings, and the characters that they interact with. We describe how this authentic language use is derived primarily from improvisation and is in competition with other demands, which are textual (e.g., the need to ensure comedic value), studio-based (e.g., operating within the practical confines of the studio), or related to audience design (e.g., the need to ensure comprehensibility). We discuss how the creative team negotiated the tension between the quest for authentic language use and characteristics of the genre, medium, and audience.

1999 ◽  
Vol 2 (2) ◽  
pp. 187-215 ◽  
Author(s):  
Wendy Sandler

In natural communication, the medium through which language is transmitted plays an important and systematic role. Sentences are broken up rhythmically into chunks; certain elements receive special stress; and, in spoken language, intonational tunes are superimposed onto these chunks in particular ways — all resulting in an intricate system of prosody. Investigations of prosody in Israeli Sign Language demonstrate that sign languages have comparable prosodic systems to those of spoken languages, although the phonetic medium is completely different. Evidence for the prosodic word and for the phonological phrase in ISL is examined here within the context of the relationship between the medium and the message. New evidence is offered to support the claim that facial expression in sign languages corresponds to intonation in spoken languages, and the term “superarticulation” is coined to describe this system in sign languages. Interesting formaldiffer ences between the intonationaltunes of spoken language and the “superarticulatory arrays” of sign language are shown to offer a new perspective on the relation between the phonetic basis of language, its phonological organization, and its communicative content.


2018 ◽  
Vol 44 (3-4) ◽  
pp. 123-208 ◽  
Author(s):  
Philippe Schlenker

AbstractWhile it is now accepted that sign languages should inform and constrain theories of ‘Universal Grammar’, their role in ‘Universal Semantics’ has been under-studied. We argue that they have a crucial role to play in the foundations of semantics, for two reasons. First, in some casessign languages provide overt evidence on crucial aspects of the Logical Form of sentences, ones that are only inferred indirectly in spoken language. For instance, sign language ‘loci’ are positions in signing space that can arguably realize logical variables, and the fact that they are overt makes it possible to revisit foundational debates about the syntactic reality of variables, about mechanisms of temporal and modal anaphora, and about the existence of dynamic binding. Another example pertains to mechanisms of ‘context shift’, which were postulated on the basis of indirect evidence in spoken language, but which are arguably overt in sign language. Second, along one dimensionsign languages are strictly more expressive than spoken languagesbecause iconic phenomena can be found at their logical core. This applies to loci themselves, which maysimultaneouslyfunction as logical variables and as schematic pictures of what they denote (context shift comes with some iconic requirements as well). As a result, the semantic system of spoken languages can in some respects be seen as a simplified version of the richer semantics found in sign languages. Two conclusions could be drawn from this observation. One is that the full extent of Universal Semantics can only be studied in sign languages. An alternative possibility is that spoken languages have comparable expressive mechanisms, but only when co-speech gestures are taken into account (as recently argued by Goldin-Meadow and Brentari). Either way, sign languages have a crucial role to play in investigations of the foundations of semantics.


2019 ◽  
Vol 39 (4) ◽  
pp. 367-395 ◽  
Author(s):  
Matthew L. Hall ◽  
Wyatte C. Hall ◽  
Naomi K. Caselli

Deaf and Hard of Hearing (DHH) children need to master at least one language (spoken or signed) to reach their full potential. Providing access to a natural sign language supports this goal. Despite evidence that natural sign languages are beneficial to DHH children, many researchers and practitioners advise families to focus exclusively on spoken language. We critique the Pediatrics article ‘Early Sign Language Exposure and Cochlear Implants’ (Geers et al., 2017) as an example of research that makes unsupported claims against the inclusion of natural sign languages. We refute claims that (1) there are harmful effects of sign language and (2) that listening and spoken language are necessary for optimal development of deaf children. While practical challenges remain (and are discussed) for providing a sign language-rich environment, research evidence suggests that such challenges are worth tackling in light of natural sign languages providing a host of benefits for DHH children – especially in the prevention and reduction of language deprivation.


Sign language is the only method of communication for the hearing and speech impaired people around the world. Most of the speech and hearing-impaired people understand single sign language. Thus, there is an increasing demand for sign language interpreters. For regular people learning sign language is difficult, and for speech and hearing-impaired person, learning spoken language is impossible. There is a lot of research being done in the domain of automatic sign language recognition. Different methods such as, computer vision, data glove, depth sensors can be used to train a computer to interpret sign language. The interpretation is being done from sign to text, text to sign, speech to sign and sign to speech. Different countries use different sign languages, the signers of different sign languages are unable to communicate with each other. Analyzing the characteristic features of gestures provides insights about the sign language, some common features in sign languages gestures will help in designing a sign language recognition system. This type of system will help in reducing the communication gap between sign language users and spoken language users.


2020 ◽  
pp. 016502542095819
Author(s):  
Julia Krebs ◽  
Dietmar Roehm ◽  
Ronnie B. Wilbur ◽  
Evie A. Malaia

Acquisition of natural language has been shown to fundamentally impact both one’s ability to use the first language and the ability to learn subsequent languages later in life. Sign languages offer a unique perspective on this issue because Deaf signers receive access to signed input at varying ages. The majority acquires sign language in (early) childhood, but some learn sign language later—a situation that is drastically different from that of spoken language acquisition. To investigate the effect of age of sign language acquisition and its potential interplay with age in signers, we examined grammatical acceptability ratings and reaction time measures in a group of Deaf signers (age range = 28–58 years) with early (0–3 years) or later (4–7 years) acquisition of sign language in childhood. Behavioral responses to grammatical word order variations (subject–object–verb [SOV] vs. object–subject–verb [OSV]) were examined in sentences that included (1) simple sentences, (2) topicalized sentences, and (3) sentences involving manual classifier constructions, uniquely characteristic of sign languages. Overall, older participants responded more slowly. Age of acquisition had subtle effects on acceptability ratings, whereby the direction of the effect depended on the specific linguistic structure.


PEDIATRICS ◽  
1994 ◽  
Vol 93 (1) ◽  
pp. A62-A62

Just as no one can pinpoint the origins of spoken language in prehistory, the roots of sign language remain hidden from view. What linguists do know is that sign languages have sprung up independently in many different places. Signing probably began with simple gestures, but then evolved into a true language with structured grammar. "In every place we've ever found deaf people, there's sign," says anthropological linguist Bob Johnson. But it's not the same language. "I went to a Mayan village where, out of 400 people, 13 were deaf, and they had their own Mayan Sign - I'd guess it's been maintained for thousands of years." Today at least 50 native sign languages are "spoken" worldwide, all mutually incomprehensible, from British and Israeli Sign to Chinese Sign.


2021 ◽  
Vol 102 ◽  
pp. 01008
Author(s):  
Maria Papatsimouli ◽  
Lazaros Lazaridis ◽  
Konstantinos-Filippos Kollias ◽  
Ioannis Skordas ◽  
George F. Fragulis

Sign Language is used to facilitate the communication between Deaf and non-Deaf people. It uses signs-words with basic structural elements such as handshape, parts of face, body or space, and the orientation of the fingers-palm. Sign Languages vary from people to people and from country to country and evolve as spoken languages. In the current study, an application which aims at Greek Sign Language and English Sign Language learning by hard of hearing people and talking people, has been developed. The application includes grouped signs in alphabetical order. The user can find Greek Sign Language signs, English sign language signs and translate from Greek sign language to English sign language. The written word of each sign, and the corresponding meaning are displayed. In addition, the sound is activated in order to enable users with partial hearing loss to hear the pronunciation of each word. The user is also provided with various tasks in order to enable an interaction of the knowledge acquired by the user. This interaction is offered mainly by multiple-choice tasks, incorporating text or video. The current application is not a simple sign language dictionary as it provides the interactive participation of users. It is a platform for Greek and English sign language active learning.


2019 ◽  
Vol 16 (1) ◽  
pp. 123-144
Author(s):  
Emilija Mustapić ◽  
Frane Malenica

The paper presents an overview of sign languages and co-speech gestures as two means of communication realised through the visuo-spatial modality. We look at previous research to examine the correlation between spoken and sign language phonology, but also provide an insight into the basic features of co-speech gestures. By analysing these features, we are able to see how these means of communication utilise phases of production (in the case of gestures) or parts of individual signs (in the case of sign languages) to convey or complement the meaning. Recent insights into sign languages as bona fide linguistic systems and co-speech gestures as a system which has no linguistic features but accompanies spoken language have shown that communication does not take place within just a single modality but is rather multimodal. By comparing gestures and sign languages to spoken languages, we are able to trace the transition from systems of communication involving simple form-meaning pairings to fully fledged morphological and syntactic complexities in spoken and sign languages, which gives us a new outlook on the emergence of linguistic phenomena.


2021 ◽  
Vol 8 (3) ◽  
pp. 110-132
Author(s):  
Khunaw Sulaiman Pirot ◽  
Wrya Izaddin Ali

This paper entitled ‘The Common Misconceptions about Sign Language’ is concerned with the most common misconceptions about sign language. It also deals with sign language and its relation with the spoken language. Sign language, primarily used by deaf people, is a fully-developed human language that does not use sounds for communication, but it is a visual-gestural system that uses hands, body and facial gestures. One of the misconceptions is that all sign languages are the same in the worldwide. Such assumptions cause problems. Accordingly, some questions have been raised: first, is sign language universal? Second, is sign language based on spoken language? And third, is sign language invented by hearing people?      The aim of the paper is to have a deeper understanding about sign language. It also demonstrates the similarities and differences between the two different modalities: sign language and spoken language. The paper is based on some hypothesis. One of the hypotheses is that sign languages are pantomimes and gestures. It also hypothesizes that the process of language acquisition in sign language for deaf people is different from the language acquisition in spoken language for hearing people.     To answer the questions raised, the qualitative approach is adopted. The procedure is to collect data about the subject from books and articles and then analyze the data to obtain the aim of the study.  One of the conclusions is that sign language is not universal. It is recommended that more work can be carried out on the differences between either American Sign Language (ASL) or British Sign Language (BSL) with reference to zmânî âmâžaî kurdî (ZAK) Kurdish Sign Language) at all linguistic levels.   


1991 ◽  
Vol 39 ◽  
pp. 75-82
Author(s):  
Beppie van den Bogaerde

Sign Language of the Netherlands (SLN) is considered to be the native language of many prelingually deaf people in the Netherlands. Although research has provided evidence that sign languages are fully fletched natural languages, many misconceptions still abound about sign languages and deaf people. The low status of sign languages all over the world and the attitude of hearing people towards deaf people and their languages, and the resulting attitude of the deaf towards their own languages, restricted the development of these languages until recently. Due to the poor results of deaf education and the dissatisfaction amongst educators of the deaf, parents of deaf children and deaf people themselves, a change of attitude towards the function of sign language in the interaction with deaf people can be observed; many hearing people dealing with deaf people one way or the other wish to learn the sign language of the deaf community of their country. Many hearing parents of deaf children, teachers of the deaf, student-interpreters and linguists are interested in sign language and want to follow a course to improve their signing ability. In order to develop sign language courses, sign language teachers and teaching materials are needed. And precisely these are missing. This is caused by several factors. First, deaf people in general do not receive the same education as hearing people, due to their inability to learn the spoken language of their environment to such an extent, that they have access to the full eduational program. This prohibits them a.o. to become teachers in elementary and secondary schools, or to become sign language teachers. Althought they are fluent "signers", they lack the competence in the spoken language of their country to obtain a teacher's degree in their sign language. A second problem is caused by the fact, that sign languages are visual languages: no adequate system has yet been found to write down a sign language. So until now hardly any teaching materials were available. Sign language courses should be developed with the help of native signers who should be educated to become language-teachers; with their help and with the help of video-material and computer-software, it will be possible in future to teach sign languages as any other language. But in order to reach this goal, it is imperative that deaf children get a better education so that they can contribute to the emancipation of their language.


Sign in / Sign up

Export Citation Format

Share Document