scholarly journals Age of exposure and subject/object asymmetries when wh-movement goes rightward

Author(s):  
Carlo Cecchetto ◽  
Alessandra Checchetto ◽  
Beatrice Giustolisi ◽  
Mirko Santoro

Abstract We report an experiment addressing the comprehension of LIS interrogatives in three adult populations with different times of exposure to sign language: native signers, early signers, and late signers. We investigate whether delayed exposure to language affects comprehension of interrogatives and whether there is an advantage for subject dependencies over object dependencies, as systematically reported for spoken languages. The answer to the first question is positive: there is evidence that natives outperform non-native signers, confirming permanent effects of delayed exposure to sign language even decades after childhood. However, the performance in subject interrogatives was lower than in object interrogatives in all groups of participants. We discuss several possible reasons for this unexpected finding.

2012 ◽  
Vol 15 (2) ◽  
pp. 185-211 ◽  
Author(s):  
Susanne Mohr

The article analyses cross-modal language contact between signed and spoken languages with special reference to the Irish Deaf community. This is exemplified by an examination of the phenomenon of mouthings in Irish Sign Language including its origins, dynamics, forms and functions. Initially, the setup of language contact with respect to Deaf communities and the sociolinguistics of the Irish Deaf community are discussed, and in the main part the article analyses elicited data in the form of personal stories by twelve native signers from the Republic of Ireland. The major aim of the investigation is to determine whether mouthings are yet fully integrated into ISL and if so, whether this integration has ultimately caused language change. Finally, it is asked whether traditional sociolinguistic frameworks of language contact can actually tackle issues of cross-modal language contact occurring between signed and spoken languages.


2016 ◽  
Vol 28 (1) ◽  
pp. 20-40 ◽  
Author(s):  
Velia Cardin ◽  
Eleni Orfanidou ◽  
Lena Kästner ◽  
Jerker Rönnberg ◽  
Bencie Woll ◽  
...  

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.


2014 ◽  
Vol 15 (1) ◽  
Author(s):  
Barbara Hänel-Faulhaber ◽  
Nils Skotara ◽  
Monique Kügow ◽  
Uta Salden ◽  
Davide Bottari ◽  
...  

2021 ◽  
Vol 7 (2) ◽  
pp. 156-171
Author(s):  
Ilaria Berteletti ◽  
SaraBeth J. Sullivan ◽  
Lucas Lancaster

With two simple experiments we investigate the overlooked influence of handshape similarity for processing numerical information conveyed on the hands. In most finger-counting sequences there is a tight relationship between the number of fingers raised and the numerical value represented. This creates a possible confound where numbers closer to each other are also represented by handshapes that are more similar. By using the American Sign Language (ASL) number signs we are able to dissociate between the two variables orthogonally. First, we test the effect of handshape similarity in a same/different judgment task in a group of hearing non-signers and then test the interference of handshape in a number judgment task in a group of native ASL signers. Our results show an effect of handshape similarity and its interaction with numerical value even in the group of native signers for whom these handshapes are linguistic symbols and not a learning tool for acquiring numerical concepts. Because prior studies have never considered handshape similarity, these results open new directions for understanding the relationship between finger-based counting, internal hand representations and numerical proficiency.


2021 ◽  
pp. 1-12
Author(s):  
William Matchin ◽  
Deniz İlkbaşaran ◽  
Marla Hatrak ◽  
Austin Roth ◽  
Agnes Villwock ◽  
...  

Abstract Areas within the left-lateralized neural network for language have been found to be sensitive to syntactic complexity in spoken and written language. Previous research has revealed that these areas are active for sign language as well, but whether these areas are specifically responsive to syntactic complexity in sign language independent of lexical processing has yet to be found. To investigate the question, we used fMRI to neuroimage deaf native signers' comprehension of 180 sign strings in American Sign Language (ASL) with a picture-probe recognition task. The ASL strings were all six signs in length but varied at three levels of syntactic complexity: sign lists, two-word sentences, and complex sentences. Syntactic complexity significantly affected comprehension and memory, both behaviorally and neurally, by facilitating accuracy and response time on the picture-probe recognition task and eliciting a left lateralized activation response pattern in anterior and posterior superior temporal sulcus (aSTS and pSTS). Minimal or absent syntactic structure reduced picture-probe recognition and elicited activation in bilateral pSTS and occipital-temporal cortex. These results provide evidence from a sign language, ASL, that the combinatorial processing of anterior STS and pSTS is supramodal in nature. The results further suggest that the neurolinguistic processing of ASL is characterized by overlapping and separable neural systems for syntactic and lexical processing.


2020 ◽  
pp. 026553222092459
Author(s):  
Justyna Kotowicz ◽  
Bencie Woll ◽  
Rosalind Herman

The evaluation of sign language proficiency needs to be based on measures with well-established psychometric proprieties. To date, no valid and reliable test is available to assess Polish Sign Language ( Polski Język Migowy, PJM) skills in deaf children. Hence, our aim with this study was to adapt the British Sign Language Receptive Skills Test (the first standardized test to determine sign language proficiency in children) into PJM, a less researched sign language. In this paper, we present the first steps in the adaptation process and highlight linguistic and cultural similarities and differences between the British Sign Language Receptive Skills Test and the PJM adaptation. We collected data from 20 deaf children who were native signers (age range: 6 to 12) and 30 deaf children who were late learners of PJM (age range: 6 to 13). Preliminary analyses showed that the PJM Receptive Skills Test has acceptable psychometric characteristics (item analysis, validity, reliability, and sensitivity to age). Our long-term goal with this work was to include younger children (age range: 3 to 6) and to standardize the PJM Receptive Skills Tests, so that it can be used in educational settings and in scientific research.


Gesture ◽  
2016 ◽  
Vol 15 (3) ◽  
pp. 291-305 ◽  
Author(s):  
David P. Corina ◽  
Eva Gutierrez

Little is known about how individual signs that occur in naturally produced signed languages are recognized. Here we examine whether sign understanding may be grounded in sensorimotor properties by evaluating a signer’s ability to make lexical decisions to American Sign Language (ASL) signs that are articulated either congruent with or incongruent with the observer’s own handedness. Our results show little evidence for handedness congruency effects for native signers’ perception of ASL, however handedness congruency effects were seen in non-native late learners of ASL and hearing ASL-English bilinguals. The data are compatible with a theory of sign recognition that makes reference to internally simulated articulatory control signals — a forward model based upon sensory-motor properties of one’s owns body. The data suggest that sign recognition may rely upon an internal body schema when processing is non-optimal as a result of having learned ASL later in life. Native signers however may have developed representations of signs which are less bound to the hand with which it is performed, suggesting a different engagement of an internal forward model for rapid lexical decisions.


2002 ◽  
Vol 14 (7) ◽  
pp. 1064-1075 ◽  
Author(s):  
Mairéad MacSweeney ◽  
Bencie Woll ◽  
Ruth Campbell ◽  
Gemma A. Calvert ◽  
Philip K. McGuire ◽  
...  

In all signed languages used by deaf people, signs are executed in “sign space” in front of the body. Some signed sentences use this space to map detailed “real-world” spatial relationships directly. Such sentences can be considered to exploit sign space “topographically.” Using functional magnetic resonance imaging, we explored the extent to which increasing the topographic processing demands of signed sentences was reflected in the differential recruitment of brain regions in deaf and hearing native signers of the British Sign Language. When BSL signers performed a sentence anomaly judgement task, the occipito-temporal junction was activated bilaterally to a greater extent for topographic than nontopo-graphic processing. The differential role of movement in the processing of the two sentence types may account for this finding. In addition, enhanced activation was observed in the left inferior and superior parietal lobules during processing of topographic BSL sentences. We argue that the left parietal lobe is specifically involved in processing the precise configuration and location of hands in space to represent objects, agents, and actions. Importantly, no differences in these regions were observed when hearing people heard and saw English translations of these sentences. Despite the high degree of similarity in the neural systems underlying signed and spoken languages, exploring the linguistic features which are unique to each of these broadens our understanding of the systems involved in language comprehension.


2012 ◽  
Vol 15 (2) ◽  
pp. 402-412 ◽  
Author(s):  
DIANE BRENTARI ◽  
MARIE A. NADOLSKE ◽  
GEORGE WOLFORD

In this paper the prosodic structure of American Sign Language (ASL) narratives is analyzed in deaf native signers (L1-D), hearing native signers (L1-H), and highly proficient hearing second language signers (L2-H). The results of this study show that the prosodic patterns used by these groups are associated both with their ASL language experience (L1 or L2) and with their hearing status (deaf or hearing), suggesting that experience using co-speech gesture (i.e. gesturing while speaking) may have some effect on the prosodic cues used by hearing signers, similar to the effects of the prosodic structure of an L1 on an L2.


2011 ◽  
Vol 14 (1) ◽  
pp. 94-114 ◽  
Author(s):  
Donna Lewin ◽  
Adam C. Schembri

This article investigates the claim that tongue protrusion (‘th’) acts as a nonmanual adverbial morpheme in British Sign Language (BSL) (Brennan 1992; Sutton-Spence & Woll 1999) drawing on narrative data produced by two deaf native signers as part of the European Cultural Heritage Online (ECHO) corpus. Data from ten BSL narratives have been analysed to observe the frequency and form of tongue protrusion. The results from this preliminary investigation indicate tongue protrusion occurs as part of the phonological formation of lexical signs (i.e., ‘echo phonology’, see Woll 2001), as well as a separate meaningful unit that co-occurs (sometimes as part of constructed action) with classifier constructions and lexical verb signs. In the latter cases, the results suggest ‘th’ sometimes appears to function as an adverbial morpheme in BSL, but with a greater variety of meanings than previously suggested in the BSL literature. One use of the adverbial appears similar to a nonmanual signal in American Sign Language described by Liddell (1980), although the form of the mouth gesture in our BSL data differs from what is reported in Liddell’s work. Thus, these findings suggest the mouth gesture ‘th’ in BSL has a broad range of functions. Some uses of tongue protrusion, however, remain difficult to categorise and further research with a larger dataset is needed.


Sign in / Sign up

Export Citation Format

Share Document