Event representations in signed languages

Author(s):  
Aslı Özyürek ◽  
Pamela Perniss
2016 ◽  
Vol 28 (1) ◽  
pp. 20-40 ◽  
Author(s):  
Velia Cardin ◽  
Eleni Orfanidou ◽  
Lena Kästner ◽  
Jerker Rönnberg ◽  
Bencie Woll ◽  
...  

The study of signed languages allows the dissociation of sensorimotor and cognitive neural components of the language signal. Here we investigated the neurocognitive processes underlying the monitoring of two phonological parameters of sign languages: handshape and location. Our goal was to determine if brain regions processing sensorimotor characteristics of different phonological parameters of sign languages were also involved in phonological processing, with their activity being modulated by the linguistic content of manual actions. We conducted an fMRI experiment using manual actions varying in phonological structure and semantics: (1) signs of a familiar sign language (British Sign Language), (2) signs of an unfamiliar sign language (Swedish Sign Language), and (3) invented nonsigns that violate the phonological rules of British Sign Language and Swedish Sign Language or consist of nonoccurring combinations of phonological parameters. Three groups of participants were tested: deaf native signers, deaf nonsigners, and hearing nonsigners. Results show that the linguistic processing of different phonological parameters of sign language is independent of the sensorimotor characteristics of the language signal. Handshape and location were processed by different perceptual and task-related brain networks but recruited the same language areas. The semantic content of the stimuli did not influence this process, but phonological structure did, with nonsigns being associated with longer RTs and stronger activations in an action observation network in all participants and in the supramarginal gyrus exclusively in deaf signers. These results suggest higher processing demands for stimuli that contravene the phonological rules of a signed language, independently of previous knowledge of signed languages. We suggest that the phonological characteristics of a language may arise as a consequence of more efficient neural processing for its perception and production.


2021 ◽  
Vol 6 (1) ◽  
Author(s):  
Mark Aronoff ◽  
Jonathan Rawski ◽  
Wendy Sandler ◽  
Iris Berent

Spoken and signed languages differ because of the affordances of the human body and the limits of each medium. But can commonalities between the two be compared to find abstract language universals?


2011 ◽  
Vol 14 (2) ◽  
pp. 213-247 ◽  
Author(s):  
Rachel McKee ◽  
Sophia Wallingford

This study investigates the frequency and functions of a ubiquitous form in conversational NZSL discourse glossed as palm-up. Dictionaries show that it is a polysemous vocabulary item in NZSL, although many of its uses in discourse are not accounted for in the lexicon. Analysis of discourse data from 20 signers shows it to be the second most frequently occuring item, and to exhibit phonological variation. We identify and discuss four (non-exclusive) functions of palm-up in this data: cohesive, modal, interactive, and manual frame for unpredictable mouthings (codemixing). Correspondences in form, linguistic context, and meaning are found between uses of palm-up in NZSL, similar forms in other signed languages, and co-speech palm gestures. The study affirms previous descriptions of this form as having properties of both gesture and sign, and suggests that it also has features of a discourse marker.


2019 ◽  
Author(s):  
Chloé Stoll ◽  
Matthew William Geoffrey Dye

While a substantial body of work has suggested that deafness brings about an increased allocation of visual attention to the periphery there has been much less work on how using a signed language may also influence this attentional allocation. Signed languages are visual-gestural and produced using the body and perceived via the human visual system. Signers fixate upon the face of interlocutors and do not directly look at the hands moving in the inferior visual field. It is therefore reasonable to predict that signed languages require a redistribution of covert visual attention to the inferior visual field. Here we report a prospective and statistically powered assessment of the spatial distribution of attention to inferior and superior visual fields in signers – both deaf and hearing – in a visual search task. Using a Bayesian Hierarchical Drift Diffusion Model, we estimated decision making parameters for the superior and inferior visual field in deaf signers, hearing signers and hearing non-signers. Results indicated a greater attentional redistribution toward the inferior visual field in adult signers (both deaf and hearing) than in hearing sign-naïve adults. The effect was smaller for hearing signers than for deaf signers, suggestive of either a role for extent of exposure or greater plasticity of the visual system in the deaf. The data provide support for a process by which the demands of linguistic processing can influence the human attentional system.


2021 ◽  
Author(s):  
Samantha Syd Cohen ◽  
Christopher Baldassano

How does the representation of naturalistic life events change with age? Here we analyzed fMRI data from 415 children and adolescents (5 - 19 years) as they watched a narrative movie. In addition to changes in the degree of inter-subject correlation (ISC) with age in sensory and medial parietal regions, we used a novel measure (between-groups ISC) to reveal age-related shifts in the responses across the majority of the neocortex. Over the course of development, brain responses became more discretized into stable and coherent events, and shifted earlier in time to anticipate upcoming event transitions. However, hippocampal responses to event boundaries actually decreased with age, suggesting a shifting division of labor between episodic encoding processes and schematic event representations between the ages of 5 and 19.


Sign in / Sign up

Export Citation Format

Share Document