Journeys as Communicative Gestures: My Relationships with/in the Sciences

Author(s):  
Tristan Gleason
2021 ◽  
Author(s):  
Wim Pouw ◽  
Jan de Wit ◽  
Sara Bögels ◽  
Marlou Rasenberg ◽  
Branka Milivojevic ◽  
...  

Most manual communicative gestures that humans produce cannot be looked up in a dictionary, as these manual gestures inherit their meaning in large part from the communicative context and are not conventionalized. However, it is understudied to what extent the communicative signal as such — bodily postures in movement, or kinematics — can inform about gesture semantics. Can we construct, in principle, a distribution-based semantics of gesture kinematics, similar to how word vectorization methods in NLP (Natural language Processing) are now widely used to study semantic properties in text and speech? For such a project to get off the ground, we need to know the extent to which semantically similar gestures are more likely to be kinematically similar. In study 1 we assess whether semantic word2vec distances between the conveyed concepts participants were explicitly instructed to convey in silent gestures, relate to the kinematic distances of these gestures as obtained from Dynamic Time Warping (DTW). In a second director-matcher dyadic study we assess kinematic similarity between spontaneous co-speech gestures produced between interacting participants. Participants were asked before and after they interacted how they would name the objects. The semantic distances between the resulting names were related to the gesture kinematic distances of gestures that were made in the context of conveying those objects in the interaction. We find that the gestures’ semantic relatedness is reliably predictive of kinematic relatedness across these highly divergent studies, which suggests that the development of an NLP method of deriving semantic relatedness from kinematics is a promising avenue for future developments in automated multimodal recognition. Deeper implications for statistical learning processes in multimodal language are discussed.


2018 ◽  
Vol 9 ◽  
Author(s):  
Brian Ravenet ◽  
Catherine Pelachaud ◽  
Chloé Clavel ◽  
Stacy Marsella

1984 ◽  
Vol 5 (14) ◽  
pp. 129-143 ◽  
Author(s):  
Raymond J. Folven ◽  
John D. Bonvillian ◽  
Michael D. Orlansky

1989 ◽  
Vol 155 (S7) ◽  
pp. 66-67

Diminished emotional responsiveness as characterised by a reduction in facial expression, modulation of feelings, and communicative gestures. Basis for rating: observation of physical manifestations of affective tone and emotional responsiveness during the course of interview.


1988 ◽  
Vol 53 (2) ◽  
pp. 115-124 ◽  
Author(s):  
Guila Glosser ◽  
Morton Wiener ◽  
Edith Kaplan

This study reports intraindividual variations in the semantic and syntactic complexity of language and in the linguistic errors produced by mildly and moderately impaired aphasic and nonneurologically impaired control subjects in different communication contexts. Aphasic patients, compared to control subjects, evidenced as many, if not more, linguistic variations in response to changing communication requirements. In conditions that restricted visual contact between speaker and listener, aphasic patients produced fewer communicative gestures and more complex verbalizations. Verbal complexity and language errors also varied significantly with different contents of communication. Measures of verbal complexity and errors in verbal communications were found to vary independently across different communication contexts, contents, and tasks. These findings demonstrate that despite their linguistic impairments, aphasic patients show appropriate and predictable linguistic changes in response to nonlinguistic social contextual variables.


Sign in / Sign up

Export Citation Format

Share Document