scholarly journals Joint epistemic engineering: The neglected process of context construction in human communication

2020 ◽  
Author(s):  
Arjen Stolk ◽  
Jana Bašnáková ◽  
Ivan Toni

This contribution argues that a common language and its statistics do not explain how people overcome fundamental communicative obstacles. We introduce joint epistemic engineering, a neurosemiotic account of how asymmetric interlocutors can communicate effectively despite using ambiguous signals that are referentially contingent on the current communicative circumstances. The basic insight is that a communicative signal contains a multiplicity of functions and that interlocutors use those multi-layered signals to simultaneously coordinate a space of possible interpretations, declare a communicative intent, and to reduce uncertainty over the identity of a referent.

2015 ◽  
Vol 27 (12) ◽  
pp. 2352-2368 ◽  
Author(s):  
David Peeters ◽  
Mingyuan Chu ◽  
Judith Holler ◽  
Peter Hagoort ◽  
Aslı Özyürek

In everyday human communication, we often express our communicative intentions by manually pointing out referents in the material world around us to an addressee, often in tight synchronization with referential speech. This study investigated whether and how the kinematic form of index finger pointing gestures is shaped by the gesturer's communicative intentions and how this is modulated by the presence of concurrently produced speech. Furthermore, we explored the neural mechanisms underpinning the planning of communicative pointing gestures and speech. Two experiments were carried out in which participants pointed at referents for an addressee while the informativeness of their gestures and speech was varied. Kinematic and electrophysiological data were recorded online. It was found that participants prolonged the duration of the stroke and poststroke hold phase of their gesture to be more communicative, in particular when the gesture was carrying the main informational burden in their multimodal utterance. Frontal and P300 effects in the ERPs suggested the importance of intentional and modality-independent attentional mechanisms during the planning phase of informative pointing gestures. These findings contribute to a better understanding of the complex interplay between action, attention, intention, and language in the production of pointing gestures, a communicative act core to human interaction.


2009 ◽  
Vol 23 (2) ◽  
pp. 63-76 ◽  
Author(s):  
Silke Paulmann ◽  
Sarah Jessen ◽  
Sonja A. Kotz

The multimodal nature of human communication has been well established. Yet few empirical studies have systematically examined the widely held belief that this form of perception is facilitated in comparison to unimodal or bimodal perception. In the current experiment we first explored the processing of unimodally presented facial expressions. Furthermore, auditory (prosodic and/or lexical-semantic) information was presented together with the visual information to investigate the processing of bimodal (facial and prosodic cues) and multimodal (facial, lexic, and prosodic cues) human communication. Participants engaged in an identity identification task, while event-related potentials (ERPs) were being recorded to examine early processing mechanisms as reflected in the P200 and N300 component. While the former component has repeatedly been linked to physical property stimulus processing, the latter has been linked to more evaluative “meaning-related” processing. A direct relationship between P200 and N300 amplitude and the number of information channels present was found. The multimodal-channel condition elicited the smallest amplitude in the P200 and N300 components, followed by an increased amplitude in each component for the bimodal-channel condition. The largest amplitude was observed for the unimodal condition. These data suggest that multimodal information induces clear facilitation in comparison to unimodal or bimodal information. The advantage of multimodal perception as reflected in the P200 and N300 components may thus reflect one of the mechanisms allowing for fast and accurate information processing in human communication.


2015 ◽  
Vol 20 (3) ◽  
pp. 176-189 ◽  
Author(s):  
John F. Rauthmann

Abstract. There is as yet no consensually agreed-upon situational taxonomy. The current work addresses this issue and reviews extant taxonomic approaches by highlighting a “road map” of six research stations that lead to the observed diversity in taxonomies: (1) theoretical and conceptual guidelines, (2) the “type” of situational information studied, (3) the general taxonomic approach taken, (4) the generation of situation pools, (5) the assessment and rating of situational information, and (6) the statistical analyses of situation data. Current situational taxonomies are difficult to integrate because they follow different paths along these six stations. Some suggestions are given on how to spur integrated taxonomies toward a unified psychology of situations that speaks a common language.


1988 ◽  
Vol 33 (10) ◽  
pp. 920-921
Author(s):  
L. Kristine Pond
Keyword(s):  

Author(s):  
Patricia L. McDermott ◽  
Jason Luck ◽  
Laurel Allender ◽  
Alia Fisher

Sign in / Sign up

Export Citation Format

Share Document