"Was that successful?" On Integrating Proactive Meta-Dialogue in a DIY-Assistant using Multimodal Cues

Author(s):  
Matthias Kraus ◽  
Marvin Schiller ◽  
Gregor Behnke ◽  
Pascal Bercher ◽  
Michael Dorna ◽  
...  
Keyword(s):  
2014 ◽  
Author(s):  
Meghan Armstrong ◽  
Núria Esteve-Gibert ◽  
Pilar Prieto
Keyword(s):  

2019 ◽  
Author(s):  
Gabriella Vigliocco ◽  
Yasamin Motamedi ◽  
Margherita Murgiano ◽  
Elizabeth Wonnacott ◽  
Chloë Marshall ◽  
...  

Most research on how children learn the mapping between words and world has assumed that language is arbitrary, and has investigated language learning in contexts in which objects referred to are present in the environment. Here, we report analyses of a semi-naturalistic corpus of caregivers talking to their 2-3 year-old. We focus on caregivers’ use of non-arbitrary cues across different expressive channels: both iconic (onomatopoeia and representational gestures) and indexical (points and actions with objects). We ask if these cues are used differently when talking about objects known or unknown to the child, and when the referred objects are present or absent. We hypothesize that caregivers would use these cues more often with objects novel to the child. Moreover, they would use the iconic cues especially when objects are absent because iconic cues bring to the mind’s eye properties of referents. We find that cue distribution differs: all cues except points are more common for unknown objects indicating their potential role in learning; onomatopoeia and representational gestures are more common for displaced contexts whereas indexical cues are more common when objects are present. Thus, caregivers provide multimodal non-arbitrary cues to support children’s vocabulary learning and iconicity – specifically – can support linking mental representations for objects and labels.


2018 ◽  
Vol 75 ◽  
pp. 1-10 ◽  
Author(s):  
Filip Malawski ◽  
Bogdan Kwolek

System ◽  
2021 ◽  
pp. 102691
Author(s):  
Vivien Lin ◽  
Hui-Chin Yeh ◽  
Huai-Hsuan Huang ◽  
Nian-Shing Chen

Sign in / Sign up

Export Citation Format

Share Document