pointing gesture
Recently Published Documents


TOTAL DOCUMENTS

117
(FIVE YEARS 24)

H-INDEX

19
(FIVE YEARS 1)

Gesture ◽  
2021 ◽  
Vol 20 (1) ◽  
pp. 1-29
Author(s):  
María Fernández-Flecha ◽  
María Blume ◽  
Andrea Junyent ◽  
Talía Tijero Neyra

Abstract We examine gestural development, and correlations between gesture types, vocalizations and vocabulary at ages 8 to 15 months, employing data from MacArthur-Bates Communicative Development Inventories for Peruvian Spanish, in the first such study with Peruvian children. Results show (1) significant change with age in the production of gesture types, with older children producing more; (2) important correlations between gesture types, and both vocalization types and vocabulary after controlling for age effects; and (3) correlations between the trajectory of the pointing gesture in its two modalities (whole-hand and index-finger) with age, vocalizations, and vocabulary, an effect that persists with respect to vocalizations after controlling for age. Our findings, based on a sample from a non-weird population, support a key role for gesture production in early communicative and linguistic development.


2021 ◽  
Author(s):  
Ulf Liszkowski

Human pointing is foundational to language acquisition and sociality. The current chapter explores the ontogenetic origins of the human pointing gesture in infancy. First, the authors define infant pointing in terms of function, cognition, motivation, and morphology. Then, the authors review current evidence for predictors of infant pointing on child and caregiver levels, because any predictors provide insights into the basic developmental factors. From this review, the authors introduce and discuss a number of pertinent accounts on the emergence of pointing: social shaping accounts (pointing-from- reaching; pointing-from-non-communicative pointing) and social cognition accounts (pointing-from-imitation; pointing-from-gaze-following). The authors end by presenting a synthesis, which holds that child-level cognitive factors, specifically directedness andsocial motivation, interact with caregiver-level social factors, specifically responsiveness and assisting actions relevant to infants’ directed activity. The interaction of these factors creates social goals and formats that scale up to pointing acts expressing triadic relations between infant, caregiver, and entities at a distance in the context of joint activity and experience.


2021 ◽  
pp. 104425
Author(s):  
Carla J. Eatherington ◽  
Paolo Mongillo ◽  
Miina Lõoke ◽  
Lieta Marinelli

2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Anna C. S. Medeiros ◽  
Photchara Ratsamee ◽  
Jason Orlosky ◽  
Yuki Uranishi ◽  
Manabu Higashida ◽  
...  

AbstractFirefighters need to gain information from both inside and outside of buildings in first response emergency scenarios. For this purpose, drones are beneficial. This paper presents an elicitation study that showed firefighters’ desires to collaborate with autonomous drones. We developed a Human–Drone interaction (HDI) method for indicating a target to a drone using 3D pointing gestures estimated solely from a monocular camera. The participant first points to a window without using any wearable or body-attached device. Through the drone’s front-facing camera, the drone detects the gesture and computes the target window. This work includes a description of the process for choosing the gesture, detecting and localizing objects, and carrying out the transformations between coordinate systems. Our proposed 3D pointing gesture interface improves on 2D interfaces by integrating depth information with SLAM and solving ambiguity with multiple objects aligned on the same plane in a large-scale outdoor environment. Experimental results showed that our 3D pointing gesture interface obtained average F1 scores of 0.85 and 0.73 for precision and recall in simulation and real-world experiments and an F1 score of 0.58 at the maximum distance of 25 m between the drone and building.


Author(s):  
Ulf Liszkowski ◽  
Johanna Rüther

Human pointing is foundational to language acquisition and sociality. The current chapter explores the ontogenetic origins of the human pointing gesture in infancy. First, the authors define infant pointing in terms of function, cognition, motivation, and morphology. Then, the authors review current evidence for predictors of infant pointing on child and caregiver levels, because any predictors provide insights into the basic developmental factors. From this review, the authors introduce and discuss a number of pertinent accounts on the emergence of pointing: social shaping accounts (pointing-from-reaching; pointing-from-non-communicative pointing) and social cognition accounts (pointing-from-imitation; pointing-from-gaze-following). The authors end by presenting a synthesis, which holds that child-level cognitive factors, specifically directedness and social motivation, interact with caregiver-level social factors, specifically responsiveness and assisting actions relevant to infants’ directed activity. The interaction of these factors creates social goals and formats that scale up to pointing acts expressing triadic relations between infant, caregiver, and entities at a distance in the context of joint activity and experience.


2021 ◽  
Author(s):  
Anna C S Medeiros ◽  
Photchara Ratsamee ◽  
Jason Orlosky ◽  
Yuki Uranishi ◽  
Manabu Higashida ◽  
...  

Abstract Firefighters need to gain information from both inside and outside of buildings in first response emergency scenarios. For this purpose, drones are beneficial. This paper presents an elicitation study that showed the firefighters’ desire to collaborate with autonomous drones. We developed a Human-Drone Interaction (HDI) method for indicating a target to a drone using 3D pointing gestures estimated solely from a monocular camera. The participant first points to a window without using any wearable or body-attached device. Through the drone’s front-facing camera, the drone detects the gesture and computes the target window. This work includes a description of the process for choosing the gesture, detecting and localizing objects, and carrying out the transformations between coordinate systems. Our proposed 3D pointing gesture interface improves a 2D pointing gesture interface by integrating depth information with SLAM, solving multiple objects aligned on the same plane ambiguity, in a large-scale outdoor environment. Experimental results showed that our 3D pointing gesture interface obtained a 0.85 and 0.73 F1-Score on average in simulation and real-world experiments and 0.58 F1-Score at the maximum distance of 25 meters between drone and building.


2021 ◽  
Vol 31 ◽  
Author(s):  
Juliana Prieto Bruckner ◽  
Eliene Novais Costa ◽  
Cláudia Cardoso-Martins

Abstract There is evidence of a strong association between the pointing gesture and early vocabulary acquisition. This study examined the extent to which this association is moderated by the communicative function of children’s pointing. A total of 35 children participated in the study. Their use of the pointing gesture and their expressive vocabulary were assessed at 13 and 18 months using the Early Social Communication Scales and the MacArthur-Bates Communicative Development Inventory, respectively. The results of multiple linear regression analyses indicated that variations in the frequency of declarative pointing at 13 months significantly contributed to variations in vocabulary size at both 13 and 18 months, independently of variations in maternal education. In contrast, variations in the frequency of imperative pointing did not concurrently or longitudinally correlate with children’s vocabulary sizes. These results suggest that the relation between pointing and early vocabulary acquisition is moderated by the communicative function of the pointing gesture.


2020 ◽  
Vol 11 ◽  
Author(s):  
Marieke Hoetjes ◽  
Lieke van Maastricht

Most language learners have difficulties acquiring the phonemes of a second language (L2). Unfortunately, they are often judged on their L2 pronunciation, and segmental inaccuracies contribute to miscommunication. Therefore, we aim to determine how to facilitate phoneme acquisition. Given the close relationship between speech and co-speech gesture, previous work unsurprisingly reports that gestures can benefit language acquisition, e.g., in (L2) word learning. However, gesture studies on L2 phoneme acquisition present contradictory results, implying that both specific properties of gestures and phonemes used in training, and their combination, may be relevant. We investigated the effect of phoneme and gesture complexity on L2 phoneme acquisition. In a production study, Dutch natives received instruction on the pronunciation of two Spanish phonemes, /u/ and /θ/. Both are typically difficult to produce for Dutch natives because their orthographic representation differs between both languages. Moreover, /θ/ is considered more complex than /u/, since the Dutch phoneme inventory contains /u/ but not /θ/. The instruction participants received contained Spanish examples presented either via audio-only, audio-visually without gesture, audio-visually with a simple, pointing gesture, or audio-visually with a more complex, iconic gesture representing the relevant speech articulator(s). Preceding and following training, participants read aloud Spanish sentences containing the target phonemes. In a perception study, Spanish natives rated the target words from the production study on accentedness and comprehensibility. Our results show that combining gesture and speech in L2 phoneme training can lead to significant improvement in L2 phoneme production, but both gesture and phoneme complexity affect successful learning: Significant learning only occurred for the less complex phoneme /u/ after seeing the more complex iconic gesture, whereas for the more complex phoneme /θ/, seeing the more complex gesture actually hindered acquisition. The perception results confirm the production findings and show that items containing /θ/ produced after receiving training with a less complex pointing gesture are considered less foreign-accented and more easily comprehensible as compared to the same items after audio-only training. This shows that gesture can facilitate task performance in L2 phonology acquisition, yet complexity affects whether certain gestures work better for certain phonemes than others.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Dimitrios Kourtis ◽  
Pierre Jacob ◽  
Natalie Sebanz ◽  
Dan Sperber ◽  
Günther Knoblich

Abstract We investigated whether communicative cues help observers to make sense of human interaction. We recorded EEG from an observer monitoring two individuals who were occasionally communicating with each other via either mutual eye contact and/or pointing gestures, and then jointly attending to the same object or attending to different objects that were placed on a table in front of them. The analyses were focussed on the processing of the interaction outcome (i.e. presence or absence of joint attention) and showed that its interpretation is a two-stage process, as reflected in the N300 and the N400 potentials. The N300 amplitude was reduced when the two individuals shared their focus of attention, which indicates the operation of a cognitive process that involves the relatively fast identification and evaluation of actor–object relationships. On the other hand, the N400 was insensitive to the sharing or distribution of the two individuals’ attentional focus. Interestingly, the N400 was reduced when the interaction outcome was preceded either by mutual eye contact or by a perceived pointing gesture. This shows that observation of communication “opens up” the mind to a wider range of action possibilities and thereby helps to interpret unusual outcomes of social interactions.


2020 ◽  
Author(s):  
Anna C S Medeiros ◽  
Photchara Ratsamee ◽  
Jason Orlosky ◽  
Yuki Uranishi ◽  
Manabu Higashida ◽  
...  

Abstract Firefighters need to gain information from both inside and outside of buildings in first response emergency scenarios. For this purpose, drones are beneficial. This paper presents an elicitation study that showed the firefighters' desire to collaborate with autonomous drones. We developed a Human-Drone Interaction (HDI) method for indicating a target to a drone using 3D pointing gestures estimated solely from a monocular camera. The participant first points to a window without using any wearable or body-attached device. Through the drone's front-facing camera, the drone detects the gesture and computes the target window. This work includes a description of the process for choosing the gesture, detecting and localizing objects, and carrying out the transformations between coordinate systems. Our proposed 3D pointing gesture improves a 2D pointing gesture interface by integrating depth information with SLAM, solving multiple objects aligned on the same plane ambiguity, in a large-scale outdoor environment. Experimental results showed that our 3D pointing gesture interface obtained a 0.85 and 0.73 F1-Score on average in simulation and real-world experiments and 0.58 F1-Score at the maximum distance of 25 meters between drone and building.


Sign in / Sign up

Export Citation Format

Share Document