scholarly journals SPeeding Up the Detection of Non-iconic and Iconic Gestures (SPUDNIG): a toolkit for the automatic detection of hand movements and gestures in video data.

2019 ◽  
Author(s):  
Jordy Ripperda ◽  
Linda Drijvers ◽  
Judith Holler

In human face-to-face communication, speech is frequently accompanied by visual signals, especially communicative hand gestures. Analyzing these visual signals requires detailed manual annotation of video data, which is often a labor-intensive and time-consuming process. To facilitate this process, we here present SPUDNIG (SPeeding Up the Detection of Non-iconic and Iconic Gestures), a tool to automatize the detection and annotation of hand movements in video data. We provide a detailed description of how SPUDNIG detects hand movement initiation and termination, as well as open-source code and a short tutorial on an easy-to-use graphical user interface (GUI) of our tool. We then provide a proof-of-principle and validation of our method by comparing SPUDNIG’s output to manual annotations of gestures by a human coder. While the tool does not entirely eliminate the need of a human coder (e.g., for false positives detection), our results demonstrate that SPUDNIG can detect both iconic and non-iconic gestures with very high accuracy, and could successfully detect all iconic gestures in our validation dataset. Importantly, SPUDNIG’s output can directly be imported into commonly used annotation tools such as ELAN and ANVIL. We therefore believe that SPUDNIG will be highly relevant for researchers studying multimodal communication due to its annotations significantly accelerating the analysis of large video corpora.

2016 ◽  
Vol 28 (11) ◽  
pp. 1828-1837 ◽  
Author(s):  
Emiliano Brunamonti ◽  
Aldo Genovesio ◽  
Pierpaolo Pani ◽  
Roberto Caminiti ◽  
Stefano Ferraina

Reaching movements require the integration of both somatic and visual information. These signals can have different relevance, depending on whether reaches are performed toward visual or memorized targets. We tested the hypothesis that under such conditions, therefore depending on target visibility, posterior parietal neurons integrate differently somatic and visual signals. Monkeys were trained to execute both types of reaches from different hand resting positions and in total darkness. Neural activity was recorded in Area 5 (PE) and analyzed by focusing on the preparatory epoch, that is, before movement initiation. Many neurons were influenced by the initial hand position, and most of them were further modulated by the target visibility. For the same starting position, we found a prevalence of neurons with activity that differed depending on whether hand movement was performed toward memorized or visual targets. This result suggests that posterior parietal cortex integrates available signals in a flexible way based on contextual demands.


2017 ◽  
Vol 118 (6) ◽  
pp. 3293-3310 ◽  
Author(s):  
Kiyoshi Kurata

To determine the role of the periarcuate cortex during coordinated eye and hand movements in monkeys, the present study examined neuronal activity in this region during movement with the hand, eyes, or both as effectors toward a visuospatial target. Similar to the primary motor cortex (M1), the dorsal premotor cortex contained a higher proportion of neurons that were closely related to hand movements, whereas saccade-related neurons were frequently recorded from the frontal eye field (FEF). Interestingly, neurons that exhibited activity related to both eye and hand movements were recorded most frequently in the ventral premotor cortex (PMv), located between the FEF and M1. Neuronal activity in the periarcuate cortex was highly modulated during coordinated movements compared with either eye or hand movement only. Additionally, a small number of neurons were active specifically during one of the three task modes, which could be dissociated from the effector activity. In this case, neuron onset was either ahead of or behind the onset of eye and/or hand movement, and some neuronal activity lasted until reward delivery signaled successful completion of reaching. The present findings indicate that the periarcuate cortex, particularly the PMv, plays important roles in orchestrating coordinated movements from the initiation to the termination of reaching. NEW & NOTEWORTHY Movement-related neuronal activity was recorded throughout the periarcuate cortex of monkeys that performed a task requiring them to move their hand only, eyes only, or both hand and eyes toward visuospatial targets. Most typically, neurons were found that were commonly active regardless of different effectors, from movement initiation to completion of a successful outcome. The findings suggest that the periarcuate cortex as a whole plays a crucial role in initiating and completing coordinated eye-hand movements.


2020 ◽  
Vol 132 (5) ◽  
pp. 1358-1366
Author(s):  
Chao-Hung Kuo ◽  
Timothy M. Blakely ◽  
Jeremiah D. Wander ◽  
Devapratim Sarma ◽  
Jing Wu ◽  
...  

OBJECTIVEThe activation of the sensorimotor cortex as measured by electrocorticographic (ECoG) signals has been correlated with contralateral hand movements in humans, as precisely as the level of individual digits. However, the relationship between individual and multiple synergistic finger movements and the neural signal as detected by ECoG has not been fully explored. The authors used intraoperative high-resolution micro-ECoG (µECoG) on the sensorimotor cortex to link neural signals to finger movements across several context-specific motor tasks.METHODSThree neurosurgical patients with cortical lesions over eloquent regions participated. During awake craniotomy, a sensorimotor cortex area of hand movement was localized by high-frequency responses measured by an 8 × 8 µECoG grid of 3-mm interelectrode spacing. Patients performed a flexion movement of the thumb or index finger, or a pinch movement of both, based on a visual cue. High-gamma (HG; 70–230 Hz) filtered µECoG was used to identify dominant electrodes associated with thumb and index movement. Hand movements were recorded by a dataglove simultaneously with µECoG recording.RESULTSIn all 3 patients, the electrodes controlling thumb and index finger movements were identifiable approximately 3–6-mm apart by the HG-filtered µECoG signal. For HG power of cortical activation measured with µECoG, the thumb and index signals in the pinch movement were similar to those observed during thumb-only and index-only movement, respectively (all p > 0.05). Index finger movements, measured by the dataglove joint angles, were similar in both the index-only and pinch movements (p > 0.05). However, despite similar activation across the conditions, markedly decreased thumb movement was observed in pinch relative to independent thumb-only movement (all p < 0.05).CONCLUSIONSHG-filtered µECoG signals effectively identify dominant regions associated with thumb and index finger movement. For pinch, the µECoG signal comprises a combination of the signals from individual thumb and index movements. However, while the relationship between the index finger joint angle and HG-filtered signal remains consistent between conditions, there is not a fixed relationship for thumb movement. Although the HG-filtered µECoG signal is similar in both thumb-only and pinch conditions, the actual thumb movement is markedly smaller in the pinch condition than in the thumb-only condition. This implies a nonlinear relationship between the cortical signal and the motor output for some, but importantly not all, movement types. This analysis provides insight into the tuning of the motor cortex toward specific types of motor behaviors.


1979 ◽  
Vol 48 (1) ◽  
pp. 207-214 ◽  
Author(s):  
Luis R. Marcos

16 subordinate bilingual subjects produced 5-min. monologues in their nondominant languages, i.e., English or Spanish. Hand-movement activity manifested during the videotape monologues was scored and related to measures of fluency in the nondominant language. The hand-movement behavior categorized as Groping Movement was significantly related to all of the nondominant-language fluency measures. These correlations support the assumption that Groping Movement may have a function in the process of verbal encoding. The results are discussed in terms of the possibility of monitoring central cognitive processes through the study of “visible” motor behavior.


2020 ◽  
Author(s):  
Marlen Fröhlich ◽  
Natasha Bartolotta ◽  
Caroline Fryns ◽  
Colin Wagner ◽  
Laurene Momon ◽  
...  

Abstract From early infancy, human face-to-face communication is “multimodal”, comprising a plethora of interlinked articulators and sensory modalities. Although there is also growing evidence for this in nonhuman primates, the functions of integrating articulators (i.e. multiplex or multi-articulator acts) and channels (i.e. multimodal or multi-sensory acts) remain poorly understood. Here, we studied close-range social interactions within and beyond mother-infant pairs of Bornean and Sumatran orang-utans living in wild and captive settings, to examine to what extent species, setting and recipient-dependent factors affected the use of and responses to multi-sensory as well as multi-articulator communication. Results showed that both multi-sensory and multi-articulatory acts were more effective at eliciting responses (i.e. “apparently satisfactory outcomes”) than their respective uni-component parts, and generally played a larger role in wild populations. However, only multi-articulator acts were used more when the presumed goal did not match the dominant outcome for a specific communicative act, and were more common among non-mother-infant dyads and Sumatrans across settings. We suggest that communication through multiple sensory channels primarily facilitates effectiveness, whereas a flexible combination of articulators is relevant when social tolerance and interaction outcomes are less predictable. These different functions underscore the importance of distinguishing between these forms of multi-component communication.


Author(s):  
Bruce Dienes ◽  
Michael Gurstein

A province-wide network of Community Access Internet sites was supported during the summers of 1996 and 1997 by Wire Nova Scotia (WiNS), a government funded program to provide staffing, training and technical support for these centres. The program was managed remotely from an office in Sydney, Nova Scotia (Canada) using a variety of Internet-based technologies, including email, a web site, conference boards, real-time chat, and mailing lists. Remote management enabled the efficient and low-cost operation of a program involving 67 sites with field placements, plus six regional coordinators and the technical and administrative staff at the hub in Sydney. Effectiveness of remote management was enhanced when employees participated in an initial face-to-face regional training workshop. This training not only familiarized the employees with the communications technologies, but, perhaps more importantly, put a human face and personality to the messages that later came electronically over the Intranet.


2019 ◽  
Vol 121 (5) ◽  
pp. 1967-1976 ◽  
Author(s):  
Niels Gouirand ◽  
James Mathew ◽  
Eli Brenner ◽  
Frederic R. Danion

Adapting hand movements to changes in our body or the environment is essential for skilled motor behavior. Although eye movements are known to assist hand movement control, how eye movements might contribute to the adaptation of hand movements remains largely unexplored. To determine to what extent eye movements contribute to visuomotor adaptation of hand tracking, participants were asked to track a visual target that followed an unpredictable trajectory with a cursor using a joystick. During blocks of trials, participants were either allowed to look wherever they liked or required to fixate a cross at the center of the screen. Eye movements were tracked to ensure gaze fixation as well as to examine free gaze behavior. The cursor initially responded normally to the joystick, but after several trials, the direction in which it responded was rotated by 90°. Although fixating the eyes had a detrimental influence on hand tracking performance, participants exhibited a rather similar time course of adaptation to rotated visual feedback in the gaze-fixed and gaze-free conditions. More importantly, there was extensive transfer of adaptation between the gaze-fixed and gaze-free conditions. We conclude that although eye movements are relevant for the online control of hand tracking, they do not play an important role in the visuomotor adaptation of such tracking. These results suggest that participants do not adapt by changing the mapping between eye and hand movements, but rather by changing the mapping between hand movements and the cursor’s motion independently of eye movements. NEW & NOTEWORTHY Eye movements assist hand movements in everyday activities, but their contribution to visuomotor adaptation remains largely unknown. We compared adaptation of hand tracking under free gaze and fixed gaze. Although our results confirm that following the target with the eyes increases the accuracy of hand movements, they unexpectedly demonstrate that gaze fixation does not hinder adaptation. These results suggest that eye movements have distinct contributions for online control and visuomotor adaptation of hand movements.


1981 ◽  
Vol 75 (8) ◽  
pp. 327-331 ◽  
Author(s):  
Diane P. Wormsley

Twenty-one children ages 6 though 13 were taught to use their hands independently when reading braille to determine how this pattern of hand movements affected reading variables, excluding character recognition. Although all the children learned this pattern of hand movements during the 20 days scheduled for training, only nine children exhibited a dramatic decrease in inefficient tracking movements such as pauses and scrubbing motions. Because these children were younger and more intelligent than the others, read braille more slowly, and had received less training in braille at school, the results strongly suggested that skill in tracking and use of an efficient hand movement pattern is closely tied to perceptual ability. Thus when teaching children to read braille, the motor aspects of the task should be combined with the perceptual aspects from the beginning.


2015 ◽  
Vol 113 (7) ◽  
pp. 2845-2858 ◽  
Author(s):  
Yoshihisa Nakayama ◽  
Osamu Yokoyama ◽  
Eiji Hoshi

The caudal cingulate motor area (CMAc) and the supplementary motor area (SMA) play important roles in movement execution. The present study aimed to characterize the functional organization of these regions during movement by investigating laterality representations in the CMAc and SMA of monkeys via an examination of neuronal activity during a button press movement with either the right or left hand. Three types of movement-related neuronal activity were observed: 1) with only the contralateral hand, 2) with only the ipsilateral hand, and 3) with either hand. Neurons in the CMAc represented contralateral and ipsilateral hand movements to the same degree, whereas neuronal representations in the SMA were biased toward contralateral hand movement. Furthermore, recording neuronal activities using a linear-array multicontact electrode with 24 contacts spaced 150 μm apart allowed us to analyze the spatial distribution of neurons exhibiting particular hand preferences at the submillimeter scale. The CMAc and SMA displayed distinct microarchitectural organizations. The contralateral, ipsilateral, and bilateral CMAc neurons were distributed homogeneously, whereas SMA neurons exhibiting identical hand preferences tended to cluster. These findings indicate that the CMAc, which is functionally organized in a less structured manner than the SMA is, controls contralateral and ipsilateral hand movements in a counterbalanced fashion, whereas the SMA, which is more structured, preferentially controls contralateral hand movements.


Sign in / Sign up

Export Citation Format

Share Document