gestural control
Recently Published Documents


TOTAL DOCUMENTS

36
(FIVE YEARS 4)

H-INDEX

4
(FIVE YEARS 1)

3D Audio ◽  
2021 ◽  
pp. 64-81
Author(s):  
Diego Quiroz
Keyword(s):  
3D Audio ◽  

2019 ◽  
Vol 11 (1) ◽  
pp. 190-208
Author(s):  
Janneke Adema ◽  
Kamila Kuc

Unruly gestures presents a hybrid performative intervention by means of video, text, and still images. With this experimental essay we aspire to break down various preconceptions about reading/writing gestures. Breaking away from a narrative that sees these gestures foremost as passive entities – as either embodiments of pure subjective intentionality, or as bodily movements shaped and controlled by media technologies (enabling specific sensory engagements with texts) – we aim to reappraise them. Indeed, in this essay we identify numerous dominant narratives that relate to gestural agency, to the media-specificity of gestures, and to their (linear) historicity, naturalness and humanism. This essay disrupts these preconceptions, and by doing so, it unfolds an alternative genealogy of ‘unruly gestures.’ These are gestures that challenge gestural conditioning through particular media technologies, cultural power structures, hegemonic discourses, and the biopolitical self. We focus on reading/writing gestures that have disrupted gestural hegemonies and material-discursive forms of gestural control through time and across media. Informed by Tristan Tzara’s cut-up techniques, where through the gesture of cutting the Dadaists subverted established traditions of authorship, intentionality, and linearity, this essay has been cut-up into seven semi-autonomous cine-paragraphs (accessible in video and print). Each of these cine-paragraphs confronts specific gestural preconceptions while simultaneously showcasing various unruly gestures.


Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 641 ◽  
Author(s):  
Jessica D’Abbraccio ◽  
Luca Massari ◽  
Sahana Prasanna ◽  
Laura Baldini ◽  
Francesca Sorgini ◽  
...  

Advancements in the study of the human sense of touch are fueling the field of haptics. This is paving the way for augmenting sensory perception during object palpation in tele-surgery and reproducing the sensed information through tactile feedback. Here, we present a novel tele-palpation apparatus that enables the user to detect nodules with various distinct stiffness buried in an ad-hoc polymeric phantom. The contact force measured by the platform was encoded using a neuromorphic model and reproduced on the index fingertip of a remote user through a haptic glove embedding a piezoelectric disk. We assessed the effectiveness of this feedback in allowing nodule identification under two experimental conditions of real-time telepresence: In Line of Sight (ILS), where the platform was placed in the visible range of a user; and the more demanding Not In Line of Sight (NILS), with the platform and the user being 50 km apart. We found that the entailed percentage of identification was higher for stiffer inclusions with respect to the softer ones (average of 74% within the duration of the task), in both telepresence conditions evaluated. These promising results call for further exploration of tactile augmentation technology for telepresence in medical interventions.


Author(s):  
Francesca Sorgini ◽  
Giuseppe Airò Farulla ◽  
Nikola Lukic ◽  
Ivan Danilov ◽  
Bozica Bojovic ◽  
...  

Research on bidirectional human-machine interfaces will enable the smooth interaction with robotic platforms in contexts ranging from industry to tele-medicine and rescue. This paper introduces a bidirectional communication system to achieve multisensory telepresence during the gestural control of an industrial robotic arm. We complement the gesture-based control by means of a tactile-feedback strategy grounding on a spiking artificial neuron model. Force and motion from the robot are converted in neuromorphic haptic stimuli delivered on the user’s hand through a vibro-tactile glove. Untrained personnel participated in an experimental task benchmarking a pick-and-place operation. The robot end-effector was used to sequentially press six buttons, illuminated according to a random sequence, and comparing the tasks executed without and with tactile feedback. The results demonstrated the reliability of the hand tracking strategy developed for controlling the robotic arm, and the effectiveness of a neuronal spiking model for encoding hand displacement and exerted forces in order to promote a fluid embodiment of the haptic interface and control strategy. The main contribution of this paper is in presenting a robotic arm under gesture-based remote control with multisensory telepresence, demonstrating for the first time that a spiking haptic interface can be used to effectively deliver on the skin surface a sequence of stimuli emulating the neural code of the mechanoreceptors beneath.


Author(s):  
Jessica D'Abbraccio ◽  
Luca Massari ◽  
Sahana Prasanna ◽  
Laura Baldini ◽  
Francesca Sorgini ◽  
...  

The advancements in the study of the human sense of touch are fueling the field of haptics. This is paving the way for augmenting the sensory perception during objects palpation in tele-surgery, and reproducing the information through tactile feedback. Here, we present a novel tele-palpation apparatus that enables the user to detect nodules with various distinct stiffness buried in an ad-hoc polymeric phantom. The contact force measured by the platform was encoded using a neuromorphic model and reproduced on the index fingertip of a remote user through a haptic glove embedding a piezoelectric disk. We assessed the effectiveness of this feedback in allowing nodule identification under two experimental conditions of real-time telepresence: In Line of Sight (ILS), where the platform was placed in the visible range of a user; and the more demanding Not In Line of Sight (NILS), with the platform being 50 km apart. We found that the entailed percentage of identification was higher for stiffer inclusions with respect to the softer ones (average of 74% within the duration of the task), in both telepresence conditions evaluated. These promising results call for further exploration of tactile augmentation technology for telepresence in medical interventions.


Author(s):  
M. David Keller ◽  
Patrick Mead ◽  
Megan Kozub

Gaze supported non-tactile gestural control uses a combination of gestures based body movements with eye gaze positioning to provide an input source for a user’s control with a system. Combining body gestures with eye movements allows for unique computer control methods other than the traditional mouse. However, research is mixed on the effectiveness of emerging control types, such as gestures and eye-tracking, with some showing positive performance outcomes for one or more control aspects but performance detriments in other areas that would prohibit the use of such novel control methods. One important aspect that is often ignored is familiarity with the control method. Unlike the mouse, users are typically unfamiliar with eye and gestures based control methods. In order to truly understand the benefit of new concepts like gaze supported gestural controls, testing experienced users is necessary. In the current experiment, participants were trained on the gaze supported gestures system in order to become “experts” and achieve similar levels of proficiency with the different control methods to be assessed, to include mouse, non-gaze and gaze supported gestural controls. Results showed that after as few as five practice sessions participants were able to perform a simple point and click task as well or even better than mouse control when using gaze supported gestures.


Sign in / Sign up

Export Citation Format

Share Document