real hand
Recently Published Documents


TOTAL DOCUMENTS

41
(FIVE YEARS 17)

H-INDEX

11
(FIVE YEARS 1)

2021 ◽  
Vol 11 (2) ◽  
pp. 95-102
Author(s):  
Nur Ameerah Abdul Halim ◽  
Ajune Wanis Ismail

Augmented Reality (AR) have been widely explored worldwide for their potential as a technology that enhances information representation. As technology progresses, smartphones (handheld devices) now have sophisticated processors and cameras for capturing static photographs and video, as well as a variety of sensors for tracking the user's position, orientation, and motion. Hence, this paper would discuss a finger-ray pointing technique in real-time for interaction in handheld AR and comparing the technique with the conventional technique in handheld, touch-screen interaction. The aim of this paper is to explore the ray pointing interaction in handheld AR for 3D object selection. Previous works in handheld AR and also covers Mixed Reality (MR) have been recapped.


2021 ◽  
Vol 11 (15) ◽  
pp. 6932
Author(s):  
Yongseok Lee ◽  
Somang Lee ◽  
Dongjun Lee

We propose a novel wearable haptic device that can provide kinesthetic haptic feedback for stiffness rendering of virtual objects in augmented reality (AR). Rendering stiffness of objects using haptic feedback is crucial for realistic finger-based object manipulation, yet challenging particularly in AR due to the co-presence of a real hand, haptic device, and rendered AR objects in the scenes. By adopting passive actuation with a tendon-based transmission mechanism, the proposed haptic device can generate kinesthetic feedback strong enough for immersive manipulation and prevention of inter-penetration in a small-form-factor, while maximizing the wearability and minimizing the occlusion in AR usage. A selective locking module is adopted in the device to allow for the rendering of the elasticity of objects. We perform an experimental study of two-finger grasping to verify the efficacy of the proposed haptic device for finger-based manipulation in AR. We also quantitatively compare/articulate the effects of different types of feedbacks across haptic and visual sense (i.e., kinesthetic haptic feedback, vibrotactile haptic feedback, and visuo-haptic feedback) for stiffness rendering of virtual objects in AR for the first time.


2021 ◽  
Vol 15 ◽  
Author(s):  
Akihiro Iida ◽  
Hidekazu Saito ◽  
Hisaaki Ota

Although the illusion that the mirror image of a hand or limb could be recognized as a part of one’s body behind the mirror, the effect of adding tactile stimulation to this illusion remains unknown. We, therefore, examined how the timing of tactile stimulation affects the induction of body ownership on the mirror image. Twenty-one healthy, right-handed participants (mean age = 23.0 ± 1.0 years, no medical history of neurological and/or psychiatric disorders) were enrolled and a crossover design was adopted in this study. Participants’ right and left hands were placed on the front and back sides of the mirror, respectively, then they were asked to keep looking at their right hand in the mirror. All participants experienced two experiments; one was with tactile stimulation that was synchronized with the movement of a mirror image (synchronous condition), and the other one was with tactile stimulation that was not synchronized (asynchronous condition). The qualitative degree of body ownership for the mirrored hand was evaluated by a questionnaire. Proprioceptive drift (PD), an illusory shift of the felt position of the real hand toward the mirrored hand was used for quantitative evaluation of body ownership and measured at “baseline,” “immediately after stimulation,” “2 min after stimulation,” and “4 min after stimulation.” The results of the questionnaire revealed that some items of body ownership rating were higher in the synchronous condition than in the asynchronous condition (p < 0.05). We found that PD occurred from immediately after to 4 min after stimulation in both conditions (p < 0.01) and there was no difference in the results between the conditions. From the dissociation of these results, we interpreted that body ownership could be elicited by different mechanisms depending on the task demand. Our results may contribute to the understanding of the multisensory integration mechanism of visual and tactile stimulation during mirror illusion induction.


2021 ◽  
Vol 12 ◽  
Author(s):  
Arran T. Reader ◽  
Victoria S. Trifonova ◽  
H. Henrik Ehrsson

The rubber hand illusion (RHI) is one of the most commonly used paradigms to examine the sense of body ownership. Touches are synchronously applied to the real hand, hidden from view, and a false hand in an anatomically congruent position. During the illusion one may perceive that the feeling of touch arises from the false hand (referral of touch), and that the false hand is one's own. The relationship between referral of touch and body ownership in the illusion is unclear, and some articles average responses to statements addressing these experiences, which may be inappropriate depending on the research question of interest. To address these concerns, we re-analyzed three freely available datasets to better understand the relationship between referral of touch and feeling of ownership in the RHI. We found that most participants who report a feeling of ownership also report referral of touch, and that referral of touch and ownership show a moderately strong positive relationship that was highly replicable. In addition, referral of touch tends to be reported more strongly and more frequently than the feeling of ownership over the hand. The former observations confirm that referral of touch and ownership are related experiences in the RHI. The latter, however, indicate that when pooling the statements one may obtain a higher number of illusion ‘responders’ compared to considering the ownership statements in isolation. These results have implications for the RHI as an experimental paradigm.


2021 ◽  
Author(s):  
Daisuke Mine ◽  
Kazuhiko Yokosawa

The space surrounding our body is called peripersonal space (PPS). It has been reported that visuo-tactile facilitation occurs more strongly within PPS than outside PPS. Furthermore, previous research has revealed several methods by which PPS can be extended. The present study provides the first behavioral evidence of the transfer of PPS in a virtual environment by a novel technique. PPS representation was investigated using a remote-controlled hand avatar presented far from their body in a virtual environment. Participants showed strongest visuo-tactile facilitation at the far space around the remote hand and no facilitation at the near space around the real hand, suggesting that PPS transfers from near the body to the space around the hand avatar. The present results extend previous findings of the plasticity of PPS and demonstrate flexibility of PPS representation beyond the physical and anatomical limits of body representation.


2020 ◽  
Vol 11 (1) ◽  
pp. 6
Author(s):  
Yu-Wei Hsieh ◽  
Meng-Ta Lee ◽  
Yu-Hsuan Lin ◽  
Li-Ling Chuang ◽  
Chih-Chi Chen ◽  
...  

Both action observation (AO) and virtual reality (VR) provide visual stimuli to trigger brain activations during the observation of actions. However, the mechanism of observing video movements performed by a person’s real hand versus that performed by a computer graphic hand remains uncertain. We aimed to investigate the differences in observing the video of real versus computer graphic hand movements on primary motor cortex (M1) activation by magnetoencephalography. Twenty healthy adults completed 3 experimental conditions: the resting state, the video of real hand movements (VRH), and the video of computer graphic hand movements (CGH) conditions with the intermittent electrical stimuli simultaneously applied to the median nerve by an electrical stimulator. The beta oscillatory activity (~20 Hz) in the M1 was collected, lower values indicating greater activations. To compare the beta oscillatory activities among the 3 conditions, the Friedman test with Bonferroni correction (p-value < 0.017 indicating statistical significance) were used. The beta oscillatory activities of the VRH and CGH conditions were significantly lower than that of the resting state condition. No significant difference in the beta oscillatory activity was found between the VRH and CGH conditions. Observing hand movements in a video performed by a real hand and those by a computer graphic hand evoked comparable M1 activations in healthy adults. This study provides some neuroimaging support for the use of AO and VR in rehabilitation, but no differential activations were found.


Author(s):  
Roland Pfister ◽  
Annika L. Klaffehn ◽  
Andreas Kalckert ◽  
Wilfried Kunde ◽  
David Dignath

AbstractBody representations are readily expanded based on sensorimotor experience. A dynamic view of body representations, however, holds that these representations cannot only be expanded but that they can also be narrowed down by disembodying elements of the body representation that are no longer warranted. Here we induced illusory ownership in terms of a moving rubber hand illusion and studied the maintenance of this illusion across different conditions. We observed ownership experience to decrease gradually unless participants continued to receive confirmatory multisensory input. Moreover, a single instance of multisensory mismatch – a hammer striking the rubber hand but not the real hand – triggered substantial and immediate disembodiment. Together, these findings support and extend previous theoretical efforts to model body representations through basic mechanisms of multisensory integration. They further support an updating model suggesting that embodied entities fade from the body representation if they are not refreshed continuously.


Author(s):  
Jabbar Salman Hussain ◽  
Ahmed Al-Khazzar ◽  
Mithaq Nama Raheema

Myoelectric prostheses are a viable solution for people with amputations. The challenge in implementing a usable myoelectric prosthesis lies in accurately recognizing different hand gestures. The current myoelectric devices usually implement very few hand gestures. In order to approximate a real hand functionality, a myoelectric prosthesis should implement a large number of hand and finger gestures. However, increasing number of gestures can lead to a decrease in recognition accuracy. In this work a Myo arm band device is used to recognize fourteen gestures (five build in gestures of Myo armband in addition to nine new gestures). The data in this research is collected from three body-able subjects for a period of 7 seconds per gesture. The proposed method uses a pattern recognition technique based on Multi-Layer Perceptron Neural Network (MLPNN). The results show an average accuracy of 90.5% in recognizing the proposed fourteen gestures.


Author(s):  
C S Yusof ◽  
N A A Halim ◽  
M N A Nor’a ◽  
A W Ismail
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document