scholarly journals Intuitive Spatial Tactile Feedback for Better Awareness about Robot Trajectory during Human–Robot Collaboration

Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5748
Author(s):  
Stefan Grushko ◽  
Aleš Vysocký ◽  
Dominik Heczko ◽  
Zdenko Bobovský

In this work, we extend the previously proposed approach of improving mutual perception during human–robot collaboration by communicating the robot’s motion intentions and status to a human worker using hand-worn haptic feedback devices. The improvement is presented by introducing spatial tactile feedback, which provides the human worker with more intuitive information about the currently planned robot’s trajectory, given its spatial configuration. The enhanced feedback devices communicate directional information through activation of six tactors spatially organised to represent an orthogonal coordinate frame: the vibration activates on the side of the feedback device that is closest to the future path of the robot. To test the effectiveness of the improved human–machine interface, two user studies were prepared and conducted. The first study aimed to quantitatively evaluate the ease of differentiating activation of individual tactors of the notification devices. The second user study aimed to assess the overall usability of the enhanced notification mode for improving human awareness about the planned trajectory of a robot. The results of the first experiment allowed to identify the tactors for which vibration intensity was most often confused by users. The results of the second experiment showed that the enhanced notification system allowed the participants to complete the task faster and, in general, improved user awareness of the robot’s movement plan, according to both objective and subjective data. Moreover, the majority of participants (82%) favoured the improved notification system over its previous non-directional version and vision-based inspection.

Sensors ◽  
2021 ◽  
Vol 21 (11) ◽  
pp. 3673
Author(s):  
Stefan Grushko ◽  
Aleš Vysocký ◽  
Petr Oščádal ◽  
Michal Vocetka ◽  
Petr Novák ◽  
...  

In a collaborative scenario, the communication between humans and robots is a fundamental aspect to achieve good efficiency and ergonomics in the task execution. A lot of research has been made related to enabling a robot system to understand and predict human behaviour, allowing the robot to adapt its motion to avoid collisions with human workers. Assuming the production task has a high degree of variability, the robot’s movements can be difficult to predict, leading to a feeling of anxiety in the worker when the robot changes its trajectory and approaches since the worker has no information about the planned movement of the robot. Additionally, without information about the robot’s movement, the human worker cannot effectively plan own activity without forcing the robot to constantly replan its movement. We propose a novel approach to communicating the robot’s intentions to a human worker. The improvement to the collaboration is presented by introducing haptic feedback devices, whose task is to notify the human worker about the currently planned robot’s trajectory and changes in its status. In order to verify the effectiveness of the developed human-machine interface in the conditions of a shared collaborative workspace, a user study was designed and conducted among 16 participants, whose objective was to accurately recognise the goal position of the robot during its movement. Data collected during the experiment included both objective and subjective parameters. Statistically significant results of the experiment indicated that all the participants could improve their task completion time by over 45% and generally were more subjectively satisfied when completing the task with equipped haptic feedback devices. The results also suggest the usefulness of the developed notification system since it improved users’ awareness about the motion plan of the robot.


2021 ◽  
Vol 2021 (6) ◽  
pp. 5475-5480
Author(s):  
STEFAN GRUSHKO ◽  
◽  
ALES VYSOCKY ◽  
JIRI SUDER ◽  
LADISLAV GLOGAR ◽  
...  

Human-robot collaboration is a widespread topic within the concept of Industry 4.0. Such collaboration brings new opportunities to improve ergonomics and innovative options for manufacturing automation; however, most of the modern collaborative industrial applications are limited by the fact that neither collaborative side is fully aware of the partner: the human operator may not see the robot movement due to own engagement in the work process, and the collaborative robot simply has no means of knowing the position of the operator. Dynamic replanning of the robot trajectory with respect to the operator's current position can increase the efficiency and safety of cooperation since the robot will be able to avoid collisions and proceed in task completion; however, the other side of communication remains unresolved. This paper provides a review of methods of improving human awareness during collaboration with a robot. Covered techniques include graphical, acoustic and haptic feedback implementations. The work is focused on the practical applicability of the approaches, and analyses present challenges associated with each method.


2020 ◽  
Vol 4 (4) ◽  
pp. 78
Author(s):  
Andoni Rivera Pinto ◽  
Johan Kildal ◽  
Elena Lazkano

In the context of industrial production, a worker that wants to program a robot using the hand-guidance technique needs that the robot is available to be programmed and not in operation. This means that production with that robot is stopped during that time. A way around this constraint is to perform the same manual guidance steps on a holographic representation of the digital twin of the robot, using augmented reality technologies. However, this presents the limitation of a lack of tangibility of the visual holograms that the user tries to grab. We present an interface in which some of the tangibility is provided through ultrasound-based mid-air haptics actuation. We report a user study that evaluates the impact that the presence of such haptic feedback may have on a pick-and-place task of the wrist of a holographic robot arm which we found to be beneficial.


2021 ◽  
Vol 8 ◽  
Author(s):  
Min Li ◽  
Jiazhou Chen ◽  
Guoying He ◽  
Lei Cui ◽  
Chaoyang Chen ◽  
...  

Active enrollment in rehabilitation training yields better treatment outcomes. This paper introduces an exoskeleton-assisted hand rehabilitation system. It is the first attempt to combine fingertip cutaneous haptic stimulation with exoskeleton-assisted hand rehabilitation for training participation enhancement. For the first time, soft material 3D printing techniques are adopted to make soft pneumatic fingertip haptic feedback actuators to achieve cheaper and faster iterations of prototype designs with consistent quality. The fingertip haptic stimulation is synchronized with the motion of our hand exoskeleton. The contact force of the fingertips resulted from a virtual interaction with a glass of water was based on data collected from normal hand motions to grasp a glass of water. System characterization experiments were conducted and exoskeleton-assisted hand motion with and without the fingertip cutaneous haptic stimulation were compared in an experiment involving healthy human subjects. Users’ attention levels were monitored in the motion control process using a Brainlink EEG-recording device and software. The results of characterization experiments show that our created haptic actuators are lightweight (6.8 ± 0.23 g each with a PLA fixture and Velcro) and their performance is consistent and stable with small hysteresis. The user study experimental results show that participants had significantly higher attention levels with additional haptic stimulations compared to when only the exoskeleton was deployed; heavier stimulated grasping weight (a 300 g glass) was associated with significantly higher attention levels of the participants compared to when lighter stimulated grasping weight (a 150 g glass) was applied. We conclude that haptic stimulations increase the involvement level of human subjects during exoskeleton-assisted hand exercises. Potentially, the proposed exoskeleton-assisted hand rehabilitation with fingertip stimulation may better attract user’s attention during treatment.


2021 ◽  
Vol 33 (5) ◽  
pp. 1104-1116
Author(s):  
Yoshihiro Tanaka ◽  
Shogo Shiraki ◽  
Kazuki Katayama ◽  
Kouta Minamizawa ◽  
Domenico Prattichizzo ◽  
...  

Tactile sensations are crucial for achieving precise operations. A haptic connection between a human operator and a robot has the potential to promote smooth human-robot collaboration (HRC). In this study, we assemble a bilaterally shared haptic system for grasping operations, such as both hands of humans using a bottle cap-opening task. A robot arm controls the grasping force according to the tactile information from the human that opens the cap with a finger-attached acceleration sensor. Then, the grasping force of the robot arm is fed back to the human using a wearable squeezing display. Three experiments are conducted: measurement of the just noticeable difference in the tactile display, a collaborative task with different bottles under two conditions, with and without tactile feedback, including psychological evaluations using a questionnaire, and a collaborative task under an explicit strategy. The results obtained showed that the tactile feedback provided the confidence that the cooperative robot was adjusting its action and improved the stability of the task with the explicit strategy. The results indicate the effectiveness of the tactile feedback and the requirement for an explicit strategy of operators, providing insight into the design of an HRC with bilaterally shared haptic perception.


2005 ◽  
Vol 14 (3) ◽  
pp. 345-365 ◽  
Author(s):  
Sangyoon Lee ◽  
Gaurav Sukhatme ◽  
Gerard Jounghyun Kim ◽  
Chan-Mo Park

The problem of teleoperating a mobile robot using shared autonomy is addressed: An onboard controller performs close-range obstacle avoidance while the operator uses the manipulandum of a haptic probe to designate the desired speed and rate of turn. Sensors on the robot are used to measure obstacle-range information. A strategy to convert such range information into forces is described, which are reflected to the operator's hand via the haptic probe. This haptic information provides feedback to the operator in addition to imagery from a front-facing camera mounted on the mobile robot. Extensive experiments with a user population both in virtual and in real environments show that this added haptic feedback significantly improves operator performance, as well as presence, in several ways (reduced collisions, increased minimum distance between the robot and obstacles, etc.) without a significant increase in navigation time.


2019 ◽  
Vol 30 (17) ◽  
pp. 2521-2533 ◽  
Author(s):  
Alex Mazursky ◽  
Jeong-Hoi Koo ◽  
Tae-Heon Yang

Realistic haptic feedback is needed to provide information to users of numerous technologies, such as virtual reality, mobile devices, and robotics. For a device to convey realistic haptic feedback, two touch sensations must be present: tactile feedback and kinesthetic feedback. Although many devices today convey tactile feedback through vibrations, most neglect to incorporate kinesthetic feedback. To address this issue, this study investigates a haptic device with the aim of conveying both kinesthetic and vibrotactile information to users. A prototype based on electrorheological fluids was designed and fabricated. By controlling the electrorheological fluid flow via applied electric fields, the device can generate a range of haptic sensations. The design centered on an elastic membrane that acts as the actuator’s contact surface. Moreover, the control electronics and structural components were integrated into a compact printed circuit board, resulting in a slim device suitable for mobile applications. The device was tested using a dynamic mechanical analyzer to evaluate its performance. The device design was supported with mathematical modeling and was in agreement with experimental results. According to the just-noticeable difference analysis, this range is sufficient to transmit distinct kinesthetic and vibrotactile sensations to users, indicating that the electrorheological fluid–based actuator is capable of conveying haptic feedback.


2019 ◽  
Vol 9 (23) ◽  
pp. 5123 ◽  
Author(s):  
Diego Vaquero-Melchor ◽  
Ana M. Bernardos

Nowadays, Augmented-Reality (AR) head-mounted displays (HMD) deliver a more immersive visualization of virtual contents, but the available means of interaction, mainly based on gesture and/or voice, are yet limited and obviously lack realism and expressivity when compared to traditional physical means. In this sense, the integration of haptics within AR may help to deliver an enriched experience, while facilitating the performance of specific actions, such as repositioning or resizing tasks, that are still dependent on the user’s skills. In this direction, this paper gathers the description of a flexible architecture designed to deploy haptically enabled AR applications both for mobile and wearable visualization devices. The haptic feedback may be generated through a variety of devices (e.g., wearable, graspable, or mid-air ones), and the architecture facilitates handling the specificity of each. For this reason, within the paper, it is discussed how to generate a haptic representation of a 3D digital object depending on the application and the target device. Additionally, the paper includes an analysis of practical, relevant issues that arise when setting up a system to work with specific devices like HMD (e.g., HoloLens) and mid-air haptic devices (e.g., Ultrahaptics), such as the alignment between the real world and the virtual one. The architecture applicability is demonstrated through the implementation of two applications: (a) Form Inspector and (b) Simon Game, built for HoloLens and iOS mobile phones for visualization and for UHK for mid-air haptics delivery. These applications have been used to explore with nine users the efficiency, meaningfulness, and usefulness of mid-air haptics for form perception, object resizing, and push interaction tasks. Results show that, although mobile interaction is preferred when this option is available, haptics turn out to be more meaningful in identifying shapes when compared to what users initially expect and in contributing to the execution of resizing tasks. Moreover, this preliminary user study reveals some design issues when working with haptic AR. For example, users may be expecting a tailored interface metaphor, not necessarily inspired in natural interaction. This has been the case of our proposal of virtual pressable buttons, built mimicking real buttons by using haptics, but differently interpreted by the study participants.


Author(s):  
Lei Tian ◽  
Aiguo Song ◽  
Dapeng Chen

In order to enhance the sense of reality haptic display based on image, it is widely expected to express various characteristics of the objects in the image using different kinds of haptic feedback. To this end, a multi-mode haptic display method of image was proposed in this paper, including the multi-feature extraction of image and the image expression with various types of haptic rendering. First, the device structure integrating force and vibrotactile feedbacks was designed for multi-mode haptic display. Meanwhile, the three-dimensional geometric shape, detail texture and outline of the object in the image were extracted by various image processing algorithms. Then, a rendering method for the object in the image was proposed based on the psychophysical experiments on the piezoelectric ceramic actuator. The 3D geometric shape, detail texture and outline of the object were rendered by force and vibration tactile feedbacks, respectively. Finally, these three features of the image were haptic expressed simultaneously by the integrated device. Haptic perception experiment results show that the multi-mode haptic display method can effectively improve the authenticity of haptic perception.


2019 ◽  
Vol 24 (2) ◽  
pp. 191-209 ◽  
Author(s):  
Mohammed Al-Sada ◽  
Keren Jiang ◽  
Shubhankar Ranade ◽  
Mohammed Kalkattawi ◽  
Tatsuo Nakajima

Abstract Haptic feedback plays a large role in enhancing immersion and presence in VR. However, previous research and commercial products have limitations in terms of variety and locations of delivered feedbacks. To address these challenges, we present HapticSnakes, which are snake-like waist-worn robots that can deliver multiple types of feedback in various body locations, including taps-, gestures-, airflow-, brushing- and gripper-based feedbacks. We developed two robots, one is lightweight and suitable for taps and gestures, while the other is capable of multiple types of feedback. We presented a design space based on our implementations and conducted two evaluations. Since taps are versatile, easy to deliver and largely unexplored, our first evaluation focused on distinguishability of tap strengths and locations on the front and back torso. Participants had highest accuracy in distinguishing feedback on the uppermost regions and had superior overall accuracy in distinguishing feedback strengths over locations. Our second user study investigated HapticSnakes’ ability to deliver multiple feedback types within VR experiences, as well as users’ impressions of wearing our robots and receiving novel feedback in VR. The results indicate that participants had distinct preferences for feedbacks and were in favor of using our robots throughout. Based on the results of our evaluations, we extract design considerations and discuss research challenges and opportunities for developing multi-haptic feedback robots.


Sign in / Sign up

Export Citation Format

Share Document