On‐Skin Stimulation Devices for Haptic Feedback and Human–Machine Interfaces

2021 ◽  
pp. 2100452
Author(s):  
Wei Guo ◽  
Yijia Hu ◽  
Zhouping Yin ◽  
Hao Wu
2022 ◽  
Vol 8 (2) ◽  
Author(s):  
Yiming Liu ◽  
Chunki Yiu ◽  
Zhen Song ◽  
Ya Huang ◽  
Kuanming Yao ◽  
...  

The closed-loop HMI system could compliantly interface with human body for teleoperating various robotics with haptic feedback.


Procedia CIRP ◽  
2021 ◽  
Vol 100 ◽  
pp. 601-606
Author(s):  
Lasse Schölkopf ◽  
Mario Lorenz ◽  
Mareike Stamer ◽  
Lina Albrecht ◽  
Philipp Klimant ◽  
...  

2021 ◽  
Vol 1 (1) ◽  
pp. 81-120
Author(s):  
Zhongda Sun ◽  
Minglu Zhu ◽  
Chengkuo Lee

Entering the 5G and internet of things (IoT) era, human–machine interfaces (HMIs) capable of providing humans with more intuitive interaction with the digitalized world have experienced a flourishing development in the past few years. Although the advanced sensing techniques based on complementary metal-oxide-semiconductor (CMOS) or microelectromechanical system (MEMS) solutions, e.g., camera, microphone, inertial measurement unit (IMU), etc., and flexible solutions, e.g., stretchable conductor, optical fiber, etc., have been widely utilized as sensing components for wearable/non-wearable HMIs development, the relatively high-power consumption of these sensors remains a concern, especially for wearable/portable scenarios. Recent progress on triboelectric nanogenerator (TENG) self-powered sensors provides a new possibility for realizing low-power/self-sustainable HMIs by directly converting biomechanical energies into valuable sensory information. Leveraging the advantages of wide material choices and diversified structural design, TENGs have been successfully developed into various forms of HMIs, including glove, glasses, touchpad, exoskeleton, electronic skin, etc., for sundry applications, e.g., collaborative operation, personal healthcare, robot perception, smart home, etc. With the evolving artificial intelligence (AI) and haptic feedback technologies, more advanced HMIs could be realized towards intelligent and immersive human–machine interactions. Hence, in this review, we systematically introduce the current TENG HMIs in the aspects of different application scenarios, i.e., wearable, robot-related and smart home, and prospective future development enabled by the AI/haptic-feedback technology. Discussion on implementing self-sustainable/zero-power/passive HMIs in this 5G/IoT era and our perspectives are also provided.


2018 ◽  
Author(s):  
Hellen van Rees ◽  
◽  
Angelika Mader ◽  
Merlijn Smits ◽  
Geke Ludden ◽  
...  

Author(s):  
E. Willuth ◽  
S. F. Hardon ◽  
F. Lang ◽  
C. M. Haney ◽  
E. A. Felinska ◽  
...  

Abstract Background Robotic-assisted surgery (RAS) potentially reduces workload and shortens the surgical learning curve compared to conventional laparoscopy (CL). The present study aimed to compare robotic-assisted cholecystectomy (RAC) to laparoscopic cholecystectomy (LC) in the initial learning phase for novices. Methods In a randomized crossover study, medical students (n = 40) in their clinical years performed both LC and RAC on a cadaveric porcine model. After standardized instructions and basic skill training, group 1 started with RAC and then performed LC, while group 2 started with LC and then performed RAC. The primary endpoint was surgical performance measured with Objective Structured Assessment of Technical Skills (OSATS) score, secondary endpoints included operating time, complications (liver damage, gallbladder perforations, vessel damage), force applied to tissue, and subjective workload assessment. Results Surgical performance was better for RAC than for LC for total OSATS (RAC = 77.4 ± 7.9 vs. LC = 73.8 ± 9.4; p = 0.025, global OSATS (RAC = 27.2 ± 1.0 vs. LC = 26.5 ± 1.6; p = 0.012, and task specific OSATS score (RAC = 50.5 ± 7.5 vs. LC = 47.1 ± 8.5; p = 0.037). There were less complications with RAC than with LC (10 (25.6%) vs. 26 (65.0%), p = 0.006) but no difference in operating times (RAC = 77.0 ± 15.3 vs. LC = 75.5 ± 15.3 min; p = 0.517). Force applied to tissue was similar. Students found RAC less physical demanding and less frustrating than LC. Conclusions Novices performed their first cholecystectomies with better performance and less complications with RAS than with CL, while operating time showed no differences. Students perceived less subjective workload for RAS than for CL. Unlike our expectations, the lack of haptic feedback on the robotic system did not lead to higher force application during RAC than LC and did not increase tissue damage. These results show potential advantages for RAS over CL for surgical novices while performing their first RAC and LC using an ex vivo cadaveric porcine model. Registration number researchregistry6029 Graphic abstract


2020 ◽  
Vol 6 (1) ◽  
Author(s):  
Maximilian Neidhardt ◽  
Nils Gessert ◽  
Tobias Gosau ◽  
Julia Kemmling ◽  
Susanne Feldhaus ◽  
...  

AbstractMinimally invasive robotic surgery offer benefits such as reduced physical trauma, faster recovery and lesser pain for the patient. For these procedures, visual and haptic feedback to the surgeon is crucial when operating surgical tools without line-of-sight with a robot. External force sensors are biased by friction at the tool shaft and thereby cannot estimate forces between tool tip and tissue. As an alternative, vision-based force estimation was proposed. Here, interaction forces are directly learned from deformation observed by an external imaging system. Recently, an approach based on optical coherence tomography and deep learning has shown promising results. However, most experiments are performed on ex-vivo tissue. In this work, we demonstrate that models trained on dead tissue do not perform well in in vivo data. We performed multiple experiments on a human tumor xenograft mouse model, both on in vivo, perfused tissue and dead tissue. We compared two deep learning models in different training scenarios. Training on perfused, in vivo data improved model performance by 24% for in vivo force estimation.


Sign in / Sign up

Export Citation Format

Share Document