scholarly journals Wearable multichannel haptic device for encoding proprioception in the upper limb

2020 ◽  
Vol 17 (5) ◽  
pp. 056035
Author(s):  
Patrick G Sagastegui Alva ◽  
Silvia Muceli ◽  
S Farokh Atashzar ◽  
Lucie William ◽  
Dario Farina
Keyword(s):  
2018 ◽  
Vol 42 (1) ◽  
pp. 43-52 ◽  
Author(s):  
S. Mazzoleni ◽  
E. Battini ◽  
R. Crecchi ◽  
P. Dario ◽  
F. Posteraro

2011 ◽  
Vol 35 (5) ◽  
pp. 459-467 ◽  
Author(s):  
Ho-Kyoo Lee ◽  
Young-Tark Kim ◽  
Yoshiyuki Takahashi ◽  
Tasuku Miyoshi ◽  
Keisuke Suzuki ◽  
...  

Author(s):  
Patrick G. Sagastegui Alva ◽  
Silvia Muceli ◽  
S. Farokh Atashzar ◽  
Lucie William ◽  
Dario Farina

Author(s):  
Shoichi SAKAMOTO ◽  
Hokyoo LEE ◽  
Yoshiyuki TAKAHASHI ◽  
Tasuku MIYOSHI ◽  
Tadashi SUZUKI ◽  
...  

Author(s):  
Norali Pernalete ◽  
Amar Raheja ◽  
Stephanie Carey

In this paper, we discuss the possibility to determine assessment metrics for eye-hand coordination and upper-limb disability therapy, using a mapping between a robotic haptic device to a virtual environment and a training algorithm based on Complex Valued Neural Networks that will calculate how close a set movement pattern is in relationship with that traced by a healthy individual. Most of the current robotic systems’ therapy relies on the patient’s performance on standardized clinical tests such as the functional independence measure (FIM), and the upper limb subsection of the Fugl-Meyer (FM) scales. These systems don’t have other standardized metrics for assessment purposes. There is a need to establish a more intelligent and tailored therapy that could be implemented for patients to use at home in between therapy sessions, or in the long term. This therapy should be based on performance data gathered by the robotic/computer system that will provide an assessment procedure with improved objectivity and precision. A set of complex and movement demanding virtual environments, representing various levels of difficulty labyrinths was developed in a virtual environment. The participants were instructed to use a haptic device (Omni) to follow the trajectories. This was completed while video data were collected using a Vicon motion capture system. Readings of traced trajectories, time, and upper limb motions are recorded for further analysis.


Sign in / Sign up

Export Citation Format

Share Document