In this paper, we discuss the possibility to determine assessment metrics for eye-hand coordination and upper-limb disability therapy, using a mapping between a robotic haptic device to a virtual environment and a training algorithm based on Complex Valued Neural Networks that will calculate how close a set movement pattern is in relationship with that traced by a healthy individual.
Most of the current robotic systems’ therapy relies on the patient’s performance on standardized clinical tests such as the functional independence measure (FIM), and the upper limb subsection of the Fugl-Meyer (FM) scales. These systems don’t have other standardized metrics for assessment purposes. There is a need to establish a more intelligent and tailored therapy that could be implemented for patients to use at home in between therapy sessions, or in the long term. This therapy should be based on performance data gathered by the robotic/computer system that will provide an assessment procedure with improved objectivity and precision.
A set of complex and movement demanding virtual environments, representing various levels of difficulty labyrinths was developed in a virtual environment. The participants were instructed to use a haptic device (Omni) to follow the trajectories. This was completed while video data were collected using a Vicon motion capture system. Readings of traced trajectories, time, and upper limb motions are recorded for further analysis.