Eye Gaze Controlled Robotic Arm for Persons with Severe Speech and Motor Impairment

Author(s):  
Vinay Krishna Sharma ◽  
Kamalpreet Saluja ◽  
Vimal Mollyn ◽  
Pradipta Biswas
2016 ◽  
Vol 55 (01) ◽  
pp. 79-83 ◽  
Author(s):  
A. Vourvopoulos ◽  
A. Bernardino ◽  
i Bermúdez Badia ◽  
J. Alves

Summary Introduction: This article is part of the Focus Theme of Methods of Information in Medicine on “Methodologies, Models and Algorithms for Patients Rehabilitation”. Objective: Identify eye gaze correlates of motor impairment in a virtual reality motor observation task in a study with healthy participants and stroke patients. Methods: Participants consisted of a group of healthy subjects (N = 20) and a group of stroke survivors (N = 10). Both groups were required to observe a simple reach-and-grab and place-and-release task in a virtual environment. Additionally, healthy subjects were required to observe the task in a normal condition and a constrained movement condition. Eye movements were recorded during the observation task for later analysis. Results: For healthy participants, results showed differences in gaze metrics when comparing the normal and arm-constrained conditions. Differences in gaze metrics were also found when comparing dominant and non-dominant arm for saccades and smooth pursuit events. For stroke patients, results showed longer smooth pursuit segments in action observation when observing the paretic arm, thus providing evidence that the affected circuitry may be activated for eye gaze control during observation of the simulated motor action. Conclusions: This study suggests that neural motor circuits are involved, at multiple levels, in observation of motor actions displayed in a virtual reality environment. Thus, eye tracking combined with action observation tasks in a virtual reality display can be used to monitor motor deficits derived from stroke, and consequently can also be used for re -habilitation of stroke patients.


Author(s):  
Ayush Agarwal ◽  
DV JeevithaShree ◽  
Kamalpreet Singh Saluja ◽  
Atul Sahay ◽  
Pullikonda Mounika ◽  
...  
Keyword(s):  
Eye Gaze ◽  

2018 ◽  
Vol 28 (08) ◽  
pp. 1850018 ◽  
Author(s):  
Xiaogang Chen ◽  
Bing Zhao ◽  
Yijun Wang ◽  
Shengpu Xu ◽  
Xiaorong Gao

Although robot technology has been successfully used to empower people who suffer from motor disabilities to increase their interaction with their physical environment, it remains a challenge for individuals with severe motor impairment, who do not have the motor control ability to move robots or prosthetic devices by manual control. In this study, to mitigate this issue, a noninvasive brain-computer interface (BCI)-based robotic arm control system using gaze based steady-state visual evoked potential (SSVEP) was designed and implemented using a portable wireless electroencephalogram (EEG) system. A 15-target SSVEP-based BCI using a filter bank canonical correlation analysis (FBCCA) method allowed users to directly control the robotic arm without system calibration. The online results from 12 healthy subjects indicated that a command for the proposed brain-controlled robot system could be selected from 15 possible choices in 4[Formula: see text]s (i.e. 2[Formula: see text]s for visual stimulation and 2[Formula: see text]s for gaze shifting) with an average accuracy of 92.78%, resulting in a 15 commands/min transfer rate. Furthermore, all subjects (even naive users) were able to successfully complete the entire move-grasp-lift task without user training. These results demonstrated an SSVEP-based BCI could provide accurate and efficient high-level control of a robotic arm, showing the feasibility of a BCI-based robotic arm control system for hand-assistance.


2015 ◽  
Vol 58 ◽  
pp. e98 ◽  
Author(s):  
M. Beaudoin ◽  
F. Routhier ◽  
J. Lettre ◽  
P. Archambault ◽  
M. Lemay

2014 ◽  
Vol 23 (1) ◽  
pp. 42-54 ◽  
Author(s):  
Tanya Rose Curtis

As the field of telepractice grows, perceived barriers to service delivery must be anticipated and addressed in order to provide appropriate service delivery to individuals who will benefit from this model. When applying telepractice to the field of AAC, additional barriers are encountered when clients with complex communication needs are unable to speak, often present with severe quadriplegia and are unable to position themselves or access the computer independently, and/or may have cognitive impairments and limited computer experience. Some access methods, such as eye gaze, can also present technological challenges in the telepractice environment. These barriers can be overcome, and telepractice is not only practical and effective, but often a preferred means of service delivery for persons with complex communication needs.


2014 ◽  
Vol 23 (3) ◽  
pp. 132-139 ◽  
Author(s):  
Lauren Zubow ◽  
Richard Hurtig

Children with Rett Syndrome (RS) are reported to use multiple modalities to communicate although their intentionality is often questioned (Bartolotta, Zipp, Simpkins, & Glazewski, 2011; Hetzroni & Rubin, 2006; Sigafoos et al., 2000; Sigafoos, Woodyatt, Tuckeer, Roberts-Pennell, & Pittendreigh, 2000). This paper will present results of a study analyzing the unconventional vocalizations of a child with RS. The primary research question addresses the ability of familiar and unfamiliar listeners to interpret unconventional vocalizations as “yes” or “no” responses. This paper will also address the acoustic analysis and perceptual judgments of these vocalizations. Pre-recorded isolated vocalizations of “yes” and “no” were presented to 5 listeners (mother, father, 1 unfamiliar, and 2 familiar clinicians) and the listeners were asked to rate the vocalizations as either “yes” or “no.” The ratings were compared to the original identification made by the child's mother during the face-to-face interaction from which the samples were drawn. Findings of this study suggest, in this case, the child's vocalizations were intentional and could be interpreted by familiar and unfamiliar listeners as either “yes” or “no” without contextual or visual cues. The results suggest that communication partners should be trained to attend to eye-gaze and vocalizations to ensure the child's intended choice is accurately understood.


2000 ◽  
Vol 42 (01) ◽  
pp. 69 ◽  
Author(s):  
Bernard Dan ◽  
Guy Cheron
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document