Control of a 9-DoF Wheelchair-mounted robotic arm system using a P300 Brain Computer Interface: Initial experiments

Author(s):  
M. Palankar ◽  
K.J. De Laurentis ◽  
R. Alqasemi ◽  
E. Veras ◽  
R. Dubey ◽  
...  
Author(s):  
Kathryn J. De Laurentis ◽  
Yael Arbel ◽  
Rajiv Dubey ◽  
Emanuel Donchin

Three decades ago, Farwell and Donchin [1] developed a computer system based on the method of electroencephalography (EEG) that enables individuals to communicate with their environment without using any neuromuscular function. This P300 BCI speller makes use of the well-studied observation that the brain reacts differently to different stimuli, based on the level of attention given to the stimulus and the specific processing triggered by the stimulus. Since this first report in 1988, several brain-computer interface (BCI) systems have been developed and constantly improved. We have previously demonstrated that the P300-BCI can control a wheelchair-mounted robotic arm (WMRA) system [2].


Author(s):  
Abhay Patil

Abstract: There are roughly 21 million handicapped people in India, which is comparable to 2.2% of the complete populace. These people are affected by various neuromuscular problems. To empower them to articulate their thoughts, one can supply them with elective and augmentative correspondence. For this, a Brain-Computer Interface framework (BCI) has been assembled to manage this specific need. The basic assumption of the venture reports the plan, working just as a testing impersonation of a man's arm which is intended to be powerfully just as kinematically exact. The conveyed gadget attempts to take after the movement of the human hand by investigating the signs delivered by cerebrum waves. The cerebrum waves are really detected by sensors in the Neurosky headset and produce alpha, beta, and gamma signals. Then, at that point, this sign is examined by the microcontroller and is then acquired onto the engineered hand by means of servo engines. A patient that experiences an amputee underneath the elbow can acquire from this specific biomechanical arm. Keywords: Brainwaves, Brain Computer Interface, Arduino, EEG sensor, Neurosky Mindwave Headset, Robotic arm


BIOPHILIA ◽  
2011 ◽  
Vol 1 (4) ◽  
pp. 4_28-4_28
Author(s):  
Gelu Onose ◽  
Cristian Grozea ◽  
Aurelian Anghelescu ◽  
Cristina Daia ◽  
Crina Julieta Sinescu ◽  
...  

2018 ◽  
Vol 10 (1) ◽  
pp. 35-40 ◽  
Author(s):  
Saad Abdullah ◽  
◽  
Muhammad A. Khan ◽  
Mauro Serpelloni ◽  
Emilio Sardini ◽  
...  

2019 ◽  
Vol 7 (2) ◽  
pp. 480-483
Author(s):  
Chengyu Li ◽  
Weijie Zhao

Abstract What can the brain–computer interface (BCI) do? Wearing an electroencephalogram (EEG) headcap, you can control the flight of a drone in the laboratory by your thought; with electrodes inserted inside the brain, paralytic patients can drink by controlling a robotic arm with thinking. Both invasive and non-invasive BCI try to connect human brains to machines. In the past several decades, BCI technology has continued to develop, making science fiction into reality and laboratory inventions into indispensable gadgets. In July 2019, Neuralink, a company founded by Elon Musk, proposed a sewing machine-like device that can dig holes in the skull and implant 3072 electrodes onto the cortex, promising more accurate reading of what you are thinking, although many serious scientists consider the claim misleading to the public. Recently, National Science Review (NSR) interviewed Professor Bin He, the department head of Biomedical Engineering at Carnegie Mellon University, and a leading scientist in the non-invasive-BCI field. His team developed new methods for non-invasive BCI to control drones by thoughts. In 2019, Bin’s team demonstrated the control of a robotic arm to follow a continuously randomly moving target on the screen. In this interview, Bin He recounted the history of BCI, as well as the opportunities and challenges of non-invasive BCI.


Sign in / Sign up

Export Citation Format

Share Document