Cerebral Pulse Controlled Robotic Hand for Effective Paralytic Movements

Author(s):  
Abhay Patil

Abstract: There are roughly 21 million handicapped people in India, which is comparable to 2.2% of the complete populace. These people are affected by various neuromuscular problems. To empower them to articulate their thoughts, one can supply them with elective and augmentative correspondence. For this, a Brain-Computer Interface framework (BCI) has been assembled to manage this specific need. The basic assumption of the venture reports the plan, working just as a testing impersonation of a man's arm which is intended to be powerfully just as kinematically exact. The conveyed gadget attempts to take after the movement of the human hand by investigating the signs delivered by cerebrum waves. The cerebrum waves are really detected by sensors in the Neurosky headset and produce alpha, beta, and gamma signals. Then, at that point, this sign is examined by the microcontroller and is then acquired onto the engineered hand by means of servo engines. A patient that experiences an amputee underneath the elbow can acquire from this specific biomechanical arm. Keywords: Brainwaves, Brain Computer Interface, Arduino, EEG sensor, Neurosky Mindwave Headset, Robotic arm

Brain-Computer Interface (BCI) is atechnology that enables a human to communicate with anexternal stratagem to achieve the desired result. This paperpresents a Motor Imagery (MI) – Electroencephalography(EEG) signal based robotic hand movements of lifting anddropping of an external robotic arm. The MI-EEG signalswere extracted using a 3-channel electrode system with theAD8232 amplifier. The electrodes were placed on threelocations, namely, C3, C4, and right mastoid. Signalprocessing methods namely, Butterworth filter and Sym-9Wavelet Packet Decomposition (WPD) were applied on theextracted EEG signals to de-noise the raw EEG signal.Statistical features like entropy, variance, standarddeviation, covariance, and spectral centroid were extractedfrom the de-noised signals. The statistical features werethen applied to train a Multi-Layer Perceptron (MLP) -Deep Neural Network (DNN) to classify the hand movementinto two classes; ‘No Hand Movement’ and ’HandMovement’. The resultant k-fold cross-validated accuracyachieved was 85.41% and other classification metrics, suchas precision, recall sensitivity, specificity, and F1 Score werealso calculated. The trained model was interfaced withArduino to move the robotic arm according to the classpredicted by the DNN model in a real-time environment.The proposed end to end low-cost deep learning frameworkprovides a substantial improvement in real-time BCI.


BIOPHILIA ◽  
2011 ◽  
Vol 1 (4) ◽  
pp. 4_28-4_28
Author(s):  
Gelu Onose ◽  
Cristian Grozea ◽  
Aurelian Anghelescu ◽  
Cristina Daia ◽  
Crina Julieta Sinescu ◽  
...  

2018 ◽  
Vol 10 (1) ◽  
pp. 35-40 ◽  
Author(s):  
Saad Abdullah ◽  
◽  
Muhammad A. Khan ◽  
Mauro Serpelloni ◽  
Emilio Sardini ◽  
...  

Author(s):  
Judy Flavia ◽  
Aviraj Patel ◽  
Diwakar Kumar Jha ◽  
Navnit Kumar Jha

In the project we are demonstrating the combined usage Augmented Reality(AR) and brain faced com- puter interface(BI) which can be used to control the robotic acurator by.This method is more simple and more user friendly. Here brainwave senor will work in its normal setting detecting alpha, beta, and gam- ma signals. These signals are decoded to detect eye movements. These are very limited on its own since the number of combinations possible to make higher and more complex task possible. As a solution to this AR is integrated with the BCI application to make control interface more user friendly. This application can be used in many cases including many robotic and device controlling cases. Here we use BCI-AR to detect eye paralysis that can be archive by detecting eye lid movement of person by wearing headbend.


2019 ◽  
Vol 7 (2) ◽  
pp. 480-483
Author(s):  
Chengyu Li ◽  
Weijie Zhao

Abstract What can the brain–computer interface (BCI) do? Wearing an electroencephalogram (EEG) headcap, you can control the flight of a drone in the laboratory by your thought; with electrodes inserted inside the brain, paralytic patients can drink by controlling a robotic arm with thinking. Both invasive and non-invasive BCI try to connect human brains to machines. In the past several decades, BCI technology has continued to develop, making science fiction into reality and laboratory inventions into indispensable gadgets. In July 2019, Neuralink, a company founded by Elon Musk, proposed a sewing machine-like device that can dig holes in the skull and implant 3072 electrodes onto the cortex, promising more accurate reading of what you are thinking, although many serious scientists consider the claim misleading to the public. Recently, National Science Review (NSR) interviewed Professor Bin He, the department head of Biomedical Engineering at Carnegie Mellon University, and a leading scientist in the non-invasive-BCI field. His team developed new methods for non-invasive BCI to control drones by thoughts. In 2019, Bin’s team demonstrated the control of a robotic arm to follow a continuously randomly moving target on the screen. In this interview, Bin He recounted the history of BCI, as well as the opportunities and challenges of non-invasive BCI.


Sign in / Sign up

Export Citation Format

Share Document