Classification of EEG Signals for Hand Gripping Motor Imagery and Hardware Representation of Neural States Using Arduino-Based LED Sensors

Author(s):  
Deepanshi Dabas ◽  
Ayushi ◽  
Mehak Lakhani ◽  
Bharti Sharma
Keyword(s):  
Sensors ◽  
2019 ◽  
Vol 19 (13) ◽  
pp. 2854 ◽  
Author(s):  
Kwon-Woo Ha ◽  
Jin-Woo Jeong

Various convolutional neural network (CNN)-based approaches have been recently proposed to improve the performance of motor imagery based-brain-computer interfaces (BCIs). However, the classification accuracy of CNNs is compromised when target data are distorted. Specifically for motor imagery electroencephalogram (EEG), the measured signals, even from the same person, are not consistent and can be significantly distorted. To overcome these limitations, we propose to apply a capsule network (CapsNet) for learning various properties of EEG signals, thereby achieving better and more robust performance than previous CNN methods. The proposed CapsNet-based framework classifies the two-class motor imagery, namely right-hand and left-hand movements. The motor imagery EEG signals are first transformed into 2D images using the short-time Fourier transform (STFT) algorithm and then used for training and testing the capsule network. The performance of the proposed framework was evaluated on the BCI competition IV 2b dataset. The proposed framework outperformed state-of-the-art CNN-based methods and various conventional machine learning approaches. The experimental results demonstrate the feasibility of the proposed approach for classification of motor imagery EEG signals.


Author(s):  
S. R. Sreeja ◽  
Joytirmoy Rabha ◽  
Debasis Samanta ◽  
Pabitra Mitra ◽  
Monalisa Sarma
Keyword(s):  

Author(s):  
Subrota Mazumdar ◽  
Rohit Chaudhary ◽  
Suruchi Suruchi ◽  
Suman Mohanty ◽  
Divya Kumari ◽  
...  

In this chapter, a nearest neighbor (k-NN)-based method for efficient classification of motor imagery using EEG for brain-computer interfacing (BCI) applications has been proposed. Electroencephalogram (EEG) signals are obtained from multiple channels from brain. These EEG signals are taken as input features and given to the k-NN-based classifier to classify motor imagery. More specifically, the chapter gives an outline of the Berlin brain-computer interface that can be operated with minimal subject change. All the design and simulation works are carried out with MATLAB software. k-NN-based classifier is trained with data from continuous signals of EEG channels. After the network is trained, it is tested with various test cases. Performance of the network is checked in terms of percentage accuracy, which is found to be 99.25%. The result suggested that the proposed method is accurate for BCI applications.


Sensors ◽  
2019 ◽  
Vol 19 (3) ◽  
pp. 551 ◽  
Author(s):  
Mengxi Dai ◽  
Dezhi Zheng ◽  
Rui Na ◽  
Shuai Wang ◽  
Shuailei Zhang

Successful applications of brain-computer interface (BCI) approaches to motor imagery (MI) are still limited. In this paper, we propose a classification framework for MI electroencephalogram (EEG) signals that combines a convolutional neural network (CNN) architecture with a variational autoencoder (VAE) for classification. The decoder of the VAE generates a Gaussian distribution, so it can be used to fit the Gaussian distribution of EEG signals. A new representation of input was developed by combining the time, frequency, and channel information from the EEG signal, and the CNN-VAE method was designed and optimized accordingly for this form of input. In this network, the classification of the extracted CNN features is performed via the deep network VAE. Our framework, with an average kappa value of 0.564, outperforms the best classification method in the literature for BCI Competition IV dataset 2b with a 3% improvement. Furthermore, using our own dataset, the CNN-VAE framework also yields the best performance for both three-electrode and five-electrode EEGs and achieves the best average kappa values 0.568 and 0.603, respectively. Our results show that the proposed CNN-VAE method raises performance to the current state of the art.


Sign in / Sign up

Export Citation Format

Share Document