scholarly journals Adaptive-projection intrinsically transformed multivariate empirical mode decomposition in cooperative brain–computer interface applications

Author(s):  
Apit Hemakom ◽  
Valentin Goverdovsky ◽  
David Looney ◽  
Danilo P. Mandic

An extension to multivariate empirical mode decomposition (MEMD), termed adaptive-projection intrinsically transformed MEMD (APIT-MEMD), is proposed to cater for power imbalances and inter-channel correlations in real-world multichannel data. It is shown that the APIT-MEMD exhibits similar or better performance than MEMD for a large number of projection vectors, whereas it outperforms MEMD for the critical case of a small number of projection vectors within the sifting algorithm. We also employ the noise-assisted APIT-MEMD within our proposed intrinsic multiscale analysis framework and illustrate the advantages of such an approach in notoriously noise-dominated cooperative brain–computer interface (BCI) based on the steady-state visual evoked potentials and the P300 responses. Finally, we show that for a joint cognitive BCI task, the proposed intrinsic multiscale analysis framework improves system performance in terms of the information transfer rate.

2018 ◽  
Vol 28 (10) ◽  
pp. 1850034 ◽  
Author(s):  
Wei Li ◽  
Mengfan Li ◽  
Huihui Zhou ◽  
Genshe Chen ◽  
Jing Jin ◽  
...  

Increasing command generation rate of an event-related potential-based brain-robot system is challenging, because of limited information transfer rate of a brain-computer interface system. To improve the rate, we propose a dual stimuli approach that is flashing a robot image and is scanning another robot image simultaneously. Two kinds of event-related potentials, N200 and P300 potentials, evoked in this dual stimuli condition are decoded by a convolutional neural network. Compared with the traditional approaches, this proposed approach significantly improves the online information transfer rate from 23.0 or 17.8 to 39.1 bits/min at an accuracy of 91.7%. These results suggest that combining multiple types of stimuli to evoke distinguishable ERPs might be a promising direction to improve the command generation rate in the brain-computer interface.


Micromachines ◽  
2019 ◽  
Vol 10 (10) ◽  
pp. 681
Author(s):  
Bor-Shyh Lin ◽  
Bor-Shing Lin ◽  
Tzu-Hsiang Yen ◽  
Chien-Chin Hsu ◽  
Yao-Chin Wang

Brain–computer interface (BCI) is a system that allows people to communicate directly with external machines via recognizing brain activities without manual operation. However, for most current BCI systems, conventional electroencephalography (EEG) machines and computers are usually required to acquire EEG signal and translate them into control commands, respectively. The sizes of the above machines are usually large, and this increases the limitation for daily applications. Moreover, conventional EEG electrodes also require conductive gels to improve the EEG signal quality. This causes discomfort and inconvenience of use, while the conductive gels may also encounter the problem of drying out during prolonged measurements. In order to improve the above issues, a wearable headset with steady-state visually evoked potential (SSVEP)-based BCI is proposed in this study. Active dry electrodes were designed and implemented to acquire a good EEG signal quality without conductive gels from the hairy site. The SSVEP BCI algorithm was also implemented into the designed field-programmable gate array (FPGA)-based BCI module to translate SSVEP signals into control commands in real time. Moreover, a commercial tablet was used as the visual stimulus device to provide graphic control icons. The whole system was designed as a wearable device to improve convenience of use in daily life, and it could acquire and translate EEG signal directly in the front-end headset. Finally, the performance of the proposed system was validated, and the results showed that it had excellent performance (information transfer rate = 36.08 bits/min).


2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Shih Chung Chen ◽  
Aaron Raymond See ◽  
Yeou Jiunn Chen ◽  
Chia Hong Yeng ◽  
Chih Kuo Liang

People suffering from paralysis caused by serious neural disorder or spinal cord injury also need to be given a means of recreation other than general living aids. Although there have been a proliferation of brain computer interface (BCI) applications, developments for recreational activities are scarcely seen. The objective of this study is to develop a BCI-based remote control integrated with commercial devices such as the remote controlled Air Swimmer. The brain is visually stimulated using boxes flickering at preprogrammed frequencies to activate a brain response. After acquiring and processing these brain signals, the frequency of the resulting peak, which corresponds to the user’s selection, is determined by a decision model. Consequently, a command signal is sent from the computer to the wireless remote controller via a data acquisition (DAQ) module. A command selection training (CST) and simulated path test (SPT) were conducted by 12 subjects using the BCI control system and the experimental results showed a recognition accuracy rate of 89.51% and 92.31% for the CST and SPT, respectively. The fastest information transfer rate demonstrated a response of 105 bits/min and 41.79 bits/min for the CST and SPT, respectively. The BCI system was proven to be able to provide a fast and accurate response for a remote controller application.


Sensors ◽  
2021 ◽  
Vol 21 (13) ◽  
pp. 4578
Author(s):  
Jihyeon Ha ◽  
Sangin Park ◽  
Chang-Hwan Im ◽  
Laehyun Kim

Assistant devices such as meal-assist robots aid individuals with disabilities and support the elderly in performing daily activities. However, existing meal-assist robots are inconvenient to operate due to non-intuitive user interfaces, requiring additional time and effort. Thus, we developed a hybrid brain–computer interface-based meal-assist robot system following three features that can be measured using scalp electrodes for electroencephalography. The following three procedures comprise a single meal cycle. (1) Triple eye-blinks (EBs) from the prefrontal channel were treated as activation for initiating the cycle. (2) Steady-state visual evoked potentials (SSVEPs) from occipital channels were used to select the food per the user’s intention. (3) Electromyograms (EMGs) were recorded from temporal channels as the users chewed the food to mark the end of a cycle and indicate readiness for starting the following meal. The accuracy, information transfer rate, and false positive rate during experiments on five subjects were as follows: accuracy (EBs/SSVEPs/EMGs) (%): (94.67/83.33/97.33); FPR (EBs/EMGs) (times/min): (0.11/0.08); ITR (SSVEPs) (bit/min): 20.41. These results revealed the feasibility of this assistive system. The proposed system allows users to eat on their own more naturally. Furthermore, it can increase the self-esteem of disabled and elderly peeople and enhance their quality of life.


Sign in / Sign up

Export Citation Format

Share Document