Brain-Computer Interface: Generic Control Interface for Social Interaction Applications

Author(s):  
C. Hintermüller ◽  
C. Guger ◽  
G. Edlinger
Author(s):  
Judy Flavia ◽  
Aviraj Patel ◽  
Diwakar Kumar Jha ◽  
Navnit Kumar Jha

In the project we are demonstrating the combined usage Augmented Reality(AR) and brain faced com- puter interface(BI) which can be used to control the robotic acurator by.This method is more simple and more user friendly. Here brainwave senor will work in its normal setting detecting alpha, beta, and gam- ma signals. These signals are decoded to detect eye movements. These are very limited on its own since the number of combinations possible to make higher and more complex task possible. As a solution to this AR is integrated with the BCI application to make control interface more user friendly. This application can be used in many cases including many robotic and device controlling cases. Here we use BCI-AR to detect eye paralysis that can be archive by detecting eye lid movement of person by wearing headbend.


2010 ◽  
Vol 19 (1) ◽  
pp. 71-81 ◽  
Author(s):  
Francisco Velasco-Álvarez ◽  
Ricardo Ron-Angevin ◽  
Maria José Blanca-Mena

In this paper, an asynchronous brain–computer interface is presented that enables the control of a wheelchair in virtual environments using only one motor imagery task. The control is achieved through a graphical intentional control interface with three navigation commands (move forward, turn right, and turn left) which are displayed surrounding a circle. A bar is rotating in the center of the circle, so it points successively to the three possible commands. The user can, by motor imagery, extend this bar length to select the command at which the bar is pointing. Once a command is selected, the virtual wheelchair moves in a continuous way, so the user controls the length of the advance or the amplitude of the turns. Users can voluntarily switch from this interface to a noncontrol interface (and vice versa) when they do not want to generate any command. After performing a cue-based feedback training, three subjects carried out an experiment in which they had to navigate through the same fixed path to reach an objective. The results obtained support the viability of the system.


NeuroSci ◽  
2021 ◽  
Vol 2 (2) ◽  
pp. 109-119
Author(s):  
Szczepan Paszkiel ◽  
Ryszard Rojek ◽  
Ningrong Lei ◽  
Maria António Castro

The article describes the practical use of Unity technology in neurogaming. For this purpose, the article describes Unity technology and brain–computer interface (BCI) technology based on the Emotiv EPOC + NeuroHeadset device. The process of creating the game world and the test results for the use of a device based on the BCI as a control interface for the created game are also presented. The game was created in the Unity graphics engine and the Visual Studio environment in C#. The game presented in the article is called “NeuroBall” due to the player’s object, which is a big red ball. The game will require full focus to make the ball move. The game will aim to improve the concentration and training of the user’s brain in a user-friendly environment. Through neurogaming, it will be possible to exercise and train a healthy brain, as well as diagnose and treat various symptoms of brain disorders. The project was entirely created in the Unity graphics engine in Unity version 2020.1.


Author(s):  
Ms. Judy Flavia ◽  
◽  
Aviraj Patel ◽  
Diwakar Kumar Jha ◽  
Navnit Kumar Jha ◽  
...  

In the project we are demonstrating the combined usage Augmented Reality(AR) and brain faced com- puter interface(BI) which can be used to control the robotic acurator by. This method is more simple and more user friendly. Here brainwave senor will work in its normal setting detecting alpha, beta, and gam- ma signals. These signals are decoded to detect eye movements. These are very limited on its own since the number of combinations possible to make higher and more complex task possible. Asa solution to this AR is integrated with the BCI application to make control interface more user friendly. This application can be used in many cases including many robotic and device controlling cases. Here we use BCI-AR to detect eye paralysis that can be archive by detecting eye lid movement of person by wearing head bend.


2013 ◽  
Vol 133 (3) ◽  
pp. 635-641
Author(s):  
Genzo Naito ◽  
Lui Yoshida ◽  
Takashi Numata ◽  
Yutaro Ogawa ◽  
Kiyoshi Kotani ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document