A 3D Visual Stimuli Based P300 Brain-computer Interface

Author(s):  
Jingsheng Tang ◽  
Zongtan Zhou ◽  
Yadong Liu
2009 ◽  
Vol 120 (8) ◽  
pp. 1562-1566 ◽  
Author(s):  
Kouji Takano ◽  
Tomoaki Komatsu ◽  
Naoki Hata ◽  
Yasoichi Nakajima ◽  
Kenji Kansaku

2016 ◽  
Vol 14 (2) ◽  
pp. 477-484 ◽  
Author(s):  
Alan Francisco Perez Vidal ◽  
Marco Antonio Oliver Salazar ◽  
Guadalupe Salas Lopez

2015 ◽  
Vol 43 ◽  
pp. 239-249 ◽  
Author(s):  
Toshiyuki Kondo ◽  
Midori Saeki ◽  
Yoshikatsu Hayashi ◽  
Kosei Nakayashiki ◽  
Yohei Takata

2018 ◽  
Vol 10 (02) ◽  
pp. 1840005
Author(s):  
Javier F. Castillo-Garcia ◽  
Sandra Müller ◽  
Eduardo F. Caicedo-Bravo ◽  
Teodiano F. Bastos ◽  
Alberto F. De Souza

This work describes the development of a nonfatigating Brain–Computer Interface (BCI) based on Steady State Evoked Potentials (SSVEP) and Event-Related Desynchronization (ERD) to control an autonomous car. Through a graphical interface presented to the user in the autonomous car, destination places are shown. The selection of commands is performed through visual stimuli and brain signals. The signals are captured on the occipital region of the scalp, and are processed in order to obtain the necessary data for the planning system of the autonomous car. Test performed obtained success rate of 90% for a synchronous BCI and 83% for an asynchronous BCI. The proposed system is a hybrid-BCI, which includes the ability to enable and disable the visual stimuli, reducing fatigue associated with the use of SSVEP-based BCIs. The video showing this development can be accessed on: cbeb2020.org/AutonomousCarVideo.mp4.


Sign in / Sign up

Export Citation Format

Share Document