Augmented Reality (AR) and Brain-Computer Interface (BCI): Two Enabling Technologies for Empowering the Fruition of Sensor Data in the 4.0 Era

Author(s):  
Annarita Tedesco ◽  
Dominique Dallet ◽  
Pasquale Arpaia
2020 ◽  
Vol 69 (4) ◽  
pp. 1530-1539 ◽  
Author(s):  
Leopoldo Angrisani ◽  
Pasquale Arpaia ◽  
Antonio Esposito ◽  
Nicola Moccaldi

Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5765
Author(s):  
Soram Kim ◽  
Seungyun Lee ◽  
Hyunsuk Kang ◽  
Sion Kim ◽  
Minkyu Ahn

Since the emergence of head-mounted displays (HMDs), researchers have attempted to introduce virtual and augmented reality (VR, AR) in brain–computer interface (BCI) studies. However, there is a lack of studies that incorporate both AR and VR to compare the performance in the two environments. Therefore, it is necessary to develop a BCI application that can be used in both VR and AR to allow BCI performance to be compared in the two environments. In this study, we developed an opensource-based drone control application using P300-based BCI, which can be used in both VR and AR. Twenty healthy subjects participated in the experiment with this application. They were asked to control the drone in two environments and filled out questionnaires before and after the experiment. We found no significant (p > 0.05) difference in online performance (classification accuracy and amplitude/latency of P300 component) and user experience (satisfaction about time length, program, environment, interest, difficulty, immersion, and feeling of self-control) between VR and AR. This indicates that the P300 BCI paradigm is relatively reliable and may work well in various situations.


Author(s):  
Judy Flavia ◽  
Aviraj Patel ◽  
Diwakar Kumar Jha ◽  
Navnit Kumar Jha

In the project we are demonstrating the combined usage Augmented Reality(AR) and brain faced com- puter interface(BI) which can be used to control the robotic acurator by.This method is more simple and more user friendly. Here brainwave senor will work in its normal setting detecting alpha, beta, and gam- ma signals. These signals are decoded to detect eye movements. These are very limited on its own since the number of combinations possible to make higher and more complex task possible. As a solution to this AR is integrated with the BCI application to make control interface more user friendly. This application can be used in many cases including many robotic and device controlling cases. Here we use BCI-AR to detect eye paralysis that can be archive by detecting eye lid movement of person by wearing headbend.


2020 ◽  
Vol 14 ◽  
Author(s):  
Amaia Benitez-Andonegui ◽  
Rodion Burden ◽  
Richard Benning ◽  
Rico Möckel ◽  
Michael Lührs ◽  
...  

2015 ◽  
Vol 75 (4) ◽  
Author(s):  
Faris Amin M. Abuhashish ◽  
Hoshang Kolivand ◽  
Mohd Shahrizal Sunar ◽  
Dzulkifli Mohamad

A Brain-Computer Interface (BCI) is the device that can read and acquire the brain activities. A human body is controlled by Brain-Signals, which considered as a main controller. Furthermore, the human emotions and thoughts will be translated by brain through brain signals and expressed as human mood. This controlling process mainly performed through brain signals, the brain signals is a key component in electroencephalogram (EEG). Based on signal processing the features representing human mood (behavior) could be extracted with emotion as a major feature. This paper proposes a new framework in order to recognize the human inner emotions that have been conducted on the basis of EEG signals using a BCI device controller. This framework go through five steps starting by classifying the brain signal after reading it in order to obtain the emotion, then map the emotion, synchronize the animation of the 3D virtual human, test and evaluate the work. Based on our best knowledge there is no framework for controlling the 3D virtual human. As a result for implementing our framework will enhance the game field of enhancing and controlling the 3D virtual humans’ emotion walking in order to enhance and bring more realistic as well. Commercial games and Augmented Reality systems are possible beneficiaries of this technique.


Sign in / Sign up

Export Citation Format

Share Document