Brain-Computer Interfaces and Human-Computer Interaction

Author(s):  
Desney Tan ◽  
Anton Nijholt
Photonics ◽  
2019 ◽  
Vol 6 (3) ◽  
pp. 90 ◽  
Author(s):  
Bosworth ◽  
Russell ◽  
Jacob

Over the past decade, the Human–Computer Interaction (HCI) Lab at Tufts University has been developing real-time, implicit Brain–Computer Interfaces (BCIs) using functional near-infrared spectroscopy (fNIRS). This paper reviews the work of the lab; we explore how we have used fNIRS to develop BCIs that are based on a variety of human states, including cognitive workload, multitasking, musical learning applications, and preference detection. Our work indicates that fNIRS is a robust tool for the classification of brain-states in real-time, which can provide programmers with useful information to develop interfaces that are more intuitive and beneficial for the user than are currently possible given today’s human-input (e.g., mouse and keyboard).


2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Alonso-Valerdi Luz María ◽  
Mercado-García Víctor Rodrigo

Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI). Those cognitive processes take place while a user navigates and explores a virtual environment (VE) and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence) that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI). BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1) set out working environmental conditions, (2) maximize the efficiency of BCI control panels, (3) implement navigation systems based not only on user intentions but also on user emotions, and (4) regulate user mental state to increase the differentiation between control and noncontrol modalities.


Author(s):  
Thorsten O. Zander ◽  
Laurens R. Krol

Brain-computer interfaces can provide an input channel from humans to computers that depends only on brain activity, bypassing traditional means of communication and interaction. This input channel can be used to send explicit commands, but also to provide implicit input to the computer. As such, the computer can obtain information about its user that not only bypasses, but also goes beyond what can be communicated using traditional means. In this form, implicit input can potentially provide significant improvements to human-computer interaction. This paper describes a selection of work done by Team PhyPA (Physiological Parameters for Adaptation) at the Technische Universität Berlin to use brain-computer interfacing to enrich human-computer interaction.


2016 ◽  
pp. 251-269
Author(s):  
Andéol Evain ◽  
Nicolas Roussel ◽  
Gry Casiez ◽  
Fernando Argelaguet-Sanz ◽  
Anatole Lécuyer

2015 ◽  
Vol 9 (3-4) ◽  
pp. 263 ◽  
Author(s):  
J. Harry Whalley ◽  
Panagiotis Mavros ◽  
Peter Furniss

This paper will explore questions of agency, control and interaction and the embodied nature of musical performance in relation to the use of human-computer interaction (HCI), through the experimental work <em>Clasp Together (beta)</em> [1] for small ensemble and live electronics by J. Harry Whalley. This practice-led research is situated at the intersection of music neurotechnology for sound synthesis and brain-computer interfaces (BCI, a subdomain of HCI), and explores the use of neural patterns from Electroencephalography (EEG) as a control instrument. The composition departed from the traditional composer/performer paradigm by using both non-instrumental physical gestures and cognitive or emotive instructions integrated into the score.


Sign in / Sign up

Export Citation Format

Share Document