Real-Time Access to Attention and Attention-Based Brain-Machine Interfaces

2021 ◽  
pp. 545-549
Author(s):  
C. Gaillard ◽  
C. De Sousa ◽  
J. Amengual ◽  
S. Ben Hamed
Author(s):  
Jack DiGiovanna ◽  
Loris Marchal ◽  
Prapaporn Rattanatamrong ◽  
Ming Zhao ◽  
Shalom Darmanjian ◽  
...  

Author(s):  
Vasileios G. Kanas ◽  
Iosif Mporas ◽  
Heather L. Benz ◽  
Kyriakos N. Sgarbas ◽  
Anastasios Bezerianos ◽  
...  

2013 ◽  
Vol 10 (3) ◽  
pp. 036008 ◽  
Author(s):  
Julie Dethier ◽  
Paul Nuyujukian ◽  
Stephen I Ryu ◽  
Krishna V Shenoy ◽  
Kwabena Boahen

2011 ◽  
Vol 2011 ◽  
pp. 1-7 ◽  
Author(s):  
Gustavo Sudre ◽  
Lauri Parkkonen ◽  
Elizabeth Bock ◽  
Sylvain Baillet ◽  
Wei Wang ◽  
...  

To date, the majority of studies using magnetoencephalography (MEG) rely on off-line analysis of the spatiotemporal properties of brain activity. Real-time MEG feedback could potentially benefit multiple areas of basic and clinical research: brain-machine interfaces, neurofeedback rehabilitation of stroke and spinal cord injury, and new adaptive paradigm designs, among others. We have developed a software interface to stream MEG signals in real time from the 306-channel Elekta Neuromag MEG system to an external workstation. The signals can be accessed with a minimal delay (≤45 ms) when data are sampled at 1000 Hz, which is sufficient for most real-time studies. We also show here that real-time source imaging is possible by demonstrating real-time monitoring and feedback of alpha-band power fluctuations over parieto-occipital and frontal areas. The interface is made available to the academic community as an open-source resource.


2020 ◽  
Author(s):  
Samuel R. Nason ◽  
Matthew J. Mender ◽  
Alex K. Vaskov ◽  
Matthew S. Willsey ◽  
Parag G. Patil ◽  
...  

SUMMARYModern brain-machine interfaces can return function to people with paralysis, but current hand neural prostheses are unable to reproduce control of individuated finger movements. Here, for the first time, we present a real-time, high-speed, linear brain-machine interface in nonhuman primates that utilizes intracortical neural signals to bridge this gap. We created a novel task that systematically individuates two finger groups, the index finger and the middle-ring-small fingers combined, presenting separate targets for each group. During online brain control, the ReFIT Kalman filter demonstrated the capability of individuating movements of each finger group with high performance, enabling a nonhuman primate to acquire two targets simultaneously at 1.95 targets per second, resulting in an average information throughput of 2.1 bits per second. To understand this result, we performed single unit tuning analyses. Cortical neurons were active for movements of an individual finger group, combined movements of both finger groups, or both. Linear combinations of neural activity representing individual finger group movements predicted the neural activity during combined finger group movements with high accuracy, and vice versa. Hence, a linear model was able to explain how cortical neurons encode information about multiple dimensions of movement simultaneously. Additionally, training ridge regressing decoders with independent component movements was sufficient to predict untrained higher-complexity movements. Our results suggest that linear decoders for brain-machine interfaces may be sufficient to execute high-dimensional tasks with the performance levels required for naturalistic neural prostheses.


2020 ◽  
Author(s):  
C. De Sousa Ferreira ◽  
C. Gaillard ◽  
F. Di Bello ◽  
S. Ben Hadj Hassen ◽  
S. Ben Hamed

AbstractThe ability to access brain information in real-time is crucial both for a better understanding of cognitive functions and for the development of therapeutic applications based on brain-machine interfaces. Great success has been achieved in the field of neural motor prosthesis. Progress is still needed in the real-time decoding of higher-order cognitive processes such as covert attention. Recently, we showed that we can track the location of the attentional spotlight using classification methods applied to prefrontal multi-unit activity (MUA) in the non-human primate (Astrand et al., 2016). Importantly, we demonstrated that the decoded (x,y) attentional spotlight parametrically correlates with the behavior of the monkeys thus validating our decoding of attention. We also demonstrate that this spotlight is extremely dynamic (Gaillard et al., 2020). Here, in order to get closer to non-invasive decoding applications, we extend our previous work to local field potential signals (LFP). Specifically, we achieve, for the first time, high decoding accuracy of the (x,y) location of the attentional spotlight from prefrontal LFP signals, to a degree comparable to that achieved from MUA signals, and we show that this LFP content is predictive of behavior. This LFP attention-related information is maximal in the gamma band. In addition, we introduce a novel two-step decoding procedure based on the labelling of maximally attention-informative trials during the decoding procedure. This procedure strongly improves the correlation between our real-time MUA and LFP based decoding and behavioral performance, thus further refining the functional relevance of this real-time decoding of the (x,y) locus of attention. This improvement is more marked for LFP signals than for MUA signals, suggesting that LFP signals may contain other sources of task-related variability than spatial attention information. Overall, this study demonstrates that the attentional spotlight can be accessed from LFP frequency content, in real-time, and can be used to drive high-information content cognitive brain machine interfaces for the development of new therapeutic strategies.HighlightsWe use machine learning to decode attention spotlight from prefrontal MUA & LFP.We achieve high decoding accuracy of (x,y) spatial attention spotlight.(x,y) attention spotlight position accuracy is maximal from LFP gamma frequency range.MUA and LFP decoded attention position predicts behavioral performances.Selecting high information signals improves decoding and behavioral correlates.


Sign in / Sign up

Export Citation Format

Share Document