neural engineering
Recently Published Documents


TOTAL DOCUMENTS

223
(FIVE YEARS 37)

H-INDEX

13
(FIVE YEARS 2)

2021 ◽  
Vol 72 ◽  
pp. 29-38
Author(s):  
Hannah Wunderlich ◽  
Kristen L Kozielski

2021 ◽  
Vol 15 ◽  
Author(s):  
Tianyu Liu ◽  
Zhixiong Xu ◽  
Lei Cao ◽  
Guowei Tan

Hybrid-modality brain-computer Interfaces (BCIs), which combine motor imagery (MI) bio-signals and steady-state visual evoked potentials (SSVEPs), has attracted wide attention in the research field of neural engineering. The number of channels should be as small as possible for real-life applications. However, most of recent works about channel selection only focus on either the performance of classification task or the effectiveness of device control. Few works conduct channel selection for MI and SSVEP classification tasks simultaneously. In this paper, a multitasking-based multiobjective evolutionary algorithm (EMMOA) was proposed to select appropriate channels for these two classification tasks at the same time. Moreover, a two-stage framework was introduced to balance the number of selected channels and the classification accuracy in the proposed algorithm. The experimental results verified the feasibility of multiobjective optimization methodology for channel selection of hybrid BCI tasks.


IEEE Pulse ◽  
2021 ◽  
Vol 12 (5) ◽  
pp. 19-23
Author(s):  
Junio Alves de Lima ◽  
Ashley Dalrymple ◽  
Maria Jantz ◽  
Chantel Charlebois ◽  
Cynthia Weber

2021 ◽  
Vol 33 (3) ◽  
pp. 827-852
Author(s):  
Omri Barak ◽  
Sandro Romani

Empirical estimates of the dimensionality of neural population activity are often much lower than the population size. Similar phenomena are also observed in trained and designed neural network models. These experimental and computational results suggest that mapping low-dimensional dynamics to high-dimensional neural space is a common feature of cortical computation. Despite the ubiquity of this observation, the constraints arising from such mapping are poorly understood. Here we consider a specific example of mapping low-dimensional dynamics to high-dimensional neural activity—the neural engineering framework. We analytically solve the framework for the classic ring model—a neural network encoding a static or dynamic angular variable. Our results provide a complete characterization of the success and failure modes for this model. Based on similarities between this and other frameworks, we speculate that these results could apply to more general scenarios.


2021 ◽  
Vol 15 ◽  
Author(s):  
Avi Hazan ◽  
Elishai Ezra Tsur

Brain-inspired hardware designs realize neural principles in electronics to provide high-performing, energy-efficient frameworks for artificial intelligence. The Neural Engineering Framework (NEF) brings forth a theoretical framework for representing high-dimensional mathematical constructs with spiking neurons to implement functional large-scale neural networks. Here, we present OZ, a programable analog implementation of NEF-inspired spiking neurons. OZ neurons can be dynamically programmed to feature varying high-dimensional response curves with positive and negative encoders for a neuromorphic distributed representation of normalized input data. Our hardware design demonstrates full correspondence with NEF across firing rates, encoding vectors, and intercepts. OZ neurons can be independently configured in real-time to allow efficient spanning of a representation space, thus using fewer neurons and therefore less power for neuromorphic data representation.


2021 ◽  
Vol 33 (1) ◽  
pp. 96-128
Author(s):  
Andreas Stöckel ◽  
Chris Eliasmith

Nonlinear interactions in the dendritic tree play a key role in neural computation. Nevertheless, modeling frameworks aimed at the construction of large-scale, functional spiking neural networks, such as the Neural Engineering Framework, tend to assume a linear superposition of postsynaptic currents. In this letter, we present a series of extensions to the Neural Engineering Framework that facilitate the construction of networks incorporating Dale's principle and nonlinear conductance-based synapses. We apply these extensions to a two-compartment LIF neuron that can be seen as a simple model of passive dendritic computation. We show that it is possible to incorporate neuron models with input-dependent nonlinearities into the Neural Engineering Framework without compromising high-level function and that nonlinear postsynaptic currents can be systematically exploited to compute a wide variety of multivariate, band-limited functions, including the Euclidean norm, controlled shunting, and nonnegative multiplication. By avoiding an additional source of spike noise, the function approximation accuracy of a single layer of two-compartment LIF neurons is on a par with or even surpasses that of two-layer spiking neural networks up to a certain target function bandwidth.


Sign in / Sign up

Export Citation Format

Share Document