scholarly journals Any neuron can perform linearly non-separable computations

F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 539
Author(s):  
Romain D. Cazé

Multiple studies have shown how dendrites enable some neurons to perform linearly non-separable computations. These works focus on cells with an extended dendritic arbor where voltage can vary independently, turning dendritic branches into local non-linear subunits. However, these studies leave a large fraction of the nervous system unexplored. Many neurons, e.g. granule cells, have modest dendritic trees and are electrically compact. It is impossible to decompose them into multiple independent subunits. Here, we upgraded the integrate and fire neuron to account for saturating dendrites. This artificial neuron has a unique membrane voltage and can be seen as a single layer. We present a class of linearly non-separable computations and how our neuron can perform them. We thus demonstrate that even a single layer neuron with dendrites has more computational capacity than without. Because any neuron has one or more layer, and all dendrites do saturate, we show that any dendrited neuron can implement linearly non-separable computations.

F1000Research ◽  
2021 ◽  
Vol 10 ◽  
pp. 539
Author(s):  
Romain D. Cazé

Multiple studies have shown how dendrites enable some neurons to perform linearly non-separable computations. These works focus on cells with an extended dendritic arbor where voltage can vary independently, turning dendritic branches into local non-linear subunits. However, these studies leave a large fraction of the nervous system unexplored. Many neurons, e.g. granule cells, have modest dendritic trees and are electrically compact. It is impossible to decompose them into multiple independent subunits. Here, we upgraded the integrate and fire neuron to account for saturating dendrites. This artificial neuron has a unique membrane voltage and can be seen as a single layer. We present a class of linearly non-separable computations and how our neuron can perform them. We thus demonstrate that even a single layer neuron with dendrites has more computational capacity than without. Because any neuron has one or more layer, and all dendrites do saturate, we show that any dendrited neuron can implement linearly non-separable computations.


2021 ◽  
Vol 1 (132) ◽  
pp. 116-123
Author(s):  
Alexey Gnilenko

The hardware implementation of an artificial neuron is the key problem of the design of neuromorphic chips which are new promising architectural solutions for massively parallel computing. In this paper an analog neuron circuit design is presented to be used as a building element of spiking neuron networks. The design of the neuron is performed at the transistor level based on Leaky Integrate-and-Fire neuron implementation model. The neuron is simulated using EDA tool to verify the design. Signal waveforms at key nodes of the neuron are obtained and neuron functionality is demonstrated.


1996 ◽  
Vol 8 (3) ◽  
pp. 611-624 ◽  
Author(s):  
Anthony M. Zador ◽  
Barak A. Pearlmutter

We compute the VC dimension of a leaky integrate-and-fire neuron model. The VC dimension quantifies the ability of a function class to partition an input pattern space, and can be considered a measure of computational capacity. In this case, the function class is the class of integrate-and-fire models generated by varying the integration time constant T and the threshold θ, the input space they partition is the space of continuous-time signals, and the binary partition is specified by whether or not the model reaches threshold at some specified time. We show that the VC dimension diverges only logarithmically with the input signal bandwidth N. We also extend this approach to arbitrary passive dendritic trees. The main contributions of this work are (1) it offers a novel treatment of computational capacity of this class of dynamic system; and (2) it provides a framework for analyzing the computational capabilities of the dynamic systems defined by networks of spiking neurons.


2021 ◽  
Author(s):  
Romain Daniel Caze

Multiple studies show how dendrites might extend some neurons' computational capacity. These studies leave a large fraction of the nervous system unexplored. Here we demonstrate how a modest dendritic tree can allow cerebellar granule cells to implement linearly non-separable computations. Granule cells' dendrites do not spike and these cells' membrane voltage is isopotential. Conjunction of Boolean algebra and biophysical modelling enable us to make an experimental prediction. Granule cells can perform linearly non-separable computations. The standard neuron model used in the artificial network, aka the integrate and fire, cannot perform such type of computations. Confirming the prediction we provide in the present work would change how we understand the nervous system.


Fractals ◽  
1993 ◽  
Vol 01 (02) ◽  
pp. 171-178 ◽  
Author(s):  
KLAUS-D. KNIFFKI ◽  
MATTHIAS PAWLAK ◽  
CHRISTIANE VAHLE-HINZ

The morphology of Golgi-impregnated thalamic neurons was investigated quantitatively. In particular, it was sought to test whether the dendritic bifurcations can be described by the scaling law (d0)n=(d1)n+(d2)nwith a single value of the diameter exponent n. Here d0 is the diameter of the parent branch, d1 and d2 are the diameters of the two daughter branches. Neurons from two functionally distinct regions were compared: the somatosensory ventrobasal complex (VB) and its nociceptive ventral periphery (VBvp). It is shown that for the neuronal trees studied in both regions, the scaling law was fulfilled. The diameter exponent n, however, was not a constant. It increased from n=1.76 for the 1st order branches to n=3.92 for the 7th order branches of neurons from both regions. These findings suggest that more than one simple intrinsic rule is involved in the neuronal growth process, and it is assumed that the branching ratio d0/d1 is not required to be encoded genetically. Furthermore, the results support the concept of the dendritic trees having a statistically identical topology in neurons of VB and VBvp and thus may be regarded as integrative modules.


2018 ◽  
Vol 8 (03) ◽  
pp. 835-841 ◽  
Author(s):  
Coline Adda ◽  
Laurent Cario ◽  
Julien Tranchant ◽  
Etienne Janod ◽  
Marie-Paule Besland ◽  
...  

Abstract


Sign in / Sign up

Export Citation Format

Share Document