scholarly journals It may be time to improve the neuron of artificial neural network

Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>

2020 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2020 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>DD code is available at https://github.com/liugang1234567/Gang-neuron.<br></p>


2021 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2021 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2020 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some studies recently show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity.</p>


2020 ◽  
Author(s):  
Gang Liu

<div> In recent years, artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, and artificial intelligence. Neuron in ANNs is designed by the Knowledge about the biological neurons in the brain 70 years ago.Neuron in ANNs is expressed as f(wx+b) or f(WX). The design of this architecture does not consider the information processing capabilities of dendrites. However, recently, studies shows that dendrites participate in the pre-calculation of input data in the brain. Concretely, biological dendrites play a role in the pre-processing for the interaction information of input data. Therefore, it may be time to perfect the neuron of ANNs. According to our previous studies (Gang transform), this paper adds the dendrite processing section to neurons of ANNs. The dendrite processing section can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)) .The simplified new neuron be expressed as f(∑(WA ○ X)) . After perfecting the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </div><div> </div><div> Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) after being connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with gradient. (3) The networks using Gang neurons can delete the full connectional layer of traditional networks. In other words, the parameters of the full connectional layers are assigned to a single neuron, which reduces parameters of a network for the same mapping capacity.</div>


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jermyn Z. See ◽  
Natsumi Y. Homma ◽  
Craig A. Atencio ◽  
Vikaas S. Sohal ◽  
Christoph E. Schreiner

AbstractNeuronal activity in auditory cortex is often highly synchronous between neighboring neurons. Such coordinated activity is thought to be crucial for information processing. We determined the functional properties of coordinated neuronal ensembles (cNEs) within primary auditory cortical (AI) columns relative to the contributing neurons. Nearly half of AI cNEs showed robust spectro-temporal receptive fields whereas the remaining cNEs showed little or no acoustic feature selectivity. cNEs can therefore capture either specific, time-locked information of spectro-temporal stimulus features or reflect stimulus-unspecific, less-time specific processing aspects. By contrast, we show that individual neurons can represent both of those aspects through membership in multiple cNEs with either high or absent feature selectivity. These associations produce functionally heterogeneous spikes identifiable by instantaneous association with different cNEs. This demonstrates that single neuron spike trains can sequentially convey multiple aspects that contribute to cortical processing, including stimulus-specific and unspecific information.


2020 ◽  
Vol 9 (1) ◽  
pp. 7-10
Author(s):  
Hendry Fonda

ABSTRACT Riau batik is known since the 18th century and is used by royal kings. Riau Batik is made by using a stamp that is mixed with coloring and then printed on fabric. The fabric used is usually silk. As its development, comparing Javanese  batik with riau batik Riau is very slowly accepted by the public. Convolutional Neural Networks (CNN) is a combination of artificial neural networks and deeplearning methods. CNN consists of one or more convolutional layers, often with a subsampling layer followed by one or more fully connected layers as a standard neural network. In the process, CNN will conduct training and testing of Riau batik so that a collection of batik models that have been classified based on the characteristics that exist in Riau batik can be determined so that images are Riau batik and non-Riau batik. Classification using CNN produces Riau batik and not Riau batik with an accuracy of 65%. Accuracy of 65% is due to basically many of the same motifs between batik and other batik with the difference lies in the color of the absorption in the batik riau. Kata kunci: Batik; Batik Riau; CNN; Image; Deep Learning   ABSTRAK   Batik Riau dikenal sejak abad ke 18 dan digunakan oleh bangsawan raja. Batik Riau dibuat dengan menggunakan cap yang dicampur dengan pewarna kemudian dicetak di kain. Kain yang digunakan biasanya sutra. Seiring perkembangannya, dibandingkan batik Jawa maka batik Riau sangat lambat diterima oleh masyarakat. Convolutional Neural Networks (CNN) merupakan kombinasi dari jaringan syaraf tiruan dan metode deeplearning. CNN terdiri dari satu atau lebih lapisan konvolutional, seringnya dengan suatu lapisan subsampling yang diikuti oleh satu atau lebih lapisan yang terhubung penuh sebagai standar jaringan syaraf. Dalam prosesnya CNN akan melakukan training dan testing terhadap batik Riau sehingga didapat kumpulan model batik yang telah terklasi    fikasi berdasarkan ciri khas yang ada pada batik Riau sehingga dapat ditentukan gambar (image) yang merupakan batik Riau dan yang bukan merupakan batik Riau. Klasifikasi menggunakan CNN menghasilkan batik riau dan bukan batik riau dengan akurasi 65%. Akurasi 65% disebabkan pada dasarnya banyak motif yang sama antara batik riau dengan batik lainnya dengan perbedaan terletak pada warna cerap pada batik riau. Kata kunci: Batik; Batik Riau; CNN; Image; Deep Learning


2003 ◽  
Vol 13 (02) ◽  
pp. 87-91
Author(s):  
Allan Kardec Barros ◽  
Andrzej Cichocki ◽  
Noboru Ohnishi

Redundancy reduction as a form of neural coding has been since the early sixties a topic of large research interest. A number of strategies has been proposed, but the one which is attracting most attention recently assumes that this coding is carried out so that the output signals are mutually independent. In this work we go one step further and suggest an strategy to deal also with non-orthogonal signals (i.e., ''dependent'' signals). Moreover, instead of working with the usual squared error, we design a neuron where the non-linearity is operating on the error. It is computationally more economic and, importantly, the permutation/scaling problem10 is avoided. The framework is given with a biological background, as we avocate throughout the manuscript that the algorithm fits well the single neuron and redundancy reduction doctrine.5 Moreover, we show that wavelet-like receptive fields emerges from natural images processed by this algorithm.


Sign in / Sign up

Export Citation Format

Share Document