scholarly journals It may be time to improve the neuron of artificial neural network

Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some studies recently show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity.</p>

2020 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2020 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>DD code is available at https://github.com/liugang1234567/Gang-neuron.<br></p>


2021 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2020 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2021 ◽  
Author(s):  
Gang Liu

<p>Artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, or artificial intelligence in recent years. The neuron of ANNs was designed by the stereotypical knowledge of biological neurons 70 years ago. Artificial Neuron is expressed as f(wx+b) or f(WX). This design does not consider dendrites' information processing capacity. However, some recent studies show that biological dendrites participate in the pre-calculation of input data. Concretely, biological dendrites play a role in extracting the interaction information among inputs (features). Therefore, it may be time to improve the neuron of ANNs. According to our previous studies (DD), this paper adds the dendrites' function to artificial Neuron. The dendrite function can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)).The simplified new neuron be expressed as f(∑(WA ○ X)). After improving the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </p> <p>Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with a gradient. (3) The networks using Gang neurons can delete traditional networks' Fully-connected Layer. In other words, the Fully-connected Layers' parameters are assigned to a single neuron, which reduces the parameters of a network for the same mapping capacity. </p><p>One important thing: ResDD can replace the current all ANNs' Neurons (ResDD modules+One Linear module)! ResDD has controllable precision for better generalization capability! </p><p>Gang neuron code is available at https://github.com/liugang1234567/Gang-neuron.</p>


2020 ◽  
Author(s):  
Gang Liu

<div> In recent years, artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, and artificial intelligence. Neuron in ANNs is designed by the Knowledge about the biological neurons in the brain 70 years ago.Neuron in ANNs is expressed as f(wx+b) or f(WX). The design of this architecture does not consider the information processing capabilities of dendrites. However, recently, studies shows that dendrites participate in the pre-calculation of input data in the brain. Concretely, biological dendrites play a role in the pre-processing for the interaction information of input data. Therefore, it may be time to perfect the neuron of ANNs. According to our previous studies (Gang transform), this paper adds the dendrite processing section to neurons of ANNs. The dendrite processing section can be expressed as W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup> . The generalized new neuron can be expressed as f(W(W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>0|1|2|...|i-1</sup>)) .The simplified new neuron be expressed as f(∑(WA ○ X)) . After perfecting the neuron, there are so many networks to try. This paper shows some basic architecture for reference in the future. </div><div> </div><div> Interesting things: (1) The computational complexity of dendrite modules (W<sup>i,i-1</sup>A<sup>i-1</sup> ○ A<sup>i-1</sup>) after being connected in series is far lower than Horner's method. Will this speed up the calculation of basic functions in computers? (2) The range of sight of animals has a gradient, but the convolution layer does not have this characteristic. This paper proposes receptive fields with gradient. (3) The networks using Gang neurons can delete the full connectional layer of traditional networks. In other words, the parameters of the full connectional layers are assigned to a single neuron, which reduces parameters of a network for the same mapping capacity.</div>


2020 ◽  
Author(s):  
Gang Liu

In recent years, artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, and artificial intelligence. The basic unit of an ANN is to mimic neurons in the brain. Neuron in ANNs is expressed as f(wx+b) or f(wx).This structure does not consider the information processing capabilities of dendrites. However, recently, studies shown that dendrites participate in pre-calculation in the brain. Concretely, biological dendrites play a role in the pre-processing to the interaction information of input data. Therefore, it's time to perfect the neuron of the neural network. This paper, add dendrite processing section, presents a novel artificial neuron, according to our previous studies (CR-PNN or Gang transform). The dendrite processing section can be expressed as WA.X. Because I perfected the basic unit of ANNs-neuron, there are so many networks to try, this article gives the basic architecture for reference in future research.


2020 ◽  
Author(s):  
Gang Liu

In recent years, artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, and artificial intelligence. The basic unit of an ANN is to mimic neurons in the brain. Neuron in ANNs is expressed as f(wx+b) or f(wx).This structure does not consider the information processing capabilities of dendrites. However, recently, studies shown that dendrites participate in pre-calculation in the brain. Concretely, biological dendrites play a role in the pre-processing to the interaction information of input data. Therefore, it's time to perfect the neuron of the neural network. This paper added dendrite processing section, and presented a novel artificial neuron, according to our previous studies (CR-PNN or Gang transform). The dendrite processing section can be expressed as WA.X. Because I perfected the basic unit of ANNs-neuron, there are so many networks to try, this article gives the basic architecture for reference in future research.


Author(s):  
Rafael Stahl ◽  
Alexander Hoffman ◽  
Daniel Mueller-Gritschneder ◽  
Andreas Gerstlauer ◽  
Ulf Schlichtmann

AbstractPerforming inference of Convolutional Neural Networks (CNNs) on Internet of Things (IoT) edge devices ensures both privacy of input data and possible run time reductions when compared to a cloud solution. As most edge devices are memory- and compute-constrained, they cannot store and execute complex CNNs. Partitioning and distributing layer information across multiple edge devices to reduce the amount of computation and data on each device presents a solution to this problem. In this article, we propose DeeperThings, an approach that supports a full distribution of CNN inference tasks by partitioning fully-connected as well as both feature- and weight-intensive convolutional layers. Additionally, we jointly optimize memory, computation and communication demands. This is achieved using techniques to combine both feature and weight partitioning with a communication-aware layer fusion method, enabling holistic optimization across layers. For a given number of edge devices, the schemes are applied jointly using Integer Linear Programming (ILP) formulations to minimize data exchanged between devices, to optimize run times and to find the entire model’s minimal memory footprint. Experimental results from a real-world hardware setup running four different CNN models confirm that the scheme is able to evenly balance the memory footprint between devices. For six devices on 100 Mbit/s connections the integration of layer fusion additionally leads to a reduction of communication demands by up to 28.8%. This results in run time speed-up of the inference task by up to 1.52x compared to layer partitioning without fusing.


2020 ◽  
Author(s):  
Gang Liu

In recent years, artificial neural networks (ANNs) have won numerous contests in pattern recognition, machine learning, and artificial intelligence. The basic unit of an ANN is to mimic neurons in the brain. Neuron in ANNs is expressed as f(wx+b) or f(wx).This structure does not consider the information processing capabilities of dendrites. However, recently, studies shown that dendrites participate in pre-calculation in the brain. Concretely, biological dendrites play a role in the pre-processing to the interaction information of input data. Therefore, it's time to perfect the neuron of the neural network. This paper added dendrite processing section, and presented a novel artificial neuron, according to our previous studies (CR-PNN or Gang transform). The dendrite processing section can be expressed as W <sup>i,i-1</sup>A<sup>i-</sup><sup>1</sup> ◦A<sup>0|1|2|...|i-1</sup> . Because I perfected the basic unit of ANNs-neuron, there are so many networks to try, this article gives the basic architecture for reference in future research.


Sign in / Sign up

Export Citation Format

Share Document