scholarly journals NAND Flash Based Novel Synaptic Architecture for Highly Robust and High-Density Quantized Neural Networks With Binary Neuron Activation of (1, 0)

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 114330-114339
Author(s):  
Sung-Tae Lee ◽  
Dongseok Kwon ◽  
Hyeongsu Kim ◽  
Honam Yoo ◽  
Jong-Ho Lee
2019 ◽  
Vol 7 ◽  
pp. 1085-1093 ◽  
Author(s):  
Sung-Tae Lee ◽  
Suhwan Lim ◽  
Nag Yong Choi ◽  
Jong-Ho Bae ◽  
Dongseok Kwon ◽  
...  

2020 ◽  
Vol 20 (7) ◽  
pp. 4138-4142
Author(s):  
Sung-Tae Lee ◽  
Suhwan Lim ◽  
Nagyong Choi ◽  
Jong-Ho Bae ◽  
Dongseok Kwon ◽  
...  

NAND flash memory which is mature technology has great advantage in high density and great storage capacity per chip because cells are connected in series between a bit-line and a source-line. Therefore, NAND flash cell can be used as a synaptic device which is very useful for a high-density synaptic array. In this paper, the effect of the word-line bias on the linearity of multi-level conductance steps of the NAND flash cell is investigated. A 3-layer perceptron network (784×200×10) is trained by a suitable weight update method for NAND flash memory using MNIST data set. The linearity of multi-level conductance steps is improved as the word line bias increases from Vth −0.5 to Vth +1 at a fixed bit-line bias of 0.2 V. As a result, the learning accuracy is improved as the word-line bias increases from Vth −0.5 to Vth+1.


Author(s):  
Ihor Konovalenko ◽  
Pavlo Maruschak ◽  
Vitaly Brevus

Abstract Steel defect diagnostics is important for industry task as it is tied to the product quality and production efficiency. The aim of this paper is evaluating the application of residual neural networks for recognition of industrial steel defects of three classes. Developed and investigated models based on deep residual neural networks for the recognition and classification of surface defects of rolled steel. Investigated the influence of various loss functions, optimizers and hyperparameters on the obtained result and selected optimal model parameters. Based on an ensemble of two deep residual neural networks ResNet50 and ResNet152, a classifier was constructed to detect defects of three classes on flat metal surfaces. The proposed technique allows classifying images with high accuracy. The average binary accuracy of classifying the test data is 96.7% for all images (including defect-free ones). The fields of neuron activation in the convolutional layers of the model were investigated. Feature maps formed in the process were found to reflect the position, size and shape of the objects of interest very well. The proposed ensemble model has proven to be robust and able to accurately recognize steel surface defects. Erroneous recognition cases of the classifier application are investigated. It was shown that errors most often occur in ambiguous situations, where surface artifacts of different types are similar.


2007 ◽  
Vol 17 (03) ◽  
pp. 207-218 ◽  
Author(s):  
BAOYONG ZHANG ◽  
SHENGYUAN XU ◽  
YONGMIN LI

This paper considers the problem of robust exponential stability for a class of recurrent neural networks with time-varying delays and parameter uncertainties. The time delays are not necessarily differentiable and the uncertainties are assumed to be time-varying but norm-bounded. Sufficient conditions, which guarantee that the concerned uncertain delayed neural network is robustly, globally, exponentially stable for all admissible parameter uncertainties, are obtained under a weak assumption on the neuron activation functions. These conditions are dependent on the size of the time delay and expressed in terms of linear matrix inequalities. Numerical examples are provided to demonstrate the effectiveness and less conservatism of the proposed stability results.


2004 ◽  
Vol 14 (05) ◽  
pp. 1807-1811 ◽  
Author(s):  
M. DI MARCO ◽  
M. FORTI ◽  
P. NISTRI ◽  
A. TESI

The paper addresses robustness of complete stability with respect to perturbations of the interconnections of nominal symmetric neural networks. The influence of the maximum neuron activation gain on complete stability robustness is discussed for a class of third-order neural networks. It is shown that high values of the gain lead to an extremely small complete stability margin of all nominal symmetric neural networks, thus allowing to conclude that complete stability robustness cannot be, in general, guaranteed.


Sign in / Sign up

Export Citation Format

Share Document