memristor crossbars
Recently Published Documents


TOTAL DOCUMENTS

39
(FIVE YEARS 21)

H-INDEX

7
(FIVE YEARS 2)

2022 ◽  
pp. 111706
Author(s):  
Abdelouadoud El Mesoudy ◽  
Gwénaëlle Lamri ◽  
Raphaël Dawant ◽  
Javier Arias-Zapata ◽  
Pierre Gliech ◽  
...  

2021 ◽  
Author(s):  
Thomas Dalgaty ◽  
Filippo Moro ◽  
Alessio De Pra ◽  
Giacomo Indiveri ◽  
Elisa Vianello ◽  
...  

Abstract Thanks to their non-volatile and multi-bit properties, memristors have been extensively used as synaptic weight elements in neuromorphic architectures. However, their use to define and re-program the network connectivity has been overlooked. Here, we propose, implement and experimentally demonstrate Mosaic, a neuromorphic architecture based on a systolic array of memristor crossbars. For the first time, we use distributed non-volatile memristors not only for computation, but also for routing (i.e., to define the network connectivity). Mosaic is particularly well-suited for the implementation of re-configurable small-world graphical models, with dense local and sparse global connectivity - found extensively in the brain. We mathematically show that, as the networks scale up, the Mosaic requires less memory than in conventional memristor approaches. We map a spiking recurrent neural network on the Mosaic to solve an Electrocardiogram (ECG) anomaly detection task. While the performance is either equivalent or better than software models, the advantage of the Mosaic was clearly seen in respective one and two orders of magnitude reduction in energy requirements, compared to a micro-controller and address-event representation-based processor. Mosaic promises to open up a new approach to designing neuromorphic hardware based on graph-theoretic principles with less memory and energy.


Author(s):  
Xiaoyang Liu ◽  
Zhigang Zeng

AbstractThe paper presents memristor crossbar architectures for implementing layers in deep neural networks, including the fully connected layer, the convolutional layer, and the pooling layer. The crossbars achieve positive and negative weight values and approximately realize various nonlinear activation functions. Then the layers constructed by the crossbars are adopted to build the memristor-based multi-layer neural network (MMNN) and the memristor-based convolutional neural network (MCNN). Two kinds of in-situ weight update schemes, which are the fixed-voltage update and the approximately linear update, respectively, are used to train the networks. Consider variations resulted from the inherent characteristics of memristors and the errors of programming voltages, the robustness of MMNN and MCNN to these variations is analyzed. The simulation results on standard datasets show that deep neural networks (DNNs) built by the memristor crossbars work satisfactorily in pattern recognition tasks and have certain robustness to memristor variations.


Author(s):  
Gianluca Zoppo ◽  
Anil Korkmaz ◽  
Francesco Marrone ◽  
Samuel Palermo ◽  
Fernando Corinto ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document