scholarly journals Adiabatic superconducting cells for ultra-low-power artificial neural networks

2016 ◽  
Vol 7 ◽  
pp. 1397-1403 ◽  
Author(s):  
Andrey E Schegolev ◽  
Nikolay V Klenov ◽  
Igor I Soloviev ◽  
Maxim V Tereshonok

We propose the concept of using superconducting quantum interferometers for the implementation of neural network algorithms with extremely low power dissipation. These adiabatic elements are Josephson cells with sigmoid- and Gaussian-like activation functions. We optimize their parameters for application in three-layer perceptron and radial basis function networks.

2011 ◽  
Vol 64 (1) ◽  
pp. 47-53 ◽  
Author(s):  
Giuseppe Moschetti ◽  
Niklas Wadefalk ◽  
Per-Åke Nilsson ◽  
Yannick Roelens ◽  
Albert Noudeviwa ◽  
...  

Author(s):  
Andrew Lishchytovych ◽  
Volodymyr Pavlenko

The object of this study is to analyse the effectiveness of document ran­ king algorithms in search engines that use artificial neural networks to match the texts. The purpose of the study was to inspect a neural network model of text document ran­ king that uses clustering, factor analysis, and multi-layered network architecture. The work of neural network algorithms was compared with the standard statistical search algorithm OkapiBM25. The result of the study is to evaluate the effectiveness of the use of particular models and to recommend model selection for specific datasets.


2019 ◽  
Vol 2019 ◽  
pp. 1-8 ◽  
Author(s):  
Ismail Gassoumi ◽  
Lamjed Touil ◽  
Bouraoui Ouni ◽  
Abdellatif Mtibaa

Quantum-dot cellular automata (QCA) technology is one of the emerging technologies that can be used for replacing CMOS technology. It has attracted significant attention in the recent years due to its extremely low power dissipation, high operating frequency, and a small size. In this study, we demonstrate an n-bit parity generator circuit by utilizing QCA technology. Here, a novel XOR gate is used in the synthesis of the proposed circuit. The proposed gate is based on electrostatic interactions between cells to perform the desired function. The comparison results demonstrate that the designed QCA circuits have advantages compared to other circuits in terms of cell count, area, delay, and power consumption. The QCADesigner software, as widely used QCA circuit design and verification, has been used to implement and to verify all of the designs in this study. Power dissipation has been computed for the proposed circuit using accurate QCAPro power estimator tool.


2006 ◽  
Vol 42 (12) ◽  
pp. 688 ◽  
Author(s):  
W. Kruppa ◽  
J.B. Boos ◽  
B.R. Bennett ◽  
N.A. Papanicolaou ◽  
D. Park ◽  
...  

2021 ◽  
Vol 15 ◽  
Author(s):  
Pavan Kumar Chundi ◽  
Dewei Wang ◽  
Sung Justin Kim ◽  
Minhao Yang ◽  
Joao Pedro Cerqueira ◽  
...  

This paper presents a novel spiking neural network (SNN) classifier architecture for enabling always-on artificial intelligent (AI) functions, such as keyword spotting (KWS) and visual wake-up, in ultra-low-power internet-of-things (IoT) devices. Such always-on hardware tends to dominate the power efficiency of an IoT device and therefore it is paramount to minimize its power dissipation. A key observation is that the input signal to always-on hardware is typically sparse in time. This is a great opportunity that a SNN classifier can leverage because the switching activity and the power consumption of SNN hardware can scale with spike rate. To leverage this scalability, the proposed SNN classifier architecture employs event-driven architecture, especially fine-grained clock generation and gating and fine-grained power gating, to obtain very low static power dissipation. The prototype is fabricated in 65 nm CMOS and occupies an area of 1.99 mm2. At 0.52 V supply voltage, it consumes 75 nW at no input activity and less than 300 nW at 100% input activity. It still maintains competitive inference accuracy for KWS and other always-on classification workloads. The prototype achieved a power consumption reduction of over three orders of magnitude compared to the state-of-the-art for SNN hardware and of about 2.3X compared to the state-of-the-art KWS hardware.


2019 ◽  
Vol 1 (1) ◽  
pp. p8
Author(s):  
Jamilu Auwalu Adamu

One of the objectives of this paper is to incorporate fat-tail effects into, for instance, Sigmoid in order to introduce Transparency and Stability into the existing stochastic Activation Functions. Secondly, according to the available literature reviewed, the existing set of Activation Functions were introduced into the Deep learning Artificial Neural Network through the “Window” not properly through the “Legitimate Door” since they are “Trial and Error “and “Arbitrary Assumptions”, thus, the Author proposed a “Scientific Facts”, “Definite Rules: Jameel’s Stochastic ANNAF Criterion”, and a “Lemma” to substitute not necessarily replace the existing set of stochastic Activation Functions, for instance, the Sigmoid among others. This research is expected to open the “Black-Box” of Deep Learning Artificial Neural networks. The author proposed a new set of advanced optimized fat-tailed Stochastic Activation Functions EMANATED from the AI-ML-Purified Stocks Data  namely; the Log – Logistic (3P) Probability Distribution (1st), Cauchy Probability Distribution (2nd), Pearson 5 (3P) Probability Distribution (3rd), Burr (4P) Probability Distribution (4th), Fatigue Life (3P) Probability Distribution (5th), Inv. Gaussian (3P) Probability Distribution (6th), Dagum (4P) Probability Distribution (7th), and Lognormal (3P) Probability Distribution (8th) for the successful conduct of both Forward and Backward Propagations of Deep Learning Artificial Neural Network. However, this paper did not check the Monotone Differentiability of the proposed distributions. Appendix A, B, and C presented and tested the performances of the stressed Sigmoid and the Optimized Activation Functions using Stocks Data (2014-1991) of Microsoft Corporation (MSFT), Exxon Mobil (XOM), Chevron Corporation (CVX), Honda Motor Corporation (HMC), General Electric (GE), and U.S. Fundamental Macroeconomic Parameters, the results were found fascinating. Thus, guarantee, the first three distributions are excellent Activation Functions to successfully conduct any Stock Deep Learning Artificial Neural Network. Distributions Number 4 to 8 are also good Advanced Optimized Activation Functions. Generally, this research revealed that the Advanced Optimized Activation Functions satisfied Jameel’s ANNAF Stochastic Criterion depends on the Referenced Purified AI Data Set, Time Change and Area of Application which is against the existing “Trial and Error “and “Arbitrary Assumptions” of Sigmoid, Tanh, Softmax, ReLu, and Leaky ReLu.


Sign in / Sign up

Export Citation Format

Share Document