Near-Threshold Computing: Reclaiming Moore's Law Through Energy Efficient Integrated Circuits

2010 ◽  
Vol 98 (2) ◽  
pp. 253-266 ◽  
Author(s):  
Ronald G. Dreslinski ◽  
Michael Wieckowski ◽  
David Blaauw ◽  
Dennis Sylvester ◽  
Trevor Mudge
Author(s):  
David Segal

Chapter 3 highlights the critical role materials have in the development of digital computers. It traces developments from the cat’s whisker to valves through to relays and transistors. Accounts are given for transistors and the manufacture of integrated circuits (silicon chips) by use of photolithography. Future potential computing techniques, namely quantum computing and the DNA computer, are covered. The history of computability and Moore’s Law are discussed.


2017 ◽  
Vol 8 ◽  
pp. 2689-2710 ◽  
Author(s):  
Igor I Soloviev ◽  
Nikolay V Klenov ◽  
Sergey V Bakurskiy ◽  
Mikhail Yu Kupriyanov ◽  
Alexander L Gudkov ◽  
...  

The predictions of Moore’s law are considered by experts to be valid until 2020 giving rise to “post-Moore’s” technologies afterwards. Energy efficiency is one of the major challenges in high-performance computing that should be answered. Superconductor digital technology is a promising post-Moore’s alternative for the development of supercomputers. In this paper, we consider operation principles of an energy-efficient superconductor logic and memory circuits with a short retrospective review of their evolution. We analyze their shortcomings in respect to computer circuits design. Possible ways of further research are outlined.


2021 ◽  
pp. 187-198
Author(s):  
Laszlo Solymar

This is the story of the birth of the transistor and of the growing understanding of the theory and technology of solid state devices. The transistor was invented at Bell Laboratories by William Shockley, John Bardeen and Walter Brattain. They received the Nobel Prize in 1956.The next advance was putting more and more units on a substrate, initiating the age of integrated circuits. Moore’s Law in its original form states that the number of transistors on a substrate will double every year. As the price of computers using transistors plummeted, the number of computers sold rose fast.


Author(s):  
Chien-Ping Lu

Artificial Intelligence (AI) was the inspiration that shaped computing as we know it today. In this article, I explore why and how AI would continue to inspire computing and reinvent it when Moore's Law is running out of steam. At the dawn of computing, Alan Turing proposed that instead of comprising many different specific machines, the computing machinery for AI should be a Universal Digital Computer, modeled after human computers, which carry out calculations with pencil on paper. Based on the belief that a digital computer would be significantly faster, more diligent and patient than a human, he anticipated that AI would be advanced as software. In modern terminology, a universal computer would be designed to understand a language known as an Instruction Set Architecture (ISA), and software would be translated into the ISA. Since then, universal computers have become exponentially faster and more energy efficient through Moore's Law, while software has grown more sophisticated. Even though software has not yet made a machine think, it has been changing how we live fundamentally. The computing revolution started when the software was decoupled from the computing machinery. Since the slowdown of Moore's Law in 2005, the universal computer is no longer improving exponentially in terms of speed and energy efficiency. It has to carry ISA legacy, and cannot be aggressively modified to save energy. Turing's proposition of AI as software is challenged, and the temptation of making many domain-specific AI machines emerges. Thanks to Deep Learning, software can stay decoupled from the computing machinery in the language of linear algebra, which it has in common with supercomputing. A new universal computer for AI understands such language natively to then become a Native Supercomputer. AI has been and will still be the inspiration for computing. The quest to make machines think continues amid the slowdown of Moore's Law. AI might not only maximize the remaining benefits of Moore's Law, but also revive Moore's Law beyond current technology.


2006 ◽  
Vol 7 (12) ◽  
pp. 1961-1967 ◽  
Author(s):  
L. Thylén ◽  
Sailing He ◽  
L. Wosinski ◽  
Daoxin Dai

Author(s):  
James A. Anderson

Digital computers are built from hardware of great simplicity. First, they are built from devices with two states: on or off, one or zero, high voltage or low voltage, or logical TRUE or FALSE. Second, the devices are connected with extremely fine connections, currently on the order of size of a large virus. Their utility, value, and perceived extreme complexity lie in the software controlling them. Different devices have been used to build computers: relays, vacuum tubes, transistors, and integrated circuits. Theoretically, all can run the same software, only slower or faster. More exotic technologies have not proved commercially viable. Digital computer hardware has increased in power by roughly a factor of 2 every 2 years for five decades, an observation called Moore’s Law. Engineering problems with very small devices, such as quantum effects, heat, and difficulty of fabrication, are increasing and may soon end Moore’s Law.


Sign in / Sign up

Export Citation Format

Share Document