Computing Hardware

Author(s):  
James A. Anderson

Digital computers are built from hardware of great simplicity. First, they are built from devices with two states: on or off, one or zero, high voltage or low voltage, or logical TRUE or FALSE. Second, the devices are connected with extremely fine connections, currently on the order of size of a large virus. Their utility, value, and perceived extreme complexity lie in the software controlling them. Different devices have been used to build computers: relays, vacuum tubes, transistors, and integrated circuits. Theoretically, all can run the same software, only slower or faster. More exotic technologies have not proved commercially viable. Digital computer hardware has increased in power by roughly a factor of 2 every 2 years for five decades, an observation called Moore’s Law. Engineering problems with very small devices, such as quantum effects, heat, and difficulty of fabrication, are increasing and may soon end Moore’s Law.

Author(s):  
David Segal

Chapter 3 highlights the critical role materials have in the development of digital computers. It traces developments from the cat’s whisker to valves through to relays and transistors. Accounts are given for transistors and the manufacture of integrated circuits (silicon chips) by use of photolithography. Future potential computing techniques, namely quantum computing and the DNA computer, are covered. The history of computability and Moore’s Law are discussed.


2010 ◽  
Vol 98 (2) ◽  
pp. 253-266 ◽  
Author(s):  
Ronald G. Dreslinski ◽  
Michael Wieckowski ◽  
David Blaauw ◽  
Dennis Sylvester ◽  
Trevor Mudge

2021 ◽  
pp. 187-198
Author(s):  
Laszlo Solymar

This is the story of the birth of the transistor and of the growing understanding of the theory and technology of solid state devices. The transistor was invented at Bell Laboratories by William Shockley, John Bardeen and Walter Brattain. They received the Nobel Prize in 1956.The next advance was putting more and more units on a substrate, initiating the age of integrated circuits. Moore’s Law in its original form states that the number of transistors on a substrate will double every year. As the price of computers using transistors plummeted, the number of computers sold rose fast.


2006 ◽  
Vol 7 (12) ◽  
pp. 1961-1967 ◽  
Author(s):  
L. Thylén ◽  
Sailing He ◽  
L. Wosinski ◽  
Daoxin Dai

2020 ◽  
Vol 20 (1&2) ◽  
pp. 1-13
Author(s):  
Nicolas F. Lori ◽  
Jos Neves ◽  
Alex H. Blin ◽  
Victor Alves

The contemporary development of Quantum Computers has opened new possibilities for computation improvements, but the limits of Moore's law validity are starting to show. We analyze here the possibility that miniaturization will continue to be the source of Moore's law validity in the near future, and our conclusion is that miniaturization is no longer a reliable answer for the future development of computer science, but instead we suggest that lateralization is the correct approach. By lateralization, we mean the use of biology as the correct format for the implementation of ubiquitous computerized systems, a format that might in many circumstances eschew miniaturization as an overly expensive useless advantage whereas in other cases miniaturization might play a key role. Thus, the future of computer science is not towards a miniaturization that goes from the atom-scale (its present application scale) towards the nucleus-scale, but rather in developing more integrated circuits at the micrometer to nanometer scale, so as to better mimic and interact with biological systems. We analyze some "almost sci-fi" approaches to the development of better computer systems near the Bekenstein bound limit, and unsurprisingly they fail to have any realistic feasibility. Then, we use the difference between the classical vs. quantum version of the Hammerstein-Clifford theorem to explain why biological systems eschewed quantum computation to represent the world but have chosen classical computation instead. Finally, we analyze examples of recent work which indicate future possibilities of integration between computers and biological systems. As a corollary of that choice by the biological systems, we propose that the predicted lateralization-driven evolution in computer science will not be based in quantum computers, but rather in classical computers.


The number of transistors per chip, feature sizes, frequencies, transistor densities, number of cores, thermal design powers, die areas, and storage capacities of Integrated Circuits (ICs) used for different processing units and memories were collected from various websites from 1973 to 2019 and plotted against year of introduction of ICs in semi-log paper to find the trend with R-squared (R2) value using Microsoft Excel. The R2 values of the trend lines for the above parameters were over 0.922 which indicated that more than 92% of data satisfied the fitting lines except for thermal design power (R2 = 0.7) and die area (R2 = 0.4 to 0.6). It was observed that the growths of transistor counts, transistor densities, frequencies, and thermal design powers for different processing units were growing exponentially and doubled every 16.8 to 24 months from 1973 to 2019 except the growth of thermal design powers (TDP) and frequencies of ICs which were increased up to 2003. After that, the growth of TDP and frequencies are nearly linear up to the present day. The growth of the above parameters for ICs of different memories was a little faster, it was doubled every 14 to 16 months. The feature sizes shrunk 2 times every 18 months. A strong relation was found between feature sizes and transistor densities (R2 = 0.9) and observed that one fold of feature size decreased for the increasing of 2-3 folds of transistor densities. It was observed that different parameters for ICs designing from 1973 to 2019 kept pace with Moore's law. It may be concluded that the decrease of feature size, increasing of transistor count and transistor density in ICs design will follow Moore's law for some more years with the limitation of frequency and power of ICs.


2008 ◽  
Vol 600-603 ◽  
pp. 1091-1094 ◽  
Author(s):  
Y. Zhang ◽  
Kuang Sheng ◽  
Ming Su ◽  
Jian Hui Zhao ◽  
Petre Alexandrov ◽  
...  

A series of high voltage (HV) and low voltage (LV) lateral JFETs are successfully developed in 4H-SiC based on the vertical channel LJFET (VC-LJFET) device platform. Both room temperature and 300 oC characterizations are presented. The HV JFET shows a specific-on resistance of 12.8 mΩ·cm2 and is capable of conducting current larger than 3 A at room temperature. A threshold voltage drop of about 0.5 V for HV and LV JFETs is observed when temperature varies from room temperature to 300 oC. The measured increase of specific-on resistance with temperature due to a reduction of electron mobility agrees with the numerical prediction. The first demonstration of SiC power integrated circuits (PIC) is also reported, which shows 5 MHz switching at VDS of 200 V and on-state current of 0.4 A.


Author(s):  
M.G. Rosenfield

Minimum feature sizes in experimental integrated circuits are approaching 0.5 μm and below. During the fabrication process it is usually necessary to be able to non-destructively measure the critical dimensions in resist and after the various process steps. This can be accomplished using the low voltage SEM. Submicron linewidth measurement is typically done by manually measuring the SEM micrographs. Since it is desirable to make as many measurements as possible in the shortest period of time, it is important that this technique be automated.Linewidth measurement using the scanning electron microscope is not well understood. The basic intent is to measure the size of a structure from the secondary electron signal generated by that structure. Thus, it is important to understand how the actual dimension of the line being measured relates to the secondary electron signal. Since different features generate different signals, the same method of relating linewidth to signal cannot be used. For example, the peak to peak method may be used to accurately measure the linewidth of an isolated resist line; but, a threshold technique may be required for an isolated space in resist.


Sign in / Sign up

Export Citation Format

Share Document