Solid state quantum computers: a nanoscopic solution to the Moore's law problem

2001 ◽  
Author(s):  
Joseph Ng ◽  
Derek Abbott
2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Prasanna Date ◽  
Davis Arthur ◽  
Lauren Pusey-Nazzaro

AbstractTraining machine learning models on classical computers is usually a time and compute intensive process. With Moore’s law nearing its inevitable end and an ever-increasing demand for large-scale data analysis using machine learning, we must leverage non-conventional computing paradigms like quantum computing to train machine learning models efficiently. Adiabatic quantum computers can approximately solve NP-hard problems, such as the quadratic unconstrained binary optimization (QUBO), faster than classical computers. Since many machine learning problems are also NP-hard, we believe adiabatic quantum computers might be instrumental in training machine learning models efficiently in the post Moore’s law era. In order to solve problems on adiabatic quantum computers, they must be formulated as QUBO problems, which is very challenging. In this paper, we formulate the training problems of three machine learning models—linear regression, support vector machine (SVM) and balanced k-means clustering—as QUBO problems, making them conducive to be trained on adiabatic quantum computers. We also analyze the computational complexities of our formulations and compare them to corresponding state-of-the-art classical approaches. We show that the time and space complexities of our formulations are better (in case of SVM and balanced k-means clustering) or equivalent (in case of linear regression) to their classical counterparts.


2021 ◽  
pp. 187-198
Author(s):  
Laszlo Solymar

This is the story of the birth of the transistor and of the growing understanding of the theory and technology of solid state devices. The transistor was invented at Bell Laboratories by William Shockley, John Bardeen and Walter Brattain. They received the Nobel Prize in 1956.The next advance was putting more and more units on a substrate, initiating the age of integrated circuits. Moore’s Law in its original form states that the number of transistors on a substrate will double every year. As the price of computers using transistors plummeted, the number of computers sold rose fast.


2020 ◽  
Vol 20 (1&2) ◽  
pp. 1-13
Author(s):  
Nicolas F. Lori ◽  
Jos Neves ◽  
Alex H. Blin ◽  
Victor Alves

The contemporary development of Quantum Computers has opened new possibilities for computation improvements, but the limits of Moore's law validity are starting to show. We analyze here the possibility that miniaturization will continue to be the source of Moore's law validity in the near future, and our conclusion is that miniaturization is no longer a reliable answer for the future development of computer science, but instead we suggest that lateralization is the correct approach. By lateralization, we mean the use of biology as the correct format for the implementation of ubiquitous computerized systems, a format that might in many circumstances eschew miniaturization as an overly expensive useless advantage whereas in other cases miniaturization might play a key role. Thus, the future of computer science is not towards a miniaturization that goes from the atom-scale (its present application scale) towards the nucleus-scale, but rather in developing more integrated circuits at the micrometer to nanometer scale, so as to better mimic and interact with biological systems. We analyze some "almost sci-fi" approaches to the development of better computer systems near the Bekenstein bound limit, and unsurprisingly they fail to have any realistic feasibility. Then, we use the difference between the classical vs. quantum version of the Hammerstein-Clifford theorem to explain why biological systems eschewed quantum computation to represent the world but have chosen classical computation instead. Finally, we analyze examples of recent work which indicate future possibilities of integration between computers and biological systems. As a corollary of that choice by the biological systems, we propose that the predicted lateralization-driven evolution in computer science will not be based in quantum computers, but rather in classical computers.


2010 ◽  
pp. 111-114
Author(s):  
Tadhg Morgan

The relentless progression of technology is something we are all familiar with. Computers have gone from filling entire rooms to only taking up some desk space while at the same time becoming incredibly fast. Music was once stored on vinyl records but we can now store hundreds of albums on portable MP3 players. This progression is described by Moore's law which says that technology is getting twice as small and twice as fast every eighteen months. However, this progression can only continue unhindered for so long until it hits a fundamental wall. The problem is that the miniaturization of technology is moving it out of the classical, everyday world and into the quantum world, and devices will soon reach the size of single or few atoms. Whilst moving into the quantum world presents a number of challenges, the benefits far out weigh them. Quantum computers, computers which utilize quantum mechanics, ...


Author(s):  
David Segal

Chapter 3 highlights the critical role materials have in the development of digital computers. It traces developments from the cat’s whisker to valves through to relays and transistors. Accounts are given for transistors and the manufacture of integrated circuits (silicon chips) by use of photolithography. Future potential computing techniques, namely quantum computing and the DNA computer, are covered. The history of computability and Moore’s Law are discussed.


Author(s):  
Daniel Pargman ◽  
Aksel Biørn-Hansen ◽  
Elina Eriksson ◽  
Jarmo Laaksolahti ◽  
Markus Robèrt
Keyword(s):  

2015 ◽  
Vol 59 (1) ◽  
pp. 33-35 ◽  
Author(s):  
Michael A. Cusumano ◽  
David B. Yoffie
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document