scholarly journals Evolutionary Model of Moore’s Law

2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Joachim Kaldasch

Moore suggested an exponential growth of the number of transistors in integrated electronic circuits. In this paper, Moore’s law is derived from a preferential growth model of successive production technology generations. The theory suggests that products manufactured with a new production technology generating lower costs per unit have a competitive advantage on the market. Therefore, previous technology generations are replaced according to a Fisher-Pry law. Discussed is the case that a production technology is governed by a cost relevant characteristic. If this characteristic is bounded by a technological or physical boundary, the presented evolutionary model predicts an asymptotic approach to this limit. The model discusses the wafer size evolution and the long term evolution of Moore’s law for the case of a physical boundary of the lithographic production technology. It predicts that the miniaturization process of electronic devices will slow down considerably in the next two decades.

2019 ◽  
Vol 13 (1) ◽  
pp. 106-109
Author(s):  
Thomas J. Misa

Abstract This talk presents the theme that anchors the new third edition of Leonardo to the Internet: Technology and Culture from the Renaissance to the Present, which is organized around technical-economic-political “eras” spotlighting the long-term interactions of technology and culture. The book’s first edition (2004) concluded with an optimistic assessment of global culture, then added a pessimistic assessment of systemic risk (2011). The eras point to socio-economic structures that foster and channel the development of certain technologies (and not others). This approach steers for a middle ground between social constructivism and technological determinism. This talk analyzes Moore’s Law (1975–2005), widely hailed to explain, well, everything. By 1975 Gordon Moore appeared to accurately “predict” the doubling every 18 months of the number components on each integrated circuit. During these years chips expanded from roughly 2,000 to 600 million transistors; more important the “law” guided a technical revolution and an industry transformation. At first national and then international cooperative “roadmapping” exercises predicted the exact dimensions of chips in the future, and semiconductor companies all aimed exactly where their peers were aiming. So Moore’s Law is a self-fulfilling prophecy supported for three decades by inter-firm cooperation and synchronized R&D.


Author(s):  
David Segal

Chapter 3 highlights the critical role materials have in the development of digital computers. It traces developments from the cat’s whisker to valves through to relays and transistors. Accounts are given for transistors and the manufacture of integrated circuits (silicon chips) by use of photolithography. Future potential computing techniques, namely quantum computing and the DNA computer, are covered. The history of computability and Moore’s Law are discussed.


Author(s):  
Daniel Pargman ◽  
Aksel Biørn-Hansen ◽  
Elina Eriksson ◽  
Jarmo Laaksolahti ◽  
Markus Robèrt
Keyword(s):  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Prasanna Date ◽  
Davis Arthur ◽  
Lauren Pusey-Nazzaro

AbstractTraining machine learning models on classical computers is usually a time and compute intensive process. With Moore’s law nearing its inevitable end and an ever-increasing demand for large-scale data analysis using machine learning, we must leverage non-conventional computing paradigms like quantum computing to train machine learning models efficiently. Adiabatic quantum computers can approximately solve NP-hard problems, such as the quadratic unconstrained binary optimization (QUBO), faster than classical computers. Since many machine learning problems are also NP-hard, we believe adiabatic quantum computers might be instrumental in training machine learning models efficiently in the post Moore’s law era. In order to solve problems on adiabatic quantum computers, they must be formulated as QUBO problems, which is very challenging. In this paper, we formulate the training problems of three machine learning models—linear regression, support vector machine (SVM) and balanced k-means clustering—as QUBO problems, making them conducive to be trained on adiabatic quantum computers. We also analyze the computational complexities of our formulations and compare them to corresponding state-of-the-art classical approaches. We show that the time and space complexities of our formulations are better (in case of SVM and balanced k-means clustering) or equivalent (in case of linear regression) to their classical counterparts.


2015 ◽  
Vol 59 (1) ◽  
pp. 33-35 ◽  
Author(s):  
Michael A. Cusumano ◽  
David B. Yoffie
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document