John von Neumann: The Founding Father of Artificial Life

1998 ◽  
Vol 4 (3) ◽  
pp. 229-235 ◽  
Author(s):  
Pierre Marchal

Aside from being known for his contributions to mathematics and physics, John von Neumann is considered one of the founding fathers of computer science and engineering. Not only did he do pioneering work on sequential computing systems, but he also carried out a major investigation of parallel architectures, leading to his work on cellular automata. His exceptional vision and daring, borrowing from biology the concept of genomic information even before the discovery of DNA's double helix, led him to propose the concept of self-reproducing automata.

Informatics ◽  
2021 ◽  
Vol 8 (4) ◽  
pp. 71
Author(s):  
János Végh

Today’s computing is based on the classic paradigm proposed by John von Neumann, three-quarters of a century ago. That paradigm, however, was justified for (the timing relations of) vacuum tubes only. The technological development invalidated the classic paradigm (but not the model!). It led to catastrophic performance losses in computing systems, from the operating gate level to large networks, including the neuromorphic ones. The model is perfect, but the paradigm is applied outside of its range of validity. The classic paradigm is completed here by providing the “procedure” missing from the “First Draft” that enables computing science to work with cases where the transfer time is not negligible apart from the processing time. The paper reviews whether we can describe the implemented computing processes by using the accurate interpretation of the computing model, and whether we can explain the issues experienced in different fields of today’s computing by omitting the wrong omissions. Furthermore, it discusses some of the consequences of improper technological implementations, from shared media to parallelized operation, suggesting ideas on how computing performance could be improved to meet the growing societal demands.


Author(s):  
Eleonora Bilotta ◽  
Pietro Pantano

In Complexity Science (Bak, 1996; Morin, 2001; Gell-Mann, 1994; Prigogine & Stengers, 1984) and Artificial Life (Langton, 1995; Adami, 1998), almost all attempts to simulate or synthesize living systems in new media are somehow related to the influential work of John von Neumann (1966).These studies can be grouped into four basic categories (Sipper et al., 1998).


Robotica ◽  
2006 ◽  
Vol 24 (2) ◽  
pp. 145-150
Author(s):  
Brian H. Rudall

When John von Neumann gave his lecture on ‘General and Logical Theory of Automata’ in 1948 his ideas were met with some scepticism. The suggestion that life is a logical process which could result in making a new kind of creature, was the forerunner of all our discussions on whether ‘artificial life’ was at all possible.


2004 ◽  
Vol 174 (12) ◽  
pp. 1371 ◽  
Author(s):  
Mikhail I. Monastyrskii
Keyword(s):  

Author(s):  
Sandip Tiwari

Information is physical, so its manipulation through devices is subject to its own mechanics: the science and engineering of behavioral description, which is intermingled with classical, quantum and statistical mechanics principles. This chapter is a unification of these principles and physical laws with their implications for nanoscale. Ideas of state machines, Church-Turing thesis and its embodiment in various state machines, probabilities, Bayesian principles and entropy in its various forms (Shannon, Boltzmann, von Neumann, algorithmic) with an eye on the principle of maximum entropy as an information manipulation tool. Notions of conservation and non-conservation are applied to example circuit forms folding in adiabatic, isothermal, reversible and irreversible processes. This brings out implications of fluctuation and transitions, the interplay of errors and stability and the energy cost of determinism. It concludes discussing networks as tools to understand information flow and decision making and with an introduction to entanglement in quantum computing.


Electronics ◽  
2020 ◽  
Vol 9 (9) ◽  
pp. 1526 ◽  
Author(s):  
Choongmin Kim ◽  
Jacob A. Abraham ◽  
Woochul Kang ◽  
Jaeyong Chung

Crossbar-based neuromorphic computing to accelerate neural networks is a popular alternative to conventional von Neumann computing systems. It is also referred as processing-in-memory and in-situ analog computing. The crossbars have a fixed number of synapses per neuron and it is necessary to decompose neurons to map networks onto the crossbars. This paper proposes the k-spare decomposition algorithm that can trade off the predictive performance against the neuron usage during the mapping. The proposed algorithm performs a two-level hierarchical decomposition. In the first global decomposition, it decomposes the neural network such that each crossbar has k spare neurons. These neurons are used to improve the accuracy of the partially mapped network in the subsequent local decomposition. Our experimental results using modern convolutional neural networks show that the proposed method can improve the accuracy substantially within about 10% extra neurons.


2000 ◽  
Vol 6 (4) ◽  
pp. 347-361 ◽  
Author(s):  
Barry McMullin

In the late 1940s John von Neumann began to work on what he intended as a comprehensive “theory of [complex] automata.” He started to develop a book length manuscript on the subject in 1952. However, he put it aside in 1953, apparently due to pressure of other work. Due to his tragically early death in 1957, he was never to return to it. The draft manuscript was eventually edited, and combined for publication with some related lecture transcripts, by Burks in 1966. It is clear from the time and effort that von Neumann invested in it that he considered this to be a very significant and substantial piece of work. However, subsequent commentators (beginning even with Burks) have found it surprisingly difficult to articulate this substance. Indeed, it has since been suggested that von Neumann's results in this area either are trivial, or, at the very least, could have been achieved by much simpler means. It is an enigma. In this paper I review the history of this debate (briefly) and then present my own attempt at resolving the issue by focusing on an analysis of von Neumann's problem situation. I claim that this reveals the true depth of von Neumann's achievement and influence on the subsequent development of this field, and further that it generates a whole family of new consequent problems, which can still serve to inform—if not actually define—the field of artificial life for many years to come.


Physics World ◽  
2021 ◽  
Vol 34 (12) ◽  
pp. 59-60i
Author(s):  
Andrew Robinson

Andrew Robinson reviews The Man from the Future: the Visionary Life of John von Neumann by Ananyo Bhattacharya.


Sign in / Sign up

Export Citation Format

Share Document