scholarly journals Why do we need to Introduce Temporal Behavior in Both Modern Science and Modern Computing, with an Outlook to Researching Modern Effects/Materials and Technologies

Author(s):  
Janos Vegh

Classic science seemed to be completed more than a century ago, facing only a few (but growing number of!) unexplained issues. Introducing time-dependence into classic science explained those issues, and its consequent use led to the birth of a series of modern sciences, including relativistic and quantum physics. Classic computing is based on the paradigm proposed by von Neumann for vacuum tubes only, which seems to be completed in the same sense. Von Neumann warned, however, that implementing computers under more advanced technological conditions, using the paradigm without considering the transfer time (and especially attempting to imitate neural operation), would be unsound. However, classic computing science persists in neglecting the transfer time and is facing a few (but growing number of!) unexplained issues, and its development stalled in most of its fields. Introducing time-dependence into the classic computing science explains those issues and discovers the reasons for its experienced stalling. It can lead to a revolution in computing, resulting in a modern computing science, in the same way, as it resulted in modern science's birth.

2021 ◽  
Author(s):  
Janos Vegh ◽  
Ádám József Berki

Abstract Both the growing demand to cope with ”big data” (based on, or assisted by, artificial intelligence) and the interest in understanding the operation of our brain more completely, stimulated the efforts to build biology-mimicking computing systems from inexpensive conventional components and build different (”neuro-morphic”) computing systems. On one side, those systems require an unusually large number of processors, which introduces performance limitations and nonlinear scaling. On the other side, the neuronal operation drastically differs from the conventional workloads. The conduction time (transfer time) is ignored in both in conventional computing and ”spatiotemporal” computational models of neural networks, although von Neumann warned: ”In the human nervous system the con-duction times along the lines (axons) can be longer than the synaptic delays, hence our above procedure of neglecting them aside of τ [the processing time] would be unsound” [1], section 6.3. This difference alone makes imitating biological behavior in technical implementation hard. Besides, the recent issues in computing called the attention to that the temporal behavior is a general feature of computing systems, too. Some of their effects in both biological and technical systems were al-ready noticed. Instead of introducing some ”looks like” models, the correct handling of the transfer time is suggested here. Introducing the temporal logic, based on the Minkowski transform, gives quantitative insight into the operation of both kinds of computing systems, furthermore provides a natural explanation of decades-old empirical phenomena. Without considering their temporal behavior correctly, neither effective implementation nor a true imitation of biological neural systems are possible.


Informatics ◽  
2021 ◽  
Vol 8 (4) ◽  
pp. 71
Author(s):  
János Végh

Today’s computing is based on the classic paradigm proposed by John von Neumann, three-quarters of a century ago. That paradigm, however, was justified for (the timing relations of) vacuum tubes only. The technological development invalidated the classic paradigm (but not the model!). It led to catastrophic performance losses in computing systems, from the operating gate level to large networks, including the neuromorphic ones. The model is perfect, but the paradigm is applied outside of its range of validity. The classic paradigm is completed here by providing the “procedure” missing from the “First Draft” that enables computing science to work with cases where the transfer time is not negligible apart from the processing time. The paper reviews whether we can describe the implemented computing processes by using the accurate interpretation of the computing model, and whether we can explain the issues experienced in different fields of today’s computing by omitting the wrong omissions. Furthermore, it discusses some of the consequences of improper technological implementations, from shared media to parallelized operation, suggesting ideas on how computing performance could be improved to meet the growing societal demands.


2020 ◽  
Author(s):  
Janos Vegh ◽  
Ádám József Berki

Abstract Both the growing demand to cope with "big data" (based on, or assisted by, artificial intelligence) and the interest in understanding the operation of our brain more completely, stimulated the efforts to build biology-mimicking computing systems from inexpensive conventional components and build different ("neuromorphic") computing systems. On one side, those systems require an unusually large number of processors, which introduces performance limitations and nonlinear scaling. On the other side, the neuronal operation drastically differs from the conventional workloads. The conduction time (transfer time) is ignored in both in conventional computing and "spatiotemporal" computational models of neural networks, although von Neumann warned: "In the human nervous system the conduction times along the lines (axons) can be longer than the synaptic delays, hence our procedure of neglecting them aside of the processing time would be unsound" [1], section 6.3. This difference alone makes imitating biological behavior in technical implementation hard. Besides, the recent issues in computing called the attention to that the temporal behavior is a general feature of computing systems, too. Some of their effects in both biological and technical systems were already noticed. Instead of introducing some "looks like" models, the correct handling of the transfer time is suggested here. Introducing the temporal logic, based on the Minkowski transform, gives quantitative insight into the operation of both kinds of computing systems, furthermore provides a natural explanation of decades-old empirical phenomena. Without considering their temporal behavior correctly, neither effective implementation nor a true imitation of biological neural systems are possible.


Author(s):  
Cheng-yang Zhang ◽  
Zhi-hua Guo ◽  
H.X. Cao

Quantum coherence is an important physical resource in quantum information science, and also as one of the most fundamental and striking features in quantum physics. In this paper, we obtain a symmetry-like relation of relative entropy measure $C_r(\rho)$ of coherence for $n$-partite quantum states $\rho$, which gives lower and upper bounds for $C_r(\rho)$. Meanwhile, we discuss the conjecture about the validity of the inequality $C_r(\rho)\leq C_{\ell_1}(\rho)$ for any state $\rho$. We observe that every mixture $\eta$ of a state $\rho$ satisfying $C_r(\rho)\leq C_{\ell_1}(\rho)$ and any incoherent state $\sigma$ also satisfies the conjecture. We also note that if the von Neumann entropy is defined by the natural logarithm $\ln$ instead of $\log_2$, then the reduced relative entropy measure of coherence $\bar{C}_r(\rho)=-\rho_{\rm{diag}}\ln \rho_{\rm{diag}}+\rho\ln \rho$ satisfies the inequality ${\bar{C}}_r(\rho)\leq C_{\ell_1}(\rho)$ for any mixed state $\rho$.


2021 ◽  
Vol 3 (4) ◽  
pp. 643-655
Author(s):  
Louis Narens

In 1933, Kolmogorov synthesized the basic concepts of probability that were in general use at the time into concepts and deductions from a simple set of axioms that said probability was a σ-additive function from a boolean algebra of events into [0, 1]. In 1932, von Neumann realized that the use of probability in quantum mechanics required a different concept that he formulated as a σ-additive function from the closed subspaces of a Hilbert space onto [0,1]. In 1935, Birkhoff & von Neumann replaced Hilbert space with an algebraic generalization. Today, a slight modification of the Birkhoff-von Neumann generalization is called “quantum logic”. A central problem in the philosophy of probability is the justification of the definition of probability used in a given application. This is usually done by arguing for the rationality of that approach to the situation under consideration. A version of the Dutch book argument given by de Finetti in 1972 is often used to justify the Kolmogorov theory, especially in scientific applications. As von Neumann in 1955 noted, and his criticisms still hold, there is no acceptable foundation for quantum logic. While it is not argued here that a rational approach has been carried out for quantum physics, it is argued that (1) for many important situations found in behavioral science that quantum probability theory is a reasonable choice, and (2) that it has an arguably rational foundation to certain areas of behavioral science, for example, the behavioral paradigm of Between Subjects experiments.


Author(s):  
Alexey A. Iakovlev ◽  
◽  
Ekaterina A. Pchelko-Tolstova ◽  
Gennadiy P. Andreev ◽  

Modern science is, as Thomas Kuhn pointed out in his fundamental work The Structure of Scientific Revolutions, a science that develops in stages: “scientific revolution” – “nor­mal science” – “scientific revolution”. During periods of normal functioning of science, the rules of testing and testing show established and perceived without reflection para­digms, but in the case of technical disciplines, this rule does not work, since technical paradigms do not change according to the same rules as other scientific paradigms. To change technical paradigms, you only need to accept them by the expert community – you also need to accept them by technical consumers. The article discusses the difficul­ties of defining the concepts of “technology” and “technical knowledge” as knowledge about artifacts, their use and the consequences of their use (Bernhard Irrgang). On two examples of the control of technical paradigms in two totalitarian regimes of the twenti­eth century (Lysenkoism, or Lysenkovshchina and “Aryan physics”), the role of para­digms in the situation of ideological control is presented. In these cases, we used at­tempts to “correct” genetics and quantum physics (more precisely, to completely abandon it), respectively. Of course, this control brought biology in the Soviet Union and physics in Nazi Germany to the brink of disaster. In this article, with the help of Gisle Solbu's theory of epi-knowing (knowledge at the general educational level), we propose solutions to the problem of purposeful ideological interference in the scientific and ideo­logical adjustments of not only scientific paradigms, but also scientific paradigms.


Sign in / Sign up

Export Citation Format

Share Document