scholarly journals What Is Consciousness? -Artificial Intelligence, Real Intelligence, Quantum Mind, and Qualia

2021 ◽  
Author(s):  
Stuart Kauffman ◽  
Andrea Roli

We approach the question, "What is Consciousness?'' in a new way, not as Descartes' "systematic doubt'', but as how organisms find their way in their world. Finding one's way involves finding possible uses of features of the world that might be beneficial or avoiding those that might be harmful. "Possible uses of X to accomplish Y'' are "Affordances''. The number of uses of X is indefinite, the different uses are unordered and are not deducible from one another. All biological adaptations are either affordances seized by heritable variation and selection or, far faster, by the organism acting in its world finding uses of X to accomplish Y. Based on this, we reach rather astonishing conclusions: 1) Strong AI is not possible. Universal Turing Machines cannot "find'' novel affordances. 2) Brain-mind is not purely classical physics for no classical physics system can be an analogue computer whose dynamical behavior can be isomorphic to "possible uses''. 3) Brain mind must be partly quantum - supported by increasing evidence at 6.0 sigma to 7.3 Sigma. 4) Based on Heisenberg's interpretation of the quantum state as "Potentia'' converted to "Actuals'' by Measurement, a natural hypothesis is that mind actualizes Potentia. This is supported at 5.2 Sigma. Then Mind's actualization of entangled brain-mind-world states are experienced as qualia and allow "seeing'' or "perceiving'' of uses of X to accomplish Y. We can and do jury-rig. Computers cannot. 5) Beyond familiar quantum computers, we consider Trans-Turing-Systems.

2018 ◽  
Author(s):  
Rajendra K. Bera

It now appears that quantum computers are poised to enter the world of computing and establish its dominance, especially, in the cloud. Turing machines (classical computers) tied to the laws of classical physics will not vanish from our lives but begin to play a subordinate role to quantum computers tied to the enigmatic laws of quantum physics that deal with such non-intuitive phenomena as superposition, entanglement, collapse of the wave function, and teleportation, all occurring in Hilbert space. The aim of this 3-part paper is to introduce the readers to a core set of quantum algorithms based on the postulates of quantum mechanics, and reveal the amazing power of quantum computing.


Author(s):  
Dr. Maysoon M. Aziz, Et. al.

In this paper, we will use the differential equations of the SIR model as a non-linear system, by using the Runge-Kutta numerical method to calculate simulated values for known epidemiological diseases related to the time series including the epidemic disease COVID-19, to obtain hypothetical results and compare them with the dailyreal statisticals of the disease for counties of the world and to know the behavior of this disease through mathematical applications, in terms of stability as well as chaos in many applied methods. The simulated data was obtained by using Matlab programms, and compared between real data and simulated datd were well compatible and with a degree of closeness. we took the data for Italy as an application.  The results shows that this disease is unstable, dissipative and chaotic, and the Kcorr of it equal (0.9621), ,also the power spectrum system was used as an indicator to clarify the chaos of the disease, these proves that it is a spread,outbreaks,chaotic and epidemic disease .


Author(s):  
Jeremy Butterfield

Over the centuries, the doctrine of determinism has been understood, and assessed, in different ways. Since the seventeenth century, it has been commonly understood as the doctrine that every event has a cause; or as the predictability, in principle, of the entire future. To assess the truth of determinism, so understood, philosophers have often looked to physical science; they have assumed that their current best physical theory is their best guide to the truth of determinism. It seems that most have believed that classical physics, especially Newton’s physics, is deterministic. And in this century, most have believed that quantum theory is indeterministic. Since quantum theory has superseded classical physics, philosophers have typically come to the tentative conclusion that determinism is false. In fact, these impressions are badly misleading. The above formulations of determinism are unsatisfactory. Once we use a better formulation, we see that there is a large gap between the determinism of a given physical theory, and the bolder, vague idea that motivated the traditional formulations: the idea that the world in itself is deterministic. Admittedly, one can make sense of this idea by adopting a sufficiently bold metaphysics; but it cannot be made sense of just by considering determinism for physical theories. As regards physical theories, the traditional impression is again misleading. Which theories are deterministic turns out to be a subtle and complicated matter, with many open questions. But broadly speaking, it turns out that much of classical physics, even much of Newton’s physics, is indeterministic. Furthermore, the alleged indeterminism of quantum theory is very controversial: it enters, if at all, only in quantum theory’s account of measurement processes, an account which remains the most controversial part of the theory.


2003 ◽  
Vol 14 (05) ◽  
pp. 853-870 ◽  
Author(s):  
HARUMICHI NISHIMURA

In this paper, we explore the power of quantum computers with restricted transition amplitudes. In 1997 Adleman, DeMarrais, and Huang showed that quantum Turing machines (QTMs) with the amplitudes from [Formula: see text] are computationally equivalent to ones with the polynomial-time computable amplitudes as machines implementing bounded-error polynomial-time algorithms. We show that QTMs with the amplitudes from [Formula: see text] is polynomial-time equivalent to deterministic Turing machines as machines implementing exact algorithms, i.e., algorithms that output correct answers with certainty. By extending this result, it is shown that exact quantum computers with rational biased coins are equivalent to classical computers. Moreover, we discuss the computational power of exact quantum computers with multiple types of coins. We also show that, from the viewpoint of zero-error polynomial-time algorithms, [Formula: see text] is not more powerful than [Formula: see text] as the set of amplitudes taken by QTMs; however, it is sufficient to solve the factoring problem.


2021 ◽  
Vol 7 (1) ◽  
pp. 1-9
Author(s):  
Zion Elani

Quantum computing, a fancy word resting on equally fancy fundamentals in quantum mechanics, has become a media hype, a mainstream topic in popular culture and an eye candy for high-tech company researchers and investors alike. Quantum computing has the power to provide faster, more efficient, secure and accurate computing solutions for emerging future innovations. Governments the world over, in collaboration with high-tech companies, pour in billions of dollars for the advancement of computing solutions quantum-based and for the development of fully functioning quantum computers that may one day aid in or even replace classical computers. Despite much hype and publicity, most people do not understand what quantum computing is, nor do they comprehend the significance of the developments required in this field, and the impact it may have on the future. Through these lecture notes, we embark on a pedagogic journey of understanding quantum computing, gradually revealing the concepts that form its basis, later diving in a vast pool of future possibilities that lie ahead, concluding with understanding and acknowledging some major hindrance and speed breaking bumpers in their path.


Author(s):  
Jeremy Butterfield

Over the centuries, the doctrine of determinism has been understood, and assessed, in different ways. Since the seventeenth century, it has been commonly understood as the doctrine that every event has a cause; or as the predictability, in principle, of the entire future. To assess the truth of determinism, so understood, philosophers have often looked to physical science; they have assumed that their current best physical theory is their best guide to the truth of determinism. Most have believed that classical physics, especially Newton’s physics, is deterministic. And in this century, most have believed that quantum theory is indeterministic. Since quantum theory has superseded classical physics, philosophers have typically come to the tentative conclusion that determinism is false. In fact, these impressions are badly misleading, on three counts. First of all, formulations of determinism in terms of causation or predictability are unsatisfactory, since ‘event’, ‘causation’ and ‘prediction’ are vague and controversial notions, and are not used (at least not univocally) in most physical theories. So if we propose to assess determinism by considering physical theories, our formulation of determinism should be more closely tied to such theories. To do this, the key idea is that determinism is a property of a theory. Imagine a theory that ascribes properties to objects of a certain kind, and claims that the sequence through time of any such object’s properties satisfies certain regularities. Then we say that the theory is deterministic if and only if for any two such objects: if their properties match exactly at a given time, then according to the theory, they will match exactly at all future times. Second, this improved formulation reveals that there is a large gap between the determinism of a given physical theory, and the bolder, vague idea that motivated the traditional formulations: the idea that the world as a whole, independent of any single theory, is deterministic. Admittedly, one can make sense of this idea by adopting a sufficiently bold metaphysics: namely, a metaphysics that accepts the idea of a theory of the world as a whole, so that its objects are possible worlds, and determinism becomes the requirement that any two possible worlds described by the theory that match exactly at a given time also match exactly at all future times. But this idea cannot be made sense of using the more cautious strategy of considering determinism as a feature of a given physical theory. Third, according to this more cautious strategy, the traditional consensus is again misleading. Which theories are deterministic turns out to be a subtle and complicated matter, with many questions still open. But broadly speaking, it turns out that much of classical physics, even much of Newton’s physics, is indeterministic. Furthermore, the alleged indeterminism of quantum theory is very controversial: it enters, if at all, only in quantum theory’s account of measurement processes, an account which remains the most controversial part of the theory. These subtleties and controversies mean that physics does not pass to philosophers any simple verdict about determinism. But more positively, they also mean that determinism remains an exciting topic in the philosophy of science.


Author(s):  
Jill North

This chapter argues against formal accounts of theoretical equivalence in physics. It defends the importance of a theory’s picture of the world and its explanations of the phenomena, drawing on examples from classical physics, Newtonian gravitation, classical electromagnetism, special relativity, and quantum mechanics. The discussion draws a distinction between metaphysical equivalence and informational equivalence and argues that these are equally important to the equivalence of physical theories. The chapter concludes that there are fewer cases of wholly equivalent theories in physics than usually thought. However, this is not a problem, for it is still possible to talk about the various respects in which physical theories are, or are not, equivalent to one another.


Sign in / Sign up

Export Citation Format

Share Document