Turing Machine-Inspired Computer Science Results

Author(s):  
Juris Hartmanis
Author(s):  
Arlindo Oliveira

This chapter covers the development of computing, from its origins, with the analytical engine, to modern computer science. Babbage and Ada Lovelace’s contributions to the science of computing led, in time, to the idea of universal computers, proposed by Alan Turing. These universal computers, proposed by Turing, are conceptual devices that can compute anything that can possibly be computed. The basic concepts created by Turing and Church were further developed to create the edifice of modern computer science and, in particular, the concepts of algorithms, computability, and complexity, covered in this chapter. The chapter ends describing the Church-Turing thesis, which states that anything that can be computed can be computed by a Turing machine.


1987 ◽  
Vol 52 (1) ◽  
pp. 1-43 ◽  
Author(s):  
Larry Stockmeyer

One of the more significant achievements of twentieth century mathematics, especially from the viewpoints of logic and computer science, was the work of Church, Gödel and Turing in the 1930's which provided a precise and robust definition of what it means for a problem to be computationally solvable, or decidable, and which showed that there are undecidable problems which arise naturally in logic and computer science. Indeed, when one is faced with a new computational problem, one of the first questions to be answered is whether the problem is decidable or undecidable. A problem is usually defined to be decidable if and only if it can be solved by some Turing machine, and the class of decidable problems defined in this way remains unchanged if “Turing machine” is replaced by any of a variety of other formal models of computation. The division of all problems into two classes, decidable or undecidable, is very coarse, and refinements have been made on both sides of the boundary. On the undecidable side, work in recursive function theory, using tools such as effective reducibility, has exposed much additional structure such as degrees of unsolvability. The main purpose of this survey article is to describe a branch of computational complexity theory which attempts to expose more structure within the decidable side of the boundary.Motivated in part by practical considerations, the additional structure is obtained by placing upper bounds on the amounts of computational resources which are needed to solve the problem. Two common measures of the computational resources used by an algorithm are time, the number of steps executed by the algorithm, and space, the amount of memory used by the algorithm.


Author(s):  
Manuel Blum ◽  
Lenore Blum

The quest to understand consciousness, once the purview of philosophers and theologians, is now actively pursued by scientists of many stripes. This paper studies consciousness from the perspective of theoretical computer science. It formalizes the Global Workspace Theory (GWT) originated by the cognitive neuroscientist Bernard Baars and further developed by him, Stanislas Dehaene, and others. Our major contribution lies in the precise formal definition of a Conscious Turing Machine (CTM), also called a Conscious AI. We define the CTM in the spirit of Alan Turing’s simple yet powerful definition of a computer, the Turing Machine (TM). We are not looking for a complex model of the brain nor of cognition but for a simple model of (the admittedly complex concept of) consciousness. After formally defining CTM, we give a formal definition of consciousness in CTM. We later suggest why the CTM has the feeling of consciousness. The reasonableness of the definitions and explanations can be judged by how well they agree with commonly accepted intuitive concepts of human consciousness, the range of related concepts that the model explains easily and naturally, and the extent of its agreement with scientific evidence.


Author(s):  
Subrata Dasgupta

The modern computer is a hierarchically organized system of computational artefacts. Inventing, understanding, and applying rules and principles of hierarchy is a subdiscipline of computer science. ‘Computational artefacts’ explains the concepts of compositional hierarchy, the abstraction/refinement principle, and hierarchy by construction. There are three classes of computational artefacts—abstract, material, and liminal. An important example of an abstract artefact is the Turing machine. Sciences involving artefacts are sciences of the artificial, entailing the study of the relationship between means and ends. The ‘science’ in computer science is, thus, a science of means and ends. It asks: how can a computational artefact demonstrably achieve a given human need, goal, or purpose?


Author(s):  
Robin Whitty

In 1936 Turing invented a mathematical model of computation, known today as the Turing machine. He intended it as a representation of human computation and in particular as a vehicle for refuting a central part of David Hilbert’s early 20th-century programme to mechanize mathematics. By a nice irony it came to define what is achievable by non-human computers and has become deeply embedded in modern computer science. A simple example is enough to convey the essentials of a Turing machine. We then describe the background to Hilbert’s programme and Turing’s challenge—and explain how Turing’s response to Hilbert resolves a host of related problems in mathematics and logic. If I had to portray, in less than 30 seconds, what Alan Turing achieved in 1936 it seems to me that drawing the picture shown in Fig. 37.1 would be a reasonable thing to do. That this might be so is a testament to the quite extraordinary merging of the concrete and the abstract in Turing’s 1936 paper on computability. It is regarded by, I suppose, a large majority of mathematical scientists as his greatest work. The details of our picture are not especially important. As it happens, it is a machine for deciding which whole numbers, written in binary form, are multiples of 3. It works thus: suppose the number is 105, whose binary representation is 1101001, because (1 × 26) + (1 × 25) + (0 × 24) + (1 × 23) + (0 × 22) + (0 × 21) + (1 × 20) = 64 + 32 + 8 + 1 = 105. We start at the node labelled A and use the binary digits to drive us from node to node. The first couple of 1s take us to node B and back to A again. The third digit, 0, loops us around at A. Now a 1 and a 0 take us across to node C; and the final 0 and 1 take us back via B to A once more.


2015 ◽  
Author(s):  
Yubin Huang

Background. P and NP are two classes (sets) of languages in Computer Science. An open problem is whether P = NP. This paper tests a new idea to compare the two language sets and attempts to prove that these two language sets consist of same languages by elementary mathematical methods and basic knowledge of Turing machine. Methods. By introducing a filter function C(M,w) that is the number of configurations which have more than one children (nondeterministic moves) in the shortest accept computation path of a nondeterministic Turing machine M for input w, for any language L(M) ∈ NP, we can define a series of its subsets, Li(M) = {w | w ∈ L(M) ∧ C(M,w) ≤ i}, and a series of the subsets of NP as Li = {Li(M) | ∀M ∙ L(M) ∈ NP}. The nondeterministic multi-tape Turing machine is used to bridge two language sets Li and Li+1, by simulating the (i+1)-th nondeterministic move deterministically in multiple work tapes, to reduce one (the last) nondeterministic move. Results. The main result is that, with the above methods, the language set Li+1, which seems more powerful, can be proved to be a subset of Li. This result collapses Li ⊆ P for all i ∈ N. With NP = ⋃i∈NLi, it is clear that NP ⊆ P. Because by definition P ⊆ NP, we have P = NP. Discussion. There can be other ways to define the subsets Li and prove the same result. The result can be extended to cover any sets of time functions C, if ∀f ∙ f ∈ C ⇒ f2 ∈ C, then DTIME(C) = NTIME(C). This paper does not show any ways to find a solution in P for the problem known in NP.


2019 ◽  
Vol 44 (1) ◽  
pp. 27-43 ◽  
Author(s):  
Paweł Stacewicz

AbstractIn this article I defend the thesis that modern computer science has a significant philosophical potential, which is expressed in a form of worldview, called here informational worldview (IVW). It includes such theses like: a) each being contains a certain informational content (which may be revealed by computer science concepts, such as code or algorithm), b) the mind is an information processing system (which should be modeled by means of data processing systems), c) cognition is a type of computation. These (pre)philosophical theses are accepted in many sciences (e.g. in cognitive science), and this is both an expression and strengthening of the IWV. After a general discussion of the relations between philosophy, particular sciences and the worldview, and then the presentation of the basic assumptions and theses of the IWV, I analyze a certain specification of thesis b) expressed in the statement that “the mind is the Turing machine”. I distinguish three concepts of mind (static, variable and minimal) and explain how each of them is connected with the concept of the Turing machine.


Sign in / Sign up

Export Citation Format

Share Document