scholarly journals Probabilistic Recursion Theory and Implicit Computational Complexity

2014 ◽  
Vol 24 (2) ◽  
pp. 177-216 ◽  
Author(s):  
Ugo Dal Lago ◽  
Sara Zuppiroli ◽  
Maurizio Gabbrielli

J. C. Shepherdson. Algorithmic procedures, generalized Turing algorithms, and elementary recursion theory. Harvey Friedman's research on the foundations of mathematics, edited by L. A. Harrington, M. D. Morley, A. S̆c̆edrov, and S. G. Simpson, Studies in logic and the foundations of mathematics, vol. 117, North-Holland, Amsterdam, New York, and Oxford, 1985, pp. 285–308. - J. C. Shepherdson. Computational complexity of real functions. Harvey Friedman's research on the foundations of mathematics, edited by L. A. Harrington, M. D. Morley, A. S̆c̆edrov, and S. G. Simpson, Studies in logic and the foundations of mathematics, vol. 117, North-Holland, Amsterdam, New York, and Oxford, 1985, pp. 309–315. - A. J. Kfoury. The pebble game and logics of programs. Harvey Friedman's research on the foundations of mathematics, edited by L. A. Harrington, M. D. Morley, A. S̆c̆edrov, and S. G. Simpson, Studies in logic and the foundations of mathematics, vol. 117, North-Holland, Amsterdam, New York, and Oxford, 1985, pp. 317–329. - R. Statman. Equality between functionals revisited. Harvey Friedman's research on the foundations of mathematics, edited by L. A. Harrington, M. D. Morley, A. S̆c̆edrov, and S. G. Simpson, Studies in logic and the foundations of mathematics, vol. 117, North-Holland, Amsterdam, New York, and Oxford, 1985, pp. 331–338. - Robert E. Byerly. Mathematical aspects of recursive function theory. Harvey Friedman's research on the foundations of mathematics, edited by L. A. Harrington, M. D. Morley, A. S̆c̆edrov, and S. G. Simpson, Studies in logic and the foundations of mathematics, vol. 117, North-Holland, Amsterdam, New York, and Oxford, 1985, pp. 339–352.

1990 ◽  
Vol 55 (2) ◽  
pp. 876-878
Author(s):  
J. V. Tucker

Author(s):  
Harold Hodes

A reducibility is a relation of comparative computational complexity (which can be made precise in various non-equivalent ways) between mathematical objects of appropriate sorts. Much of recursion theory concerns such relations, initially between sets of natural numbers (in so-called classical recursion theory), but later between sets of other sorts (in so-called generalized recursion theory). This article considers only the classical setting. Also Turing first defined such a relation, now called Turing- (or just T-) reducibility; probably most logicians regard it as the most important such relation. Turing- (or T-) degrees are the units of computational complexity when comparative complexity is taken to be T-reducibility.


10.29007/t77g ◽  
2018 ◽  
Author(s):  
Daniel Leivant

We use notions originating in Computational Complexity to provide insight into the analogies between computational complexity and Higher Recursion Theory. We consider alternating Turing machines, but with a modified, global, definition of acceptance. We show that a language is accepted by such a machine iff it is Pi-1-1. Moreover, total alternating machines, which either accept or reject each input, accept precisely the hyper-arithmetical (Delta-1-1) languages. Also, bounding the permissible number of alternations we obtain a characterization of the levels of the arithmetical hierarchy..The novelty of these characterizations lies primarily in the use of finite computing devices, with finitary, discrete, computation steps. We thereby elucidate the correspondence between the polynomial-time and the arithmetical hierarchies, as well as that between the computably-enumerable, the inductive (Pi-1-1), and the PSpace languages.


Author(s):  
Harold Hodes

In mathematics, a hierarchy is a ‘bottom up’ system classifying entities of some particular sort, a system defined inductively, starting with a ‘basic’ class of such entities, with further (‘higher’) classes of such entities defined in terms of previously defined (‘lower’) classes. Such a classification reflects complexity in some respect, one entity being less complex than another if it appears ‘earlier’ (‘lower’) then that other. Many of the hierarchies studied by logicians construe complexity as complexity of definition, placing such hierarchies within the purview of model theory; but even such notions of complexity are closely tied to species of computational complexity, placing them also in the purview of recursion theory.


2001 ◽  
Vol 11 (1) ◽  
pp. 1-1
Author(s):  
Daniel Leivant ◽  
Bob Constable

This issue of the Journal of Functional Programming is dedicated to work presented at the Workshop on Implicit Computational Complexity in Programming Languages, affiliated with the 1998 meeting of the International Conference on Functional Programming in Baltimore.Several machine-independent approaches to computational complexity have been developed in recent years; they establish a correspondence linking computational complexity to conceptual and structural measures of complexity of declarative programs and of formulas, proofs and models of formal theories. Examples include descriptive complexity of finite models, restrictions on induction in arithmetic and related first order theories, complexity of set-existence principles in higher order logic, and specifications in linear logic. We refer to these approaches collectively as Implicit Computational Complexity. This line of research provides a framework for a streamlined incorporation of computational complexity into areas such as formal methods in software development, programming language theory, and database theory.A fruitful thread in implicit computational complexity is based on exploring the computational complexity consequences of introducing various syntactic control mechanisms in functional programming, including restrictions (akin to static typing) on scoping, data re-use (via linear modalities), and iteration (via ramification of data). These forms of control, separately and in combination, can certify bounds on the time and space resources used by programs. In fact, all results in this area establish that each restriction considered yields precisely a major computational complexity class. The complexity classes thus obtained range from very restricted ones, such as NC and Alternating logarithmic time, through the central classes Poly-Time and Poly-Space, to broad classes such as the Elementary and the Primitive Recursive functions.Considerable effort has been invested in recent years to relax as much as possible the structural restrictions considered, allowing for more exible programming and proof styles, while still guaranteeing the same resource bounds. Notably, more exible control forms have been developed for certifying that functional programs execute in Poly-Time.The 1998 workshop covered both the theoretical foundations of the field and steps toward using its results in various implemented systems, for example in controlling the computational complexity of programs extracted from constructive proofs. The five papers included in this issue nicely represent this dual concern of theory and practice. As they are going to print, we should note that the field of Implicit Computational Complexity continues to thrive: successful workshops dedicated to it were affiliated with both the LICS'99 and LICS'00 conferences. Special issues, of Information and Computation dedicated to the former, and of Theoretical Computer Science to the latter, are in preparation.


Sign in / Sign up

Export Citation Format

Share Document