consistency proofs
Recently Published Documents


TOTAL DOCUMENTS

57
(FIVE YEARS 3)

H-INDEX

10
(FIVE YEARS 0)

2020 ◽  
Vol 10 (1) ◽  
pp. 21-27
Author(s):  
Qiqing Yu

Objective: We studied the consistency of the semi-parametric maximum likelihood estimator (SMLE) under the Cox regression model with right-censored (RC) data. Methods: Consistency proofs of the MLE are often based on the Shannon-Kolmogorov inequality, which requires finite E(lnL), where L is the likelihood function. Results: The results of this study show that one property of the semi-parametric MLE (SMLE) is established. Conclusion: Under the Cox model with RC data, E(lnL) may not exist. We used the Kullback-Leibler information inequality in our proof.


Author(s):  
Michael Hallett

In §§138–47 of the Basic Laws of Arithmetic, Frege attacks the notion that mathematical objects can be ‘created’, criticising Stolz, Hankel, and Dedekind directly, and Cantor and Hilbert indirectly. This paper tries to assess exactly what Frege’s criticism criticises, concentrating particularly on Frege’s opposition to Dedekind and Hilbert. Frege’s ostensible target is arbitrariness, and the need for consistency proofs and the method of achieving them. However, the analysis here argues that the real target is the hidden existential assumptions which are called on, as well as the attempt to avoid what Frege would consider proper definitions. This is then compared to Frege’s description of his own procedure. In the light of this, the paper concludes that Frege’s criticism is unfair, that he attacks these other mathematicians for not doing what he himself is unable to do. At the end, attention is also drawn to another attack on the idea that we create mathematics, that of Gödel. Gödel’s core concerns and core arguments differ from Frege’s, many of them being rooted in the various incompleteness phenomena discovered long after Frege’s work. Nevertheless, there are parallels, which it would be instructive to pursue.


Author(s):  
Volker Peckhaus

The German mathematician and logician Gerhard Gentzen devoted his life to proving the consistency of arithmetic and analysis. His work should be seen as contributing to the post-Gödelian development of Hilbert’s programme. In this connection he developed several logical calculi. The main device used in his proofs was a theorem in which he proved the eliminability of the inference known as ‘cut’ from a variety of different kinds of proofs. This ‘cut-elimination theorem’ yields the consistency of both classical and intuitionistic logic, and the consistency of arithmetic without complete induction. His later work was aimed at providing consistency proofs for less restricted systems of arithmetic and analysis.


2016 ◽  
Vol 9 (3) ◽  
pp. 429-455
Author(s):  
LUCA BELLOTTI

AbstractWe consider the consistency proof for a weak fragment of arithmetic published by von Neumann in 1927. This proof is rather neglected in the literature on the history of consistency proofs in the Hilbert school. We explain von Neumann’s proof and argue that it fills a gap between Hilbert’s consistency proofs for the so-called elementary calculus of free variables with a successor and a predecessor function and Ackermann’s consistency proof for second-order primitive recursive arithmetic. In particular, von Neumann’s proof is the first rigorous proof of the consistency of an axiomatization of the first-order theory of a successor function.


Sign in / Sign up

Export Citation Format

Share Document