Realists Without a Cause: Deflationary Theories of Truth and Ethical Realism

1996 ◽  
Vol 26 (4) ◽  
pp. 561-589 ◽  
Author(s):  
Sergio Tenenbaum

In ‘The Status of Content,’ Paul Boghossian points out an embarrassment in which A.J. Ayer finds himself in his extensive irrealism. Ayer embraces both an emotivist theory of ethics and a deflationary theory of truth. According to an emotivist theory, sentences that look like perfectly good declarative sentences, such as ‘One ought not to kill,’ should be interpreted as non-declarative sentences. According to a deflationary theory of truth, ‘truth’ is not a predicate of sentences, and sentences of the form ‘“p” is true’ are equivalent to sentences of the form ‘p.’ Boghossian argues that emotivism and deflationism turn out to be incompatible.Boghossian's criticism should not be presented before we ask this question: What motivates Ayer's subversion of surface grammar? A typical motivation to provide an analysis of a certain region of discourse is to find a way in which patterns of inference and compositionality could be more perspicuously presented.

Author(s):  
Josefine Papst

John Gibbons tries to show that the notion of similarities and differences between different cases of events reveals the relevance of relational properties, which are of causal relevance. Based on such considerations, Gibbons' main claim is that the truth value somebody assigns to his or her beliefs has causal power. This means that the deflationary theory of truth becomes false. The questions therefore are: (1) What are the similarities and differences between different cases? (2) What kind of properties are relational properties? (3) What is the causal relevance of such relational properties, and why should the truth value be of causal relevance? (4) Why can Gibbons not show that the truth value has the relevant causal power?


The article discusses the Frege-Geach problem, which is considered one of the most serious difficulties for emotive meta-ethics. The paper describes how recognition of an emotive content by moral statements in the form of attitudes leads to the emergence of the Frege-Geach problem. The essence of the problem is explained, which consists in the impossibility to make a logical conclusion in a situation of mixed contexts, when the antecedent has a value meaning, and the consequent is descriptive. The authors consider one way to solve this problem, which involves the use of a deflationary theory of truth. It is proved that the adoption of deflationism about the truth makes it possible to draw a logical conclusion in mixed contexts. They also raise the question of whether the application of deflationary truth concept is sufficient to avoid the relativistic effects of emotivism on normative ethics. The authors note that the synthesis of deflationism and emotivism is not able to explain the internalism of moral statements, which always have a hidden prescriptive modality. The conclusion is made that deflationism does not allow emotivism to avoid relativistic consequences in the field of normative ethics. Therefore, the authors conclude that emotivism should be called the nihilistic theory of the rationale for moral statements.


2013 ◽  
Vol 12 (2) ◽  
pp. 191-216
Author(s):  
Antonius Widyarsono

Abstrak: John Langshaw Austin menjadi terkenal sebagai filosof Lingkaran Oxford yang menekankan pentingnya tuturan performatif. Namun dalam artikelnya “Truth” (1950) ia menggunakan teori korespondensi dalam memahami masalah kebenaran. Austin mengkritik Strawson yang menggunakan teori deflasioner tentang kebenaran berdasarkan analisis mengenai pentingnya tuturan performatif. Dalam tulisan ini akan dijelaskan mengapa Austin lebih memilih teori korespondensi dari pada teori deflasioner dalam memahami kebenaran. Juga akan ditunjukkan sumbangan khas Austin yang membarui teori korespondensi umum yang menggunakan metafora “cermin” dan “peta” realitas dengan menekankan sifat konvensional ide korespondensi. Menurut penulis, hal ini merupakan suatu usaha yang serius dan berguna dalam mengartikulasikan cara kita menggunakan simbol-simbol bahasa yang ditentukan secara sewenang-wenang untuk merepresentasikan realitas dunia.   Kata-kata Kunci: Kebenaran, teori korespondensi, teori koherensi, teori deflasioner, teori tindak-tutur, aspek ilokusioner bahasa, tuturan deskriptif, tuturan performatif, konvensi deskriptif, konvensi demonstratif.   Abstract: John Langshaw Austin is an “Ordinary Language Philosopher” of Oxford, who is famous for emphasizing the importance of performative statements. In his article, “Truth” (1950), however, he used correspondence theory for understanding the problem of truth. Austin criticized Strawson, who uses the deflationary theory of truth that is compatible with the analysis of performative utterances. This article will explain why Austin chooses the correspondence theory of truth rather than deflationary one. It will also elaborate Austin’s specific contribution in changing the version of the correspondence theory, which uses the metaphor of “mirroring” or “mapping”’ the world, to a conventional correspondence theory. It is, in my opinion, a serious and notable attempt to articulate our use of arbitrary symbols in the representation of brute reality. Keywords: Truth, correspondence theory, coherence theory, deflationary theory, speech-act theory, the illocutionary aspect of language, descriptive utterance, performative utterance, descriptive convention, demonstrative convention.


Author(s):  
Daniel Boyd

Kripke’s Wittgenstein is standardly understood as a non-factualist about meaning ascription. Non-factualism about meaning ascription is the idea that sentences like “Joe means addition by ‘plus’” are not used to state facts about the world. Byrne and Kusch have argued that Kripke’s Wittgenstein is not a non-factualist about meaning ascription. They are aware that their interpretation is non-standard, but cite arguments from Boghossian and Wright to support their view. Boghossian argues that non-factualism about meaning ascription is incompatible with a deflationary theory of truth. Wright argues that non-factualism about meaning ascription is incoherent. To support the standard interpretation, I’ll respond to each argument in turn. To the degree that my responses are successful, Byrne and Kusch have an unmotivated interpretation of Kripke’s Wittgenstein. Wilson provides a factualist interpretation that is not based on Boghossian and Wright’s arguments. Miller argues for a non-factualist interpretation against Wilson, but I’ll show that Miller’s interpretation faces a dilemma. Miller’s argument cannot be maintained if a coherent interpretation of the skeptical solution is to be provided. I’ll show how this dilemma can be avoided and provide an independent argument against Wilson so that a non-factualist interpretation of the skeptical solution can be maintained.


Author(s):  
Jamin Asay ◽  
Sam Baron

Abstract In this paper we confront a challenge to truthmaker theory that is analogous to the objections raised by deflationists against substantive theories of truth. Several critics of truthmaker theory espouse a ‘deflationary’ attitude about truthmaking, though it has not been clearly presented as such. Our goal is to articulate and then object to the underlying rationale behind deflationary truthmaking. We begin by developing the analogy between deflationary truth and deflationary truthmaking, and then show how the latter can be found in the work of Dodd, Hornsby, Schnieder, Williamson, and others. These philosophers believe that the ambitions of truthmaker theory are easily satisfied, without recourse to ambitious ontological investigation—hence the analogy with deflationary truth. We argue that the deflationists’ agenda fails: there is no coherent deflationary theory of truthmaking. Truthmaking, once deflated, fails to address the questions at the heart of truthmaking investigation. Truthmaking cannot be had on the cheap.


Author(s):  
L.J. Chen ◽  
Y.F. Hsieh

One measure of the maturity of a device technology is the ease and reliability of applying contact metallurgy. Compared to metal contact of silicon, the status of GaAs metallization is still at its primitive stage. With the advent of GaAs MESFET and integrated circuits, very stringent requirements were placed on their metal contacts. During the past few years, extensive researches have been conducted in the area of Au-Ge-Ni in order to lower contact resistances and improve uniformity. In this paper, we report the results of TEM study of interfacial reactions between Ni and GaAs as part of the attempt to understand the role of nickel in Au-Ge-Ni contact of GaAs.N-type, Si-doped, (001) oriented GaAs wafers, 15 mil in thickness, were grown by gradient-freeze method. Nickel thin films, 300Å in thickness, were e-gun deposited on GaAs wafers. The samples were then annealed in dry N2 in a 3-zone diffusion furnace at temperatures 200°C - 600°C for 5-180 minutes. Thin foils for TEM examinations were prepared by chemical polishing from the GaA.s side. TEM investigations were performed with JE0L- 100B and JE0L-200CX electron microscopes.


Sign in / Sign up

Export Citation Format

Share Document