scholarly journals A short argument for truthmaker maximalism

Analysis ◽  
2019 ◽  
Author(s):  
Mark Jago

Abstract Each truth has a truthmaker: an entity in virtue of whose existence that truth is true. So say truthmaker maximalists. Arguments for maximalism are hard to find, whereas those against are legion. Most accept that maximalism comes at a significant cost, which many judge to be too high. The scales would seem to be balanced against maximalism. Yet, as I show here, maximalism can be derived from an acceptable premise which many will pre-theoretically accept.

2013 ◽  
Vol 43 (4) ◽  
pp. 460-474 ◽  
Author(s):  
Mark Jago

According to truthmaker theory, particular truths are true in virtue of the existence of particular entities. Truthmaker maximalism holds that this is so for all truths. Negative existential and other ‘negative’ truths threaten the position. Despite this, maximalism is an appealing thesis for truthmaker theorists. This motivates interest in parsimonious maximalist theories, which do not posit extra entities for truthmaker duty. Such theories have been offered by David Lewis and Gideon Rosen, Ross Cameron, and Jonathan Schaffer. However, it will be argued here that these theories cannot be sustained, and hence maximalism comes with a serious ontological cost. Neither Armstrong's invocation of totality facts nor the Martin-Kukso line on absences can meet this cost satisfactorily. I'll claim that negative facts are the best (and perhaps only) way out of the problem for the truthmaker maximalist.


Author(s):  
Ross P. Cameron

Truthmaker theory says that what is true depends on what exists. This chapter spells out this thesis, its implications, and why we should believe it. It looks at the connection between truth-making and the in virtue of relation. It looks at reasons to accept or reject truthmaker maximalism—the claim that absolutely every truth has a truthmaker—and truthmaker necessitarianism—the claim that if A makes p true then it is impossible for A to exist without p being true. It asks what views on essentialism are compatible with truthmaker theory. Three reasons for accepting the view are discussed. The views that truthmaker theory is required by the correct theory of truth, and that it is a commitment of realism, is rejected. It is argued that the best reason to accept truthmaker theory is that it yields a parsimonious account of what truths are brute.


2007 ◽  
pp. 5-27 ◽  
Author(s):  
J. Searle

The author claims that an institution is any collectively accepted system of rules (procedures, practices) that enable us to create institutional facts. These rules typically have the form of X counts as Y in C, where an object, person, or state of affairs X is assigned a special status, the Y status, such that the new status enables the person or object to perform functions that it could not perform solely in virtue of its physical structure, but requires as a necessary condition the assignment of the status. The creation of an institutional fact is, thus, the collective assignment of a status function. The typical point of the creation of institutional facts by assigning status functions is to create deontic powers. So typically when we assign a status function Y to some object or person X we have created a situation in which we accept that a person S who stands in the appropriate relation to X is such that (S has power (S does A)). The whole analysis then gives us a systematic set of relationships between collective intentionality, the assignment of function, the assignment of status functions, constitutive rules, institutional facts, and deontic powers.


Author(s):  
José M. Ariso Salgado

RESUMENAl analizar si Ludwig Wittgenstein mantiene una posición fundamentalista en Sobre la certeza, suele discutirse si la citada obra se adapta al modelo de fundamentalismo propuesto por Avrum Stroll. Tras exponer las líneas básicas de dicho modelo, en esta nota se mantiene que Sobre la certeza no se adapta al modelo de Stroll debido al importante papel que Wittgenstein concede al contextualismo. Además, se añade que Wittgenstein no puede ser calificado de fundamentalista porque no reconoce ninguna propiedad que, sin tener en cuenta la diversidad de casos particulares, permita justificar de forma conjunta todas nuestras creencias básicas.PALABRAS CLAVEWITTGENSTEIN, FUNDAMENTALISMO, CONTEXTUALISMO, CERTEZAABSTRACTDid Wittgenstein hold a foundationalist position in On Certainty? When this question is tackled, it is often discussed, whether On Certainty fits in the foundationalist model devised by Avrum Stroll. After expounding the main lines of this model, I hold that On Certainty does not fit in Stroll’s model, because of the important role Wittgenstein attaches to contextualism. Furthermore, I add that Wittgenstein cannot be seen as a foundationalist –or a coherentist–, because he does not admit any feature in virtue of which the whole of our basic beliefs are justified without considering circumstances at all.KEYWORDSWITTGENSTEIN, CERTAINTY, FOUNDATIONALISM, CONTEXTUALISM


2014 ◽  
Vol 17 (1) ◽  
pp. 72-93 ◽  
Author(s):  
Christian Tapp

In this paper, Anselm’s argument for the uniqueness of God or, more precisely, something through which everything that exists has its being (Monologion 3) is reconstructed. A first reading of the argument leads to a preliminary reconstruens with one major weakness, namely the incompleteness of a central case distinction. In the successful attempt to construct a more tenable reconstruens some additional premises which are deeply rooted in an Anselmian metaphysics are identified. Anselm’s argument seems to depend on premises such as that if two things have the same nature, then there is one common thing from which they have this nature and in virtue of which they exist. Furthermore it appears that infinite regresses are excluded by the premise that if everything that exists is through something, then there is something through which it is “most truly”.


2019 ◽  
Author(s):  
Robert C. Hockett

This white paper lays out the guiding vision behind the Green New Deal Resolution proposed to the U.S. Congress by Representative Alexandria Ocasio-Cortez and Senator Bill Markey in February of 2019. It explains the senses in which the Green New Deal is 'green' on the one hand, and a new 'New Deal' on the other hand. It also 'makes the case' for a shamelessly ambitious, not a low-ball or slow-walked, Green New Deal agenda. At the core of the paper's argument lies the observation that only a true national mobilization on the scale of those associated with the original New Deal and the Second World War will be up to the task of comprehensively revitalizing the nation's economy, justly growing our middle class, and expeditiously achieving carbon-neutrality within the twelve-year time-frame that climate science tells us we have before reaching an environmental 'tipping point.' But this is actually good news, the paper argues. For, paradoxically, an ambitious Green New Deal also will be the most 'affordable' Green New Deal, in virtue of the enormous productivity, widespread prosperity, and attendant public revenue benefits that large-scale public investment will bring. In effect, the Green New Deal will amount to that very transformative stimulus which the nation has awaited since the crash of 2008 and its debt-deflationary sequel.


2020 ◽  
Vol 29 (5) ◽  
pp. 519-535
Author(s):  
Levi Tenen

Aesthetic and historical values are commonly distinguished from each other. Yet there has not been sustained discussion of what, precisely, differs between them. In fact, recent scholarship has focused on various ways in which the two are related. I argue, though, that historical value can differ in an interesting way from aesthetic value and that this difference may have significant implications for environmental preservation. In valuing something for its historical significance, it need not always be the case that there is a reason to want people to experience the entity. Valuing something for its aesthetic merit, by contrast, does imply a reason to want people to experience the entity. I suggest that in virtue of this difference, some historical values may offer better justification for preserving natural environments than do aesthetic considerations.


1997 ◽  
Vol 36 (6-7) ◽  
pp. 107-115 ◽  
Author(s):  
Gregory J. Wilson ◽  
Amid P. Khodadoust ◽  
Makram T. Suidan ◽  
Richard C. Brenner

An integrated reactor system has been developed to remediate pentachlorophenol (PCP) containing wastes using sequential anaerobic and aerobic biodegradation. Anaerobically, PCP was degraded to predominately equimolar concentrations (>99%) of monochlorophenol (MCP) in two GAC fluidized bed reactors at Empty Bed Contact Times (EBCTs) ranging from 18.6 to 1.15 hours. However, at lower EBCTs, MCP concentrations decreased to less than 10% of the influent PCP concentration suggesting mineralization. The optimal EBCT was determined to be 2.3 hours based on PCP conversion to MCPs and stable reactor operation. Decreasing the EBCT fourfold did not inhibit degradation of PCP and its intermediates, thus allowing removal of PCP at much lower detention time and resulting in a significant cost advantage. Analytical grade PCP was fed via syringe pumps into two fluidized bed reactors at influent concentrations of 100 mg/l and 200 mg/l, respectively. Acting as the primary substrate, ethanol was also fed into the reactors at concentrations of 697 and 1388 mg/l. Effluent PCP and chlorinated phenolic compounds were analyzed weekly to evaluate reactor performance. Biodegradation pathways were also identified. 3-chlorophenol (CP) was the predominant MCP and varied simultaneously with 3,5-dichlorophenol (DCP) concentrations. Likewise, 4-CP concentrations varied simultaneously with 3,4-DCP concentrations. A second stage aerobic GAC fluidized bed reactor was added after the anaerobic reactor to completely mineralize the remaining MCP and phenols. Data show no presence of phenol and MCP in the effluent or on the GAC. Overall, the chemical oxygen demand (COD) fed to the system was reduced from 75 g/d in the influent to less than 1.5 g/d in the effluent.


Author(s):  
Elia Nathan Bravo

The purpose of this paper is two-fold. On the one hand, it offers a general analysis of stigmas (a person has one when, in virtue of its belonging to a certain group, such as that of women, homosexuals, etc., he or she is subjugated or persecuted). On the other hand, I argue that stigmas are “invented”. More precisely, I claim that they are not descriptive of real inequalities. Rather, they are socially created, or invented in a lax sense, in so far as the real differences to which they refer are socially valued or construed as negative, and used to justify social inequalities (that is, the placing of a person in the lower positions within an economic, cultural, etc., hierarchy), or persecutions. Finally, I argue that in some cases, such as that of the witch persecution of the early modern times, we find the extreme situation in which a stigma was invented in the strict sense of the word, that is, it does not have any empirical content.


Sign in / Sign up

Export Citation Format

Share Document