Foundations of Science
Latest Publications


TOTAL DOCUMENTS

815
(FIVE YEARS 243)

H-INDEX

21
(FIVE YEARS 4)

Published By Springer-Verlag

1572-8471, 1233-1821

Author(s):  
J. L. Usó-Doménech ◽  
J. Nescolarde-Selva
Keyword(s):  

Author(s):  
Mateusz Kotowski ◽  
Krzysztof Szlachcic

AbstractFor many decades, Duhem has been considered a paradigmatic instrumentalist, and while some commentators have argued against classifying him in this way, it still seems prevalent as an interpretation of his philosophy of science. Yet such a construal bears scant resemblance to the views presented in his own works—so little, indeed, that it might be said to constitute no more than a mere phantom with respect to his actual thought. In this article, we aim to deconstruct this phantom, tracing the sources of the misconceptions surrounding his ideas and pinpointing the sources and/or causes of its proliferation. We subsequently point out and discuss those elements of his philosophy that, taken together, support the view of him as a scientific realist of a sophisticated kind. Finally, we defend our own interpretation of his thought against suggestions to the effect that it is oriented towards neither instrumentalism nor scientific realism.


Author(s):  
J. L. Usó-Doménech ◽  
J. A. Nescolarde-Selva ◽  
H. Gash

AbstractIn this paper, the authors try to clarify the relations between Meinong’s and Russell's thoughts on the ontological ideas of existence. The Meinongian theory on non-existent objects does not in itself violate the principle of non-contradiction, since the problem that this hypothesis offers to the theory of definite descriptions is not so much a logical problem as an ontological problem. To demonstrate this we will establish what we believe are the two main theses basic to the theory of descriptions: the epistemological thesis and logical thesis.


Author(s):  
Mithun Bantwal Rao

AbstractThis paper is a contribution to a discussion in philosophy of technology by focusing on the epistemological status of the example. Of the various developments in the emerging, inchoate field of philosophy of technology, the “empirical turn” stands out as having left the most enduring mark on the trajectory contemporary research takes. From a historical point of view, the empirical turn can best be understood as a corrective to the overly “transcendentalizing” tendencies of “classical” philosophers of technology, such as Heidegger. Empirically oriented philosophy of technology emphasizes actual technologies through case-study research into the formation of technical objects and systems (constructivist studies) and how they, for example, transform our perceptions and conceptions (the phenomenological tradition) or pass on and propagate relations of power (critical theory). This paper explores the point of convergence of classical and contemporary approaches by means of the notion of the “example” or “paradigm.” It starts with a discussion of the quintessential modern philosopher of technology, Martin Heidegger, and his thinking about technology in terms of the ontological difference. Heidegger’s framing of technology in terms of this difference places the weight of intelligibility entirely on the side of being, to such an extent that his examples become heuristic rather than constitutive. The second part of the paper discusses the methodological and epistemological import of the “example” and the form of intelligibility it affords. Drawing on the work of Wittgenstein (standard metre), Foucault (panopticism), and Agamben (paradigm), we argue that the example offers an alternative way of understanding the study of technologies from that of empirical case studies.


Author(s):  
Fabio Sterpetti

AbstractThis article presents a challenge that those philosophers who deny the causal interpretation of explanations provided by population genetics might have to address. Indeed, some philosophers, known as statisticalists, claim that the concept of natural selection is statistical in character and cannot be construed in causal terms. On the contrary, other philosophers, known as causalists, argue against the statistical view and support the causal interpretation of natural selection. The problem I am concerned with here arises for the statisticalists because the debate on the nature of natural selection intersects the debate on whether mathematical explanations of empirical facts are genuine scientific explanations. I argue that if the explanations provided by population genetics are regarded by the statisticalists as non-causal explanations of that kind, then statisticalism risks being incompatible with a naturalist stance. The statisticalist faces a dilemma: either she maintains statisticalism but has to renounce naturalism; or she maintains naturalism but has to content herself with an account of the explanations provided by population genetics that she deems unsatisfactory. This challenge is relevant to the statisticalists because many of them see themselves as naturalists.


Author(s):  
Alger Sans Pinillos

AbstractMany parts of the contemporary philosophical debate have been built on the radicalization of conclusions derived from the acceptance of a certain set of classical dichotomies. It also discusses how pragmatism and abduction are currently presented to solve the problems arising from these dichotomies. For this reason, the efforts of this article have been directed to analyze the impact of this fact on the philosophy of science and logic. The starting point is that accepting abduction implies, in many ways, accepting the foundations of pragmatism. Also, that the analysis of such problems from pragmatism and the particular use of abduction dissolve the dichotomies and, with it, also modify the philosophical problems related to them. Therefore, I propose to understand abduction as the right conceptual device to review the problems and debates of the twentieth century’s epistemology from a pragmatic perspective. In doing so, the aim is to propose that the current use of abduction in contemporary debates may imply a change of the philosophical perspective.


Author(s):  
Fabio Rigat

Abstract“What data will show the truth?” is a fundamental question emerging early in any empirical investigation. From a statistical perspective, experimental design is the appropriate tool to address this question by ensuring control of the error rates of planned data analyses and of the ensuing decisions. From an epistemological standpoint, planned data analyses describe in mathematical and algorithmic terms a pre-specified mapping of observations into decisions. The value of exploratory data analyses is often less clear, resulting in confusion about what characteristics of design and analysis are necessary for decision making and what may be useful to inspire new questions. This point is addressed here by illustrating the Popper-Miller theorem in plain terms and using a graphical support. Popper and Miller proved that probability estimates cannot generate hypotheses on behalf of investigators. Consistently with Popper-Miller, we show that probability estimation can only reduce uncertainty about the truth of a merely possible hypothesis. This fact clearly identifies exploratory analysis as one of the tools supporting a dynamic process of hypothesis generation and refinement which cannot be purely analytic. A clear understanding of these facts will enable stakeholders, mathematical modellers and data analysts to better engage on a level playing field when designing experiments and when interpreting the results of planned and exploratory data analyses.


Sign in / Sign up

Export Citation Format

Share Document