Incommensurability and scientific progress : an essay concerning the nature of the relation between successive scientific theories

1977 ◽  
Author(s):  
Craig Dilworth
2005 ◽  
Vol 48 (1-2) ◽  
pp. 57-82
Author(s):  
Slobodan Negic

The text that has been presented here offers a model of analysis based on the comparison of cumulative and relativistic approach in contemporary methodology and philosophy of science. The analysis is related to some of the most important problems of these disciplines, such as possibility of defining the falsification criteria for scientific theories or growth of scientific knowledge. Therefore, we have Popper (and later Lakatos) on one side, versus Kuhn on the other side. The analysis is conducted in a form of internal critique of uniformistic notion of scientific progress, in compliance with all the normative and formal requests of such argumentation. The emphases has been put on the development of conventionalism in modern science, which led to it's acceptance as a legitimate point of view in contemporary methodology. This happened due to the fact that the criteria for refuting scientific theories where constantly lowered when confronted with logical arguments derived from the very own structure of development of contemporary science. The main thesis of this work explains how aforementioned development of conventionalism implicates certain devastating consequences for methodology itself. Conventionalism encompasses by definition the external (social) factor of influence in every further development of methodology, seriously damaging the autonomy of science and scientific knowledge as well.


Perspectives ◽  
2018 ◽  
Vol 7 (1) ◽  
pp. 32-39
Author(s):  
Andrea Roselli

AbstractThe Verisimilitudinarian approach to scientific progress (VS, for short) is traditionally considered a realist-correspondist model to explain the proximity of our best scientific theories to the way things really are in the world out there (ʻthe Truthʻ, with the capital ʻtʻ). However, VS is based on notions, such as ʻestimated verisimilitudeʻ or ʻapproximate truthʻ, that dilute the model in a functionalist-like theory. My thesis, then, is that VS tries to incorporate notions, such as ʻprogressʻ, in a pre-constituted metaphysical conception of the world, but fails in providing a fitting framework. The main argument that I will develop to support this claim is that the notions that they use to explain scientific progress (ʻestimated verisimilitudeʻ or ʻapproximate truthʻ) have nothing to do with ʻthe Truthʻ. After presenting Cevolani and Tamboloʻs answer (2013) to Birdʻs arguments (2007), I will claim that VS sacrifices the realist-correspondist truth in favor of an epistemic notion of truth, which can obviously be compatible with certain kinds of realism but not with the one the authors have in mind (the correspondence between our theories and the way things really are).


Physiology ◽  
2002 ◽  
Vol 17 (1) ◽  
pp. 43-46 ◽  
Author(s):  
Ewald R. Weibel

Physiologists are bound to test their scientific theories in experiments on living matter and, ultimately, on living organisms—animals or humans. This confronts the physiologist with ethical dilemmas: can we engage in physiological eperiments in the face of possibly harming the interests of living beings, or should we refrain from such studies, thus preventing the good that can be derived from scientific progress?


2018 ◽  
Vol 1 (2) ◽  
pp. 245-258 ◽  
Author(s):  
Richard D. Morey ◽  
Saskia Homer ◽  
Travis Proulx

Scientific theories explain phenomena using simplifying assumptions—for instance, that the speed of light does not depend on the direction in which the light is moving, or that the shape of a pea plant’s seeds depends on a small number of alleles randomly obtained from its parents. These simplifying assumptions often take the form of statistical null hypotheses; hence, supporting these simplifying assumptions with statistical evidence is crucial to scientific progress, though it might involve “accepting” a null hypothesis. We review two historical examples in which statistical evidence was used to accept a simplifying assumption (that there is no luminiferous ether and that genetic traits are passed on in discrete forms) and one in which the null hypothesis was not accepted despite repeated failures (gravitational waves), drawing lessons from each. We emphasize the role of the scientific context in acceptance of the null: Accepting a null hypothesis is never a purely statistical affair.


2018 ◽  
Author(s):  
Richard Donald Morey ◽  
Saskia Homer ◽  
Travis Proulx

Scientific theories explain phenomena using simplifying assumptions: for instance, that the speed of light does not depend on the direction in which the light is moving, or that the height of a pea plant depends on a small number of alleles randomly obtained from its parents. The ability to support these simplifying assumptions with statistical evidence is crucial to scientific progress, though it might involve "accepting" the null hypothesis. We review two historical examples where statistical evidence was used to accept a simplifying assumption (rejecting the luminiferous aether and genetic theory) and one where the null hypothesis was not accepted in spite of repeated failures (gravitational waves), drawing lessons from each. We emphasize the role of the scientific context in the acceptance of the null: accepting the null is never a purely statistical affair.


1984 ◽  
Vol 16 (48) ◽  
pp. 53-78
Author(s):  
León Olivé

This paper discusses Laudan´s claims (1981) that neither reference nor approximate truth explain the success of science as some realists have maintained; that the main realists theses about conceptual change and scientific progress are wanting, and that the history of science decisively confutes naturalistic scientific realist theses. Laudan´s arguments are examined in detail and it is shown that there are possible realist answers to his objections, provided a different view of scientific theories than the syntactic one normally accepted by naturalistic realists is assumed. This alternative view must include the notion of model as a central component of scientific theories, as developed e.g. by Harré (1970). It is also argued that Laudan´s arguments are based upon too narrow a conception of reference. It is shown that a more elaborated notion, e.g. that suggested by Kitcher (1978), can fruitfully be used by realists to explain convergence and also to rebut Laudan´s claim that there are theories, e.g. flogisto or ether theories, whose central terms did not refer but were nonetheless successful. The alternative view of reference sketched here according to Kitcher shows that some tokens of terms like ‘flogisto’ and ‘eter’ as used by the original flogisto and ether theorists did have genuine reference. The paper goes on to argue against the naturalistic idea that reference and approximate truth alone can explain why theories are accepted by scientists and why them follow, as a matter of fact, a retentionist methodology. Laudan shares the naturalistic idea that this is an empirical hypothesis, and so he tries to refute it on the basis of historical examples. The paper argues that this naturalistic view will not do. A broader theory of science is required which, besides realist theses, should develope adequate concepts to deal with the social factors of science; e.g. experimental practices, communication processes, exercises of power through them, etc. It is advocated that a theory of science of this type should be developed in order to defend realism. But then, most of the naturalistic premisses shared by realists and antirealists should be abandoned. An important consequence is that history of science, although not irrelevant for the realism-antirrealism debate, cannot be taken as a basis of neutral, hard facts, against which theories of science can founder. On the contrary, historical studies of science will necessary presuppose a theory of science. Therefore scientific realism must be seen as a philosophical doctrine to be disputed via philosophical arguments, and the idea that it is an empirical hypothesis should be abandoned. [L.O.]


Author(s):  
Menachem Fisch

William Whewell’s two seminal works, History of the Inductive Science, from the Earliest to the Present Time (1837) and The Philosophy of the Inductive Sciences, Founded upon their History (1840), began a new era in the philosophy of science. Equally critical of the British ‘sensationalist’ school, which founded all knowledge on experience, and the German Idealists, who based science on a priori ideas, Whewell undertook to survey the history of all known sciences in search of a better explanation of scientific discovery. His conclusions were as bold as his undertaking. All real knowledge, he argued, is ‘antithetical’, requiring mutually irreducible, ever-present, and yet inseparable empirical and conceptual components. Scientific progress is achieved not by induction, or reading-out theories from previously collected data, but by the imaginative ‘superinduction’ of novel hypotheses upon known but seemingly unrelated facts. He thus broke radically with traditional inductivism – and for nearly a century was all but ignored. In the Philosophy the antithetical structure of scientific theories and the hypothetico-deductive account of scientific discovery form the basis for novel analyses of scientific and mathematical truth and scientific methodology, critiques of rival philosophies of science, and an account of the emergence and refinement of scientific ideas.


Author(s):  
Alexander Yu. Antonovskiy ◽  
◽  
Raisa E. Barash ◽  
◽  

The article proposes a solution to the paradox of scientific progress, formulated by Max Weber. Science formulates true and objective judgments, and only this distinguishes it from the world of value judgments, ideology, religion, art. On the other hand, the “lifespan of truths” is extremely small and any statement about scientific progress looks unconvincing just in comparison with the pro­gress of value discourses, where each stage of development (style or work of art), if not replaced by the “best” at least they retain or even increase their value over the centuries. A way out of this paradox, according to the authors, can be offered by a socio-evolutionary interpretation of science, where the “criterion” of a better (or more grounded) theory is viewed as “fitness”, as the ability to respond to the challenge of the external environment, to which the best theory adapts bet­ter, and as a consequence is selected. The article is devoted to the problems that the biologically based general theory of evolution is facing today when it is ex­trapolated to the problem of scientific progress. The question is investigated in what sense scientific theories can be interpreted as replacing each other and competing with each other by analogy with organic formations (genotypes, phe­notypes, populations); what the external environment of scientific communica­tion is and what institutions are responsible for the selection of the best theories; about the extent to which the autonomous mechanisms of scientific evolution are differentiated, namely, the mechanisms of random variation, natural selection and stabilization of newly acquired traits. The authors analyze the solutions to these problems in the concepts of causal individuation of the scientific theories of David Hull, the concept of semantic individuation of Stephen Gould’s theory, and the possibilities of reconciliation and synthesis of these evolutionary ap­proaches in the system-communicative theory of evolution by Niklas Luhmann.


Sign in / Sign up

Export Citation Format

Share Document