epistemic opacity
Recently Published Documents


TOTAL DOCUMENTS

23
(FIVE YEARS 14)

H-INDEX

3
(FIVE YEARS 1)

SIMULATION ◽  
2021 ◽  
pp. 003754972110288
Author(s):  
Alejandro Cassini

Some philosophers of science have recently argued that the epistemic assessment of complex simulation models, such as climate models, cannot be free of the influence of social values. In their view, the assignment of probabilities to the different hypotheses or predictions that result from simulations presupposes some methodological decisions that rest on value judgments. In this article, I criticize this claim and put forward a Bayesian response to the arguments from inductive risk according to which the influence of social values on the calculation of probabilities is negligible. I conclude that the epistemic opacity of complex simulations, such as climate models, does not preclude the application of Bayesian methods.


AI & Society ◽  
2021 ◽  
Author(s):  
Bert Heinrichs

AbstractIn this paper, I examine whether the use of artificial intelligence (AI) and automated decision-making (ADM) aggravates issues of discrimination as has been argued by several authors. For this purpose, I first take up the lively philosophical debate on discrimination and present my own definition of the concept. Equipped with this account, I subsequently review some of the recent literature on the use AI/ADM and discrimination. I explain how my account of discrimination helps to understand that the general claim in view of the aggravation of discrimination is unwarranted. Finally, I argue that the use of AI/ADM can, in fact, increase issues of discrimination, but in a different way than most critics assume: it is due to its epistemic opacity that AI/ADM threatens to undermine our moral deliberation which is essential for reaching a common understanding of what should count as discrimination. As a consequence, it turns out that algorithms may actually help to detect hidden forms of discrimination.


2020 ◽  
Author(s):  
Frederic Wieber ◽  
Alexandre Hocquet

Computational chemistry grew in a new era of “desktop modeling,” which coincided with a growing demand for modeling software, especially from the pharmaceutical industry. Parameterization of models in computational chemistry is an arduous enterprise, and we argue that this activity leads, in this specific context, to tensions among scientists regarding the epistemic opacity transparency of parameterized methods and the software implementing them. We relate one flame war from the Computational Chemistry mailing List in order to assess in detail the relationships between modeling methods, parameterization, software and the various forms of their enclosure or disclosure. Our claim is that parameterization issues are an important and often neglected source of epistemic opacity and that this opacity is entangled in methods and software alike. Models and software must be addressed together to understand the epistemological tensions at stake.


2020 ◽  
Vol 28 (5) ◽  
pp. 610-629
Author(s):  
Frédéric Wieber ◽  
Alexandre Hocquet

Computational chemistry grew in a new era of “desktop modeling,” which coincided with a growing demand for modeling software, especially from the pharmaceutical industry. Parameterization of models in computational chemistry is an arduous enterprise, and we argue that this activity leads, in this specific context, to tensions among scientists regarding the epistemic opacity transparency of parameterized methods and the software implementing them. We relate one flame war from the Computational Chemistry mailing List in order to assess in detail the relationships between modeling methods, parameterization, software and the various forms of their enclosure or disclosure. Our claim is that parameterization issues are an important and often neglected source of epistemic opacity and that this opacity is entangled in methods and software alike. Models and software must be addressed together to understand the epistemological tensions at stake.


2020 ◽  
pp. 153-166
Author(s):  
Steve McKinlay

In this chapter, McKinlay argues that the use of big data algorithms introduces a key problem in terms of epistemic opacity. Opacity in various forms is an issue which many authors identify as posing problems for democratic functioning and accountability. In McKinlay’s case, the argument focuses on the impact that epistemic opacity has on our ability to trust non-human agents. He holds that while the outputs of big data-derived decisions can be significant for citizens, where we do not have the ability to understand how these decisions were made, we cannot ultimately trust the decider. Decisions based on mere probability are not, he argues, sufficiently grounded for democratic systems and risk harming citizens.


2019 ◽  
pp. 34-47
Author(s):  
Paul Humphreys

Reasons are given to justify the claim that computer simulations and computational science constitute a distinctively new set of scientific methods as compared to traditional analytic methods and that these computational methods introduce new issues in the philosophy of science. These issues are both epistemological and methodological in kind. Definitions of epistemic opacity and essential epistemic opacity are given, the syntactic and semantic accounts of theories are shown to address different problems than those addressed by computational science, the important role of concrete dynamics in simulations is stressed, and differences between in principle approaches and in practice approaches to philosophy of science are explored.


Sign in / Sign up

Export Citation Format

Share Document