Is hybrid formal theory of arguments, stories and criminal evidence well suited for negative causation?

2019 ◽  
Vol 28 (3) ◽  
pp. 361-384
Author(s):  
Charles A. Barclay
1972 ◽  
Vol 17 (6) ◽  
pp. 358-359
Author(s):  
KURT W. BACK
Keyword(s):  

Author(s):  
Jacob Stegenga

This chapter introduces the book, describes the key arguments of each chapter, and summarizes the master argument for medical nihilism. It offers a brief survey of prominent articulations of medical nihilism throughout history, and describes the contemporary evidence-based medicine movement, to set the stage for the skeptical arguments. The main arguments are based on an analysis of the concepts of disease and effectiveness, the malleability of methods in medical research, and widespread empirical findings which suggest that many medical interventions are barely effective. The chapter-level arguments are unified by our best formal theory of inductive inference in what is called the master argument for medical nihilism. The book closes by considering what medical nihilism entails for medical practice, research, and regulation.


Author(s):  
Juan de Lara ◽  
Esther Guerra

AbstractModelling is an essential activity in software engineering. It typically involves two meta-levels: one includes meta-models that describe modelling languages, and the other contains models built by instantiating those meta-models. Multi-level modelling generalizes this approach by allowing models to span an arbitrary number of meta-levels. A scenario that profits from multi-level modelling is the definition of language families that can be specialized (e.g., for different domains) by successive refinements at subsequent meta-levels, hence promoting language reuse. This enables an open set of variability options given by all possible specializations of the language family. However, multi-level modelling lacks the ability to express closed variability regarding the availability of language primitives or the possibility to opt between alternative primitive realizations. This limits the reuse opportunities of a language family. To improve this situation, we propose a novel combination of product lines with multi-level modelling to cover both open and closed variability. Our proposal is backed by a formal theory that guarantees correctness, enables top-down and bottom-up language variability design, and is implemented atop the MetaDepth multi-level modelling tool.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
John Capps

Abstract John Dewey’s theory of truth is widely viewed as proposing to substitute “warranted assertibility” for “truth,” a proposal that has faced serious objections since the late 1930s. By examining Dewey’s theory in its historical context – and, in particular, by drawing parallels with Otto Neurath’s concurrent attempts to develop a non-correspondence, non-formal theory of truth – I aim to shed light on Dewey’s underlying objectives. Dewey and Neurath were well-known to each other and, as their writing and correspondence make clear, they took similar paths over the mid-century philosophical terrain. I conclude that Dewey’s account of truth is more principled, and more relevant to historical debates, than it first appears.


2021 ◽  
Vol 3 (1) ◽  
pp. 1-46
Author(s):  
Alexander Krüger ◽  
Jan Tünnermann ◽  
Lukas Stratmann ◽  
Lucas Briese ◽  
Falko Dressler ◽  
...  

Abstract As a formal theory, Bundesen’s theory of visual attention (TVA) enables the estimation of several theoretically meaningful parameters involved in attentional selection and visual encoding. As of yet, TVA has almost exclusively been used in restricted empirical scenarios such as whole and partial report and with strictly controlled stimulus material. We present a series of experiments in which we test whether the advantages of TVA can be exploited in more realistic scenarios with varying degree of stimulus control. This includes brief experimental sessions conducted on different mobile devices, computer games, and a driving simulator. Overall, six experiments demonstrate that the TVA parameters for processing capacity and attentional weight can be measured with sufficient precision in less controlled scenarios and that the results do not deviate strongly from typical laboratory results, although some systematic differences were found.


Measurement ◽  
2018 ◽  
Vol 116 ◽  
pp. 644-651 ◽  
Author(s):  
Giovanni Battista Rossi ◽  
Francesco Crenna

The two-beam dynamical theory of electron diffraction in absorbing crystals has been applied to explain features of bend and thickness extinction contours and of images of stacking faults observed on transmission electron micrographs of metal foils. Inelastic scattering processes affect the intensities of the elastically scattered waves and give rise to 4 anomalous ’ transmission (Borrmann) effects. The formal theory takes account of these effects phenomenologically by the use of a complex lattice potential but ignores the contribution of the inelastically scattered electrons to the image. In the theory absorption is described by certain parameters ξ' 0 and ξ' g with dimensions of length. These parameters are determined by Fourier coefficients of the imaginary part of the potential in the same manner as the extinction distance ξ g is determined by the Fourier coefficient of the real part. A simple physical explanation of the ‘anomalous’ absorption effect is developed in terms of the two crystal wave fields. This explanation is particularly helpful in understanding details of bend and thickness contours and of images of stacking faults. The theory is at present phenomenological because the detailed mechanism of the absorption process is not understood. Nevertheless, comparison of the theory with observations enables the absorption parameters to be roughly estimated.


1999 ◽  
Vol 11 (1) ◽  
pp. 21-66 ◽  
Author(s):  
Douglas A. Miller ◽  
Steven W. Zucker

We present a model of visual computation based on tightly inter-connected cliques of pyramidal cells. It leads to a formal theory of cell assemblies, a specific relationship between correlated firing patterns and abstract functionality, and a direct calculation relating estimates of cortical cell counts to orientation hyperacuity. Our network architecture is unique in that (1) it supports a mode of computation that is both reliable and efficent; (2) the current-spike relations are modeled as an analog dynamical system in which the requisite computations can take place on the time scale required for an early stage of visual processing; and (3) the dynamics are triggered by the spatiotemporal response of cortical cells. This final point could explain why moving stimuli improve vernier sensitivity.


Sign in / Sign up

Export Citation Format

Share Document