scholarly journals Coherence and probability in legal evidence

2019 ◽  
Vol 18 (4) ◽  
pp. 275-294 ◽  
Author(s):  
Christian Dahlman ◽  
Anne Ruth Mackor

Abstract The authors investigate to what extent an evaluation of legal evidence in terms of coherence (suggested by Thagard, Amaya, Van Koppen and others) is reconcilable with a probabilistic (Bayesian) approach to legal evidence. The article is written by one author (Dahlman) with a background in the bayesian approach to legal evidence, and one author (Mackor) with a background in scenario theory. The authors find common ground but partly diverge in their conclusions. Their findings give support to the claim (reductionism) that coherence can be translated into probability without loss. Dahlman therefore concludes that the probabilistic vocabulary is superior to the coherence vocabulary, since it is more precise. Mackor is more agnostic in her conclusions about reductionism. In Mackor's view, the findings of their joint investigation do not imply that the probabilistic approach is superior to the coherentist approach.

Author(s):  
M. Azarkhail ◽  
M. Modarres

The physics-of-failure (POF) modeling approach is a proven and powerful method to predict the reliability of mechanical components and systems. Most of POF models have been originally developed based upon empirical data from a wide range of applications (e.g. fracture mechanics approach to the fatigue life). Available curve fitting methods such as least square for example, calculate the best estimate of parameters by minimizing the distance function. Such point estimate approaches, basically overlook the other possibilities for the parameters and fail to incorporate the real uncertainty of empirical data into the process. The other important issue with traditional methods is when new data points become available. In such conditions, the best estimate methods need to be recalculated using the new and old data sets all together. But the original data sets, used to develop POF models may be no longer available to be combined with new data in a point estimate framework. In this research, for efficient uncertainty management in POF models, a powerful Bayesian framework is proposed. Bayesian approach provides many practical features such as a fair coverage of uncertainty and the updating concept that provide a powerful means for knowledge management, meaning that the Bayesian models allow the available information to be stored in a probability density format over the model parameters. These distributions may be considered as prior to be updated in the light of new data when they become available. At the first part of this article a brief review of classical and probabilistic approach to regression is presented. In this part the accuracy of traditional normal distribution assumption for error is examined and a new flexible likelihood function is proposed. The Bayesian approach to regression and its bonds with classical and probabilistic methods are explained next. In Bayesian section we shall discuss how the likelihood functions introduced in probabilistic approach, can be combined with prior information using the conditional probability concept. In order to highlight the advantages, the Bayesian approach is further clarified with case studies in which the result of calculation is compared with other traditional methods such as least square and maximum likelihood estimation (MLE) method. In this research, the mathematical complexity of Bayesian inference equations was overcome utilizing Markov Chain Monte Carlo simulation technique.


2015 ◽  
Vol 15 (08) ◽  
pp. 1540026 ◽  
Author(s):  
Q. Hu ◽  
H. F. Lam ◽  
S. A. Alabi

The identification of railway ballast damage under a concrete sleeper is investigated by following the Bayesian approach. The use of a discrete modeling method to capture the distribution of ballast stiffness under the sleeper introduces artificial stiffness discontinuities between different ballast regions. This increases the effects of modeling errors and reduces the accuracy of the ballast damage detection results. In this paper, a continuous modeling method was developed to overcome this difficulty. The uncertainties induced by modeling error and measurement noise are the major difficulties of vibration-based damage detection methods. In the proposed methodology, Bayesian probabilistic approach is adopted to explicitly address the uncertainties associated with the identified model parameters. In the model updating process, the stiffness of the ballast foundation is assumed to be continuous along the sleeper by using a polynomial of order N. One of the contributions of this paper is to calculate the order N conditional on a given set of measurement utilizing the Bayesian model class selection method. The proposed ballast damage detection methodology was verified with vibration data obtained from a segment of full-scale ballasted track under laboratory conditions, and the experimental verification results are very encouraging showing that it is possible to use the Bayesian approach along with the newly developed continuous modeling method for the purpose of ballast damage detection.


2021 ◽  
Vol 14 (2) ◽  
pp. 231-232
Author(s):  
Adnan Kastrati ◽  
Alexander Hapfelmeier

Author(s):  
Daiane Aparecida Zuanetti ◽  
Luis Aparecido Milan

In this paper, we propose a new Bayesian approach for QTL mapping of family data. The main purpose is to model a phenotype as a function of QTLs’ effects. The model considers the detailed familiar dependence and it does not rely on random effects. It combines the probability for Mendelian inheritance of parents’ genotype and the correlation between flanking markers and QTLs. This is an advance when compared with models which use only Mendelian segregation or only the correlation between markers and QTLs to estimate transmission probabilities. We use the Bayesian approach to estimate the number of QTLs, their location and the additive and dominance effects. We compare the performance of the proposed method with variance component and LASSO models using simulated and GAW17 data sets. Under tested conditions, the proposed method outperforms other methods in aspects such as estimating the number of QTLs, the accuracy of the QTLs’ position and the estimate of their effects. The results of the application of the proposed method to data sets exceeded all of our expectations.


Author(s):  
Christian Dahlman ◽  
Alex Stein ◽  
Giovanni Tuzet

Philosophical Foundations of Evidence Law presents a cross-disciplinary overview of the core issues in the theory and methodology of adjudicative evidence and factfinding, assembling the major philosophical and interdisciplinary insights that define evidence theory, as related to law, in a single book. The volume presents contemporary debates on truth, knowledge, rational beliefs, proof, argumentation, explanation, coherence, probability, economics, psychology, bias, gender, and race. It covers different theoretical approaches to legal evidence, including the Bayesian approach, scenario theory, and inference to the best explanation. The volume’s contributions come from scholars spread across three continents and twelve different countries, whose common interest is evidence theory as related to law.


Sign in / Sign up

Export Citation Format

Share Document