DETECTION OF TRACKS IN AERIAL PHOTOS BY THE GIBBS SAMPLER

Author(s):  
A. TETERUKOVSKIY

A problem of automatic detection of tracks in aerial photos is considered. We adopt a Bayesian approach and base our inference on an a priori knowledge of the structure of tracks. The probability of a pixel to belong to a track depends on how the pixel gray level differs from the gray levels of pixels in the neighborhood and on additional prior information. Several suggestions on how to formalize the prior knowledge about the shape of the tracks are made. The Gibbs sampler is used to construct the most probable configuration of tracks in the area. The method is applied to aerial photos with cell size of 1 sq. m. Even for detection of trails of width comparable with or smaller than the cell size, positive results can be achieved.

Author(s):  
Alessandro Ferrero ◽  
Simona Salicone ◽  
Harsha Vardhana Jetti

Since the GUM has been published, measurement uncertainty has been defined in terms of the standard deviation of the probability distribution of the values that can be reasonably attributed to the measurand, and it has been evaluated using statistical or probabilistic methods. A debate has always been alive, among the metrologists, on whether a frequentist approach or a Bayesian approach should be followed to evaluate uncertainty. The Bayesian approach, based on some available a-priori knowledge about the measurand seems to prevail, nowadays. This paper starts from the consideration that the Bayesian approach is based on the well-known Bayes theorem that, as all mathematical theorems, is valid only to the extent the assumptions made to prove it are valid. The main question, when following the Bayesian approach, is hence whether these assumptions are satisfied in the practical cases, especially when the a-priori information is combined with the information coming from the measurement data to refine uncertainty evaluation. This paper will take into account some case studies to analyze when the Bayesian approach can be usefully and reliably employed by discussing the amount and pertinence of the available a-priori knowledge.


1995 ◽  
Vol 32 (2) ◽  
pp. 152-162 ◽  
Author(s):  
Greg M. Allenby ◽  
Neeraj Arora ◽  
James L. Ginter

The authors use conjoint analysis to provide interval-level estimates of part-worths allowing tradeoffs among attribute levels to be examined. Researchers often possess prior information about the part-worths, such as the order and range restrictions of product attribute levels. It is known, for example, that consumers would rather pay less for a specific product given that all other product attribute levels are unchanged. The authors present a Bayesian approach to incorporate prior ordinal information about these part-worths into the analysis of conjoint studies. Their method results in parameter estimates with greater face validity and predictive performance than estimates that do not utilize prior information or those that use traditional methods such as LINMAP. Unlike existing methods, the authors’ methods apply to both rating and choice-based conjoint studies.


1970 ◽  
Vol 9 ◽  
pp. 41-48 ◽  
Author(s):  
R. P. Khatiwada ◽  
A. B. Sthapit

Conventional method of making statistical inference regarding food quality measure is absolutely based upon experimental data. It refuses to incorporate prior knowledge and historical data on parameter of interest. It is not well suited in the food quality control problems. We propose to use a Bayesian approach inferring the conformance of the data concerning quality run. This approach integrates the facts about the parameter of interest from the historical data or from the expert knowledge. The prior information are used along with the experimental data for the meaningful deduction. In this study, we used Bayesian approach to infer the weight of pouched ghee. Data are taken selecting random samples from a dairy industry. The prior information about average weight and the process standard deviation are taken from the prior knowledge of process specification and standards. Normal-Normal model is used to combine the prior and experimental data in Bayesian framework. We used user-friendly computer programmes, ‘First Bayes' and ‘WinBUGS' to obtain posterior distribution, estimating the process precision, credible intervals, and predictive distribution. Results are presented comparing with conventional methods. Fitting of the model is shown using kernel density and triplot of the distributions. Key words: credible interval; kernel density; posterior distribution; predictive distribution; triplot DOI: 10.3126/njst.v9i0.3163 Nepal Journal of Science and Technology 9 (2008) 41-48


1994 ◽  
Vol 19 (1) ◽  
pp. 1-21
Author(s):  
James H. Albert

The problem of interest is to estimate a two-way table of means of grade point averages (GPAs) of University of Iowa freshmen grouped by levels of ACT scores and high school percentiles. Suppose one believes a priori that the table means satisfy a specific partial order. The use of two different classes of prior distributions is considered in modeling this prior information. The use of simulation and, in particular, the Gibbs sampler are outlined to summarize various posterior distributions of interest. The posterior distribution is used to predict the grade point averages of future Iowa freshmen.


2020 ◽  
Author(s):  
Laetitia Zmuda ◽  
Charlotte Baey ◽  
Paolo Mairano ◽  
Anahita Basirat

It is well-known that individuals can identify novel words in a stream of an artificial language using statistical dependencies. While underlying computations are thought to be similar from one stream to another (e.g. transitional probabilities between syllables), performance are not similar. According to the “linguistic entrenchment” hypothesis, this would be due to the fact that individuals have some prior knowledge regarding co-occurrences of elements in speech which intervene during verbal statistical learning. The focus of previous studies was on task performance. The goal of the current study is to examine the extent to which prior knowledge impacts metacognition (i.e. ability to evaluate one’s own cognitive processes). Participants were exposed to two different artificial languages. Using a fully Bayesian approach, we estimated an unbiased measure of metacognitive efficiency and compared the two languages in terms of task performance and metacognition. While task performance was higher in one of the languages, the metacognitive efficiency was similar in both languages. In addition, a model assuming no correlation between the two languages better accounted for our results compared to a model where correlations were introduced. We discuss the implications of our findings regarding the computations which underlie the interaction between input and prior knowledge during verbal statistical learning.


Author(s):  
Robert Audi

This book provides an overall theory of perception and an account of knowledge and justification concerning the physical, the abstract, and the normative. It has the rigor appropriate for professionals but explains its main points using concrete examples. It accounts for two important aspects of perception on which philosophers have said too little: its relevance to a priori knowledge—traditionally conceived as independent of perception—and its role in human action. Overall, the book provides a full-scale account of perception, presents a theory of the a priori, and explains how perception guides action. It also clarifies the relation between action and practical reasoning; the notion of rational action; and the relation between propositional and practical knowledge. Part One develops a theory of perception as experiential, representational, and causally connected with its objects: as a discriminative response to those objects, embodying phenomenally distinctive elements; and as yielding rich information that underlies human knowledge. Part Two presents a theory of self-evidence and the a priori. The theory is perceptualist in explicating the apprehension of a priori truths by articulating its parallels to perception. The theory unifies empirical and a priori knowledge by clarifying their reliable connections with their objects—connections many have thought impossible for a priori knowledge as about the abstract. Part Three explores how perception guides action; the relation between knowing how and knowing that; the nature of reasons for action; the role of inference in determining action; and the overall conditions for rational action.


Author(s):  
Donald C. Williams

This chapter begins with a systematic presentation of the doctrine of actualism. According to actualism, all that exists is actual, determinate, and of one way of being. There are no possible objects, nor is there any indeterminacy in the world. In addition, there are no ways of being. It is proposed that actual entities stand in three fundamental relations: mereological, spatiotemporal, and resemblance relations. These relations govern the fundamental entities. Each fundamental entity stands in parthood relations, spatiotemporal relations, and resemblance relations to other entities. The resulting picture is one that represents the world as a four-dimensional manifold of actual ‘qualitied contents’—upon which all else supervenes. It is then explained how actualism accounts for classes, quantity, number, causation, laws, a priori knowledge, necessity, and induction.


Author(s):  
Keith DeRose

In this chapter the contextualist Moorean account of how we know by ordinary standards that we are not brains in vats (BIVs) utilized in Chapter 1 is developed and defended, and the picture of knowledge and justification that emerges is explained. The account (a) is based on a double-safety picture of knowledge; (b) has it that our knowledge that we’re not BIVs is in an important way a priori; and (c) is knowledge that is easily obtained, without any need for fancy philosophical arguments to the effect that we’re not BIVs; and the account is one that (d) utilizes a conservative approach to epistemic justification. Special attention is devoted to defending the claim that we have a priori knowledge of the deeply contingent fact that we’re not BIVs, and to distinguishing this a prioritist account of this knowledge from the kind of “dogmatist” account prominently championed by James Pryor.


Sign in / Sign up

Export Citation Format

Share Document