scholarly journals Análise Bayesiana no estudo do tempo de retorno das precipitações pluviais máximas em Jaboticabal (SP)

2009 ◽  
Vol 33 (1) ◽  
pp. 261-270 ◽  
Author(s):  
Luiz Alberto Beijo ◽  
Mário Javier Ferrua Vivanco ◽  
Joel Augusto Muniz

Dados históricos de precipitação máxima são utilizados para realizar previsões de chuvas extremas, cujo conhecimento é de grande importância na elaboração de projetos agrícolas e de engenharia hidráulica. A distribuição generalizada de valores extremos (GEV) tem sido aplicada com freqüência nesses tipos de estudos, porém, algumas dificuldades na obtenção de estimativas confiáveis sobre alguma medida dos dados têm ocorrido devido ao fato de que, na maioria das situações, tem-se uma quantidade escassa de dados. Uma alternativa para obter melhorias na qualidade das estimativas seria utilizar informações dos especialistas de determinada área em estudo. Sendo assim, objetiva-se neste trabalho analisar a aplicação da Inferência Bayesiana com uma distribuição a priori baseada em quantis extremos, que facilite a incorporação dos conhecimentos fornecidos por especialistas, para obter as estimativas de precipitação máxima para os tempos de retorno de 10 e 20 anos e seus respectivos limites superiores de 95%, para o período anual e para os meses da estação chuvosa em Jaboticabal (SP). A técnica Monte Carlo, via Cadeias de Markov (MCMC), foi empregada para inferência a posteriori de cada parâmetro. A metodologia Bayesiana apresentou resultados mais acurados e precisos, tanto na estimação dos parâmetros da distribuição GEV, como na obtenção dos valores de precipitação máxima provável para a região de Jaboticabal, apresentando-se como uma boa alternativa na incorporação de conhecimentos a priori no estudo de dados extremos.

2005 ◽  
Vol 34 (5) ◽  
pp. 1531-1539 ◽  
Author(s):  
Tarcísio de Moraes Gonçalves ◽  
Henrique Nunes de Oliveira ◽  
Henk Bovenhuis ◽  
Marco Bink ◽  
Johan Van Arendonk

Foi utilizada uma análise de segregação com o uso da inferência Bayesiana para estimar componentes de variância e verificar a presença de genes de efeito principal (GEP) influenciando duas características de carcaça: gordura intramuscular (GIM), em %, e espessura de toucinho (ET), em mm; e uma de crescimento, ganho de peso (g/dia) dos 25 aos 90 kg de peso vivo (GP). Para este estudo, foram utilizadas informações de 1.257 animais provenientes de um delineamento de F2, obtidos do cruzamento de suínos machos Meishan e fêmeas Large White e Landrace. No melhoramento genético animal, os modelos poligênicos finitos (MPF) podem ser uma alternativa aos modelos poligênicos infinitesimais (MPI) para avaliação genética de características quantitativas usando pedigrees complexos. MPI, MPF e MPI combinado com MPF foram empiricamente testados para se estimar componentes de variâncias e número de genes no MPF. Para a estimação de médias marginais a posteriori de componentes de variância e de parâmetros, foi utilizada uma metodologia Bayesiana, por meio do uso da Cadeia de Markov, algoritmos de Monte Carlo (MCMC), via Amostrador de Gibbs e Reversible Jump Sampler (Metropolis-Hastings). Em função dos resultados obtidos, pode-se evidenciar quatro GEP, sendo dois para GIM e dois para ET. Para ET, o GEP explicou a maior parte da variação genética, enquanto, para GIM, o GEP reduziu significativamente a variação poligênica. Para a variação do GP, não foi possível determinar a influência do GEP. As herdabilidades estimadas ajustando-se MPI para GIM, ET e GP foram de 0,37; 0,24 e 0,37, respectivamente. Estudos futuros com base neste experimento que usem marcadores moleculares para mapear os genes de efeito principal que afetem, principalmente GIM e ET, poderão lograr êxito.


Geophysics ◽  
2012 ◽  
Vol 77 (2) ◽  
pp. H19-H31 ◽  
Author(s):  
Knud Skou Cordua ◽  
Thomas Mejer Hansen ◽  
Klaus Mosegaard

We present a general Monte Carlo full-waveform inversion strategy that integrates a priori information described by geostatistical algorithms with Bayesian inverse problem theory. The extended Metropolis algorithm can be used to sample the a posteriori probability density of highly nonlinear inverse problems, such as full-waveform inversion. Sequential Gibbs sampling is a method that allows efficient sampling of a priori probability densities described by geostatistical algorithms based on either two-point (e.g., Gaussian) or multiple-point statistics. We outline the theoretical framework for a full-waveform inversion strategy that integrates the extended Metropolis algorithm with sequential Gibbs sampling such that arbitrary complex geostatistically defined a priori information can be included. At the same time we show how temporally and/or spatiallycorrelated data uncertainties can be taken into account during the inversion. The suggested inversion strategy is tested on synthetic tomographic crosshole ground-penetrating radar full-waveform data using multiple-point-based a priori information. This is, to our knowledge, the first example of obtaining a posteriori realizations of a full-waveform inverse problem. Benefits of the proposed methodology compared with deterministic inversion approaches include: (1) The a posteriori model variability reflects the states of information provided by the data uncertainties and a priori information, which provides a means of obtaining resolution analysis. (2) Based on a posteriori realizations, complicated statistical questions can be answered, such as the probability of connectivity across a layer. (3) Complex a priori information can be included through geostatistical algorithms. These benefits, however, require more computing resources than traditional methods do. Moreover, an adequate knowledge of data uncertainties and a priori information is required to obtain meaningful uncertainty estimates. The latter may be a key challenge when considering field experiments, which will not be addressed here.


2021 ◽  
Author(s):  
Fabio Ciabarri ◽  
◽  
Marco Pirrone ◽  
Cristiano Tarchiani ◽  
◽  
...  

Log-facies classification aims to predict a vertical profile of facies at well location with log readings or rock properties calculated in the formation evaluation and/or rock-physics modeling analysis as input. Various classification approaches are described in the literature and new ones continue to appear based on emerging Machine Learning techniques. However, most of the available classification methods assume that the inputs are accurate and their inherent uncertainty, related to measurement errors and interpretation steps, is usually neglected. Accounting for facies uncertainty is not a mere exercise in style, rather it is fundamental for the purpose of understanding the reliability of the classification results, and it also represents a critical information for 3D reservoir modeling and/or seismic characterization processes. This is particularly true in wells characterized by high vertical heterogeneity of rock properties or thinly bedded stratigraphy. Among classification methods, probabilistic classifiers, which relies on the principle of Bayes decision theory, offer an intuitive way to model and propagate measurements/rock properties uncertainty into the classification process. In this work, the Bayesian classifier is enhanced such that the most likely classification of facies is expressed by maximizing the integral product between three probability functions. The latters describe: (1) the a-priori information on facies proportion (2) the likelihood of a set of measurements/rock properties to belong to a certain facies-class and (3) the uncertainty of the inputs to the classifier (log data or rock properties derived from them). Reliability of the classification outcome is therefore improved by accounting for both the global uncertainty, related to facies classes overlap in the classification model, and the depth-dependent uncertainty related to log data. As derived in this work, the most interesting feature of the proposed formulation, although generally valid for any type of probability functions, is that it can be analytically solved by representing the input distributions as a Gaussian mixture model and their related uncertainty as an additive white Gaussian noise. This gives a robust, straightforward and fast approach that can be effortlessly integrated in existing classification workflows. The proposed classifier is tested in various well-log characterization studies on clastic depositional environments where Monte-Carlo realizations of rock properties curves, output of a statistical formation evaluation analysis, are used to infer rock properties distributions. Uncertainty on rock properties, modeled as an additive white Gaussian noise, are then statistically estimated (independently at each depth along the well profile) from the ensemble of Monte-Carlo realizations. At the same time, a classifier, based on a Gaussian mixture model, is parametrically inferred from the pointwise mean of the Monte Carlo realizations given an a-priori reference profile of facies. Classification results, given by the a-posteriori facies proportion and the maximum a-posteriori prediction profiles, are finally computed. The classification outcomes clearly highlight that neglecting uncertainty leads to an erroneous final interpretation, especially at the transition zone between different facies. As mentioned, this become particularly remarkable in complex environments and highly heterogeneous scenarios.


Author(s):  
Heinrich Schepers ◽  
Giorgio Tonelli ◽  
Rudolf Eisler
Keyword(s):  
A Priori ◽  

1994 ◽  
Vol 11 (4) ◽  
pp. 475-503
Author(s):  
Masudul Alum Choudhury

Is it the realm of theoretical constructs or positive applications thatdefines the essence of scientific inquiry? Is there unison between thenormative and the positive, between the inductive and deductivecontents, between perception and reality, between the micro- andmacro-phenomena of reality as technically understood? In short, isthere a possibility for unification of knowledge in modernist epistemologicalcomprehension? Is knowledge perceived in conceptionand application as systemic dichotomy between the purely epistemic(in the metaphysically a priori sense) and the purely ontic (in thepurely positivistically a posteriori sense) at all a reflection of reality?Is knowledge possible in such a dichotomy or plurality?Answers to these foundational questions are primal in order tounderstand a critique of modernist synthesis in Islamic thought thathas been raging among Muslim scholars for some time now. Theconsequences emanating from the modernist approach underlie muchof the nature of development in methodology, thinking, institutions,and behavior in the Muslim world throughout its history. They arefound to pervade more intensively, I will argue here, as the consequenceof a taqlid of modernism among Islamic thinkers. I will thenargue that this debility has arisen not because of a comparativemodem scientific investigation, but due to a failure to fathom theuniqueness of a truly Qur'anic epistemological inquiry in the understandingof the nature of the Islamic socioscientific worldview ...


2019 ◽  
Vol 11 ◽  
pp. 51-64
Author(s):  
M. LE MOAL

Les systèmes d’information géographique (SIG) sont devenus incontournables dans la gestion des réseaux d’eau et d’assainissement et leur efficacité repose en très grande partie sur la qualité des données exploitées. Parallèlement, les évolutions réglementaires et les pratiques des utilisateurs augmentant notamment les échanges d’informations renforcent le rôle central des données et de leur qualité. Si la plupart des solutions SIG du marché disposent de fonctions dédiées à la qualification de la qualité des données, elles procèdent de la traduction préalable de spécifications des données en règles informatiques avant de procéder aux tests qualitatifs. Cette approche chronophage requiert des compétences métier. Pour éviter ces contraintes, Axes Conseil a élaboré un procédé de contrôle des données SIG rapide et accessible à des acteurs métier de l’eau et de l’assainissement. Plutôt qu’une lourde approche de modélisation a priori, le principe est de générer un ensemble d’indicateurs explicites facilement exploitables a posteriori par les acteurs du métier. Cette approche offre une grande souplesse d’analyse et ne nécessite pas de compétences informatiques avancées.


Author(s):  
Barry Stroud

This chapter presents a straightforward structural description of Immanuel Kant’s conception of what the transcendental deduction is supposed to do, and how it is supposed to do it. The ‘deduction’ Kant thinks is needed for understanding the human mind would establish and explain our ‘right’ or ‘entitlement’ to something we seem to possess and employ in ‘the highly complicated web of human knowledge’. This is: experience, concepts, and principles. The chapter explains the point and strategy of the ‘deduction’ as Kant understands it, as well as the demanding conditions of its success, without entering into complexities of interpretation or critical assessment of the degree of success actually achieved. It also analyses Kant’s arguments regarding a priori concepts as well as a posteriori knowledge of the world around us, along with his claim that our position in the world must be understood as ‘empirical realism’.


2017 ◽  
Vol 58 (3) ◽  
pp. 313-342 ◽  
Author(s):  
Barbara S. Held

The positive/negative distinction works well in many fields—for example, in mathematics negative numbers hold their own, and in medical pathology negative results are usually celebrated. But in positive psychology negativity should be replaced with positivity for flourishing/optimal functioning to occur. That the designation of the psychological states and processes deemed positive (good/desirable) and negative (bad/undesirable) is made a priori, independent of circumstantial particularity, both intrapersonal and interpersonal, does not seem to bother positive psychologists. But it should, as it results in conceptual muddles and dead ends that cannot be solved within their conceptual framework of positivity and negativity. Especially problematic is an ambiguity I find in positive psychologists’ a priori and a posteriori understandings of positivity and negativity, an ambiguity about constitutive and causal relations that pervades their science and the conclusions drawn from it. By eliminating their a priori dichotomy of positivity and negativity, positive psychologists might well find themselves in a better position to put back together the psychological reality that they have fractured in their ontologically dubious move of carving up psychological reality a priori into positive and negative phenomena. They then might find themselves better placed to “broaden and build” their own science of flourishing.


Sign in / Sign up

Export Citation Format

Share Document