scholarly journals Sensitivity of the Posterior Mean on the Prior Assumptions: An Application of the Ellipsoid Bound Theorem

Author(s):  
Olawale B. Akanbi ◽  
Olusanya E. Olubusoye ◽  
Oluwaseun O. Odeyemi

This study examines the sensitivity of the posterior mean to change in the prior assumptions. Three plausible choices of prior which include informative, relative-non informative and non-informative priors are considered. The paper considers information level for a prior to cause a notable change in the Bayesian posterior point estimate. The study develops a framework for evaluating a bound for a robust posterior point estimate. The Ellipsoid Bound theorem is employed to derive the Ellipsoid Bound for an independent normal gamma prior distribution. The proposed modification ellipsoid bound for the large prior was establised by varrying different variance co-variance sizes for the independent normal gamma prior. This bound represents the range for the posterior mean when is insensitive and when it’s sensitive in both location and spread. The result shows that; for a large prior parameter value (greater than the OLS estimate) with a positive definite prior variance covariance matrix, and prior parameter values interval which contains the OLS estimate then, the posterior estimate will be less than both the OLS and the prior estimates. Similarly, if the lower bound of the prior parameter values range is greater than the OLS estimate then: The posterior estimate will be greater than the OLS estimate but smaller than the prior estimate. Furthermore, it is observed that no matter the degrees of confidence in the prior values, data information is powerful enough to modify it.

2018 ◽  
Vol 12 (10) ◽  
pp. 128
Author(s):  
Jeroen Provoost ◽  
Stephanie Nouwen ◽  
Jan Bronders

This study presents an evaluation of three screening-level models, namely the Dilution Factor (DF) model from 1996, the update version from 2005, as well as the Johnson and Ettinger model (JEM) from 1997, that are applied within the frameworks for contaminated land management (CLM) in Sweden. This evaluation applies, besides a deterministic approach (point estimate), a probabilistic assessment plus sensitivity analysis. The latter approach allows the models to be ranked according to conservatism, accuracy and parsimony by contrasting predicted and observed soil and indoor air concentrations for two contaminants (benzene and ethylbenzene), as to determine their suitability for application within CLM. The results and conclusions from this study suggest that the most accurately model for predicting the soil and indoor concentration is the JEM followed by the DF 2005 and 1996. Predictions of the soil air concentration are primarily driven by variation in physico-chemical parameters. The variation in indoor air concentration by physico-chemical and/or soil parameters for the DF models, while for the JEM soil parameters dominate. The deterministic analysis showed that default parameter values could be revised as to increase the conservatism, and be closer to the probabilistic 95-percentile predicted indoor air concentration. The DF 1996 model includes a limited number of parameters, and this analysis shows that a model with more parameters is more accurate, and less conservative. The DF 2005 seems to be the most parsimonious model as it is accurate, sufficiently conservative, and has 14 parameters, whereas the DF 1996 with 9 parameters is the most conservative and the JEM with 27 parameters the most accurate with an increased probability to produce false negative predictions. For the latter some of the dominant parameters cannot easily be measured on site.


1987 ◽  
Vol 26 (06) ◽  
pp. 248-252 ◽  
Author(s):  
M. J. van Eenige ◽  
F. C. Visser ◽  
A. J. P. Karreman ◽  
C. M. B. Duwel ◽  
G. Westera ◽  
...  

Optimal fitting of a myocardial time-activity curve is accomplished with a monoexponential plus a constant, resulting in three parameters: amplitude and half-time of the monoexponential and the constant. The aim of this study was to estimate the precision of the calculated parameters. The variability of the parameter values as a function of the acquisition time was studied in 11 patients with cardiac complaints. Of the three parameters the half-time value varied most strongly with the acquisition time. An acquisition time of 80 min was needed to keep the standard deviation of the half-time value within ±10%. To estimate the standard deviation of the half-time value as a function of the parameter values, of the noise content of the time-activity curve and of the acquisition time, a model experiment was used. In most cases the SD decreased by 50% if the acquisition time was increased from 60 to 90 min. A low amplitude/constant ratio and a high half-time value result in a high SD of the half-time value. Tables are presented to estimate the SD in a particular case.


Vestnik MEI ◽  
2020 ◽  
Vol 5 (5) ◽  
pp. 132-139
Author(s):  
Ivan E. Kurilenko ◽  
◽  
Igor E. Nikonov ◽  

A method for solving the problem of classifying short-text messages in the form of sentences of customers uttered in talking via the telephone line of organizations is considered. To solve this problem, a classifier was developed, which is based on using a combination of two methods: a description of the subject area in the form of a hierarchy of entities and plausible reasoning based on the case-based reasoning approach, which is actively used in artificial intelligence systems. In solving various problems of artificial intelligence-based analysis of data, these methods have shown a high degree of efficiency, scalability, and independence from data structure. As part of using the case-based reasoning approach in the classifier, it is proposed to modify the TF-IDF (Term Frequency - Inverse Document Frequency) measure of assessing the text content taking into account known information about the distribution of documents by topics. The proposed modification makes it possible to improve the classification quality in comparison with classical measures, since it takes into account the information about the distribution of words not only in a separate document or topic, but in the entire database of cases. Experimental results are presented that confirm the effectiveness of the proposed metric and the developed classifier as applied to classification of customer sentences and providing them with the necessary information depending on the classification result. The developed text classification service prototype is used as part of the voice interaction module with the user in the objective of robotizing the telephone call routing system and making a shift from interaction between the user and system by means of buttons to their interaction through voice.


Author(s):  
Yaniv Aspis ◽  
Krysia Broda ◽  
Alessandra Russo ◽  
Jorge Lobo

We introduce a novel approach for the computation of stable and supported models of normal logic programs in continuous vector spaces by a gradient-based search method. Specifically, the application of the immediate consequence operator of a program reduct can be computed in a vector space. To do this, Herbrand interpretations of a propositional program are embedded as 0-1 vectors in $\mathbb{R}^N$ and program reducts are represented as matrices in $\mathbb{R}^{N \times N}$. Using these representations we prove that the underlying semantics of a normal logic program is captured through matrix multiplication and a differentiable operation. As supported and stable models of a normal logic program can now be seen as fixed points in a continuous space, non-monotonic deduction can be performed using an optimisation process such as Newton's method. We report the results of several experiments using synthetically generated programs that demonstrate the feasibility of the approach and highlight how different parameter values can affect the behaviour of the system.


2006 ◽  
Vol 41 (1) ◽  
pp. 72-83 ◽  
Author(s):  
Zhe Zhang ◽  
Eric R. Hall

Abstract Parameter estimation and wastewater characterization are crucial for modelling of the membrane enhanced biological phosphorus removal (MEBPR) process. Prior to determining the values of a subset of kinetic and stoichiometric parameters used in ASM No. 2 (ASM2), the carbon, nitrogen and phosphorus fractions of influent wastewater at the University of British Columbia (UBC) pilot plant were characterized. It was found that the UBC wastewater contained fractions of volatile acids (SA), readily fermentable biodegradable COD (SF) and slowly biodegradable COD (XS) that fell within the ASM2 default value ranges. The contents of soluble inert COD (SI) and particulate inert COD (XI) were somewhat higher than ASM2 default values. Mixed liquor samples from pilot-scale MEBPR and conventional enhanced biological phosphorus removal (CEBPR) processes operated under parallel conditions, were then analyzed experimentally to assess the impact of operation in a membrane-assisted mode on the growth yield (YH), decay coefficient (bH) and maximum specific growth rate of heterotrophic biomass (µH). The resulting values for YH, bH and µH were slightly lower for the MEBPR train than for the CEBPR train, but the differences were not statistically significant. It is suggested that MEBPR simulation using ASM2 could be accomplished satisfactorily using parameter values determined for a conventional biological phosphorus removal process, if MEBPR parameter values are not available.


1989 ◽  
Vol 21 (4-5) ◽  
pp. 305-314
Author(s):  
J. P. Lumbers ◽  
S. C. Cook ◽  
G. A. Thomas

An application of a dynamic model of the activated sludge process is described within the context of real-time river basin management. The model has been calibrated and validated on independent data and then applied to investigate losses of nitrification at the Mogden Works. Monte Carlo simulation and generalised sensitivity analysis were found to be effective ways of identifying appropriate parameter values and their importance. The prediction of unmeasured states such as the autotroph population enabled the effects of alternative control actions to be better understood and the most suitable measures found.


2018 ◽  
Vol 21 (6) ◽  
pp. 411-419 ◽  
Author(s):  
Conghua Wang ◽  
Fang Yan ◽  
Yuan Zhang ◽  
Haihong Liu ◽  
Linghai Zhang

Aims and Objective: A large number of experimental evidences report that the oscillatory dynamics of p53 would regulate the cell fate decisions. Moreover, multiple time delays are ubiquitous in gene expression which have been demonstrated to lead to important consequences on dynamics of genetic networks. Although delay-driven sustained oscillation in p53-based networks is commonplace, the precise roles of such delays during the processes are not completely known. Method: Herein, an integrated model with five basic components and two time delays for the network is developed. Using such time delays as the bifurcation parameter, the existence of Hopf bifurcation is given by analyzing the relevant characteristic equations. Moreover, the effects of such time delays are studied and the expression levels of the main components of the system are compared when taking different parameters and time delays. Result and Conclusion: The above theoretical results indicated that the transcriptional and translational delays can induce oscillation by undergoing a super-critical Hopf bifurcation. More interestingly, the length of these delays can control the amplitude and period of the oscillation. Furthermore, a certain range of model parameter values is essential for oscillation. Finally, we illustrated the main results in detail through numerical simulations.


2020 ◽  
Vol 14 (2) ◽  
pp. 229-233
Author(s):  
Yongbin Zhang

Background:: The challenges to nanoporous filtration membranes are small fluxes and low membrane mechanical strengths. Objective:: To introduce newly invented nanoporous filtration membranes with complex pores, improved fluxes and mechanical strengths as registered in patents. Methods:: The analytical results are presented for the addressed membranes. Results:: The geometrical parameter values of the addressed membranes can be optimized for the highest fluxes. Conclusion:: The overall performances of nanoporous filtration membranes with complex cylindrical or/and conical pores can be significantly better than that of the conventional nanoporous filtration membranes with single cylindrical or conical pores.


Author(s):  
David Hankin ◽  
Michael S. Mohr ◽  
Kenneth B. Newman

We present a rigorous but understandable introduction to the field of sampling theory for ecologists and natural resource scientists. Sampling theory concerns itself with development of procedures for random selection of a subset of units, a sample, from a larger finite population, and with how to best use sample data to make scientifically and statistically sound inferences about the population as a whole. The inferences fall into two broad categories: (a) estimation of simple descriptive population parameters, such as means, totals, or proportions, for variables of interest, and (b) estimation of uncertainty associated with estimated parameter values. Although the targets of estimation are few and simple, estimates of means, totals, or proportions see important and often controversial uses in management of natural resources and in fundamental ecological research, but few ecologists or natural resource scientists have formal training in sampling theory. We emphasize the classical design-based approach to sampling in which variable values associated with units are regarded as fixed and uncertainty of estimation arises via various randomization strategies that may be used to select samples. In addition to covering standard topics such as simple random, systematic, cluster, unequal probability (stressing the generality of Horvitz–Thompson estimation), multi-stage, and multi-phase sampling, we also consider adaptive sampling, spatially balanced sampling, and sampling through time, three areas of special importance for ecologists and natural resource scientists. The text is directed to undergraduate seniors, graduate students, and practicing professionals. Problems emphasize application of the theory and R programming in ecological and natural resource settings.


Sign in / Sign up

Export Citation Format

Share Document