probabilistic generalization
Recently Published Documents


TOTAL DOCUMENTS

24
(FIVE YEARS 4)

H-INDEX

7
(FIVE YEARS 1)

2021 ◽  
Author(s):  
Santosh Manicka ◽  
Kathleen Johnson ◽  
David Murrugarra ◽  
Michael Levin

Nonlinearity is a characteristic of complex biological regulatory networks that has implications ranging from therapy to control. To better understand its nature, we analyzed a suite of published Boolean network models, containing a variety of complex nonlinear interactions, with an approach involving a probabilistic generalization of Boolean logic that George Boole himself had proposed. Leveraging the continuous-nature of this formulation using Taylor-decomposition methods revealed the distinct layers of nonlinearity of the models. A comparison of the resulting series of model approximations with the corresponding sets of randomized ensembles furthermore revealed that the biological networks are relatively more linearly approximable. We hypothesize that this is a result of optimization by natural selection for the purpose of controllability.


Author(s):  
Tsuyoshi Idé

This paper proposes a new method for change detection and analysis using tensor regression. Change detection in our setting is to detect changes in the relationship between the input tensor and the output scalar while change analysis is to compute the responsibility score of individual tensor modes and dimensions for the change detected. We develop a new probabilistic tensor regression method, which can be viewed as a probabilistic generalization of the alternating least squares algorithm. Thanks to the probabilistic formulation, the derived change scores have a clear information-theoretic interpretation. We apply our method to semiconductor manufacturing to demonstrate the utility. To the best of our knowledge, this is the first work of change analysis based on probabilistic tensor regression.


2018 ◽  
Vol 10 (9) ◽  
pp. 3051 ◽  
Author(s):  
Xingji Zhu ◽  
Zaixian Chen ◽  
Hao Wang ◽  
Yabin Chen ◽  
Longjun Xu

In some extreme corrosion environments, the erosion of chloride ions and carbon dioxide can occur simultaneously, causing deterioration of reinforced concrete (RC) structures. This study presents a probabilistic model for the sustainability prediction of the service life of RC structures, taking into account that combined deterioration. Because of the high computational cost, we also present a series of simplifications to improve the model. Meanwhile, a semi-empirical method is also developed for this combined effect. By probabilistic generalization, this simplified method can swiftly handle the original reliability analysis which needs to be based on large amounts of data. A comparison of results obtained by the models with and without the above simplifications supports the significance of these improvements.


2016 ◽  
Vol 49 (4) ◽  
Author(s):  
Abderrahim Mbarki ◽  
Rachid Naciri

AbstractWe give a probabilistic generalization of the theory of generalized metric spaces [2]. Then, we prove a fixed point theorem for a self-mapping of a probabilistic generalized metric space, satisfying the very general nonlinear contraction condition without the assumption that the space is Hausdorff.


2015 ◽  
Vol 30 (2) ◽  
pp. 261-280 ◽  
Author(s):  
Antonio Di Crescenzo ◽  
Barbara Martinucci ◽  
Julio Mulero

For non-negative random variables with finite means we introduce an analogous of the equilibrium residual-lifetime distribution based on the quantile function. This allows us to construct new distributions with support (0, 1), and to obtain a new quantile-based version of the probabilistic generalization of Taylor's theorem. Similarly, for pairs of stochastically ordered random variables we come to a new quantile-based form of the probabilistic mean value theorem. The latter involves a distribution that generalizes the Lorenz curve. We investigate the special case of proportional quantile functions and apply the given results to various models based on classes of distributions and measures of risk theory. Motivated by some stochastic comparisons, we also introduce the “expected reversed proportional shortfall order”, and a new characterization of random lifetimes involving the reversed hazard rate function.


2014 ◽  
Vol 116 (4) ◽  
pp. 1-28
Author(s):  
Kadriye Ercikan ◽  
Wolff-Michael Roth

Context Generalization is a critical concept in all research designed to generate knowledge that applies to all elements of a unit (population) while studying only a subset of these elements (sample). Commonly applied criteria for generalizing focus on experimental design or representativeness of samples of the population of units. The criteria tend to neglect population diversity and targeted uses of knowledge generated from the generalization. Objectives This article has two connected purposes: (a) to articulate the structure and discuss limitations of different forms of generalizations across the spectrum of quantitative and qualitative research and (b) to argue for considering population heterogeneity and future uses of knowledge claims when judging the appropriateness of generalizations. Research Design In the first part of the paper, we present two forms of generalization that rely on statistical analysis of between-group variation: analytic and probabilistic generalization. We then describe a third form of generalization: essentialist generalization. Essentialist generalization moves from the particular to the general in small sample studies. We discuss limitations of each kind of generalization. In the second part of the paper, we propose two additional criteria when evaluating the validity of evidence based on generalizations from education research: population heterogeneity and future use of knowledge claims. Conclusions/Recommendations The proposed criticisms of research generalizations have implications on how research is conducted and research findings are summarized. The main limitation in analytic generalization is that it does not provide evidence of a causal link for subgroups or individuals. In addition to making explicit the uses that the knowledge claims may be targeting, there is a need for some changes in how research is conducted. This includes a need for demonstrating the mechanisms of causality; descriptions of intervention outcomes as positive, negative, or neutral; and latent class analysis accompanied with discriminant analysis. The main criticism of probabilistic generalization is that it may not apply to subgroups and may have limited value for guiding policy and practice. This highlights a need for defining grouping variables by intended uses of knowledge claims. With respect to essentialist generalization, there are currently too few qualitative studies attempting to identify invariants that hold across the range of relevant situations. There is a need to study the ways in which a kind of phenomenon is produced, which would allow researchers to understand the various ways in which a phenomenon manifests itself.


2013 ◽  
Vol 13 (4-5) ◽  
pp. 769-781 ◽  
Author(s):  
JON SNEYERS ◽  
DANNY DE SCHREYE ◽  
THOM FRÜHWIRTH

AbstractRiveret et al. have proposed a framework for probabilistic legal reasoning. Their goal is to determine the chance of winning a court case, given the probabilities of the judge accepting certain claimed facts and legal rules.In this paper we tackle the same problem by defining and implementing a new formalism, called probabilistic argumentation logic, which can be seen as a probabilistic generalization of Nute's defeasible logic. Not only does this provide an automation of the — only hand-performed — computations in Riveret et al, it also provides a solution to one of their open problems: a method to determine the initial probabilities from a given body of precedents.


2012 ◽  
Vol 38 (5) ◽  
pp. 219-230 ◽  
Author(s):  
E. E. Vityaev ◽  
A. V. Demin ◽  
D. K. Ponomaryov

Sign in / Sign up

Export Citation Format

Share Document