Probably Approximately Correct Search

Author(s):  
Ingemar J. Cox ◽  
Ruoxun Fu ◽  
Lars Kai Hansen
Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 313
Author(s):  
Imon Banerjee ◽  
Vinayak A. Rao ◽  
Harsha Honnappa

Datasets displaying temporal dependencies abound in science and engineering applications, with Markov models representing a simplified and popular view of the temporal dependence structure. In this paper, we consider Bayesian settings that place prior distributions over the parameters of the transition kernel of a Markov model, and seek to characterize the resulting, typically intractable, posterior distributions. We present a Probably Approximately Correct (PAC)-Bayesian analysis of variational Bayes (VB) approximations to tempered Bayesian posterior distributions, bounding the model risk of the VB approximations. Tempered posteriors are known to be robust to model misspecification, and their variational approximations do not suffer the usual problems of over confident approximations. Our results tie the risk bounds to the mixing and ergodic properties of the Markov data generating model. We illustrate the PAC-Bayes bounds through a number of example Markov models, and also consider the situation where the Markov model is misspecified.


2014 ◽  
Vol 61 (10) ◽  
pp. 1222
Author(s):  
Marcus Feldman

1996 ◽  
Vol 8 (3) ◽  
pp. 625-628 ◽  
Author(s):  
Peter L. Bartlett ◽  
Robert C. Williamson

We give upper bounds on the Vapnik-Chervonenkis dimension and pseudodimension of two-layer neural networks that use the standard sigmoid function or radial basis function and have inputs from {−D, …,D}n. In Valiant's probably approximately correct (pac) learning framework for pattern classification, and in Haussler's generalization of this framework to nonlinear regression, the results imply that the number of training examples necessary for satisfactory learning performance grows no more rapidly than W log (WD), where W is the number of weights. The previous best bound for these networks was O(W4).


Sign in / Sign up

Export Citation Format

Share Document