scholarly journals Conditional Deep Gaussian Processes: Empirical Bayes Hyperdata Learning

Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1387
Author(s):  
Chi-Ken Lu ◽  
Patrick Shafto

It is desirable to combine the expressive power of deep learning with Gaussian Process (GP) in one expressive Bayesian learning model. Deep kernel learning showed success as a deep network used for feature extraction. Then, a GP was used as the function model. Recently, it was suggested that, albeit training with marginal likelihood, the deterministic nature of a feature extractor might lead to overfitting, and replacement with a Bayesian network seemed to cure it. Here, we propose the conditional deep Gaussian process (DGP) in which the intermediate GPs in hierarchical composition are supported by the hyperdata and the exposed GP remains zero mean. Motivated by the inducing points in sparse GP, the hyperdata also play the role of function supports, but are hyperparameters rather than random variables. It follows our previous moment matching approach to approximate the marginal prior for conditional DGP with a GP carrying an effective kernel. Thus, as in empirical Bayes, the hyperdata are learned by optimizing the approximate marginal likelihood which implicitly depends on the hyperdata via the kernel. We show the equivalence with the deep kernel learning in the limit of dense hyperdata in latent space. However, the conditional DGP and the corresponding approximate inference enjoy the benefit of being more Bayesian than deep kernel learning. Preliminary extrapolation results demonstrate expressive power from the depth of hierarchy by exploiting the exact covariance and hyperdata learning, in comparison with GP kernel composition, DGP variational inference and deep kernel learning. We also address the non-Gaussian aspect of our model as well as way of upgrading to a full Bayes inference.

2020 ◽  
Vol 0 (0) ◽  
Author(s):  
Fumin Zhu ◽  
Michele Leonardo Bianchi ◽  
Young Shin Kim ◽  
Frank J. Fabozzi ◽  
Hengyu Wu

AbstractThis paper studies the option valuation problem of non-Gaussian and asymmetric GARCH models from a state-space structure perspective. Assuming innovations following an infinitely divisible distribution, we apply different estimation methods including filtering and learning approaches. We then investigate the performance in pricing S&P 500 index short-term options after obtaining a proper change of measure. We find that the sequential Bayesian learning approach (SBLA) significantly and robustly decreases the option pricing errors. Our theoretical and empirical findings also suggest that, when stock returns are non-Gaussian distributed, their innovations under the risk-neutral measure may present more non-normality, exhibit higher volatility, and have a stronger leverage effect than under the physical measure.


2021 ◽  
pp. 1-13
Author(s):  
Haitao Liu ◽  
Yew-Soon Ong ◽  
Ziwei Yu ◽  
Jianfei Cai ◽  
Xiaobo Shen

2011 ◽  
Vol 328-330 ◽  
pp. 524-529
Author(s):  
Jun Yan Ma ◽  
Xiao Ping Liao ◽  
Wei Xia ◽  
Xue Lian Yan

As a powerful modeling tool, Gaussian process (GP) employs a Bayesian statistics approach and adopts a highly nonlinear regression technique for general scientific and engineering tasks. In the first step of constructing Gaussian process model is to estimate the best value of the hyperparameter which turned to be used in the second step where a satisfactory nonlinear model was fitted. In this paper, a modified Wolfe line search approach for hyperparameters estimation by maximizing the marginal likelihood based on conjugate gradient method is proposed. And then we analyze parameter correlation according to the value of hyperparameters to control the warpage which is a main defect for a thin shell structure part in injection molding.


2015 ◽  
Vol 2015 ◽  
pp. 1-5
Author(s):  
Naiyi Li ◽  
Yuan Li ◽  
Yongming Li ◽  
Yang Liu

This research is based on ranked set sampling. Through the analysis and proof, the empirical Bayes test rule and asymptotical property for the parameter of power distribution are obtained.


Author(s):  
Hau-Tieng Wu ◽  
Tze Leung Lai ◽  
Gabriel G. Haddad ◽  
Alysson Muotri

Herein we describe new frontiers in mathematical modeling and statistical analysis of oscillatory biomedical signals, motivated by our recent studies of network formation in the human brain during the early stages of life and studies forty years ago on cardiorespiratory patterns during sleep in infants and animal models. The frontiers involve new nonlinear-type time–frequency analysis of signals with multiple oscillatory components, and efficient particle filters for joint state and parameter estimators together with uncertainty quantification in hidden Markov models and empirical Bayes inference.


Genetics ◽  
2007 ◽  
Vol 177 (2) ◽  
pp. 861-873 ◽  
Author(s):  
Shuichi Kitada ◽  
Toshihide Kitakado ◽  
Hirohisa Kishino

Sign in / Sign up

Export Citation Format

Share Document