scholarly journals Posterior concentration and fast convergence rates for generalized Bayesian learning

2020 ◽  
Vol 538 ◽  
pp. 372-383
Author(s):  
Lam Si Tung Ho ◽  
Binh T. Nguyen ◽  
Vu Dinh ◽  
Duy Nguyen
2017 ◽  
Vol 62 (11) ◽  
pp. 5538-5553 ◽  
Author(s):  
Angelia Nedic ◽  
Alex Olshevsky ◽  
Cesar A. Uribe

2013 ◽  
Vol 29 (4) ◽  
pp. 838-856 ◽  
Author(s):  
Minjing Tao ◽  
Yazhen Wang ◽  
Xiaohong Chen

Financial practices often need to estimate an integrated volatility matrix of a large number of assets using noisy high-frequency data. Many existing estimators of a volatility matrix of small dimensions become inconsistent when the size of the matrix is close to or larger than the sample size. This paper introduces a new type of large volatility matrix estimator based on nonsynchronized high-frequency data, allowing for the presence of microstructure noise. When both the number of assets and the sample size go to infinity, we show that our new estimator is consistent and achieves a fast convergence rate, where the rate is optimal with respect to the sample size. A simulation study is conducted to check the finite sample performance of the proposed estimator.


Algorithmica ◽  
1999 ◽  
Vol 23 (4) ◽  
pp. 363-373 ◽  
Author(s):  
J. Belanger ◽  
A. Pavan ◽  
J. Wang

Author(s):  
Bin Gu ◽  
Wenhan Xian ◽  
Heng Huang

Asynchronous parallel stochastic optimization for non-convex  problems  becomes more and more   important in machine learning especially due to the popularity of deep learning. The Frank-Wolfe (a.k.a. conditional gradient) algorithms  has regained much interest  because of  its projection-free property and the ability of handling structured constraints. However,  our understanding of  asynchronous stochastic Frank-Wolfe algorithms is  extremely limited especially in the non-convex setting. To address this challenging problem, in this paper, we propose our  asynchronous stochastic  Frank-Wolfe algorithm (AsySFW) and  its variance reduction version (AsySVFW) for solving the constrained non-convex optimization problems.  More importantly, we  prove the fast convergence rates  of   AsySFW and AsySVFW in the non-convex setting. To the best of our knowledge, AsySFW and AsySVFW  are the first asynchronous parallel stochastic algorithms with convergence guarantees for solving the constrained  non-convex optimization problems. The  experimental  results on real high-dimensional gray-scale images   not only confirm the  fast convergence  of   our algorithms, but also show  a near-linear speedup  on a parallel system with shared memory due to the lock-free implementation.


Sign in / Sign up

Export Citation Format

Share Document