Minimax-rate adaptive nonparametric regression with unknown correlations of errors

2019 ◽  
Vol 62 (2) ◽  
pp. 227-244
Author(s):  
Guowu Yang ◽  
Yuhong Yang
Biometrika ◽  
2018 ◽  
Vol 106 (1) ◽  
pp. 87-107
Author(s):  
Asad Haris ◽  
Ali Shojaie ◽  
Noah Simon

SUMMARY We consider the problem of nonparametric regression with a potentially large number of covariates. We propose a convex, penalized estimation framework that is particularly well suited to high-dimensional sparse additive models and combines the appealing features of finite basis representation and smoothing penalties. In the case of additive models, a finite basis representation provides a parsimonious representation for fitted functions but is not adaptive when component functions possess different levels of complexity. In contrast, a smoothing spline-type penalty on the component functions is adaptive but does not provide a parsimonious representation. Our proposal simultaneously achieves parsimony and adaptivity in a computationally efficient way. We demonstrate these properties through empirical studies and show that our estimator converges at the minimax rate for functions within a hierarchical class. We further establish minimax rates for a large class of sparse additive models. We also develop an efficient algorithm that scales similarly to the lasso with the number of covariates and sample size.


2021 ◽  
Vol 1842 (1) ◽  
pp. 012044
Author(s):  
Narita Yuri Adrianingsih ◽  
I Nyoman Budiantara ◽  
Jerry Dwi Trijoyo Purnomo

Sign in / Sign up

Export Citation Format

Share Document