scholarly journals Mallows Model Averaging Estimator for the MIDAS Model with Almon Polynomial Weight

2023 ◽  
Author(s):  
Hsin-Chieh Wong ◽  
Wen-Jen Tsay
2020 ◽  
Vol 36 (6) ◽  
pp. 1099-1126
Author(s):  
Jen-Che Liao ◽  
Wen-Jen Tsay

This article proposes frequentist multiple-equation least-squares averaging approaches for multistep forecasting with vector autoregressive (VAR) models. The proposed VAR forecast averaging methods are based on the multivariate Mallows model averaging (MMMA) and multivariate leave-h-out cross-validation averaging (MCVAh) criteria (with h denoting the forecast horizon), which are valid for iterative and direct multistep forecast averaging, respectively. Under the framework of stationary VAR processes of infinite order, we provide theoretical justifications by establishing asymptotic unbiasedness and asymptotic optimality of the proposed forecast averaging approaches. Specifically, MMMA exhibits asymptotic optimality for one-step-ahead forecast averaging, whereas for direct multistep forecast averaging, the asymptotically optimal combination weights are determined separately for each forecast horizon based on the MCVAh procedure. To present our methodology, we investigate the finite-sample behavior of the proposed averaging procedures under model misspecification via simulation experiments.


2019 ◽  
Author(s):  
Yang Feng ◽  
Qingfeng Liu ◽  
Ryo Okui

2020 ◽  
Vol 187 ◽  
pp. 108916 ◽  
Author(s):  
Yang Feng ◽  
Qingfeng Liu ◽  
Ryo Okui

2019 ◽  
Vol 47 (3) ◽  
pp. 336-351 ◽  
Author(s):  
Jun Liao ◽  
Guohua Zou ◽  
Yan Gao

2018 ◽  
Vol 35 (4) ◽  
pp. 816-841 ◽  
Author(s):  
Xinyu Zhang ◽  
Chu-An Liu

This article considers the problem of inference for nested least squares averaging estimators. We study the asymptotic behavior of the Mallows model averaging estimator (MMA; Hansen, 2007) and the jackknife model averaging estimator (JMA; Hansen and Racine, 2012) under the standard asymptotics with fixed parameters setup. We find that both MMA and JMA estimators asymptotically assign zero weight to the under-fitted models, and MMA and JMA weights of just-fitted and over-fitted models are asymptotically random. Building on the asymptotic behavior of model weights, we derive the asymptotic distributions of MMA and JMA estimators and propose a simulation-based confidence interval for the least squares averaging estimator. Monte Carlo simulations show that the coverage probabilities of proposed confidence intervals achieve the nominal level.


Author(s):  
Hui Xiao ◽  
Yiguo Sun

Model selection and model averaging have been the popular approaches in handling modelling uncertainties. Fan and Li(2006) laid out a unified frame work for variable selection via penalized likelihood. The tuning parameter selection is vital in the optimization problem for the penalized estimators in achieving consistent selection and optimal estimation. Since the OLSpost-LASSO estimator by Belloni and Chernozhukov (2013), few studies have focused on the finite sample performances of the class of OLS post-penalty estimators with the tuning parameter choice determined by different tuning parameter selection approaches. We aim to supplement the existing model selection literature by studying such a class of OLS post-selection estimators. Inspired by the Shrinkage Averaging Estimator (SAE) by Schomaker(2012) and the Mallows Model Averaging (MMA) criterion by Hansen (2007), we further propose a Shrinkage Mallows Model Averaging (SMMA) estimator for averaging high dimensional sparse models. Based on the Monte Carlo design by Wang et al. (2009) which features an expanding sparse parameter space as the sample size increases, our Monte Carlo design further considers the effect of the effective sample size and the degree of model sparsity on the finite sample performances of model selection and model averaging estimators. From our data examples, we find that the OLS post-SCAD(BIC) estimator in finite sample outperforms most of the current penalized least squares estimators as long as the number of parameters does not exceed the sample size. In addition, the SMMA performs better given sparser models. This supports the use of the SMMA estimator when averaging high dimensional sparse models.


2020 ◽  
Vol 13 (11) ◽  
pp. 278
Author(s):  
Hui Xiao ◽  
Yiguo Sun

This paper aims to enrich the understanding and modelling strategies for cryptocurrency markets by investigating major cryptocurrencies’ returns determinants and forecast their returns. To handle model uncertainty when modelling cryptocurrencies, we conduct model selection for an autoregressive distributed lag (ARDL) model using several popular penalized least squares estimators to explain the cryptocurrencies’ returns. We further introduce a novel model averaging approach or the shrinkage Mallows model averaging (SMMA) estimator for forecasting. First, we find that the returns for most cryptocurrencies are sensitive to volatilities from major financial markets. The returns are also prone to the changes in gold prices and the Forex market’s current and lagged information. Then, when forecasting cryptocurrencies’ returns, we further find that an ARDL(p,q) model estimated by the SMMA estimator outperforms the competing estimators and models out-of-sample.


Sign in / Sign up

Export Citation Format

Share Document