Finite Sample Performance of Robust Bayesian Regression

Author(s):  
Michael NMI3 Smith ◽  
Simon Sheather ◽  
Robert Kohn
Econometrics ◽  
2019 ◽  
Vol 7 (3) ◽  
pp. 29
Author(s):  
Emanuela Ciapanna ◽  
Marco Taboga

This paper deals with instability in regression coefficients. We propose a Bayesian regression model with time-varying coefficients (TVC) that allows to jointly estimate the degree of instability and the time-path of the coefficients. Thanks to the computational tractability of the model and to the fact that it is fully automatic, we are able to run Monte Carlo experiments and analyze its finite-sample properties. We find that the estimation precision and the forecasting accuracy of the TVC model compare favorably to those of other methods commonly employed to deal with parameter instability. A distinguishing feature of the TVC model is its robustness to mis-specification: Its performance is also satisfactory when regression coefficients are stable or when they experience discrete structural breaks. As a demonstrative application, we used our TVC model to estimate the exposures of S&P 500 stocks to market-wide risk factors: We found that a vast majority of stocks had time-varying exposures and the TVC model helped to better forecast these exposures.


Author(s):  
Pranab K. Sen ◽  
Julio M. Singer ◽  
Antonio C. Pedroso de Lima

Methodology ◽  
2012 ◽  
Vol 8 (1) ◽  
pp. 23-38 ◽  
Author(s):  
Manuel C. Voelkle ◽  
Patrick E. McKnight

The use of latent curve models (LCMs) has increased almost exponentially during the last decade. Oftentimes, researchers regard LCM as a “new” method to analyze change with little attention paid to the fact that the technique was originally introduced as an “alternative to standard repeated measures ANOVA and first-order auto-regressive methods” (Meredith & Tisak, 1990, p. 107). In the first part of the paper, this close relationship is reviewed, and it is demonstrated how “traditional” methods, such as the repeated measures ANOVA, and MANOVA, can be formulated as LCMs. Given that latent curve modeling is essentially a large-sample technique, compared to “traditional” finite-sample approaches, the second part of the paper addresses the question to what degree the more flexible LCMs can actually replace some of the older tests by means of a Monte-Carlo simulation. In addition, a structural equation modeling alternative to Mauchly’s (1940) test of sphericity is explored. Although “traditional” methods may be expressed as special cases of more general LCMs, we found the equivalence holds only asymptotically. For practical purposes, however, no approach always outperformed the other alternatives in terms of power and type I error, so the best method to be used depends on the situation. We provide detailed recommendations of when to use which method.


2006 ◽  
Vol 54 (3) ◽  
pp. 343-350 ◽  
Author(s):  
C. F. H. Longin ◽  
H. F. Utz ◽  
A. E. Melchinger ◽  
J.C. Reif

The optimum allocation of breeding resources is crucial for the efficiency of breeding programmes. The objectives were to (i) compare selection gain ΔGk for finite and infinite sample sizes, (ii) compare ΔGk and the probability of identifying superior hybrids (Pk), and (iii) determine the optimum allocation of the number of hybrids and test locations in hybrid maize breeding using doubled haploids. Infinite compared to finite sample sizes led to almost identical optimum allocation of test resources, but to an inflation of ΔGk. This inflation decreased as the budget and the number of finally selected hybrids increased. A reasonable Pk was reached for hybrids belonging to the q = 1% best of the population. The optimum allocations for Pk(q) and ΔGkwere similar, indicating that Pk(q) is promising for optimizing breeding programmes.


2008 ◽  
Vol 47 (02) ◽  
pp. 167-173 ◽  
Author(s):  
A. Pfahlberg ◽  
O. Gefeller ◽  
R. Weißbach

Summary Objectives: In oncological studies, the hazard rate can be used to differentiate subgroups of the study population according to their patterns of survival risk over time. Nonparametric curve estimation has been suggested as an exploratory means of revealing such patterns. The decision about the type of smoothing parameter is critical for performance in practice. In this paper, we study data-adaptive smoothing. Methods: A decade ago, the nearest-neighbor bandwidth was introduced for censored data in survival analysis. It is specified by one parameter, namely the number of nearest neighbors. Bandwidth selection in this setting has rarely been investigated, although the heuristical advantages over the frequently-studied fixed bandwidth are quite obvious. The asymptotical relationship between the fixed and the nearest-neighbor bandwidth can be used to generate novel approaches. Results: We develop a new selection algorithm termed double-smoothing for the nearest-neighbor bandwidth in hazard rate estimation. Our approach uses a finite sample approximation of the asymptotical relationship between the fixed and nearest-neighbor bandwidth. By so doing, we identify the nearest-neighbor bandwidth as an additional smoothing step and achieve further data-adaption after fixed bandwidth smoothing. We illustrate the application of the new algorithm in a clinical study and compare the outcome to the traditional fixed bandwidth result, thus demonstrating the practical performance of the technique. Conclusion: The double-smoothing approach enlarges the methodological repertoire for selecting smoothing parameters in nonparametric hazard rate estimation. The slight increase in computational effort is rewarded with a substantial amount of estimation stability, thus demonstrating the benefit of the technique for biostatistical applications.


2020 ◽  
Vol 2020 ◽  
pp. 1-12
Author(s):  
Hanji He ◽  
Guangming Deng

We extend the mean empirical likelihood inference for response mean with data missing at random. The empirical likelihood ratio confidence regions are poor when the response is missing at random, especially when the covariate is high-dimensional and the sample size is small. Hence, we develop three bias-corrected mean empirical likelihood approaches to obtain efficient inference for response mean. As to three bias-corrected estimating equations, we get a new set by producing a pairwise-mean dataset. The method can increase the size of the sample for estimation and reduce the impact of the dimensional curse. Consistency and asymptotic normality of the maximum mean empirical likelihood estimators are established. The finite sample performance of the proposed estimators is presented through simulation, and an application to the Boston Housing dataset is shown.


Econometrics ◽  
2021 ◽  
Vol 9 (1) ◽  
pp. 10
Author(s):  
Šárka Hudecová ◽  
Marie Hušková ◽  
Simos G. Meintanis

This article considers goodness-of-fit tests for bivariate INAR and bivariate Poisson autoregression models. The test statistics are based on an L2-type distance between two estimators of the probability generating function of the observations: one being entirely nonparametric and the second one being semiparametric computed under the corresponding null hypothesis. The asymptotic distribution of the proposed tests statistics both under the null hypotheses as well as under alternatives is derived and consistency is proved. The case of testing bivariate generalized Poisson autoregression and extension of the methods to dimension higher than two are also discussed. The finite-sample performance of a parametric bootstrap version of the tests is illustrated via a series of Monte Carlo experiments. The article concludes with applications on real data sets and discussion.


Author(s):  
Rhys Morris ◽  
Tony Myers ◽  
Stacey Emmonds ◽  
Dave Singleton ◽  
Kevin Till

Abstract Purpose Sled towing has been shown to be an effective method to enhance the physical qualities in youth athletes. The aim of this study was to evaluate the impact of a 6-week sled towing intervention on muscular strength, speed and power in elite youth soccer players of differing maturity status. Method Seventy-three male elite youth soccer players aged 12–18 years (Pre-Peak Height Velocity [PHV] n = 25; Circa-PHV n = 24; Post-PHV n = 24) from one professional soccer academy participated in this study. Sprint assessments (10 m and 30 m), countermovement jump and isometric mid-thigh pull were undertaken before (T1) and after (T2) a 6-week intervention. The training intervention consisted of 6 weeks (2 × per week, 10 sprints over 20 m distance) of resisted sled towing (linear progression 10%–30% of body mass) during the competitive season. Bayesian regression models analysed differences between T1 and T2 within each maturity group. Results There were minimal changes in strength, speed and power (P = 0.35–0.80) for each maturity group across the 6-week intervention. Where there were changes with greater certainty, they are unlikely to represent real effect due to higher regression to the mean (RTM). Conclusion It appears that a 6-week sled towing training programme with loadings of 10%–30% body mass only maintains physical qualities in elite youth soccer players pre-, circa-, and post-PHV. Further research is required to determine the effectiveness of this training method in long-term athletic development programmes.


Sign in / Sign up

Export Citation Format

Share Document