scholarly journals Statistical Learning With Time Series Dependence: An Application to Scoring Sleep in Mice

2013 ◽  
Vol 108 (504) ◽  
pp. 1147-1162 ◽  
Author(s):  
Blakeley B. McShane ◽  
Shane T. Jensen ◽  
Allan I. Pack ◽  
Abraham J. Wyner
2022 ◽  
pp. 261-299
Author(s):  
Dag Tjøstheim ◽  
Håkon Otneim ◽  
Bård Støve

2010 ◽  
Vol 85 (2) ◽  
pp. 483-512 ◽  
Author(s):  
Ian D. Gow ◽  
Gaizka Ormazabal ◽  
Daniel J. Taylor

ABSTRACT: We review and evaluate the methods commonly used in the accounting literature to correct for cross-sectional and time-series dependence. While much of the accounting literature studies settings in which variables are cross-sectionally and serially correlated, we find that the extant methods are not robust to both forms of dependence. Contrary to claims in the literature, we find that the Z2 statistic and Newey-West corrected Fama-MacBeth standard errors do not correct for both cross-sectional and time-series dependence. We show that extant methods produce misspecified test statistics in common accounting research settings, and that correcting for both forms of dependence substantially alters inferences reported in the literature. Specifically, several findings in the implied cost of equity capital literature, the cost of debt literature, and the conservatism literature appear not to be robust to the use of well-specified test statistics.


2014 ◽  
Vol 1051 ◽  
pp. 1009-1015 ◽  
Author(s):  
Ya Li Ning ◽  
Xin You Wang ◽  
Xi Ping He

Support Vector Machines (SVM), which is a new generation learning method based on advances in statistical learning theory, is characterized by the use of many standard technologies of machine learning such as maximal margin hyperplane, Mercel kernels and the quadratic programming. Because the best performance is obtained in many currently challenging applications, SVM has sustained wide attention, and has been become the standard tools of machine learning and data mining. But as a developing technology, SVM still have some problems and its applications are limited. In this paper, SVM and its applications in chaotic time series including predicting chaotic time series, focus on comparison in regression type selection, and kernel type selection in the same regression machine type.


2011 ◽  
Vol 268-270 ◽  
pp. 1017-1020
Author(s):  
Man Xiang Miao ◽  
Yi Jin Gang

Prediction of Lorenz Chaotic Time Series is a vital problem in nonlinear dynamics .Support vector machine (SVM) is a kind of novel machine learning methods based on statistical learning theory, which have been provided an efficient algorithm thought in prediction of Chaotic Time Series. This paper combined SVM with neural network which based on the similarity of structure between SVM and RBF Networks, using SVM to obtain the centers of RBF Networks, then to predict the Lorenz Chaotic Time Series. Simulation results show that the effect is better than other methods.


2013 ◽  
Vol 1 ◽  
Author(s):  
Pierre Alquier ◽  
Xiaoyin Li ◽  
Olivier Wintenberger

Sign in / Sign up

Export Citation Format

Share Document