structured regularization
Recently Published Documents


TOTAL DOCUMENTS

15
(FIVE YEARS 3)

H-INDEX

4
(FIVE YEARS 1)

2021 ◽  
pp. 1471082X2110410
Author(s):  
Elena Tuzhilina ◽  
Leonardo Tozzi ◽  
Trevor Hastie

Canonical correlation analysis (CCA) is a technique for measuring the association between two multivariate data matrices. A regularized modification of canonical correlation analysis (RCCA) which imposes an [Formula: see text] penalty on the CCA coefficients is widely used in applications with high-dimensional data. One limitation of such regularization is that it ignores any data structure, treating all the features equally, which can be ill-suited for some applications. In this article we introduce several approaches to regularizing CCA that take the underlying data structure into account. In particular, the proposed group regularized canonical correlation analysis (GRCCA) is useful when the variables are correlated in groups. We illustrate some computational strategies to avoid excessive computations with regularized CCA in high dimensions. We demonstrate the application of these methods in our motivating application from neuroscience, as well as in a small simulation example.


PLoS ONE ◽  
2020 ◽  
Vol 15 (11) ◽  
pp. e0242099
Author(s):  
Tomokaze Shiratori ◽  
Ken Kobayashi ◽  
Yuichi Takano

This paper discusses the prediction of hierarchical time series, where each upper-level time series is calculated by summing appropriate lower-level time series. Forecasts for such hierarchical time series should be coherent, meaning that the forecast for an upper-level time series equals the sum of forecasts for corresponding lower-level time series. Previous methods for making coherent forecasts consist of two phases: first computing base (incoherent) forecasts and then reconciling those forecasts based on their inherent hierarchical structure. To improve time series predictions, we propose a structured regularization method for completing both phases simultaneously. The proposed method is based on a prediction model for bottom-level time series and uses a structured regularization term to incorporate upper-level forecasts into the prediction model. We also develop a backpropagation algorithm specialized for applying our method to artificial neural networks for time series prediction. Experimental results using synthetic and real-world datasets demonstrate that our method is comparable in terms of prediction accuracy and computational efficiency to other methods for time series prediction.


2019 ◽  
Vol 38 (5) ◽  
pp. 39-53 ◽  
Author(s):  
Jing Ren ◽  
Mikhail Panine ◽  
Peter Wonka ◽  
Maks Ovsjanikov

2017 ◽  
Vol 132 ◽  
pp. 102-118 ◽  
Author(s):  
Loïc Landrieu ◽  
Hugo Raguet ◽  
Bruno Vallet ◽  
Clément Mallet ◽  
Martin Weinmann

2016 ◽  
Vol 66 (3) ◽  
pp. 401-424 ◽  
Author(s):  
Wei Wan ◽  
Lorenz T. Biegler

2016 ◽  
Vol 27 (3) ◽  
pp. 789-804 ◽  
Author(s):  
Julien Chiquet ◽  
Tristan Mary-Huard ◽  
Stéphane Robin

2016 ◽  
Vol 3 (2) ◽  
pp. 39-46
Author(s):  
Keisuke Nagata ◽  
Yoshinobu Kawahara ◽  
Takashi Washio ◽  
Akira Unami

2014 ◽  
Vol 2 ◽  
pp. 393-404 ◽  
Author(s):  
Jonathan H. Clark ◽  
Chris Dyer ◽  
Alon Lavie

Linear models, which support efficient learning and inference, are the workhorses of statistical machine translation; however, linear decision rules are less attractive from a modeling perspective. In this work, we introduce a technique for learning arbitrary, rule-local, non-linear feature transforms that improve model expressivity, but do not sacrifice the efficient inference and learning associated with linear models. To demonstrate the value of our technique, we discard the customary log transform of lexical probabilities and drop the phrasal translation probability in favor of raw counts. We observe that our algorithm learns a variation of a log transform that leads to better translation quality compared to the explicit log transform. We conclude that non-linear responses play an important role in SMT, an observation that we hope will inform the efforts of feature engineers.


Sign in / Sign up

Export Citation Format

Share Document