backfitting algorithm
Recently Published Documents


TOTAL DOCUMENTS

7
(FIVE YEARS 2)

H-INDEX

3
(FIVE YEARS 0)

2021 ◽  
Author(s):  
ML Gamiz ◽  
Anton Kalen ◽  
Rafael Nozal-Cañadas ◽  
Rocio Raya-Miranda

Abstract Our practical motivation is the analysis of potential correlations between spectral noise current and threshold voltage from common on-wafer MOSFETs. The usual strategy leads to the use of standard techniques based on Normal linear regression easily accessible in all statistical software (both free or commercial). However, these statistical methods are not appropriate because the assumptions they lie on are not met. More sophisticated methods are required. A new strategy based on the most novel nonparametric techniques which are data-driven and thus free from questionable parametric assumptions is proposed. A backfitting algorithm accounting for random effects and nonparametric regression is designed and implemented. The nature of the correlation between threshold voltage and noise is examined by conducting a statistical test, which is based on a novel technique that summarizes in a color map all the relevant information of the data. The way the results are presented in the plot makes it easy for a non-expert in data analysis to understand what is underlying. The good performance of the method is proven through simulations and it is applied to a data case in a field where these modern statistical techniques are novel and result very efficient.


Biometrika ◽  
2020 ◽  
Author(s):  
M Hiabu ◽  
J P Nielsen ◽  
T H Scheike

Summary We consider an extension of Aalen’s additive regression model that allows covariates to have effects that vary on two different time scales. The two time scales considered are equal up to a constant for each individual and vary across individuals, such as follow-up time and age in medical studies or calendar time and age in longitudinal studies. The model was introduced in Scheike (2001), where it was solved using smoothing techniques. We present a new backfitting algorithm for estimating the structured model without having to use smoothing. Estimators of the cumulative regression functions on the two time scales are suggested by solving local estimating equations jointly on the two time scales. We provide large-sample properties and simultaneous confidence bands. The model is applied to data on myocardial infarction, providing a separation of the two effects stemming from time since diagnosis and age.


2017 ◽  
Vol 31 (8) ◽  
pp. 3609-3618 ◽  
Author(s):  
Jérôme Mendes ◽  
Francisco Souza ◽  
Rui Araújo ◽  
Saeid Rastegar

2002 ◽  
Vol 14 (10) ◽  
pp. 2415-2437 ◽  
Author(s):  
Robert A. Jacobs ◽  
Wenxin Jiang ◽  
Martin A. Tanner

Previous researchers developed new learning architectures for sequential data by extending conventional hidden Markov models through the use of distributed state representations. Although exact inference and parameter estimation in these architectures is computationally intractable, Ghahramani and Jordan (1997) showed that approximate inference and parameter estimation in one such architecture, factorial hidden Markov models (FHMMs), is feasible in certain circumstances. However, the learning algorithm proposed by these investigators, based on variational techniques, is difficult to understand and implement and is limited to the study of real-valued data sets. This chapter proposes an alternative method for approximate inference and parameter estimation in FHMMs based on the perspective that FHMMs are a generalization of a well-known class of statistical models known as generalized additive models (GAMs; Hastie & Tibshirani, 1990). Using existing statistical techniques for GAMs as a guide, we have developed the generalized backfitting algorithm. This algorithm computes customized error signals for each hidden Markov chain of an FHMM and then trains each chain one at a time using conventional techniques from the hidden Markov models literature. Relative to previous perspectives on FHMMs, we believe that the viewpoint taken here has a number of advantages. First, it places FHMMs on firm statistical foundations by relating them to a class of models that are well studied in the statistics community, yet it generalizes this class of models in an interesting way. Second, it leads to an understanding of how FHMMs can be applied to many different types of time-series data, including Bernoulli and multinomial data, not just data that are real valued. Finally, it leads to an effective learning procedure for FHMMs that is easier to understand and easier to implement than existing learning procedures. Simulation results suggest that FHMMs trained with the generalized backfitting algorithm are a practical and powerful tool for analyzing sequential data.


Author(s):  
Craig F. Ansley ◽  
Robert Kohn

AbstractThe backfitting algorithm is an iterative procedure for fitting additive models in which, at each step, one component is estimated keeping the other components fixed, the algorithm proceeding component by component and iterating until convergence. Convergence of the algorithm has been studied by Buja, Hastie, and Tibshirani (1989). We give a simple, but more general, geometric proof of the convergence of the backfitting algorithm when the additive components are estimated by penalized least squares. Our treatment covers spline smoothers and structural time series models, and we give a full discussion of the degenerate case. Our proof is based on Halperin's (1962) generalization of von Neumann's alternating projection theorem.


Sign in / Sign up

Export Citation Format

Share Document