Response transformation models

2017 ◽  
pp. 174-200
Author(s):  
T.J. Hastie ◽  
R.J. Tibshirani
2015 ◽  
Vol 3 (2) ◽  
pp. 135
Author(s):  
Sunil K Sapra

<p>The paper studies various response transformation models for discrete choice and categorical data. These response transformation models are fitted to binary response data on beverage choice. Several models are compared, and the best model is selected using AICs and deviances. The transformations include extensions of the widely used Box-Cox transformation to Normality for continuous data to categorical data. The econometric techniques employed in the paper are widely applicable to the analysis of count, binary response, and duration types of data encountered in business and economics.</p>


2020 ◽  
pp. 1471082X2096691
Author(s):  
Amani Almohaimeed ◽  
Jochen Einbeck

Random effect models have been popularly used as a mainstream statistical technique over several decades; and the same can be said for response transformation models such as the Box–Cox transformation. The latter aims at ensuring that the assumptions of normality and of homoscedasticity of the response distribution are fulfilled, which are essential conditions for inference based on a linear model or a linear mixed model. However, methodology for response transformation and simultaneous inclusion of random effects has been developed and implemented only scarcely, and is so far restricted to Gaussian random effects. We develop such methodology, thereby not requiring parametric assumptions on the distribution of the random effects. This is achieved by extending the ‘Nonparametric Maximum Likelihood’ towards a ‘Nonparametric profile maximum likelihood’ technique, allowing to deal with overdispersion as well as two-level data scenarios.


2006 ◽  
Author(s):  
Antonio Miguel ◽  
Eduardo Lleida ◽  
Alfons Juan ◽  
Luis Buera ◽  
Alfonso Ortega ◽  
...  

1989 ◽  
Vol 17 (1) ◽  
pp. 195-208 ◽  
Author(s):  
Niels Christian ◽  
Bang Jespersen

Sign in / Sign up

Export Citation Format

Share Document