Elements for Building Supervised Statistical Machine Learning Models
AbstractThis chapter gives details of the linear multiple regression model including assumptions and some pros and cons, the maximum likelihood. Gradient descendent methods are described for learning the parameters under this model. Penalized linear multiple regression is derived under Ridge and Lasso penalties, which also emphasizes the estimation of the regularization parameter of importance for its successful implementation. Examples are given for both penalties (Ridge and Lasso) and but not for penalized regression multiple regression framework for illustrating the circumstances when the penalized versions should be preferred. Finally, the fundamentals of penalized and non-penalized logistic regression are provided under a gradient descendent framework. We give examples of logistic regression. Each example comes with the corresponding R codes to facilitate their quick understanding and use.