AbstractThe fundamentals for Reproducing Kernel Hilbert Spaces (RKHS) regression methods are described in this chapter. We first point out the virtues of RKHS regression methods and why these methods are gaining a lot of acceptance in statistical machine learning. Key elements for the construction of RKHS regression methods are provided, the kernel trick is explained in some detail, and the main kernel functions for building kernels are provided. This chapter explains some loss functions under a fixed model framework with examples of Gaussian, binary, and categorical response variables. We illustrate the use of mixed models with kernels by providing examples for continuous response variables. Practical issues for tuning the kernels are illustrated. We expand the RKHS regression methods under a Bayesian framework with practical examples applied to continuous and categorical response variables and by including in the predictor the main effects of environments, genotypes, and the genotype ×environment interaction. We show examples of multi-trait RKHS regression methods for continuous response variables. Finally, some practical issues of kernel compression methods are provided which are important for reducing the computation cost of implementing conventional RKHS methods.