Summary
We propose a novel estimator of error variance and establish its asymptotic properties based on ridge regression and random matrix theory. The proposed estimator is valid under both low- and high-dimensional models, and performs well not only in nonsparse cases, but also in sparse ones. The finite-sample performance of the proposed method is assessed through an intensive numerical study, which indicates that the method is promising compared with its competitors in many interesting scenarios.