High-Dimensional Least-Squares with Perfect Positive Correlation
The least-squares is a common and important method in linear regression. However, it often leads to overfitting phenomenon as dealing with high-dimensional problems, and various regularization schemes regarding prior information for specific problems are studied to make up such a deficiency. In the sense of Kendall’s [Formula: see text] from the community of nonparametric analysis, we establish a new model wherein the ordinary least-squares is equipped with perfect positive correlation constraint, sought to maintain the concordance of the rankings of the observations and the systematic components. By sorting the observations into an ascending order, we reduce the perfect positive correlation constraint into a linear inequality system. The resulting linearly constrained least-squares problem together with its dual problem is shown to be solvable. In particular, we introduce a mild assumption on the observations and the measurement matrix which rules out the zero vector from the optimal solution set. This indicates that our proposed model is statistically meaningful. To handle large-scale instances, we propose an efficient alternating direction method of multipliers (ADMM) to solve the proposed model from the dual perspective. The effectiveness of our model compared to ordinary least-squares is evaluated in terms of rank correlation coefficient between outputs and the systematic components, and the efficiency of our dual algorithm is demonstrated with the comparison to three efficient solvers via CVX in terms of computation time, solution accuracy and rank correlation coefficient.