scholarly journals Population Risk Improvement with Model Compression: An Information-Theoretic Approach

Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1255
Author(s):  
Yuheng Bu ◽  
Weihao Gao ◽  
Shaofeng Zou ◽  
Venugopal V. Veeravalli

It has been reported in many recent works on deep model compression that the population risk of a compressed model can be even better than that of the original model. In this paper, an information-theoretic explanation for this population risk improvement phenomenon is provided by jointly studying the decrease in the generalization error and the increase in the empirical risk that results from model compression. It is first shown that model compression reduces an information-theoretic bound on the generalization error, which suggests that model compression can be interpreted as a regularization technique to avoid overfitting. The increase in empirical risk caused by model compression is then characterized using rate distortion theory. These results imply that the overall population risk could be improved by model compression if the decrease in generalization error exceeds the increase in empirical risk. A linear regression example is presented to demonstrate that such a decrease in population risk due to model compression is indeed possible. Our theoretical results further suggest a way to improve a widely used model compression algorithm, i.e., Hessian-weighted K-means clustering, by regularizing the distance between the clustering centers. Experiments with neural networks are provided to validate our theoretical assertions.

2020 ◽  
Vol 34 (04) ◽  
pp. 3300-3307
Author(s):  
Yuheng Bu ◽  
Weihao Gao ◽  
Shaofeng Zou ◽  
Venugopal Veeravalli

We show that model compression can improve the population risk of a pre-trained model, by studying the tradeoff between the decrease in the generalization error and the increase in the empirical risk with model compression. We first prove that model compression reduces an information-theoretic bound on the generalization error; this allows for an interpretation of model compression as a regularization technique to avoid overfitting. We then characterize the increase in empirical risk with model compression using rate distortion theory. These results imply that the population risk could be improved by model compression if the decrease in generalization error exceeds the increase in empirical risk. We show through a linear regression example that such a decrease in population risk due to model compression is indeed possible. Our theoretical results further suggest that the Hessian-weighted K-means clustering compression approach can be improved by regularizing the distance between the clustering centers. We provide experiments with neural networks to support our theoretical assertions.


Author(s):  
R. V. Prasad ◽  
R. Muralishankar ◽  
S. Vijay ◽  
H. N. Shankar ◽  
Przemyslaw Pawelczak ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document