Contributed Talk: Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks

Speaker: Xiangyu Chang, UC Riverside
Talk title: Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks

Time: Tuesday, April 20, 10:45am-11:10am (PT)

Abstract:
Extensive empirical evidence indicates that pruning an overparameterized model leads to better model compression schemes. This paper sheds light on these findings by characterizing the high-dimensional asymptotics of model pruning in the overparameterized regime. Our approach is based on characterizing the exact asymptotic distribution of over-parameterized least-squares. The intuition gained by analytically studying simpler models is numerically verified on neural networks.

Joint work with Yingcong Li, Samet Oymak, and Christos Thrampoulidis.

 

Return to workshop schedule