Publication Date
11-2022
Conference/Sponsorship/Institution
Thirty-sixth Conference on Neural Information Processing Systems (NeurIPS 2022)
Description
Pruning techniques have been successfully used in neural networks to trade accuracy for sparsity. However, the impact of network pruning is not uniform: prior work has shown that the recall for underrepresented classes in a dataset may be more negatively affected. In this work, we study such relative distortions in recall by hypothesizing an intensification effect that is inherent to the model. Namely, that pruning makes recall relatively worse for a class with recall below accuracy and, conversely, that it makes recall relatively better for a class with recall above accuracy. In addition, we propose a new pruning algorithm aimed at attenuating such effect. Through statistical analysis, we have observed that intensification is less severe with our algorithm but nevertheless more pronounced with relatively more difficult tasks, less complex models, and higher pruning ratios. More surprisingly, we conversely observe a de-intensification effect with lower pruning ratios, which indicates that moderate pruning may have a corrective effect to such distortions.
Type
Conference Paper
Department
Analytics & Operations Management
Link to published version
https://openreview.net/forum?id=5hgYi4r5MDp
Recommended Citation
Good, Aidan; Lin, Jiaqi; Sieg, Hannah; Ferguson, Mikey; Yu, Xin; Zhe, Shandian; Wieczorek, Jerzy; and Serra, Thiago, "Recall Distortion in Neural Network Pruning and the Undecayed Pruning Algorithm" (2022). Faculty Conference Papers and Presentations. 69.
https://digitalcommons.bucknell.edu/fac_conf/69
Included in
Artificial Intelligence and Robotics Commons, Data Science Commons, Operational Research Commons