Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation

Kenneth Borup, Lars N. Andersen

Publikation: KonferencebidragPaperForskningpeer review

82 Downloads (Pure)

Abstract

Knowledge distillation is classically a procedure where a neural network is trained on the output of another network along with the original targets in order to transfer knowledge between the architectures. The special case of self-distillation, where the network architectures are identical, has been observed to improve generalization accuracy. In this paper, we consider an iterative variant of self-distillation in a kernel regression setting, in which successive steps incorporate both model outputs and the ground-truth targets. This allows us to provide the first theoretical results on the importance of using the weighted ground-truth targets in self-distillation. Our focus is on fitting nonlinear functions to training data with a weighted mean square error objective function suitable for distillation, subject to $\ell_2$ regularization of the model parameters. We show that any such function obtained with self-distillation can be calculated directly as a function of the initial fit, and that infinite distillation steps yields the same optimization problem as the original with amplified regularization. Finally, we examine empirically, both in a regression setting and with ResNet networks, how the choice of weighting parameter influences the generalization performance after self-distillation.
OriginalsprogEngelsk
Publikationsdatodec. 2021
StatusUdgivet - dec. 2021
BegivenhedConference on Neural Information Processing Systems 2021 -
Varighed: 6 dec. 202114 dec. 2021
https://nips.cc/

Konference

KonferenceConference on Neural Information Processing Systems 2021
Periode06/12/202114/12/2021
Internetadresse

Fingeraftryk

Dyk ned i forskningsemnerne om 'Even your Teacher Needs Guidance: Ground-Truth Targets Dampen Regularization Imposed by Self-Distillation'. Sammen danner de et unikt fingeraftryk.

Citationsformater