
Sign up to save your podcasts
Or


This paper introduces Logit Calibration (LoCa) to enhance knowledge distillation by correcting teacher model predictions while preserving valuable information, improving student model performance without extra parameters.
https://arxiv.org/abs//2409.04778
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers
By Igor Melnyk5
33 ratings
This paper introduces Logit Calibration (LoCa) to enhance knowledge distillation by correcting teacher model predictions while preserving valuable information, improving student model performance without extra parameters.
https://arxiv.org/abs//2409.04778
YouTube: https://www.youtube.com/@ArxivPapers
TikTok: https://www.tiktok.com/@arxiv_papers
Apple Podcasts: https://podcasts.apple.com/us/podcast/arxiv-papers/id1692476016
Spotify: https://podcasters.spotify.com/pod/show/arxiv-papers

958 Listeners

1,932 Listeners

432 Listeners

112,060 Listeners

9,942 Listeners

5,506 Listeners

209 Listeners

49 Listeners

93 Listeners

467 Listeners