Bridging the Gap: Unpacking the Hidden Challenges in Knowledge Distillation for Online Ranking Systems
Topics: AI (Deep Learning), AIOverviews, Data Mining, LLMO, SGE
The paper from Google Deepmind, titled “Bridging the Gap: Unpacking the Hidden Challenges in Knowledge Distillation for Online Ranking Systems,” explores the application of knowledge distillation (KD) in recommender systems, specifically focusing on large-scale online video platforms. It discusses the unique challenges faced when applying KD to these systems, which differ from traditional applications in computer vision and NLP. Key issues include data distribution shifts between teacher and student models, optimizing teacher configurations, and efficiently managing multiple students using the same teacher. The authors present solutions to these challenges, including an auxiliary distillation strategy and continuous updates to the teacher model, which have been tested on Google’s video recommendation systems.