Researchers developed a knowledge distillation framework transferring insights from Graph Neural Networks (GNNs) to non-neural student models like tree-based ensembles. Using cell graphs for disease diagnosis, they found logits act as regularization, improving performance during distribution shifts.Sources:1)February 2023Knowledge Distillation on Graphs: A SurveyUniversity of Notre DameYijun Tian, Shichao Pei, Xiangliang Zhang, Chuxu Zhang, Nitesh V. Chawlahttps://arxiv.org/pdf/2302.002192)January 2026InfGraND: An Influence-Guided GNN-to-MLP Knowledge DistillationQueen's UniversityAmir Eskandari, Aman Anand, Elyas Rashno, Farhana Zulkerninehttps://arxiv.org/pdf/2601.080333)2025Fairness Implications of GNN-to-MLP Knowledge DistillationUCLAMargaret Capetz, Yizhou Sun, Arjun Subramonianhttps://openreview.net/pdf?id=6LPz8LlfeK4)August 10 2025Distilling knowledge from graph neural networks trained on cell graphs to non-neural student modelsRensselaer Polytechnic InstituteVasundhara Acharya, Bulent Yener, Gillian Beamerhttps://www.nature.com/articles/s41598-025-13697-7.pdf