A Survey on Knowledge Distillation with Graph: Methods and Application Analysis
DOI:
Author:
Affiliation:

Institute of Computing Technology, Chinese Academy of Sciences

Clc Number:

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    Graph data, such as citation networks, social networks, and transportation networks, exist widely in the real world. Graph neural networks (GNNs) have attracted wide attention due to its strong expressiveness and excellent performance in a variety of graph analysis applications. However, the excellent performance of GNNs benefits from label data and complex network models, which are difficult to obtain and computational resources are expensive. To alleviate the labeled data scarcity and the high complexity of GNNs, knowledge distillation (KD) is introduced into enhance the existing GNNs. KD is a method of training constructed small models (student models) using soft-label supervision information from larger models (teacher models) to achieve better performance and accuracy. Therefore, how to apply the KD technology to graph data has become a major research challenge, but there is still a lack of review of graph-based KD researches. This paper aims to provide a comprehensive overview of KD based on graphs, systematically comb the existing works for the first time, and fills in the blank of the lack of review in this field. Specifically, this paper first introduces the background of graph and KD. Then, three types of graph knowledge distillation methods are comprehensively summarized, namely graph knowledge distillation for deep neural networks (DNNs), graph knowledge distillation for GNNs, and Self-KD based graph knowledge distillation. Furthermore, each type of method is further divided into knowledge distillation methods based on the output layer, the middle layer, and the constructed graph. Subsequently, the ideas of various graph-based knowledge distillation algorithms are analyzed and compared, and the advantages and disadvantages of various algorithms are concluded with experimental results. In addition, the application of graph-based knowledge distillation in computer vision, natural language processing, recommendation systems, and other fields are also listed. Finally, the development of graph-based knowledge distillation is summarized and prospected.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:September 07,2022
  • Revised:February 03,2023
  • Adopted:March 16,2023
  • Online:
  • Published:
You are the firstVisitors
Copyright: Institute of Software, Chinese Academy of Sciences Beijing ICP No. 05046678-4
Address:4# South Fourth Street, Zhong Guan Cun, Beijing 100190,Postal Code:100190
Phone:010-62562563 Fax:010-62562533 Email:jos@iscas.ac.cn
Technical Support:Beijing Qinyun Technology Development Co., Ltd.

Beijing Public Network Security No. 11040202500063