Skip to content

Code for TNNLS paper: "Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks"

Notifications You must be signed in to change notification settings

LiuChuang0059/CGP

Repository files navigation

Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks

Open-sourced implementation for TNNLS 2023.

Abstract

  1. We propose a graph gradual pruning framework, namely CGP, to reduce the training and inference computing costs of GNN models while preserving their accuracy.

  2. We comprehensively sparsify the elements of GNNs, including graph structures, the node feature dimension, and model parameters, to significantly improve the efficiency of GNN models.

  3. Experimental results on various GNN models and datasets consistently validate the effectiveness and efficiency of our proposed CGP.

Python Dependencies

Our proposed Gapformer is implemented in Python 3.7 and major libraries include:

  • Pytorch = 1.11.0+cu113
  • PyG torch-geometric=2.2.0

More dependencies are provided in requirements.txt.

To Run

Once the requirements are fulfilled, use this command to run:

sh xx.sh

Datasets

All datasets used in this paper can be downloaded from PyG.

About

Code for TNNLS paper: "Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published