This is the original code for Unifews: Unified Entry-Wise Sparsification for Efficient Graph Neural Network
Installed env.txt
by conda:
conda create --name <env> --file env.txt
- C++ 14
- CMake 3.16
- eigen3
- Use
utils/data_transfer.py
to generate processed files under pathdata/[dataset_name]
similar to the example folderdata/cora
:
adj.npz
: adjacency matrix in scipy.sparse.csr_matrixfeats.npy
: features in .npy arraylabels.npz
: node label information- 'label': labels (number or one-hot)
- 'idx_train/idx_val/idx_test': indices of training/validation/test nodes
adj_el.bin
,adj_pl.bin
,attribute.txt
,degree.npz
: graph files for precomputation
- Compile Cython:
cd precompute
python setup.py build_ext --inplace
- Run full-batch experiment:
python run_fb.py -f [seed] -c [config_file] -v [device]
- Run mini-batch experiment
python run_mb.py -f [seed] -c [config_file] -v [device]
- cora, citeseer, pubmed: Pytorch Geometric
- arxiv, products, papers100m: OGBl
- GLT: A Unified Lottery Ticket Hypothesis for Graph Neural Networks
- GEBT: Early-Bird GCNs: Graph-Network Co-optimization towards More Efficient GCN Training and Inference via Drawing Early-Bird Lottery Tickets
- CGP: Comprehensive Graph Gradual Pruning for Sparse Training in Graph Neural Networks
- DSpar: DSpar: An Embarrassingly Simple Strategy for Efficient GNN Training and Inference via Degree-Based Sparsification
- NDLS: Node Dependent Local Smoothing for Scalable Graph Learning
- NIGCN: Node-wise Diffusion for Scalable Graph Learning