1
Empowering Graph Wavelet Convolution for Node
Classification: A Novel Approach with Local Lifting
Scheme
RME 5401: Thesis
Swakshar Deb
Exam Roll: 120706
Registration No. 2016-614-703
Dr. Sejuti Rahman
Associate Professor
Robotics and Mechatronics Engineering
University of Dhaka
Supervisor
M.Sc Defence Thesis Presentation
CS771: Intro to ML
Overview
2
Part 1
Part 2
Motivation for Node Classification
3
Fraudster Detection
Friend Suggestion
Drug Discovery
Homophilic and Heterophilic Graphs
4
Homophilic Graph
Heterophilic Graph
Due to similar neighbours, homophilic graphs are suitable for low pass filtering
Due to dissimilar neighbour, it is necessary to recognise high frequency pattern via high pass filter
5
Problem Statement
Input Graph
5
Wavelet Based Algorithms
G , X
[2]Xu et. al., “Graph wavelet neural network,” ICLR, 2018
[3]Zheng et. al., “How framelets enhance graph neural networks,“ ICLR, 2021
[4] Zheng et. al. , “Mathnet: Haarlike wavelet multiresolution analysis for graph representation learning,“ Knowledge-Based Systems, 2023.
[5]Li et. al., “Fast haar transforms for graph neural networks,” Neural Networks, 2020.
[6]Xu et. al., “Graph neural networks with lifting-based adaptive graph wavelets,” IEEE Transactions on Signal and Information Processing over Networks, 2022
Error
Backpropagate Error
[2,3,4,5,6]
[1]Mallat et. al., “A wavelet Tour of Signal Processing”,Elsevier,1999
[1]
6
Related Work
Input Graph
6
Wavelet Based Algorithms
G , X
[7]Shen et. al., “Optimized distributed 2d transforms for irregularly sampled sensor network grids using wavelet lifting," in IEEE ICASS, 2008
[8]Narang et. al., Lifting based wavelet transforms on graphs,“ APSIPA, 2009.
Node Classification
Predefined Wavelet based Approaches [2,3,4,5]
Utilize fix wavelet filters
Adaptive Wavelet based Approaches [6,7,8]
Require domain specific knowledge
Lead to undesirable wavelet filters
.
Data preprocessing step is mandatory
[2,3,4,5,6,7,8]
Related Works: Adaptive Wavelet based Approach
7
Input Graph
Random Split
Modified Graph
Undesirable
Wavelet
Node Classifier
Vanilla Lifting Scheme [6,7,8]
G , X
Undesirable Wavelet: A Filter produced over a disrupted graph structure
Proposed Generalized Adaptive Graph Wavelet Neural Network (GA-GWN)
8
Node Classifier
Input Graph
Adjacency Matrix
Tree Lifting Scheme
G, X
Structure has been preserved
Representation contains
complete information
Computational Tree
Desirable
Wavelet
Desirable Wavelet: Filter over the original graph structure
Our Contributions
9
Message Propagation of GA-GWNN
10
Soft Thresholding
Tree, T_i
High Frequency
Low Frequency
High (red) and Low (yellow) pass filter.
Theorem 4.1
Theorem 4.1
Fusion Module/Operation
11
Input Graph
Learned wavelet
through lifting
Fusion module
Dataset and Evaluation Metrics
12
Dataset Statistics
Overall Results
13
Mean accuracy on semi supervised node classification. Best results are heighted in bold
Predefined Wavelet Filter
Adaptive Wavelet method but undesirable filter
[9]Hamilton et. al., “Inductive representation learning on large graphs,”Neurips, 2017
[10] Velickovic et. al., “ Graph attention networks,” ICLR, 2018.
Adaptive wavelet method with desirable filters
Overall Results(3)
14
Mean Precision, Recall, F1-score on Semi supervised node classification
Overall Results(2)
15
Mean accuracy on full supervised node classification. Best results are in bold
Homophilic Graph Neural Network
Heterophilic Graph Neural Network
Adaptive wavelet filters
Overall Results(3)
16
Only utilize graph signal
Only utilize graph structure
More depth
More the cost
Mean accuracy on large scale graphs (full supervised). Best results are highlighted in bold.
OOM denotes Out of Memory
0
Computational Cost
17
GA-GWNN Time Complexity: O(L|N|d_in d_out + L |E| d_out)
Ablation Studies
18
Performance of GA-GWNN with deeper layer in supervised task
Semi supervised performance with deeper architecture
Feature map visualization
Simple and Effective GWNN (SEA-GWNN)
19
GA-GWNN
SEA-GWNN
SEA-GWNN(2)
20
A1
A2
A3
Multiscale Information
Overall Results
21
Mean accuracy on full supervised node classification. Best results are highlighted in bold
Mean accuracy on semi supervised node classification with best results are highlighted in bold
Overall Results(2)
22
Mean accuracy on large scale graphs node classification. Best results are highlighted in bold
Overall Results(3)
23
Graph level classification and prediction
[5]H. Pei, B. Wei, K. C.-C. Chang, Y. Lei, and B. Yang, Geom-gcn: Geometric graph convolutional networks," in International Conference on Learning Representations, 2019.
Model Analysis
24
Computational Cost
Model performance on deeper architecture
O(|N|d_in d_out + L|E|d_out)
Conclusion
25
26
References
[1]Mallat et. al., “A wavelet Tour of Signal Processing”,Elsevier,1999
[2]Xu et. al., “Graph wavelet neural network,” ICLR, 2018
[3]Zheng et. al., “How framelets enhance graph neural networks,“ ICLR, 2021
[4] Zheng et. al. , “Mathnet: Haarlike wavelet multiresolution analysis for graph representation learning,“ Knowledge-Based Systems, 2023.
[5]Li et. al., “Fast haar transforms for graph neural networks,” Neural Networks, 2020.
[6]Xu et. al., “Graph neural networks with lifting-based adaptive graph wavelets,” IEEE Transactions on Signal and Information Processing over Networks, 2022
[7]Shen et. al., “Optimized distributed 2d transforms for irregularly sampled sensor network grids using wavelet lifting," in IEEE ICASS, 2008
[8]Narang et. al., Lifting based wavelet transforms on graphs,“ APSIPA, 2009.
[9]Hamilton et. al., “Inductive representation learning on large graphs,”Neurips, 2017
[10] Velickovic et. al., “ Graph attention networks,” ICLR, 2018.
[11] Kipf et. al., “Semi-supervised classication with graph convolutional networks," ICLR, 2017
[11] Gasteiger et. al., “Predict then propagate: Graph neural networks meet personalized pagerank,“ICLR, 2018
[12] Perozzi et al, Deepwalk: Online learning of social representations,“ ICDM, 2014
[13] Pei et. al., Geom-gcn: Geometric graph convolutional networks," ICLR, 2019.
[14] Abu et. al. Mixhop: Higher-order graph convolutional architectures via sparsied neighborhood mixing," PMLR, 2019
[15] Zhu et. al., Beyond homophily in graph neural networks: Current limitations and effective designs, Neurips,, 2020.
[16] Zhu et. al., Graph neural networks with heterophily, AAAI, 2021.
[17] Chien et. al., Adaptive universal generalized pagerank graph neural network, ICLR, 2021.
[18] Wang et. al., Am-gcn: Adaptive multi-channel graph convolutional networks, ICDM, 2020.
[19] Hu et. al., Open graph benchmark for machine learning on graphs, Neurips, 2020.
[20]Grover et. al., Node2vec: Scalable feature learning for networks, ICDM, 2016
[21 ]Derg et. al., Graphzoom: A multi level spectral approach for accurate and scalable graph embedding, ICLR, 2019
[22]Li et. al., DeeperGCN: All you need to train deeper gcn, ICLR, 2020
[23]Fabrizzio et. al., Scalable inception graph neural network, ICML, 2020.
[24]Sen et. al., Collective classication in network data, AI magazine, 2008