1 of 26

1

Empowering Graph Wavelet Convolution for Node

Classification: A Novel Approach with Local Lifting

Scheme

RME 5401: Thesis

Swakshar Deb

Exam Roll: 120706

Registration No. 2016-614-703

Dr. Sejuti Rahman

Associate Professor

Robotics and Mechatronics Engineering

University of Dhaka

Supervisor

M.Sc Defence Thesis Presentation

CS771: Intro to ML

2 of 26

Overview

    • Motivation
    • Homophilic and Heterophilic graphs
    • Problem Statement
    • Related Works
    • Proposed Generalized Adaptive Graph Wavelet Neural Network (GA-GWNN)
    • Limitations and our Contributions
    • Experimental Analysis

    • Simple and Effective Graph Wavelet Neural Network (SEA-GWNN)
    • Experimental Analysis
    • Conclusion

2

Part 1

Part 2

3 of 26

Motivation for Node Classification

3

Fraudster Detection

Friend Suggestion

Drug Discovery

4 of 26

Homophilic and Heterophilic Graphs

4

Homophilic Graph

Heterophilic Graph

Due to similar neighbours, homophilic graphs are suitable for low pass filtering

Due to dissimilar neighbour, it is necessary to recognise high frequency pattern via high pass filter

5 of 26

5

Problem Statement

Input Graph

5

Wavelet Based Algorithms

G , X

[2]Xu et. al., “Graph wavelet neural network,” ICLR, 2018

[3]Zheng et. al., “How framelets enhance graph neural networks,“ ICLR, 2021

[4] Zheng et. al. , “Mathnet: Haarlike wavelet multiresolution analysis for graph representation learning,“ Knowledge-Based Systems, 2023.

[5]Li et. al., “Fast haar transforms for graph neural networks,” Neural Networks, 2020.

[6]Xu et. al., “Graph neural networks with lifting-based adaptive graph wavelets,” IEEE Transactions on Signal and Information Processing over Networks, 2022

Error

Backpropagate Error

[2,3,4,5,6]

[1]Mallat et. al., “A wavelet Tour of Signal Processing”,Elsevier,1999

[1]

6 of 26

6

Related Work

Input Graph

6

Wavelet Based Algorithms

G , X

[7]Shen et. al., “Optimized distributed 2d transforms for irregularly sampled sensor network grids using wavelet lifting," in IEEE ICASS, 2008

[8]Narang et. al., Lifting based wavelet transforms on graphs,“ APSIPA, 2009.

Node Classification

Predefined Wavelet based Approaches [2,3,4,5]

Utilize fix wavelet filters

Adaptive Wavelet based Approaches [6,7,8]

Require domain specific knowledge

Lead to undesirable wavelet filters

.

Data preprocessing step is mandatory

[2,3,4,5,6,7,8]

7 of 26

Related Works: Adaptive Wavelet based Approach

7

Input Graph

Random Split

Modified Graph

Undesirable

Wavelet

Node Classifier

Vanilla Lifting Scheme [6,7,8]

G , X

Undesirable Wavelet: A Filter produced over a disrupted graph structure

8 of 26

Proposed Generalized Adaptive Graph Wavelet Neural Network (GA-GWN)

8

Node Classifier

Input Graph

Adjacency Matrix

Tree Lifting Scheme

G, X

Structure has been preserved

Representation contains

complete information

Computational Tree

Desirable

Wavelet

Desirable Wavelet: Filter over the original graph structure

9 of 26

Our Contributions

  • Limitations of existing wavelet based methods

    • Predefined wavelet filter based methods consider only homophily assumption.

    • Predefined wavelet filter based methods require domain specific knowledge.

    • Adaptive wavelet filter based methods produces undesirable filters.

  • Contributions

    • Our Proposed GA-GWNN can generalize to both homophilic and heterophilic graphs.

    • Since proposed GA-GWNN is adaptive wavelet based approach thus does not require domain specific knowledge.

    • GA-GWNN is able to produce desirable wavelet filters.

    • Also proposed further simple and effective version SEA-GWNN.
      • No need of inverse transform.
      • Attention detachment.
      • Multiscale information.

9

10 of 26

Message Propagation of GA-GWNN

10

Soft Thresholding

Tree, T_i

High Frequency

Low Frequency

High (red) and Low (yellow) pass filter.

Theorem 4.1

Theorem 4.1

11 of 26

Fusion Module/Operation

  • Fuses both the low and high frequency information to enhance representation

11

Input Graph

Learned wavelet

through lifting

Fusion module

12 of 26

Dataset and Evaluation Metrics

  • Homophilic dataset
    • Citation graph[24] (Cora, Citeseer, PubMed)
  • Heterophilic dataset
    • Webpage graph[13] (Cornell, Texas, Wisconsin)
    • Film industry graph[13] (Film)
  • Large scale graphs
    • Ogbn-Arxiv[19]
    • Ogbn-Products[19]
  • Evaluation Metrics
    • Accuracy
      • Percentage correct prediction
    • Precision
    • Recall
    • F1-Score
      • Harmonic mean of precision and recall

12

Dataset Statistics

13 of 26

Overall Results

13

Mean accuracy on semi supervised node classification. Best results are heighted in bold

Predefined Wavelet Filter

Adaptive Wavelet method but undesirable filter

[9]Hamilton et. al., “Inductive representation learning on large graphs,”Neurips, 2017

[10] Velickovic et. al., “ Graph attention networks,” ICLR, 2018.

Adaptive wavelet method with desirable filters

14 of 26

Overall Results(3)

14

Mean Precision, Recall, F1-score on Semi supervised node classification

15 of 26

Overall Results(2)

15

Mean accuracy on full supervised node classification. Best results are in bold

Homophilic Graph Neural Network

Heterophilic Graph Neural Network

Adaptive wavelet filters

16 of 26

Overall Results(3)

16

Only utilize graph signal

Only utilize graph structure

More depth

More the cost

Mean accuracy on large scale graphs (full supervised). Best results are highlighted in bold.

OOM denotes Out of Memory

0

17 of 26

Computational Cost

17

GA-GWNN Time Complexity: O(L|N|d_in d_out + L |E| d_out)

18 of 26

Ablation Studies

18

Performance of GA-GWNN with deeper layer in supervised task

Semi supervised performance with deeper architecture

Feature map visualization

19 of 26

Simple and Effective GWNN (SEA-GWNN)

19

GA-GWNN

SEA-GWNN

20 of 26

SEA-GWNN(2)

20

A1

A2

A3

Multiscale Information

21 of 26

Overall Results

21

Mean accuracy on full supervised node classification. Best results are highlighted in bold

Mean accuracy on semi supervised node classification with best results are highlighted in bold

22 of 26

Overall Results(2)

22

Mean accuracy on large scale graphs node classification. Best results are highlighted in bold

23 of 26

Overall Results(3)

23

Graph level classification and prediction

[5]H. Pei, B. Wei, K. C.-C. Chang, Y. Lei, and B. Yang, Geom-gcn: Geometric graph convolutional networks," in International Conference on Learning Representations, 2019.

24 of 26

Model Analysis

24

Computational Cost

Model performance on deeper architecture

O(|N|d_in d_out + L|E|d_out)

25 of 26

Conclusion

  • Proposed a novel class of algorithm namely GA-GWNN that produces desirable wavelet filters.
  • Proposed a novel lifting scheme namely tree lifting scheme that preserve the original graph structure.
  • Our proposed GA-GWNN can learn wavelet filters on arbitrary graphs.
  • Experimental results demonstrate the superiority of our proposed algorithm.
  • Further proposed a simple and more scalable version of GA-GWNN namely SEA-GWNN

25

26 of 26

26

References

[1]Mallat et. al., “A wavelet Tour of Signal Processing”,Elsevier,1999

[2]Xu et. al., “Graph wavelet neural network,” ICLR, 2018

[3]Zheng et. al., “How framelets enhance graph neural networks,“ ICLR, 2021

[4] Zheng et. al. , “Mathnet: Haarlike wavelet multiresolution analysis for graph representation learning,“ Knowledge-Based Systems, 2023.

[5]Li et. al., “Fast haar transforms for graph neural networks,” Neural Networks, 2020.

[6]Xu et. al., “Graph neural networks with lifting-based adaptive graph wavelets,” IEEE Transactions on Signal and Information Processing over Networks, 2022

[7]Shen et. al., “Optimized distributed 2d transforms for irregularly sampled sensor network grids using wavelet lifting," in IEEE ICASS, 2008

[8]Narang et. al., Lifting based wavelet transforms on graphs,“ APSIPA, 2009.

[9]Hamilton et. al., “Inductive representation learning on large graphs,”Neurips, 2017

[10] Velickovic et. al., “ Graph attention networks,” ICLR, 2018.

[11] Kipf et. al., “Semi-supervised classication with graph convolutional networks," ICLR, 2017

[11] Gasteiger et. al., “Predict then propagate: Graph neural networks meet personalized pagerank,“ICLR, 2018

[12] Perozzi et al, Deepwalk: Online learning of social representations,“ ICDM, 2014

[13] Pei et. al., Geom-gcn: Geometric graph convolutional networks," ICLR, 2019.

[14] Abu et. al. Mixhop: Higher-order graph convolutional architectures via sparsied neighborhood mixing," PMLR, 2019

[15] Zhu et. al., Beyond homophily in graph neural networks: Current limitations and effective designs, Neurips,, 2020.

[16] Zhu et. al., Graph neural networks with heterophily, AAAI, 2021.

[17] Chien et. al., Adaptive universal generalized pagerank graph neural network, ICLR, 2021.

[18] Wang et. al., Am-gcn: Adaptive multi-channel graph convolutional networks, ICDM, 2020.

[19] Hu et. al., Open graph benchmark for machine learning on graphs, Neurips, 2020.

[20]Grover et. al., Node2vec: Scalable feature learning for networks, ICDM, 2016

[21 ]Derg et. al., Graphzoom: A multi level spectral approach for accurate and scalable graph embedding, ICLR, 2019

[22]Li et. al., DeeperGCN: All you need to train deeper gcn, ICLR, 2020

[23]Fabrizzio et. al., Scalable inception graph neural network, ICML, 2020.

[24]Sen et. al., Collective classication in network data, AI magazine, 2008