1 of 27

The Logic of Graph Neural Network

Paper by: Martin Grohe

RWTH Aachen University, Germany

Presented by: Hritam Basak

Some slides borrowed from the author’s original presentation

2 of 27

Motivation: Neuro-symbolic Integration

Part I: Logic for GNNs

We characterise the logical expressiveness of graph neural network (GNN)

Part II: GNNs for Logic

We propose a generic architecture for compiling GNN-based solvers for constraint satisfaction problems

3 of 27

Graph Neural Networks

4 of 27

Graph Neural Networks

  • Graph Neural Networks (GNNs) are deep learning architectures for machine learning problems on graphs.
  • They can be viewed as a generalization of convolutional neural networks from a rigid grid structure to more flexible structured data.
  • GNNs have wide range of applications, for example, in computational biology, chemical engineering, physics, etc.
  • By now, there are large varieties of GNN architectures. We consider the core GNN architecture known as message passing graph neural network or aggregate-combine graph neural network

5 of 27

Graph Neural Networks

6 of 27

Graph Neural Networks

7 of 27

Graph Neural Networks

8 of 27

Graph Neural Networks

9 of 27

Graph Neural Networks

10 of 27

Graph Neural Networks

11 of 27

Graph Neural Networks

12 of 27

Graph Neural Networks

13 of 27

Graph Neural Networks

14 of 27

Computation of GNNs

GNN N with d layers maps graph G to sequence of functions

 

Initialization: encodes node level of v

 

Aggregation:

 

 

  • Sufficient to use summation as aggregation (Xu et al.,19)

 

  • Function computed by a feedforward neural network or some more complex neural network architecture

15 of 27

Functions computed by GNNs

 

 

 

 

 

16 of 27

Recurrent GNNs

 

Recurrent GNNs: Instead, we can take a single layer and apply it repeatedly. That is, we have a single aggregation function agg and combination function comb and let:

 

 

17 of 27

Part I

The Expressiveness of Graph Neural Network

The Weisfeiler-Leman Algorithm

18 of 27

A simple combinatorial algo.

The color refinement algorithm iteratively computed a coloring of the vertices of graph G

Initialization: All vertices get a same color

Refinement Step: Two nodes v, w get different colors if there is some color c such that v and w have different numbers of neighbors of color c.

Refinement is repeated until coloring stays stable

The stable coloring of a given graph with n nodes and m edges can be computed in time O((n + m) log n).

19 of 27

WL distinguishes two graphs G, H if their color histograms differ, that is, some color appears a different number of times in G and H.

Thus 1-WL can be used as an incomplete isomorphism test.

    • works on almost all graphs (Babai, Erdös, Selkow 1980)

    • fails on some very simple graphs:

20 of 27

Higher-Dimensional Weisfeiler-Leman

The k-dimensional Weisfeiler-Leman algorithm (k-WL) iteratively colours k-tuples of nodes (Weisfeiler and Leman 1968, Babai 1980)

Running time: O(nk+1 log n)

k-WL is much more powerful than 1-WL, but still not a complete isomorphism test.

Theorem (Cai, Fürer, Immerman 1992)

For every k there are non-isomorphic graphs Gk, Hk of size O(k)

not distinguished by k-WL.

21 of 27

22 of 27

Part II

GNN for Constraint Satisfaction Problems

23 of 27

Constraint Satisfaction Problem

Constraint satisfaction problem (CSP) provide a general framework for combinatorial search problems:

    • The task is to assign values to variables subject to constraints

    • A specific CSP is given by its constraint language specifying which kinds of constraints can be used

    • In the maximization version (MAX-CSP) the objective is to satisfy as many constraints as possible

24 of 27

RUN-CSP

RUN-CSP (Tönshoff, Ritzert, Wolf, G. 2020+)

(Recurrent Unsupervised NN for Max-CSPs)

  • MPNN architecture for solving binary binary Max-CSPs

  • Applicable to all binary CSPs: constraint language compiled into loss function

  • training unsupervised

  • scales from small training examples to inference on large examples

25 of 27

Loss Function

26 of 27

Concluding Remarks

  • GNNs are a very flexible learning architecture which allows us to adapt them to logical formalism such as CSPs

  • We have a good understanding of their expressiveness. Yet many interesting questions remain open

  • Expressiveness results only tell half story, because they ignore learning. However, most of the results have good experimental support.

27 of 27

References

  • Slides on GNN by the author himself
  • YouTube presentation by the author
  • Grohe M. The logic of graph neural networks. In2021 36th Annual ACM/IEEE Symposium on Logic in Computer Science (LICS) 2021 Jun 29 (pp. 1-17). IEEE.
  • Toenshoff J, Ritzert M, Wolf H, Grohe M. Graph neural networks for maximum constraint satisfaction. Frontiers in artificial intelligence. 2021 Feb 25;3:580607.