1 of 35

Session-based Recommendation with Graph Neural Networks

Zongyue Qin

Oct 4 2022

2 of 35

Background

  • When online shopping, people usually click several items before they decide which items to purchase.
  • Many recommender systems rely on user profiles to do recommendation, which might be unavailable in practice.
  • Therefore, recommending items solely based on previously visited items is very useful for online shopping platform.

3 of 35

Problem Definition

  •  

4 of 35

Non-GNN Solutions

  • Conventional Methods
    • Markov Chains: Past components are combined independently, such independent assumption is too strong in practice.
    • K-NN: Strong baselines. Competitive performance with SOTA models.
  • RNN-based Methods (GRU4REC, STAMP, etc.)
    • Tend to overlook the complex transitions among distant items.
    • The unidirectional assumption is too strong, people can go back-and-forth in a session.

5 of 35

GNN Solutions

  • SR-GNN (Session-based Recommendation with Graph Neural Networks, AAAI-19)
  • GCE-GNN
  • DHCN

6 of 35

SR-GNN: Session Graphs

  •  

7 of 35

SR-GNN: Learning node Embeddings on Graph

  • As each node represents an item, the input node features are the corresponding item embeddings.
  • Then the node embeddings will be updated using a Gated Graph Neural Network (GGNN). Formally, the update function is shown on the right.

8 of 35

SR-GNN: Generate Session Embeddings

  •  

9 of 35

SR-GNN: Making Recommendation and Training

  •  

10 of 35

SR-GNN: Sum-up

  • One of the earliest papers to use GNN for session recommendation
  • Use GGNN and attention mechanism to get session embeddings.
  • Use cross entropy to optimize the model.

11 of 35

GNN Solutions

  • SR-GNN
  • GCE-GNN
  • DHCN

12 of 35

Global Context Enhanced Graph Neural Networks for Session-based Recommendation

  • SR-GNN heavily rely on the relevance of the last item to the user preference of the current session.
  • Previous studies only consider the current session while ignoring the useful item-transition patterns from other sessions.
  • Therefore, they propose a novel approach to exploit the item-transitions over all sessions explicitly.

13 of 35

Session Graph and Global Graph

  • GCE-GNN has two types of graphs to capture item transition information.
  • The session graph is basically same to SR-GNN, except that they add a self-loop for each item and there are four type of edges: in, out, in-out, self.

14 of 35

Global Graph

  •  

15 of 35

GCE-GNN: Overview

16 of 35

Global Level Item Representation

  •  

17 of 35

Session Level Item Representation

  •  

18 of 35

Session Representation

  •  

19 of 35

Prediction Layer

  • The prediction layer is the same as in SR-GNN

20 of 35

Experiments

21 of 35

Experiments

22 of 35

Experiments

  • GCE-GNN-NP: GCE-GNN with forward positional encoding
  • GCE-GNN-SA: with self-attention instead of the position-aware attention.

23 of 35

GNN Solutions

  • SR-GNN
  • GCE-GNN
  • DHCN

24 of 35

Self-Supervised Hypergraph Convolutional Networks for Session-based Recommendation

  • GNN-based methods regard item transitions as pairwise relations, which neglect the complex high-order information among items.
  • Hypergraph provides a natural way to capture beyond-pairwise relations.
  • To overcome data sparsity, they introduce line graph channel and integrate self-supervised learning.

25 of 35

Definition 1. Hypergraph

26 of 35

Definition 2. Line graph of Hypergraph

27 of 35

Hypergraph Construction

  • Each session is represented as a hyperedge.

28 of 35

DHCN: Overview

29 of 35

Hypergraph Convolutional Networks

  • The node representation is updated by aggregating its neighbors

  • It can be written in matrix form with row normalization

  • The final node representation is the average of node representation of each layer.
  • The positional embeddings are added into item representation in the same way as in GCE-GNN.
  • The session embedding is obtained in the same way as in SR-GNN.

30 of 35

Enhancing SBR with Self-Supervised Learning

  • Maximize the mutual information between hypergraph representations and line-graph representations.
  • The line graph can be seen as a simple graph which contains the cross-session information and depicts the connectivity of hyper-edges.
  • Since each node in the line graph represents a session, the input features are the average of item embeddings in the session.
  • The embeddings is computed using GCN.

31 of 35

Enhancing SBR with Self-Supervised Learning

  •  

32 of 35

Experiments

  •  

33 of 35

Ablation Studies

  • DHCN-P: without positional encodings
  • DHCN-NA: without self-attention mechanism

34 of 35

Ablation Studies

35 of 35

Refenrence

  • Wu, Shu, et al. "Session-based recommendation with graph neural networks." Proceedings of the AAAI conference on artificial intelligence. Vol. 33. No. 01. 2019.
  • Wang, Ziyang, et al. "Global context enhanced graph neural networks for session-based recommendation." Proceedings of the 43rd international ACM SIGIR conference on research and development in information retrieval. 2020.
  • Xia, Xin, et al. "Self-supervised hypergraph convolutional networks for session-based recommendation." Proceedings of the AAAI conference on artificial intelligence. Vol. 35. No. 5. 2021.