Shallow Representation Learning
Node Embedding
Unit Objectives
Node Embedding
Embedding Methods: Intuition
A good node representation should be able to reconstruct information that is desired to be preserved
mapping
Extractor
Reconstructor
Objective: Reconstruction Loss
Node Embedding: Visual Example
Input
Output
Zachary’s Karate Club Graph
Graph Embedding
Simple Graphs
Node Co-occurrence
Structural Role
Node status
Community Structure
Complex Graphs
Heterogeneous Graphs
Bipartite Graphs
Multi-dimensional Graphs
Signed Graphs
Hypergraphs
Dynamic Graphs
Embedding Simple Graphs
Node Co-occurrence Methods
Notions of Co-occurrence
… if the degree distribution of a connected graph follows a power law, we observe that the frequency which vertices appear in the short random walks will also follow a power-law distribution …
Language
Co-occurrence is the basis of fundamental representations in NLP: Vector representation of words or neural language models
Notion of Co-Occurrence
Source: School of Information, Pratt Institute
Co-Authorship Network
An author is connected to limited set of authors
This can be used as a notion of Co-occurrence
Can we use learn node representations that preserves the co-occurrence relations in the network?
Language Models
Word Frequency in Natural Language
Word frequency in natural language follows a power law
Slide from Bryan Perozzi et al.
The second most used word appears half as often as the most used word.
The third most used word appears one-third the number of times the most used word appears, and so on
Connection: Language and Graphs
Scale Free Graph
Artificial Language: Short truncated random walks as sentences
Random Walk on Graph
Short random walks
Vertex frequency in random walks on
scale free graphs also follows a power
law.
Connection: Language and Graph
Language Model in Concrete Terms
The Problems
The Hope