Graphite: Iterative Generative Modeling of Graphs Aditya Grover , Aaron Zweig, Stefano Ermon Computer Science Department Stanford University
Graphs are ubiquitous How do we learn representations of nodes in a graph? Useful for several prediction tasks. E.g., friendship links on social networks ( link prediction ), living status of organisms in ecological networks ( node classification ) Social, biological, information networks etc. Graphite: Iterative Generative Modeling of Graphs
Latent Variable Model of a Graph • Graphs are represented as adjacency matrices A ∈ {0,1} 0 × 0 • For every node 2 , we associate a latent vector representation # 3 ∈ ℝ 5 Adjacency matrix Latent feature matrix Example graph 0 1 0 1 % 2 # $ 1 0 1 0 % # & A = 1 3 Z = 0 1 0 1 % # ' 1 0 1 0 4 % # ( Graphite: Iterative Generative Modeling of Graphs
Graphite: A VAE for Graphs latent matrix Z ∈ ℝ ) × + Z Decoder: Generate data # $ (A |Z ) adjacency matrix A ∈ {0,1} ) × ) A Graphite: Iterative Generative Modeling of Graphs
Graphite: A VAE for Graphs latent matrix Z ∈ ℝ + × - Z Encoder: Infer representations Decoder: Generate data # $ (A |Z ) ' ( Z A ) adjacency matrix A ∈ {0,1} + × + A Graphite: Iterative Generative Modeling of Graphs
Graphite: Learning & Inference Given: Dataset of adjacency matrices, ) * Z # $ (A |Z ) ' ( Z A ) A Graphite: Iterative Generative Modeling of Graphs
Graphite: Learning & Inference Given: Dataset of adjacency matrices, ) * Z Learning objective: max $,( ELBO(3, 4; D * ) # $ (A |Z ) ' ( Z A ) A Graphite: Iterative Generative Modeling of Graphs
Graphite: Learning & Inference Given: Dataset of adjacency matrices, ) * Learning objective: max $,( ELBO(3, 4; D * ) Z Test time use cases Generative modeling tasks # $ (A |Z ) - Density estimation, clustering nodes, ' ( Z A ) compressing graphs etc. Graph tasks A - Link Prediction: Denoise graph - Semi-supervised node classification: Feed 8 9 for labelled nodes to a classifier Graphite: Iterative Generative Modeling of Graphs
Parameterizing Graph Autoencoders Z # $ Z A ) Encoding # $ Z A ) : Graph Neural Network (GNN) GNN A Graphite: Iterative Generative Modeling of Graphs
Parameterizing Graph Autoencoders Z ' ( Z A ) # $ (A |Z ) Encoding ' ( Z A ) : Graph Neural Network (GNN) ? GNN Decoding # $ (A |Z ) : Challenging to “upsample” graphs given latent representations A Graphite: Iterative Generative Modeling of Graphs
Decoding Graphs - MLP Z Z ∈ ℝ 0 × 2 Option 1: Multi-layer Perceptrons (MLP) Simonovsky et al., 2018 # $ (A |Z ) MLP '(( ) * + *,) total parameters for single hidden layer of width * A ∈ {0,1} 0 × 0 A Graphite: Iterative Generative Modeling of Graphs
Decoding Graphs - RNN Z Z ∈ ℝ ) × + Option 2: Recurrent Neural Network (RNN) You et al., 2018 # $ (A |Z ) RNN Arbitrary ordering of nodes required for training A ∈ {0,1} ) × ) A e.g., BFS, DFS Graphite: Iterative Generative Modeling of Graphs
Graphite – Decoding Graphs using GNN Z Z ∈ ℝ ) × + Key idea Learn the low-rank structure of adjacency # $ (A |Z ) GNN matrix A in the latent space Z A ∈ {0,1} ) × ) A Graphite: Iterative Generative Modeling of Graphs
Graphite – Decoding Graphs using GNN • For fixed number of iterations: Z Z ∈ ℝ , × . Step 1 (Low rank matrix reconstruction) Map Z to an intermediate graph ! A via an inner product ! A ≈ ZZ % & ' (A |Z ) GNN A ∈ {0,1} , × , A Graphite: Iterative Generative Modeling of Graphs
Graphite – Decoding Graphs using GNN • For fixed number of iterations: Z Z ∈ ℝ 0 × 2 Step 1 (Low rank matrix reconstruction) Map Z to an intermediate graph ! A via an inner product ! A ≈ ZZ % , ) (A |Z ) GNN Step 2 (Progressive refinement) Refine Z by message passing over ! A using a GNN Z = GNN ) (! A) A ∈ {0,1} 0 × 0 A Graphite: Iterative Generative Modeling of Graphs
Graphite – Decoding Graphs using GNN • For fixed number of iterations: Z Z ∈ ℝ 0 × 2 Step 1 (Low rank matrix reconstruction) Map Z to an intermediate graph ! A via an inner product ! A ≈ ZZ % , ) (A |Z ) GNN Step 2 (Progressive refinement) Refine Z by message passing over ! A using a GNN Z = GNN ) (! A) A ∈ {0,1} 0 × 0 A • Output step: Set , ) (A |Z ) = Bernoulli(sigmoid( ZZ % )) Graphite: Iterative Generative Modeling of Graphs
Graphite – Decoding Graphs using GNN Z Z ∈ ℝ - × / - Unlike MLP , GNN decoder with single hidden layer of length d has ( ) (A |Z ) !(dk) parameters GNN - Unlike RNN , no arbitrary ordering of input nodes is required A ∈ {0,1} - × - A Decoding is also computationally efficient. See paper for details. Graphite: Iterative Generative Modeling of Graphs
Empirical Results – Density Estimation Baseline VGAE [Kipf et al., 2016] GNN Encoder + Non-learned Inner Product Decoder. No iterative refinement. Negative log-likelihoods. Lower is better. 280 275 270 265 260 255 … … … … - o r e r s s a e a g m o l w b E u d o a g o r e r e E P a G R B VGAE Graphite Graphite: Iterative Generative Modeling of Graphs
Empirical Results – Link Prediction AUC. Higher is better. 100 95 90 85 80 SpecCluster DeepWalk node2vec VGAE Graphite Cora Citeseer Pubmed Graphite: Iterative Generative Modeling of Graphs
Empirical Results – Semi-supervised Node Classification Percentage accuracy. Higher is better. 85 80 75 70 65 60 55 SemiEmb DeepWalk ICA Planetoid GCN Graphite Cora Citeseer Pubmed Graphite: Iterative Generative Modeling of Graphs
Summary Graphite : A latent variable generative model for graphs where both encoder and decoder are parameterized by graph neural networks. • Encoder performs message passing on input graph Z • Decoder iteratively refines inner product graphs For more details, please visit us at Poster #7. GNN GNN Code: https://github.com/ermongroup/graphite # $ (A |Z ) ' ( Z A ) A Graphite: Iterative Generative Modeling of Graphs
Recommend
More recommend