graph neural network
play

Graph Neural Network Fang Yuanqiang, 2019/05/18 Graph Neural - PowerPoint PPT Presentation

Graph Neural Network Fang Yuanqiang, 2019/05/18 Graph Neural Network Why GNN? Preliminary Fixed graph Vanilla Spectral Graph ConvNets ChebyNet GCN CayleyNet Multiple graphs Variable graph


  1. Graph Neural Network Fang Yuanqiang, 2019/05/18

  2. Graph Neural Network  Why GNN?  Preliminary  Fixed graph  Vanilla Spectral Graph ConvNets  ChebyNet  GCN  CayleyNet  Multiple graphs  Variable graph  GraphSAGE  Graph Attention Network  Tasks 2

  3. Why GNN?  Euclidean domain & Non-Euclidean domain 3

  4. Why GNN?  ConvNets and Euclidean geometry  Data (image, video, sound) are compositional, they are formed by patterns that are:  Local → convolution  Multi-scale (hierarchical) → downsampling/pooling  Stationary → global/local invariance 4

  5. Why GNN?  Extend ConvNets to graph-structured data  Assumption: Non-Euclidean data are locally stationary and manifest hierarchical structures.  How to define compositionality on graphs? (conv. & pooling)  How to make them fast? (linear complexity) 5

  6. Preliminary  Theory  Graph theory  Convolution, spectral convolution  Fourier transform  Riemannian manifold  ……  Reference  http://geometricdeeplearning.com/slides/NIPS-GDL.pdf  http://helper.ipam.ucla.edu/publications/dlt2018/dlt2018_14506.pdf  https://www.zhihu.com/question/54504471?sort=created 6

  7. Preliminary  Graph 7

  8. Preliminary  Graph Laplacian 8

  9. Preliminary  Convolution: continuous 9

  10. Preliminary  Convolution: discrete Circular convolution Spatial (2-d) domain Temporal (1-d) inverse Fourier transform Hadamard product Spectral domain 10 Fourier transform 𝚾: 𝐸𝐺𝑈 𝑛𝑏𝑢𝑠𝑗𝑦/𝐺𝑝𝑣𝑠𝑗𝑓𝑠 𝑛𝑏𝑢𝑠𝑗𝑦

  11. Preliminary: aside  “ Conv ” in Deep Neural Networks. 11 http://cs231n.github.io/convolutional-networks/

  12. Preliminary: aside  “ Conv ” in Deep Neural Networks. 12 https://en.wikipedia.org/wiki/Cross-correlation

  13. Preliminary: aside  “ Conv ” in Deep Neural Networks is actually “Cross - correlation” . 13 https://pytorch.org/docs/0.3.1/nn.html#convolution-layers

  14. Preliminary  Convolution: graph 𝒉: 𝑔𝑗𝑚𝑢𝑓𝑠 𝒈: 𝑡𝑗𝑕𝑜𝑏𝑚 𝑕 𝚳 : 𝑒𝑗𝑏𝑕𝑝𝑜𝑏𝑚 𝑛𝑏𝑢𝑠𝑗𝑦 , 𝑔𝑣𝑜𝑑𝑢𝑗𝑝𝑜 𝑝𝑔 𝚳. 14

  15. Preliminary  Graph pooling  Produce a sequence of coarsened graphs  Max or average pooling of collapsed vertices  Binary tree arrangement of node indices 15

  16. Fixed graph: Vanilla Spectral Graph ConvNets Spectral Networks and Deep Locally Connected Networks on Graphs, 2014, ICLR  Locally connected networks 16

  17. Fixed graph: Vanilla Spectral Graph ConvNets Spectral Networks and Deep Locally Connected Networks on Graphs, 2014, ICLR  Locally connected networks 17

  18. Fixed graph: Vanilla Spectral Graph ConvNets Spectral Networks and Deep Locally Connected Networks on Graphs, 2014, ICLR  Spectral convolution 𝑿 ∈ ℝ 𝑜×𝑜 , diagonal matrix of learnable spectral filter coefficients  at each layer. 18

  19. Fixed graph: Vanilla Spectral Graph ConvNets Spectral Networks and Deep Locally Connected Networks on Graphs, 2014, ICLR  Analysis Each sample is a graph! 19

  20. Fixed graph: Vanilla Spectral Graph ConvNets Spectral Networks and Deep Locally Connected Networks on Graphs, 2014, ICLR  Analysis 20

  21. Fixed graph: ChebyNet Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, 2016, NIPS  Polynomial parametrization for localized filters 𝑧 = 𝚾𝑕 𝜄 𝜧 𝚾 𝑈 𝑦, 𝚾 𝑈 𝚾 = 𝑱   Polynomial filter 𝐿−1 𝜄 𝑙 𝜧 𝑙 𝑕 𝜄 𝜧 = 𝑙=0 𝐿−1 𝐿−1 𝜄 𝑙 𝜧 𝑙 𝚾 𝑈 𝑦 = 𝜄 𝑙 𝑴 𝑙 𝑦 𝑧 = 𝚾 𝑙=0 𝑙=0  Chebyshev polynomial 𝐿−1 𝜄 𝑙 𝑈 𝑙 ( 𝑕 𝜄 𝜧 = 𝜧) 𝑙=0  Cost:  Why localized? 21

  22. Fixed graph: ChebyNet Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, 2016, NIPS  Experiments  MNIST: each digit is a graph  Text categorization: 10,000 key words make up the graph. 22

  23. Fixed graph: ChebyNet Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering, 2016, NIPS  Analysis 23

  24. Fixed graph: GCN Semi-Supervised Classification with Graph Convolutional Networks, 2017, ICLR  Simplification of ChebyNet K=1 24

  25. Fixed graph: GCN Semi-Supervised Classification with Graph Convolutional Networks, 2017, ICLR  Input-output  𝑌 ∈ ℝ 𝑂×𝐷 , 𝐷 -d feature vector for 𝑂 nodes.  Θ ∈ ℝ 𝐷×𝐺 , matrix of filter parameters. 𝑎 ∈ ℝ 𝑂×𝐺 , 𝐺 -d output vector for 𝑂 nodes.   Two-layer network  Loss over labeled examples 25

  26. Fixed graph: GCN Semi-Supervised Classification with Graph Convolutional Networks, 2017, ICLR  Datasets  Whole dataset as a graph: 𝑂 = 𝑂 𝑢𝑠𝑏𝑗𝑜 + 𝑂 𝑤𝑏𝑚 + 𝑂 𝑢𝑓𝑡𝑢 26

  27. Fixed graph: GCN Semi-Supervised Classification with Graph Convolutional Networks, 2017, ICLR  Visulization (one labeled point for each class) 27

  28. Fixed graph: CayleyNet CayleyNets: Graph Convolutional Neural Networks with Complex Rational Spectral Filters, 2017  Cayley transform 𝐷 𝑦 = 𝑦 − 𝑗 𝑦 + 𝑗  Cayley polynomial  Cayley filter  Any spectral filter can be formulated as a Cayley filter. 28

  29. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Matrix ( ℝ 𝑛×𝑜 ) completion 𝒣 𝑠 : 𝑠𝑝𝑥 𝑕𝑠𝑏𝑞ℎ, 𝑛 × 𝑛 29 𝒣 𝑑 : 𝑑𝑝𝑚. 𝑕𝑠𝑏𝑞ℎ, 𝑜 × 𝑜

  30. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Matrix ( ℝ 𝑛×𝑜 ) completion  Problem: (NP-hard) ∙ ⋆ : 𝑡𝑣𝑛 𝑝𝑔 𝑡𝑗𝑜𝑕𝑣𝑚𝑏𝑠 𝑤𝑏𝑚𝑣𝑓𝑡 ∙ 𝐺 : 𝐺𝑠𝑝𝑐𝑓𝑜𝑗𝑣𝑡 𝑜𝑝𝑠𝑛  Geometric matrix completion ∙ 𝒣 : 𝐸𝑗𝑠𝑗𝑑ℎ𝑚𝑓𝑢 𝑜𝑝𝑠𝑛 2 = 𝑢𝑠𝑏𝑑𝑓 𝒀 𝑈 𝚬 𝑠 𝒀 𝒀 𝒣 𝑠 2 = 𝑢𝑠𝑏𝑑𝑓(𝒀𝚬 𝑑 𝒀 𝑈 ) 𝒀 𝒣 𝑑  Factorized model 𝑋, 𝑛 × 𝑠  Low-rank factorization (for large matrix): 𝐼, 𝑜 × 𝑠 Graph-based 30

  31. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Multi-graph CNNs (MGCNN)  2- 𝒆 Fourier transform of an matrix can be thought of as applying a 1- 𝒆 Fourier transform to its rows and columns . Φ 𝑠 , 𝑓𝑗𝑕𝑓𝑜𝑤𝑓𝑑𝑝𝑠𝑡 𝑥. 𝑠. 𝑢 𝒣 𝑠 Φ 𝑑 , 𝑓𝑗𝑕𝑓𝑜𝑤𝑓𝑑𝑝𝑠𝑡 𝑥. 𝑠. 𝑢 𝒣 𝑑  Multi-graph spectral convolution  𝑞 -order Chebyshev polynomial filters 𝑗𝑜𝑞𝑣𝑢 𝑒𝑗𝑛. ∶ 𝑛 × 𝑜 × 𝑟 ′ 31 𝑝𝑣𝑢𝑞𝑣𝑢 𝑒𝑗𝑛. ∶ 𝑛 × 𝑜 × 𝑟

  32. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Separable convolution (sMGCNN)  Complexity: 𝒫 𝑛 + 𝑜 < 𝒫 𝑛𝑜 32

  33. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Architectures 𝑌 (𝑢) progressively. RNN: diffuse the score values  MGCNN sMGCNN 33

  34. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Loss  Θ, 𝜄 𝑠 , 𝜄 𝑑 : 𝑑ℎ𝑓𝑐𝑧𝑡ℎ𝑓𝑤 𝑞𝑝𝑚𝑧𝑛𝑗𝑏𝑚 𝑑𝑝𝑓𝑔𝑔𝑗𝑑𝑗𝑓𝑜𝑢𝑡  𝜏: 𝑀𝑇𝑈𝑁, 𝑈: 𝑜𝑣𝑛𝑐𝑓𝑠 𝑝𝑔 𝑗𝑢𝑓𝑠𝑏𝑢𝑗𝑝𝑜𝑡  MGCNN  sMGCNN 34

  35. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Algorithm 35

  36. Fixed graph: Multiple graphs Geometric matrix completion with recurrent multi-graph neural networks, 2017, NIPS  Results  MovieLens dataset:  100,000 ratings (1-5) from 943 users on 1682 movies (6.3%).  Each user has rated at least 20 movies.  User: user id | age | gender | occupation | zip code  Movie: movie id | movie title | release date | video release date | IMDb URL | unknown | Action | Adventure | Animation | Children's | Comedy | Crime | Documentary | Drama | Fantasy | …… 36

  37. Variable graph: GraphSAGE Inductive Representation Learning on Large Graphs, 2017, NIPS  Desiderata => well generalized.  Invariant to node ordering  No graph isomorphism problem (https://en.wikipedia.org/wiki/Graph_isomorphism)  Locality  Operations depend on the neighbors of a given node  Number of model parameters should be independent of graph size  Model should be independent of graph structure and we should be able to transfer the model across graphs. 37

  38. Variable graph: GraphSAGE Inductive Representation Learning on Large Graphs, 2017, NIPS  Learn to propagate information across the graph to compute node features. 38

Recommend


More recommend