slides by nolan dey
play

Slides by Nolan Dey Graph Notation A B A B C D A B C D A 0 - PowerPoint PPT Presentation

Slides by Nolan Dey Graph Notation A B A B C D A B C D A 0 1 0 0 A 1 0 0 0 B 0 0 1 1 B 0 2 0 0 A = D = C D C 0 0 0 1 C 0 0 1 0 D 0 0 0 0 D 0 0 0 0 A = adjacency matrix > defines graph edges D = degree matrix


  1. Slides by Nolan Dey

  2. ̂ ̂ Graph Notation A B A B C D A B C D A 0 1 0 0 A 1 0 0 0 B 0 0 1 1 B 0 2 0 0 A = D = C D C 0 0 0 1 C 0 0 1 0 D 0 0 0 0 D 0 0 0 0 • A = adjacency matrix —> defines graph edges • D = degree matrix —> defines number of edges per node • A = A + I • D = D + I

  3. Network Notation • Number of nodes N = d l = • l th Number of node features at layer F l = • l th Hidden representation at layer F l → ( N × d l ) • F 0 = X • • A → ( N × N )

  4. GCN Layer F l = σ ( F l − 1 W l + b ) • Fully connected layer: F l = σ ( transform ( aggregate ( A , F l − 1 ), W l )) • GCN layer: • aggregate purpose: Take a weighted sum of features from adjacent nodes (analog of convolution) • transform purpose : Transform aggregated features using a weight matrix • transform ( M , W l ) = MW l F l = σ ( aggregate ( A , F l − 1 ) W l ) • GCN layer:

  5. Sum Aggregation • aggregate ( A , F l − 1 ) = AF l − 1 • Pros: Aggregated features are the sum of the features of neighbouring nodes • Cons: A node’s own features do not get propagated

  6. ̂ ̂ Sum Aggregation 2 • aggregate ( A , F l − 1 ) = AF l − 1 • A = A + I • Pros: Aggregated features are the sum of a node’s own features and the features of neighbouring nodes • Cons: Nodes with more connections have features of higher magnitude

  7. ̂ ̂ ̂ Mean Aggregation D − 1 ̂ • aggregate ( A , F l − 1 ) = AF l − 1 D ii = ∑ A ij • j • Pros: Aggregated features are the average of a node’s own features and the features of neighbouring nodes • Cons: Dynamics are “not interesting enough”

  8. ̂ Spectral Aggregation D − 1/2 ̂ • A ̂ aggregate ( A , F l − 1 ) = D − 1/2 F l − 1 • First order approximation of a spectral graph convolution

  9. ̂ ̂ What are GCNs? F l = σ ( transform ( aggregate ( A , F l − 1 ), W l )) • GCN layer: • transform ( M , W l ) = MW l D − 1/2 ̂ • A ̂ aggregate ( A , F l − 1 ) = D − 1/2 F l − 1 F l = ReLU ( ̂ • GCN layer output: D − 1 D − 1 A ̂ 2 F l − 1 W l ) 2

  10. Sample Dataset: Blood Brain Barrier Penetration (BBBP) • Binary Classification • 2050 molecules • 1567 penetrate the blood brain barrier • 483 do not penetrate the blood brain barrier • Applications in drug design

  11. Sample Architecture

  12. Applications • Image classification • Recommender systems • Path planning • 3D point cloud segmentation and classification • Molecular classification

  13. Thank you!

Recommend


More recommend