deep learning for network biology
play

Deep Learning for Network Biology Marinka Zitnik and Jure Leskovec - PowerPoint PPT Presentation

Deep Learning for Network Biology Marinka Zitnik and Jure Leskovec Stanford University Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 1 This Tutorial snap.stanford.edu/deepnetbio-ismb ISMB 2018 July 6,


  1. Deep Learning for Network Biology Marinka Zitnik and Jure Leskovec Stanford University Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 1

  2. This Tutorial snap.stanford.edu/deepnetbio-ismb ISMB 2018 July 6, 2018, 2:00 pm - 6:00 pm Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 2

  3. This Tutorial 1) Node embeddings § Map nodes to low-dimensional embeddings § Applications: PPIs, Disease pathways 2) Graph neural networks § Deep learning approaches for graphs § Applications: Gene functions 3) Heterogeneous networks § Embedding heterogeneous networks § Applications: Human tissues, Drug side effects Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 3

  4. Part 2: Graph Neural Networks Some materials adapted from: Hamilton et al. 2018. Representation Learning on • Networks. WWW. Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 4

  5. Embedding Nodes f( )= 2-dimensional node Disease similarity embeddings network Intuition: Map nodes to d-dimensional embeddings such that similar nodes in the graph are embedded close together Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 5

  6. Embedding Nodes Goal: Map nodes so that similarity in the embedding space (e.g., dot product) approximates similarity in the network d-dimensional Input network embedding space Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 6

  7. Embedding Nodes Goal: similarity( u, v ) ≈ z > v z u Need to define! d-dimensional Input network embedding space Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 7

  8. Two Key Components § Encoder: Map a node to a low-dimensional vector: d -dimensional embedding enc ( v ) = z v node in the input graph § Similarity function defines how relationships in the input network map to relationships in the embedding space: similarity( u, v ) ≈ z > v z u Similarity of u and v dot product between node in the network embeddings Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 8

  9. So Far: Shallow Encoders Shallow encoders: § One-layer of data transformation § A single hidden layer maps node 𝑣 to embedding 𝒜 & via function 𝑔 , e.g., 𝒜 & = 𝑔 𝒜 ) , 𝑤 ∈ 𝑂 . 𝑣 Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 9

  10. Shallow Encoders § Limitations of shallow encoding: § O(|V|) parameters are needed : § No sharing of parameters between nodes § Every node has its own unique embedding § Inherently “transductive ”: § Cannot generate embeddings for nodes that are not seen during training § Do not incorporate node features : § Many graphs have features that we can and should leverage Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 10

  11. Deep Graph Encoders § Next: We will now discuss deep methods based on graph neural networks: multiple layers of non-linear enc ( v ) = transformation of graph structure § Note: All these deep encoders can be combined with similarity functions from the previous section Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 11

  12. Deep Graph Encoders … Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 12

  13. Idea: Convolutional Networks CNN on an image: Goal is to generalize convolutions beyond simple lattices Leverage node features/attributes (e.g., text, images) Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 13

  14. � From Images to Networks Single CNN layer with 3x3 filter: (Animation Vincent Dumoul Image Graph Transform information at the neighbors and combine it Transform “messages” ℎ 0 from neighbors: 𝑋 0 ℎ 0 § Add them up: ∑ 𝑋 0 ℎ 0 § 0 Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 14

  15. Real-World Graphs But what if your graphs look like this? s like this? or this: or this: § Examples: Biological networks, Medical networks, Social networks, Information networks, Knowledge graphs, Communication networks, Web graph, … Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 15

  16. A Naïve Approach § Join adjacency matrix and features § Feed them into a deep neural net: • Done? A B C D E Feat A 0 1 1 1 0 1 0 ? A B B 1 0 0 1 1 0 0 E C 1 0 0 1 0 0 1 C D D 1 1 1 0 1 1 1 E 0 1 0 1 0 1 0 § Issues with this idea: § 𝑃(𝑂) parameters § Not applicable to graphs of different sizes § Not invariant to node ordering Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 16

  17. Outline of This Section 1.Basics of deep learning for graphs 2.Graph convolutional networks 3.Biomedical applications Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 17

  18. Basics of Deep Learning for Graphs Based on material from: Hamilton et al. 2017. Representation Learning on Graphs: Methods and • Applications. IEEE Data Engineering Bulletin on Graph Systems . Scarselli et al. 2005. The Graph Neural Network Model. IEEE Transactions • on Neural Networks . Kipf et al., 2017. Semisupervised Classification with Graph Convolutional • Networks. ICLR . Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 18

  19. Setup § Assume we have a graph 𝐻 : § 𝑊 is the vertex set § 𝑩 is the adjacency matrix (assume binary) § 𝒀 ∈ ℝ =×|@| is a matrix of node features § Biologically meaningful node features: – E.g., immunological signatures, gene expression profiles, gene functional information § No features: – Indicator vectors (one-hot encoding of a node) Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 19

  20. Examples Protein-protein interaction networks in different tissues, e.g., blood, substantia nigra RPT6 WNT1 Node feature: Associations of Node feature: Associations of proteins with midbrain development proteins with angiogenesis Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 20

  21. Graph Convolutional Networks Graph Convolutional Networks: convolutional architecture graph normalization neighborhood graph construction ... ... node sequence selection Problem: For a given subgraph how to come with canonical node ordering Learning convolutional neural networks for graphs. M. Niepert, M. Ahmed, K. Kutzkov ICML. 2016. Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 21

  22. Our Approach Idea: Node’s neighborhood defines a computation graph 𝑗 Determine node Propagate and computation graph transform information Learn how to propagate information across the graph to compute node features Semi-Supervised Classification with Graph Convolutional Networks. T. N. Kipf, M. Welling, ICLR 2017 Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 22

  23. Idea: Aggregate Neighbors Key idea: Generate node embeddings based on local network neighborhoods A C TARGET NODE B B A A C B C A E F D F E D INPUT GRAPH A Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 23

  24. Idea: Aggregate Neighbors Intuition: Nodes aggregate information from their neighbors using neural networks A C TARGET NODE B B A A C B C A E F D F E D INPUT GRAPH A Neural networks Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 24

  25. Idea: Aggregate Neighbors Intuition: Network neighborhood defines a computation graph Every node defines a computation graph based on its neighborhood! Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 25

  26. Deep Model: Many Layers § Model can be of arbitrary depth: § Nodes have embeddings at each layer § Layer-0 embedding of node u is its input feature, i.e. x u . Layer-0 Layer-1 x A A C TARGET NODE x C B B Layer-2 A x A A C B x B C A E x E F D F x F E D INPUT GRAPH A x A Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 26

  27. Aggregation Strategies § Neighborhood aggregation: Key distinctions are in how different approaches aggregate information across the layers A ? C TARGET NODE B B What’s in the box!? A A C B ? ? C A E F D F E ? D INPUT GRAPH A Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 27

  28. Neighborhood Aggregation § Basic approach: Average information from neighbors and apply a neural network 1) average messages A from neighbors C TARGET NODE B B A A C B C A E F D F E D INPUT GRAPH A 2) apply neural network Deep Learning for Network Biology -- snap.stanford.edu/deepnetbio-ismb -- ISMB 2018 28

Recommend


More recommend