Disentangled Graph Convolutional Networks Jianxin Ma, Peng Cui, Kun - - PowerPoint PPT Presentation

β–Ά
disentangled graph convolutional networks
SMART_READER_LITE
LIVE PREVIEW

Disentangled Graph Convolutional Networks Jianxin Ma, Peng Cui, Kun - - PowerPoint PPT Presentation

Disentangled GCNs (ICML19) Disentangled Graph Convolutional Networks Jianxin Ma, Peng Cui, Kun Kuang, Xin Wang, Wenwu Zhu Tsinghua University Disentangled GCNs (ICML19) Motivation The neighborhood of a node is formed due to many


slide-1
SLIDE 1

Disentangled GCNs (ICML’19)

Disentangled Graph Convolutional Networks

Jianxin Ma, Peng Cui, Kun Kuang, Xin Wang, Wenwu Zhu Tsinghua University

slide-2
SLIDE 2

Disentangled GCNs (ICML’19)

Motivation

  • The neighborhood of a node is formed due to many latent factors.
  • Existing GCNs convolute the neighborhood as a whole.
  • They do not distinguish between the latent factors.
  • Their node representations are thus not robust, and hardly interpretable.

𝑣

𝑀# 𝑀$ 𝑀% 𝑀

&

𝑀' 𝑀( 𝑀) 𝑀*

𝑣

𝑀# 𝑀$ 𝑀%

Latent factor: Work

𝑣

𝑀) 𝑀*

Latent factor: Family

𝑣

𝑀

&

𝑀' 𝑀(

Latent factor: Hobby

slide-3
SLIDE 3

Disentangled GCNs (ICML’19)

Disentangled GCNs

  • Disentangled representation learning aims to identify and separate the underlying

explanatory factors behind the observed data (Bengio et al., 2013).

  • We identify the latent factors, and segment the neighborhood accordingly.
  • Each segment is related with an isolated factor, and is convoluted separately.

𝑣

𝑀# 𝑀$ 𝑀% 𝑀

&

𝑀' 𝑀( 𝑀) 𝑀*

Neighborhood Routing Extract features specific to each factor.

convolution convolution convolution

𝑣

concatenate

Layer Output

𝑀) 𝑀*

π‘ΏπŸ‘

𝑀) 𝑀* 𝑀% 𝑀# 𝑀$ 𝑀% 𝑀# 𝑀$

π‘ΏπŸ

𝑀& 𝑀( 𝑀' 𝑀& 𝑀( 𝑀'

π‘ΏπŸ’

Layer Input 𝑣

𝑀# 𝑀$ 𝑀% 𝑀

&

𝑀' 𝑀( 𝑀) 𝑀*

Feed back to improve neighborhood routing.

slide-4
SLIDE 4

Disentangled GCNs (ICML’19)

Neighborhood Routing

  • We propose neighborhood routing, to segment a neighborhood.
  • Dynamic & differentiable. Similar to capsule networks’ dynamic routing.
  • Phase I:
  • To extract factor-specific features.

Β§ For node 𝑗 ∈ 𝑣 βˆͺ 𝑀: 𝑀, 𝑣 ∈ 𝐻 , and factor 𝑙 ∈ 1,2, … , 𝐿 , Β§ π’œ;,< =

>(𝑿@

A π’šCD𝒄@)

>(𝑿@

A π’šCD𝒄@) G

Β§ which describes node 𝑗’s aspect 𝑙.

  • Phase II:
  • To infer the factor that causes the link

between node 𝑣 and a neighbor 𝑀. Β§ Initialize 𝒅< ← π’œJ,< for each factor 𝑙. Β§ Iterate for π‘ˆ β‰ˆ 5 times, Β§ π‘žO,< ←

PQR π’œS,@

A

𝒅@ /U βˆ‘@W PQR π’œS,@W

A

𝒅@W /U

Β§ 𝒅< ←

π’œX,@Dβˆ‘S: S,X ∈Y ZS,@ π’œS,@ π’œX,@Dβˆ‘S: S,X ∈Y ZS,@ π’œS,@

G

Β§ 𝒅< describes the neighborhood’s aspect 𝑙.

slide-5
SLIDE 5

Disentangled GCNs (ICML’19)

Intuitions & Theories

  • The two intuitions behind neighborhood routing:
  • π‘ž Factor 𝑙 is the one that causes the links between node 𝑣 and a segment ∝

The segment contains a large number of nodes that are similar w.r.t. aspect 𝑙.

  • π‘ž Factor 𝑙 is the one that causes the link between node 𝑣 and a neighbor ∝

Node 𝑣 and the neighbor are similar w.r.t. aspect 𝑙.

  • Neighborhood routing is equivalent to an EM algorithm that performs

inference under a von Mises-Fisher subspace clustering model.

  • It finds one large cluster in each of the 𝐿 subspaces.
slide-6
SLIDE 6

Disentangled GCNs (ICML’19)

Results: Multi-label Node Classification

slide-7
SLIDE 7

Disentangled GCNs (ICML’19)

Results: Disentangled Node Representations

  • Correlations between the 64 dimensions, on a graph with eight factors.

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1