GMN GMNN: Gr Graph Ma Mark rkov Neur Neural al Ne Networks Meng Qu 1 2 , Yoshua Bengio 1 2 4 , Jian Tang 1 3 4 1 Quebec AI Institute (Mila) 2 University of Montreal 3 HEC Montreal 4 Canadian Institute for Advanced Research (CIFAR)
Se Semi mi-su supervise sed No Node Cl Classification on ? β’ Given a graph π» = (π, πΉ, π² ( ) ? Node labels β’ π = π * βπ , : nodes ? ? β’ πΉ : edges Node features ? ? β’ π² ( : node features β’ Give some labeled nodes π * , we want to infer the labels of the rest of nodes π , β’ Many other tasks on graphs can be formulated as node classification β’ E.g., link classification
Related Wo Re Work: St Statistical Re Relational Le Learn rning β’ Model the joint distribution of the node labels given the node features, i.e., π(π³ ( |π² ( ) , with conditional random fields 1 Y p ( y V | x V ) = Ο i,j ( y i , y j , x V ) . Z ( x V ) ( i,j ) β E β’ Pros β’ Capable of modeling the dependency between the node labels β’ Cons β’ Some manually defined potential functions β’ Limited model capacity β’ Difficult inference due to the complicated graph structures
Re Related Wo Work: Gr Grap aph Ne Neural Ne Network rks β’ Learn effective node representations by non-linear feature propagations β’ Graph convolutional Networks (Kipf et al. 2016) β’ Graph attention networks (VeliΔkoviΔ et al. 2017) β’ Neural message passing (Gilmer et al. 2017) β’ Pros β’ Learning effective node representations β’ High model compacity through multiple non-linear graph convolutional layers β’ Cons β’ Ignoring the dependency between node labels
GM GMNN: NN: Gr Grap aph Ma Mark rkov Ne Neural Ne Network rks β’ Towards combining statistical relational learning and graph neural networks β’ Learning effective node representations β’ Modeling the label dependencies of nodes β’ Model the joint distribution of node labels π³ 1 conditioned on node features π² 1 , i.e., π 2 (π³ 1 |π² 1 ) β’ Can be effectively optimized through pseudolikelihood Variational-EM
Tw Two Graph Neural Networks co co-tr train ain wi with h Ea Each Other β’ Two GNNs: β’ π 2 : learning network, modeling the label dependency by non-linear label propagation β’ π 4 : inference network, learning the node representations by non-linear feature propagation β’ π 4 infers the labels of unlabeled nodes trained with supervision from π 2 and labeled nodes β’ π 2 is trained with a fully labeled graph, where the unlabeled nodes are labeled by π 4
Expe Experimental Resul sults β’ State-of-the-art performance in multiple tasks Table: Semi-supervised Node Classification Table: Link Classification Table 4. Results of link classification. Category Algorithm Cora Citeseer Pubmed Category Algorithm Bitcoin Alpha Bitcoin OTC SSL LP 74.2 56.3 71.6 SSL LP 59.68 65.58 PRM 77.0 63.4 68.3 PRM 58.59 64.37 SRL RMN 71.3 68.0 70.7 SRL 59.56 65.59 RMN MLN 74.6 68.0 75.3 MLN 60.87 65.62 Planetoid * 75.7 64.7 77.2 DeepWalk 62.71 63.20 GNN GNN GCN * 81.5 70.3 79.0 GCN 64.00 65.69 GAT * 83.0 72.5 79.0 W/o Attr. in p Ο 65.59 66.62 GMNN W/o Attr. in p Ο 83.4 73.1 81.4 With Attr. in p Ο 65.86 66.83 GMNN With Attr. in p Ο 83.7 72.9 81.8 Code available at: Table: Unsupervised Node Representation Learning https://github.com/DeepGraphLearning/GMNN Category Algorithm Cora Citeseer Pubmed DeepWalk * 67.2 43.2 65.3 Come to our Poster at #7 GNN 71.8 DGI * 82.3 76.8 With only q ΞΈ . 78.1 68.0 79.3 GMNN Jun 11th 06:30-09:00 PM @ Pacific Ballroom 82.8 81.6 With q ΞΈ and p Ο 71.5
Recommend
More recommend