This document summarizes a research paper that introduces Hyperbolic Graph Convolutional Networks (HGCNs) to address limitations of previous Euclidean graph neural networks. HGCNs map node features to hyperbolic spaces and use a novel attention-based aggregation scheme to capture hierarchical structure. The paper presents HGCNs, evaluates them on citation networks, disease propagation trees, protein networks and flight networks, and finds they outperform Euclidean baselines for link prediction and node classification by learning more interpretable hierarchical representations.
Similar to NS-CUK Seminar: J.H.Lee, Review on "Hyperbolic graph convolutional neural networks," Advances in neural information processing systems 2019 (20)
3. 2
1. Introduction
Limitation of previous study
• Input node features are usually Euclidean, and it is not clear how to optimally use as inputs to hyperbolic
neural networks
• It is not clear how to perform set aggregation, a key step in message passing, in hyperbolic space
• one needs to choose hyperbolic spaces with the right curvature at every layer of GCN
4. 3
1. Introduction
Contributions
• Improved performance on graph-based tasks
→ Hyperbolic space is better suited for modeling hierarchical structures that are common in many real-
world graphs
• Interpretability
→ HGCNs can learn hierarchical representations of graph-structured data that are more interpretable
than those learned by Euclidean GCNs.
• Novelty
→ Paper introduces a hyperbolic attention-based aggregation scheme that captures hierarchical
structure of networks
15. 14
3. Experiment
Visualization (DISEASE-M dataset)
• In HGCN, the center node pays more attention to its (grand)parent.
• In contrast to Euclidean GAT, our aggregation with attention in hyperbolic space allows to pay more
attention to nodes with high hierarchy
→ such attention is crucial to good performance in disease, because only sick parents will propagate the
disease to their children
16. 15
4. Conclusions
• HGCN is a novel architecture that learns hyperbolic embeddings using graph convolution networks.
• In HGCN, the Euclidean input features are successively mapped to embeddings in hyperbolic
spaces with trainable curvatures at every layer
• HGCN achieves new state-of-the-art in learning embeddings for real-world hierarchical and scale-
free graphs