Poincaré Embeddings for Learning Hierarchical Representations

NeurIPS 2017  ·  Maximilian Nickel, Douwe Kiela ·

Representation learning has become an invaluable approach for learning from symbolic data such as text and graphs. However, while complex symbolic datasets often exhibit a latent hierarchical structure, state-of-the-art methods typically learn embeddings in Euclidean vector spaces, which do not account for this property. For this purpose, we introduce a new approach for learning hierarchical representations of symbolic data by embedding them into hyperbolic space -- or more precisely into an n-dimensional Poincar\'e ball. Due to the underlying hyperbolic geometry, this allows us to learn parsimonious representations of symbolic data by simultaneously capturing hierarchy and similarity. We introduce an efficient algorithm to learn the embeddings based on Riemannian optimization and show experimentally that Poincar\'e embeddings outperform Euclidean embeddings significantly on data with latent hierarchies, both in terms of representation capacity and in terms of generalization ability.

PDF Abstract NeurIPS 2017 PDF NeurIPS 2017 Abstract

Datasets


Results from the Paper


Task Dataset Model Metric Name Metric Value Global Rank Result Benchmark
Lexical Entailment HyperLex Poincare Embeddings Spearman Correlation 0.512 # 3
Link Prediction WordNet Poincare Embeddings (dim=10) Accuracy 68.3 # 5
Link Prediction WordNet Poincare Embeddings (dim=20) Accuracy 74.3 # 4
Link Prediction WordNet Poincare Embeddings (dim=50) Accuracy 77.0 # 3
Link Prediction WordNet Poincare Embeddings (dim=100) Accuracy 77.4 # 2

Methods