Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Mathematics of Modern Machine Learning (M3L)

Curvature-Dimension Tradeoff for Generalization in Hyperbolic Space

Nicolás Alvarado · Hans Lobel · Mircea Petrache


Abstract: The inclusion of task-relevant geometric embeddings in deep learning models is an important emerging direction of research, particularly when using hierarchical data. For instance, negatively curved geometries such as hyperbolic spaces are known to allow low-distortion embedding of tree-like hierarchical structures, which Euclidean spaces do not afford. Learning techniques for hyperbolic spaces, such as Hyperbolic Neural Networks (HNNs), have shown empirical accuracy improvement over classical Deep Neural Networks in tasks involving semantic or multi-scale information, such as recommender systems or molecular generation. However, no research has investigated generalization properties specific to such geometries. In this work, we introduce generalization bounds for learning tasks in hyperbolic spaces, marking the first time such bounds have been proposed. We highlight a previously unnoticed and important difference with Euclidean embedding models, namely, under embeddings into spaces of negative curvature $-\kappa<0$ and dimension $d$, only the product $\sqrt{\kappa}\ d$ influences generalization bounds. Hence, the curvature parameter of the space can be varied at fixed $d$ with the same effect on generalization as when varying $d$.

Chat is not available.