When Curvature Beats Dimension: Euclidean Limits and Hyperbolic Design Rules for Trees
Sarwesh Rauniyar
Abstract
When embedding hierarchical graph data (e.g., trees), practitioners face a fundamental choice: increase Euclidean dimension or use low-dimensional hyperbolic spaces. We provide a deployable decision rule, backed by rigorous theory and designed to integrate into graph-learning pipelines, that determines which geometry to use based on tree structure and desired distortion tolerance. For balanced $b$-ary trees of height $h$ with heterogeneous edge weights, we prove that any embedding into fixed d-dimensional Euclidean space must incur distortion scaling as $(b^{h/2})^{1/d}$, with the dependence on weight heterogeneity being tight. Under random edge perturbations, we provide high-probability refinements that improve the constants while preserving the fundamental scaling. On the hyperbolic side, we present an explicit constant-distortion construction in the hyperbolic plane with concrete curvature and radius requirements, demonstrating how negative curvature can substitute for additional Euclidean dimensions. These results yield a simple decision rule: input basic tree statistics (branching factor, height, weight spread) and a target distortion, and receive either (i) the minimum Euclidean dimension needed, or (ii) feasible hyperbolic parameters achieving the target within budget. The rule provides theoretically grounded, noise-robust guidance for geometry selection in GNN and graph-embedding modules, clarifying the fundamental trade-off between Euclidean dimension and hyperbolic curvature for hierarchical data representation. While our theory focuses on balanced trees, experiments demonstrate practical applicability to moderately unbalanced structures.
Chat is not available.
Successful Page Load