Blog
Oct 28, 2025
Understanding Training Stability in Hyperbolic Neural Networks
In this study, we address the crucial problem of instability in hyperbolic deep learning, particularly in the learning of the curvature of the manifold. Naive techniques have a fundamental weakness that the authors point out: performance deteriorates when the curvature parameter is updated before the model parameters are updated, invalidating the Riemannian gradients and projections. They address this by presenting a new ordered projection schema that re-projects the model parameters onto the new manifold after first updating the curvature and then projecting them to a stable tangent space.
Source: HackerNoon →