[Submitted on 4 Nov 2025]
SignCurv: Combining Sign-Based Updates with Adaptive Curvature for Transformer Optimization
View PDFAbstract:We present SignCurv, a novel optimizer combining sign-based gradient updates with lightweight curvature adaptation for transformer language models. Our method addresses key limitations in existing optimizers by (1) using sign-based updates for stable optimization across different parameter scales, (2) incorporating adaptive curvature information through diagonal Hessian approximations, and (3) implementing architecture-aware learning rate scheduling. Experiments on the FineWeb dataset demonstrate SignCurv achieves competitive performance (validation loss 4.018) while maintaining training stability. Compared to AdamW (loss 4.927), our method shows a 18.4\% relative improvement, though it does not surpass state-of-the-art methods like Muon (3.537).
Submission history
[v1] Tue, 4 Nov 2025 11:49 UTC