[Submitted on 1 Nov 2025]
SpectralOrthoAdam: An Exploration of Orthogonal Updates in Transformer Optimization
View PDFAbstract:This paper investigates the potential of combining adaptive momentum optimization with spectral normalization for transformer language models. We present SpectralOrthoAdam, an optimizer that incorporates layer-specific processing, scheduled momentum, and orthogonal updates for attention weights. While theoretically motivated to improve training stability and performance, empirical results on the FineWeb dataset with a 134M parameter model show the method underperforms the AdamW baseline (validation loss 5.267 vs 4.927). We analyze the reasons for this underperformance and discuss implications for future work in geometric optimization for transformers.
Submission history
[v1] Sat, 1 Nov 2025 13:56 UTC