[Submitted on 3 Nov 2025]
Adaptive Second Moment Optimization: \\ Memory-Efficient Training of Transformers
View PDFAbstract:We present Adaptive Second Moment Optimization (ASMO), a memory-efficient optimizer for transformer language models that maintains competitive performance while reducing memory overhead. ASMO combines compressed second moment storage with parameter-specific adaptation policies, achieving a 20\% memory reduction compared to AdamW while maintaining comparable convergence. Our experiments on the FineWeb benchmark demonstrate the practical viability of this approach, with ASMO achieving a final validation loss of 3.923 compared to AdamW's 4.927. The method builds on established techniques while introducing novel adaptations for modern transformer architectures.
Submission history
[v1] Mon, 3 Nov 2025 09:10 UTC