[Submitted on 29 Oct 2025]
Layer-Adaptive Momentum Optimization: A Comprehensive Analysis of Performance and Limitations
View PDFAbstract:We present a rigorous empirical study of Layer-Adaptive Momentum Optimization (LAMO) for transformer language models, achieving a validation loss of 5.862 compared to AdamW's 4.927. Through detailed ablation studies and comparison with 10 optimization approaches from recent literature, we identify key limitations in layer-wise momentum adaptation and provide actionable insights for future research directions in adaptive optimization.
Submission history
[v1] Wed, 29 Oct 2025 17:31 UTC