Documents
Presentation Slides
AdaPlus: Integrating Momentum and Precise Stepsize Adjustment on AdamW Basis
- DOI:
- 10.60864/65w9-sx36
- Citation Author(s):
- Submitted by:
- Lei Guan
- Last updated:
- 3 April 2024 - 11:47pm
- Document Type:
- Presentation Slides
- Document Year:
- 2024
- Event:
- Presenters:
- Lei Guan
- Paper Code:
- MLSP-L22.5
- Categories:
- Keywords:
- Log in to post comments
This paper proposes an efficient optimizer called AdaPlus which integrates Nesterov momentum and precise stepsize adjustment on AdamW basis. AdaPlus combines the advantages of AdamW, Nadam, and AdaBelief and, in particular, does not introduce any extra hyper-parameters. We perform extensive experimental evaluations on three machine learning tasks to validate the effectiveness of AdaPlus.
The experiment results validate that AdaPlus (i) among all the evaluated adaptive methods, performs most comparable with (even slightly better than) SGD with momentum on image classification tasks and (ii) outperforms
other state-of-the-art optimizers on language modeling tasks and illustrates pretty high stability when training GANs. The experiment code of AdaPlus will be accessible at: https://github.com/guanleics/AdaPlus.