- Read more about AdaPlus: Integrating Momentum and Precise Stepsize Adjustment on AdamW Basis
- Log in to post comments
This paper proposes an efficient optimizer called AdaPlus which integrates Nesterov momentum and precise stepsize adjustment on AdamW basis. AdaPlus combines the advantages of AdamW, Nadam, and AdaBelief and, in particular, does not introduce any extra hyper-parameters. We perform extensive experimental evaluations on three machine learning tasks to validate the effectiveness of AdaPlus.
ICASSP.pdf
ICASSP.pdf (84)
- Categories:
39 Views