.. _optimizers: ========== Optimizers ========== The optimizer can be changed by adding a `OPTIMIZER` section to the config. The default optimizer is `Adam`. .. contents:: Catalog :depth: 1 :local: ---- Adam ---- .. code-block:: yaml OPTIMIZER: NAME: "Adam" LR: 0.001 WEIGHT_DECAY: 1.0E-5 EPS: 1.0E-8 Reference ------------ `Adam: A Method for Stochastic Optimization `_ ----- AdamW ----- .. code-block:: yaml OPTIMIZER: NAME: "AdamW" LR: 0.001 WEIGHT_DECAY: 1.0E-5 EPS: 1.0E-8 Reference ------------ `Decoupled Weight Decay Regularization `_ --- SAM --- .. code-block:: yaml OPTIMIZER: NAME: "SAM" LR: 0.001 RHO: 0.05 BASE_OPTIMIZER_NAME: "Adam" WEIGHT_DECAY: 1.0E-5 EPS: 1.0E-8 Reference ------------ `Sharpness-Aware Minimization for Efficiently Improving Generalization `_ --- SGD --- .. code-block:: yaml OPTIMIZER: NAME: "SGD" LR: 0.001 MOMENTUM: 0.9