🌏 About Project
⚡️ Getting Started
📗 Catalog
The optimizer can be changed by adding a OPTIMIZER section to the config. The default optimizer is Adam.
Catalog
Adam
AdamW
SAM
SGD
OPTIMIZER: NAME: "Adam" LR: 0.001 WEIGHT_DECAY: 1.0E-5 EPS: 1.0E-8
Adam: A Method for Stochastic Optimization
OPTIMIZER: NAME: "AdamW" LR: 0.001 WEIGHT_DECAY: 1.0E-5 EPS: 1.0E-8
Decoupled Weight Decay Regularization
OPTIMIZER: NAME: "SAM" LR: 0.001 RHO: 0.05 BASE_OPTIMIZER_NAME: "Adam" WEIGHT_DECAY: 1.0E-5 EPS: 1.0E-8
Sharpness-Aware Minimization for Efficiently Improving Generalization
OPTIMIZER: NAME: "SGD" LR: 0.001 MOMENTUM: 0.9