[modules]
Optimizer gradients-alteration algorithms, implemented as plug-in modules.
API base classes
- AuxVar: Abstract base class for OptiModule auxiliary variables.
- OptiModule: Abstact base class for optimizer plug-in algorithms.
Adaptive learning-rate algorithms
- AdaGradModule: AdaGrad algorithm.
- AdamModule: Adam and AMSGrad algorithms.
- RMSPropModule: RMSProp algorithm.
- YogiModule: Yogi algorithm, with Adam or AMSGrad base.
Gradient clipping algorithms
- L2Clipping: Fixed-threshold, per-parameter-L2-norm gradient clipping module.
- L2GlobalClipping: Fixed-threshold, global-L2-norm gradient clipping module.
Momentum algorithms
- EWMAModule: Exponentially-Weighted Moving Average module.
- MomentumModule: Momentum (and Nesterov) acceleration module.
- YogiMomentumModule: Yogi-specific EWMA-like module.
Noise-addition mechanisms
- NoiseModule: Abstract base class for noise-addition modules.
- GaussianNoiseModule: Gaussian noise-addition module.
SCAFFOLD algorithm
Scaffold is implemented as a pair of complementary modules:
- ScaffoldClientModule: Client-side Scaffold module.
- ScaffoldServerModule: Server-side Scaffold module.
- ScaffoldAuxVar: AuxVar subclass for Scaffold modules.