declearn.optimizer.regularizers.Regularizer
Abstract class defining an API to implement loss-regularizers.
The Regularizer
API is close to the OptiModule
one, with the
following differences:
- Regularizers are meant to be applied prior to Modules, as a way to complete the computation of "raw" gradients.
- Regularizers do not provide an API to share stateful variables between a server and its clients.
The aim of this abstraction (which itself operates on the Vector abstraction, so as to provide framework-agnostic algorithms) is to enable implementing loss-regularization terms, rewritten as gradients-correction bricks, that can easily and modularly be plugged into optimization algorithms.
The declearn.optimizer.Optimizer
class defines the main tools
and routines for computing and applying gradients-based updates.
Regularizer
instances are designed to be "plugged in" such an
Optimizer
instance, to be applied to the raw gradients prior
to any further processing (e.g. adaptative scaling algorithms).
Abstract
- name: str class attribute
Name identifier of the class (should be unique across existing
Regularizer classes). Also used for automatic types-registration
of the class (see
Inheritance
section below). - run(gradients: Vector, weights: Vector) -> Vector:
Compute the regularization term's derivative from weights,
and add it to the input gradients. This is the main method
for any
Regularizer
.
Overridable
- on_round_start() -> None: Perform any required operation (e.g. resetting a state variable) at the start of a training round. By default, this method has no effect and mey thus be safely ignored when no behavior is needed.
Inheritance
When a subclass inheriting from Regularizer
is declared, it is
automatically registered under the "Regularizer" group using its
class-attribute name
. This can be prevented by adding register=False
to the inheritance specs (e.g. class MyCls(Regularizer, register=False)
).
See declearn.utils.register_type
for details on types registration.
Source code in declearn/optimizer/regularizers/_api.py
40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 |
|
name: ClassVar[str] = NotImplemented
class-attribute
Name identifier of the class, unique across Regularizer classes.
__init__(alpha=0.01)
Instantiate the loss regularization term.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
alpha |
float
|
Coefficient scaling the regularization term as part of the regularized loss function's formulation. |
0.01
|
Source code in declearn/optimizer/regularizers/_api.py
104 105 106 107 108 109 110 111 112 113 114 115 116 |
|
__init_subclass__(register=True, **kwargs)
Automatically type-register Regularizer subclasses.
Source code in declearn/optimizer/regularizers/_api.py
94 95 96 97 98 99 100 101 102 |
|
from_config(config)
classmethod
Instantiate a Regularizer from its configuration dict.
Source code in declearn/optimizer/regularizers/_api.py
154 155 156 157 158 159 160 |
|
from_specs(name, config)
staticmethod
Instantiate a Regularizer from its specifications.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
name |
str
|
Name based on which the regularizer can be retrieved. Available as a class attribute. |
required |
config |
Dict[str, Any]
|
Configuration dict of the regularizer, that is to be
passed to its |
required |
Source code in declearn/optimizer/regularizers/_api.py
162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 |
|
get_config()
Return the regularizer's JSON-serializable dict configuration.
Source code in declearn/optimizer/regularizers/_api.py
148 149 150 151 152 |
|
on_round_start()
Perform any required action at the start of a training round.
Source code in declearn/optimizer/regularizers/_api.py
142 143 144 145 146 |
|
run(gradients, weights)
abstractmethod
Compute and add the regularization term's derivative to gradients.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
gradients |
Vector[T]
|
Input gradients to which the correction term is to be added. |
required |
weights |
Vector[T]
|
Model weights with respect to which gradients were computed, and based on which the regularization term should be derived. |
required |
Returns:
Name | Type | Description |
---|---|---|
gradients |
Vector
|
Modified input gradients. The output Vector should be fully compatible with the input one - only the values of the wrapped coefficients may have changed. |
Source code in declearn/optimizer/regularizers/_api.py
118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 |
|