Bases: TomlConfig
Global container for Federated Learning "run" configurations.
This class aims at wrapping multiple, possibly optional, sets of
hyper-parameters that parameterize a Federated Learning process,
each of which is specified through a dedicated dataclass or as a
unit python type.
It is designed to be use by an orchestrator, e.g. the server in
the case of a centralized federated learning process.
This class is meant to be extendable through inheritance, so as
to refine the expected fields or add some that might be used by
children (or parallel) classes of FederatedServer
that modify
the default, centralized, federated learning process.
Fields
- rounds: int
Maximum number of training and validation rounds to perform.
- register: RegisterConfig
Parameters for clients' registration (min and/or max number
of clients to expect, optional max duration of the process).
- training: TrainingConfig
Parameters for training rounds, including effort constraints
and data-batching instructions.
- evaluate: EvaluateConfig
Parameters for validation rounds, similar to training ones.
- privacy: PrivacyConfig or None
Optional parameters to set up local differential privacy,
by having clients use the DP-SGD algorithm for training.
- early_stop: EarlyStopConfig or None
Optional parameters to set up an EarlyStopping criterion, to
be leveraged so as to interrupt the federated learning process
based on the tracking of a minimized quantity (e.g. model loss).
Instantiation classmethods
- from_toml:
Instantiate by parsing a TOML configuration file.
- from_params:
Instantiate by parsing inputs dicts (or objects).
Notes
register
may be defined as a single integer (in from_params
or in
a TOML file), that will be used as the exact number of clients.
- If
evaluate
is not provided to from_params
or in the parsed TOML
file, default parameters will automatically be used and the training
batch size will be used for evaluation as well.
- If
privacy
is provided and the 'poisson' parameter is unspecified
for training
, it will be set to True by default rather than False.
Source code in declearn/main/config/_run_config.py
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145 | @dataclasses.dataclass
class FLRunConfig(TomlConfig):
"""Global container for Federated Learning "run" configurations.
This class aims at wrapping multiple, possibly optional, sets of
hyper-parameters that parameterize a Federated Learning process,
each of which is specified through a dedicated dataclass or as a
unit python type.
It is designed to be use by an orchestrator, e.g. the server in
the case of a centralized federated learning process.
This class is meant to be extendable through inheritance, so as
to refine the expected fields or add some that might be used by
children (or parallel) classes of `FederatedServer` that modify
the default, centralized, federated learning process.
Fields
------
- rounds: int
Maximum number of training and validation rounds to perform.
- register: RegisterConfig
Parameters for clients' registration (min and/or max number
of clients to expect, optional max duration of the process).
- training: TrainingConfig
Parameters for training rounds, including effort constraints
and data-batching instructions.
- evaluate: EvaluateConfig
Parameters for validation rounds, similar to training ones.
- privacy: PrivacyConfig or None
Optional parameters to set up local differential privacy,
by having clients use the DP-SGD algorithm for training.
- early_stop: EarlyStopConfig or None
Optional parameters to set up an EarlyStopping criterion, to
be leveraged so as to interrupt the federated learning process
based on the tracking of a minimized quantity (e.g. model loss).
Instantiation classmethods
--------------------------
- from_toml:
Instantiate by parsing a TOML configuration file.
- from_params:
Instantiate by parsing inputs dicts (or objects).
Notes
-----
- `register` may be defined as a single integer (in `from_params` or in
a TOML file), that will be used as the exact number of clients.
- If `evaluate` is not provided to `from_params` or in the parsed TOML
file, default parameters will automatically be used and the training
batch size will be used for evaluation as well.
- If `privacy` is provided and the 'poisson' parameter is unspecified
for `training`, it will be set to True by default rather than False.
"""
rounds: int
register: RegisterConfig
training: TrainingConfig
evaluate: EvaluateConfig
privacy: Optional[PrivacyConfig] = None
early_stop: Optional[EarlyStopConfig] = None # type: ignore # is a type
@classmethod
def parse_register(
cls,
field: dataclasses.Field, # future: dataclasses.Field[RegisterConfig]
inputs: Any,
) -> RegisterConfig:
"""Field-specific parser to instantiate a RegisterConfig.
This method supports specifying `register`:
* as a single int, translated into {"min_clients": inputs}
* as None (or missing kwarg), using default RegisterConfig()
It otherwise routes inputs back to the `default_parser`.
"""
if inputs is None:
return RegisterConfig()
if isinstance(inputs, int):
return RegisterConfig(min_clients=inputs)
return cls.default_parser(field, inputs)
@classmethod
def from_params(
cls,
**kwargs: Any,
) -> Self:
# If evaluation batch size is not set, use the same as training.
# Note: if inputs have invalid formats, let the parent method fail.
evaluate = kwargs.setdefault("evaluate", {})
if isinstance(evaluate, dict):
training = kwargs.get("training")
if isinstance(training, dict):
evaluate.setdefault("batch_size", training.get("batch_size"))
elif isinstance(training, TrainingConfig):
evaluate.setdefault("batch_size", training.batch_size)
# If privacy is set and poisson sampling bool parameter is unspecified
# for the training dataset, make it True rather than False by default.
privacy = kwargs.get("privacy")
if isinstance(privacy, dict):
training = kwargs.get("training")
if isinstance(training, dict):
training.setdefault("poisson", True)
# Delegate the rest of the work to the parent method.
return super().from_params(**kwargs)
|
parse_register(field, inputs)
classmethod
Field-specific parser to instantiate a RegisterConfig.
This method supports specifying register
:
- as a single int, translated into {"min_clients": inputs}
- as None (or missing kwarg), using default RegisterConfig()
It otherwise routes inputs back to the default_parser
.
Source code in declearn/main/config/_run_config.py
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121 | @classmethod
def parse_register(
cls,
field: dataclasses.Field, # future: dataclasses.Field[RegisterConfig]
inputs: Any,
) -> RegisterConfig:
"""Field-specific parser to instantiate a RegisterConfig.
This method supports specifying `register`:
* as a single int, translated into {"min_clients": inputs}
* as None (or missing kwarg), using default RegisterConfig()
It otherwise routes inputs back to the `default_parser`.
"""
if inputs is None:
return RegisterConfig()
if isinstance(inputs, int):
return RegisterConfig(min_clients=inputs)
return cls.default_parser(field, inputs)
|