dp
DPMixin
Bases: ABC
Mixin class to make a Model
differentially private
Parameters:
Name | Type | Description | Default |
---|---|---|---|
target_epsilon |
float
|
The target epsilon for the model during training |
3.0
|
target_delta |
Optional[float]
|
The target delta for the model during training |
None
|
max_grad_norm |
float
|
The maximum norm for the gradients, they are trimmed to this norm if they are larger |
5.0
|
secure_mode |
bool
|
Whether to use the 'secure mode' of PyTorch's DP-SGD implementation via the |
False
|
Attributes:
Name | Type | Description |
---|---|---|
target_epsilon |
float
|
The target epsilon for the model during training |
target_delta |
float
|
The target delta for the model during training |
max_grad_norm |
float
|
The maximum norm for the gradients, they are trimmed to this norm if they are larger |
secure_mode |
bool
|
Whether to use the 'secure mode' of PyTorch's DP-SGD implementation via the |
Raises:
Type | Description |
---|---|
TypeError
|
If the inheritor is not a |
Source code in src/nhssynth/modules/model/common/dp.py
make_private(num_epochs, module=None)
Make the passed module (or the full model if a module is not passed), and its associated optimizer and data loader private.
Parameters:
Name | Type | Description | Default |
---|---|---|---|
num_epochs |
int
|
The number of epochs to train for, used to calculate the privacy budget. |
required |
module |
Optional[Module]
|
The module to make private. |
None
|
Returns:
Type | Description |
---|---|
GradSampleModule
|
The privatised module. |