mlp
MLP
Bases: Module
A configurable multi-layer perceptron used as the generator and critic backbone in the GAN.
Parameters:
| Name | Type | Description | Default |
|---|---|---|---|
n_units_in
|
int
|
Input dimensionality. |
required |
n_units_out
|
int
|
Output dimensionality. |
required |
n_layers_hidden
|
int
|
Number of hidden layers. |
2
|
n_units_hidden
|
int
|
Width of each hidden layer. |
256
|
activation
|
str
|
Hidden layer activation function name (key in ACTIVATION_FUNCTIONS). |
'leaky_relu'
|
batch_norm
|
bool
|
Whether to apply BatchNorm after each hidden linear layer. |
False
|
dropout
|
float
|
Dropout probability for hidden layers (0 = disabled). |
0.0
|
residual
|
bool
|
Whether to add residual skip connections within hidden layers of equal width. |
False
|
lr
|
float
|
Learning rate for the Adam optimiser. |
0.0002
|
opt_betas
|
tuple
|
(beta1, beta2) for the Adam optimiser. |
(0.9, 0.999)
|
activation_out
|
Optional[str]
|
Optional activation on the output layer (key in ACTIVATION_FUNCTIONS). |
None
|