File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change @@ -352,12 +352,17 @@ class ParametersNetwork(ParametersBase):
352352 functions are added layer by layer.
353353 Note that no activation function is applied between input layer and
354354 first hidden layer!
355- Currently supported activation functions are:
356-
357- - Sigmoid
358- - ReLU
359- - None (no activation used)
360- - LeakyReLU (default)
355+ The items in the list can either be strings, which MALA
356+ will map them directly to the correct activation functions or
357+ torch.nn.Module objects containing the activation functions directly
358+ OR None, in which case no activation function is used. The None
359+ can be ommitted at the end, but is useful when layers without
360+ activation functions are to be added in the middle
361+ Currently supported activation function strings are:
362+
363+ - "Sigmoid"
364+ - "ReLU"
365+ - "LeakyReLU" (default)
361366
362367 loss_function_type : string
363368 Loss function for the neural network
You can’t perform that action at this time.
0 commit comments