Skip to content

Commit bbef691

Browse files
Added more documentation
1 parent d615216 commit bbef691

1 file changed

Lines changed: 11 additions & 6 deletions

File tree

mala/common/parameters.py

Lines changed: 11 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -352,12 +352,17 @@ class ParametersNetwork(ParametersBase):
352352
functions are added layer by layer.
353353
Note that no activation function is applied between input layer and
354354
first hidden layer!
355-
Currently supported activation functions are:
356-
357-
- Sigmoid
358-
- ReLU
359-
- None (no activation used)
360-
- LeakyReLU (default)
355+
The items in the list can either be strings, which MALA
356+
will map them directly to the correct activation functions or
357+
torch.nn.Module objects containing the activation functions directly
358+
OR None, in which case no activation function is used. The None
359+
can be ommitted at the end, but is useful when layers without
360+
activation functions are to be added in the middle
361+
Currently supported activation function strings are:
362+
363+
- "Sigmoid"
364+
- "ReLU"
365+
- "LeakyReLU" (default)
361366
362367
loss_function_type : string
363368
Loss function for the neural network

0 commit comments

Comments
 (0)