You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: config_doc/DNN_Specification.md
+7-5Lines changed: 7 additions & 5 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,17 +2,19 @@ DNN Specification
2
2
=================
3
3
4
4
*`hidden_layers` : (Mandatory) An Array contain size of hidden RBM layer.
5
-
*`activation` : Activation function used by layers.
6
5
*`pretrained_layers` : Number of layers to be pre-trained.(Default Value = Size of `hidden_layers`)
7
6
8
7
*`max_col_norm` : The max value of norm of gradients; usually used in dropout and maxout.(Default Value = null)
9
8
*`l1_reg` : l1 norm regularization weight.(Default Value = 0)
10
9
*`l2_reg` : l2 norm regularization weight.(Default Value = 0)
11
10
12
-
*`do_maxout` : whether to use max-out or not. (Default Value = false)
13
-
*`do_pnorm` : whether to use p-norm (Default Value = false)
14
-
*`pool_size` : The number of units in each max-pooling(or pnorm) group for maxout/pnorm(Default Value = 1)
15
-
*`pnorm_order` : The norm order for pnorm.(Default Value = 1)
11
+
*`adv_activation`: if maxout/pnorm is used. It contains
12
+
13
+
> *`method` : Either 'maxout','pnorm'. In `maxout`, a pooling of neuron o/p is done based on poolsize. But in `pnorm` output is normalized after pooling.
14
+
> *`pool_size`: The number of units in each max-pooling(or pnorm) group for maxout/pnorm (Default Value = 1)
15
+
> *`pnorm_order`: The norm order for pnorm.(Default Value = 1)
16
+
17
+
*`activation` : Activation function used by layers. (if adv_activation is used, it sholud be either 'linear','relu' or 'cappedrelu')
16
18
17
19
*`do_dropout` : whether to use dropout or not. (Default Value =false)
18
20
*`dropout_factor` : the dropout factors for DNN layers.(One for each hidden layer)(Default Value =[0.0])
0 commit comments