Skip to content

Commit 632dd93

Browse files
committed
Update DNN_Specification.md
1 parent aaba12d commit 632dd93

1 file changed

Lines changed: 7 additions & 5 deletions

File tree

config_doc/DNN_Specification.md

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,17 +2,19 @@ DNN Specification
22
=================
33

44
* `hidden_layers` : (Mandatory) An Array contain size of hidden RBM layer.
5-
* `activation` : Activation function used by layers.
65
* `pretrained_layers` : Number of layers to be pre-trained.(Default Value = Size of `hidden_layers`)
76

87
* `max_col_norm` : The max value of norm of gradients; usually used in dropout and maxout.(Default Value = null)
98
* `l1_reg` : l1 norm regularization weight.(Default Value = 0)
109
* `l2_reg` : l2 norm regularization weight.(Default Value = 0)
1110

12-
* `do_maxout` : whether to use max-out or not. (Default Value = false)
13-
* `do_pnorm` : whether to use p-norm (Default Value = false)
14-
* `pool_size` : The number of units in each max-pooling(or pnorm) group for maxout/pnorm(Default Value = 1)
15-
* `pnorm_order` : The norm order for pnorm.(Default Value = 1)
11+
* `adv_activation`: if maxout/pnorm is used. It contains
12+
13+
> * `method` : Either 'maxout','pnorm'. In `maxout`, a pooling of neuron o/p is done based on poolsize. But in `pnorm` output is normalized after pooling.
14+
> * `pool_size`: The number of units in each max-pooling(or pnorm) group for maxout/pnorm (Default Value = 1)
15+
> * `pnorm_order`: The norm order for pnorm.(Default Value = 1)
16+
17+
* `activation` : Activation function used by layers. (if adv_activation is used, it sholud be either 'linear','relu' or 'cappedrelu')
1618

1719
* `do_dropout` : whether to use dropout or not. (Default Value =false)
1820
* `dropout_factor` : the dropout factors for DNN layers.(One for each hidden layer)(Default Value =[0.0])

0 commit comments

Comments
 (0)