|
| 1 | +Model Config |
| 2 | +------------ |
| 3 | +* `nnetType` : (Mandatory) specify Type of Network (CNN/RBM/SDA/DNN) |
| 4 | +* `train_data` : (Mandatory) specify the working directory containing data configuration and output |
| 5 | +* `wdir` : (Mandatory) Working Directory. |
| 6 | +* `data_spec` : (Mandatory) specify the path of the data sepification relative to `model_config.json` |
| 7 | +* `nnet_spec` : (Mandatory) specify the path of network configuration specification relative to `model_config.json` |
| 8 | + |
| 9 | +* `output_file` : (Mandatory) specify the path of RBM network output file relative to working directory |
| 10 | +* `input_file` : specify the path of RBM network inpu file relative to working directory |
| 11 | + |
| 12 | +* `batch_size` : specify the mini batch size while training, default 128 |
| 13 | + |
| 14 | +* `n_ins` :784 |
| 15 | +* `n_outs` :10 |
| 16 | + |
| 17 | +* `gbrbm_learning_rate` : pretraining |
| 18 | +* `pretraining_learning_rate` : pretraining |
| 19 | +* `pretraining_epochs` : |
| 20 | + |
| 21 | +* `initial_pretrain_momentum` :Specify the momentum factor while training default 0.5 |
| 22 | +* `final_pretrain_momentum` :Specify the momentum factor while training default 0.9 |
| 23 | +* `initial_pretrain_momentum_epoch` : Specify the momentum factor while training default 5 |
| 24 | + |
| 25 | +* `finetune_method` : Two methods are supported C: Constant learning rate and E: Exponential decay |
| 26 | +* `finetune_rate` : learning rate configuration |
| 27 | + |
| 28 | +> param | description | default value |learning method |
| 29 | +> :-----|:------------|:--------------:|:---------------: |
| 30 | +> `learning_rate` ||0.08 | C |
| 31 | +> `epoch_num` ||10 | C |
| 32 | +> `start_rate` ||0.08 | E |
| 33 | +> `scale_by` ||0.5 | E |
| 34 | +> `min_derror_decay_start` ||0.05 | E |
| 35 | +> `min_derror_stop` ||0.05 | E |
| 36 | +> `min_epoch_decay_start` || 15 | E |
| 37 | +> `init_error` || 100 | E |
| 38 | +
|
| 39 | +* `finetune_momentum` : Specify the momentum factor while finetuning |
| 40 | + |
| 41 | +* `processes` : |
| 42 | + |
| 43 | +>> * `pretraining` : default:false |
| 44 | +>> * `finetuning` : default:false |
| 45 | +>> * `testing` : default:false |
| 46 | +>> * `export_data` : default:false |
| 47 | +
|
| 48 | +* `export_path` : path (realative to wdir) for writting (bottleneck) features. |
| 49 | + |
0 commit comments