@@ -23,16 +23,15 @@ Model Config
2323>> 2 . E: Exponential decay.
2424* ` finetune_rate ` : Configuration of learning method.Contains a json object with following params
2525
26- > param | description | default value |learning method
27- > :-----|:------------|:--------------:|:---------------:
28- > ` learning_rate ` ||0.08 | C
29- > ` epoch_num ` ||10 | C
30- > ` start_rate ` ||0.08 | E
31- > ` scale_by ` ||0.5 | E
32- > ` min_derror_decay_start ` ||0.05 | E
33- > ` min_derror_stop ` ||0.05 | E
34- > ` min_epoch_decay_start ` || 15 | E
35- > ` init_error ` || 100 | E
26+ > param | description | default value |learning method
27+ > :-------------------------|:------------|:--------------:|:---------------:
28+ > ` learning_rate ` ||0.08 | C
29+ > ` epoch_num ` ||10 | C
30+ > ` start_rate ` ||0.08 | E
31+ > ` scale_by ` ||0.5 | E
32+ > ` min_derror_decay_start ` ||0.05 | E
33+ > ` min_derror_stop ` ||0.05 | E
34+ > ` min_epoch_decay_start ` || 15 | E
3635
3736* ` finetune_momentum ` : The momentum factor while finetuning
3837* ` export_path ` : path (realative to ` wdir ` ) for writting (bottleneck) features.
@@ -74,4 +73,4 @@ Each one is a json object with following fields:
7473* ` reader_type ` : (Mandatory) Type of reader NP/T1/T2.
7574* ` dim_shuffle ` : how to use reshape given fatten vector.Used only ` keep_flatten ` is ` false `
7675
77- ________________________________________________________________________________________________________________
76+ _____________________________________________________________________________________________________________
0 commit comments