-
nnetType: (Mandatory) Type of Network (CNN/RBM/SDA/DNN) -
train_data: (Mandatory) The working directory containing data configuration and output -
wdir: (Mandatory) Working Directory. -
data_spec: (Mandatory) The path of the data sepification relative tomodel_config.json -
nnet_spec: (Mandatory) The path of network configuration specification relative tomodel_config.json -
output_file: (Mandatory) The path of RBM network output file relative towdir -
input_file: The path of PreTrained/FineTuned network input file relative towdir.(Mandatory for DNN) -
random_seed: Random Seed used for initialization of weights. -
logger_level: Level of Logger.Valid Values are "INFO","DEBUG" and "ERROR" -
batch_size: specify the mini batch size while training, default 128 -
n_ins:Dimension of input (Mandatory for all except CNN) -
n_outs:(Mandatory) Dimension of output (No: of Classes) -
input_shape: The input shape of a given feature vector.(Mandatory For CNN).Should be an Array. -
finetune_params: Configuration of finetune learning method.Contains a json object with following params:
momentum: The momentum factor while finetuningmethod: Two methods are supported
- C: Constant learning rate(DEFAULT): run
epoch_numiterations withlearning_rateunchanged- E: Exponential decay: we start with the learning rate of
start_rate; if the validation error reduction between two epochs is less thanmin_derror_decay_start, the learning rate is scaled byscale_byduring each of the remaining epoch. The whole traing terminates when the validation error reduction between two epochs falls belowmin_derror_stop.min_epoch_decay_staris the minimum epoch number after which scaling can only be performed.
Default value of other paramters
Param Default value Learning method learning_rate0.08 C epoch_num10 C start_rate0.08 E scale_by0.5 E min_derror_decay_start0.05 E min_derror_stop0.05 E min_epoch_decay_start15 E
These parameters are used by Constant learning rate or Exponential decay
pretrain_params: Configuration of pretraining method.Contains a json object with following params
Param Default value nnet Type Description gbrbm_learning_rate0.005 DBN Pretraining learning rate for gbrbm layer. learning_rate0.08 SDA,DBN Pretraining learning rate (DBN: for all layers except gbrbm layer) epochs15 DBN No of Pretraining epochs initial_momentum0.5 DBN The initial momentum factor while pre-training final_momentum0.9 DBN The final momentum factor while pre-training initial_momentum_epoch5 DBN No: of epochs with the initial momentum factor before switching to final momentum factor keep_layer_num0 SDA,DBN From which layer Pre-Trainig Should Start.If non-Zero layer is intilaized with weights from input_file
export_path: path (realative towdir) for writting (bottleneck) features.processes: Process should be run by program.Contains a json object with following params
pretraining: whether Pre-Training is needed.(invalid for DNN and CNN).(Default value = false)finetuning: whether Fine Tuning is needed.(Default value = false)testing: whether Fine Tuning is needed.(Default value = false)export_data: whether extracted features should written to file.If true,export_pathis required.(Default value = false).
Also See: