Experiment setup update for G2P

This commit is contained in:
kaisheny 2015-05-29 22:45:24 -07:00
Родитель ebe575481e
Коммит 17bfe22dc7
4 изменённых файлов: 17 добавлений и 6 удалений

Просмотреть файл

@ -1,7 +1,7 @@
#WorkDir=//speechstore5/transient/kaishengy/data/lts/Data/CNTK
WorkDir=d:/exp/lts
#DataDir=d:/data/lts
DataDir=d:/data/ltsdbg
DataDir=d:/data/lts
#DataDir=d:/data/ltsdbg
NdlDir=c:/dev/cntk5/ExampleSetups/G2P/setups
PredictionModelFeatureDir=\\speechstore5\transient\kaishengy\exp\lts\result\expbilstmce300n\s4
ExpDir=\\speechstore5\transient\kaishengy\exp\lts\result\explstm

Просмотреть файл

@ -3,7 +3,7 @@
stderr=$LogDir$\ATIS\log
command=LSTM
type=double
type=float
LSTM=[
# this is the maximum size for the minibatch, since sequence minibatches are really just a single sequence

Просмотреть файл

@ -16,7 +16,7 @@ Run the command line with both globals.config and the desired config, separated
* note that full paths to config files need to be provided if you are not inside the config directory
* for example
* C:\dev\cntk5\CNTKSolution\x64\Release\cntk configFile=C:\dev\cntk5\ExampleSetups\SLU\globals.config+C:\dev\cntk5\ExampleSetups\SLU\rnnlu.config
* C:\dev\cntk5\x64\release\CNTK.exe configFile=C:\dev\cntk5\ExampleSetups\SLU\globals.config+C:\dev\cntk5\ExampleSetups\SLU\rnnlu.config
Scoring
* ./score.sh
@ -27,6 +27,15 @@ Path Definitions:
=================
* globals.config [defines paths to feature and label files and experiments]
Check training loss
==========================
$ grep Finish log_LSTM_LSTMTest.log
Finished Epoch[1]: [Training Set] Train Loss Per Sample = 0.62975813 EvalErr Per Sample = 0.62975813 Ave Learn Rate Per Sample = 0.1000000015 Epoch Time=5250.689
Finished Epoch[1]: [Validation Set] Train Loss Per Sample = 0.2035009 EvalErr Per Sample = 0.2035009
------ code changed and the following need to be verified ----
------ May 29 2015
--------------------------------------------------------------
Network Training Examples:
==========================
* rnnlu.config

Просмотреть файл

@ -110,7 +110,7 @@ LSTM=[
randomize=None
# number of utterances to be allocated for each minibatch
nbruttsineachrecurrentiter=1
nbruttsineachrecurrentiter=10
# if writerType is set, we will cache to a binary file
# if the binary file exists, we will use it instead of parsing this file
@ -210,7 +210,9 @@ LSTM=[
# if writerType is set, we will cache to a binary file
# if the binary file exists, we will use it instead of parsing this file
# writerType=BinaryReader
nbruttsineachrecurrentiter=10
#### write definition
wfile=$ExpDir$\sequenceSentence.valid.bin
#wsize - inital size of the file in MB