Initial demo sample structure and content

and addressed CR comments
This commit is contained in:
Philipp Kranen 2015-11-23 15:31:10 +01:00
Родитель 8bcc6b2530
Коммит 7bc03002af
61 изменённых файлов: 58409 добавлений и 809 удалений

12
Demos/Image/Readme.md Normal file
Просмотреть файл

@ -0,0 +1,12 @@
There are several examples for training different networks on image corpora
in the folder 'ExampleSetups/Image/' including:
* MNIST
* CIFAR-10
* ImageNet
For those examples the data has to be downloaded and converted
(python scripts are provided in the corresponding folder).
These examples also show how to train convolutional neural networks (CNN) with CNTK.
We recommend to start with the MNIST example.

25
Demos/Readme.md Normal file
Просмотреть файл

@ -0,0 +1,25 @@
# CNTK Demos and Example Setups
This folder contains a few self-contained demos to get started with CNTK.
The data for the demos is contained in the corresponding Data folders.
Each demo folder has a Readme file that explains how to run it on Windows and Linux.
How to run the demos on Philly (https://phillywebportal.azurewebsites.net/index.aspx) is
explained in the Philly portal wiki (Philly is an internal GPU cluster for Microsoft production runs).
The demos cover different domains such as Speech and Text classification
and show different types of networks including FF, CNN RNN and LSTM.
Further examples are provided in the folder 'ExampleSetups'.
A popular example is the MNIST handwritten digits classification task.
You can find this example in 'ExampleSetups/Images/MNIST'.
The examples in 'ExampleSetups' might require downloading data which in some cases is not free of charge.
See individual folders for more information.
The four examples shown in the table below provide a good introduction to CNTK.
Additional more complex examples can be found in the 'ExampleSetups' folder.
|Folder | Domain | Network types |
|:------------------------|:-------------------------------------------------|:----------------|
Demos/Simple2d | Synthetic 2d data | FF (CPU and GPU)
Demos/Speech | Speech data (CMU AN4) | FF and LSTM
Demos/Text | Text data (penn treebank) | RNN
ExampleSetups/Image/MNIST | Image data (MNIST handwritten digit recognition) | CNN

Просмотреть файл

@ -1,100 +0,0 @@
RootDir=..
command=Simple_Demo:Simple_Demo_Output
# deviceId=-1 for CPU, >=0 for GPU devices
DeviceNumber=-1
#stderr=Demo
precision=float
modelPath=models/simple.dnn
deviceId=$DeviceNumber$
outputNodeNames=ScaledLogLikelihood
traceLevel=1
#######################################
# TRAINING CONFIG (Simple, Fixed LR) #
#######################################
Simple_Demo=[
action=train
# Notation xxx:yyy*n:zzz is equivalent to xxx,
# then yyy repeated n times, then zzz
# example: 10:20*3:5 is equivalent to 10:20:20:20:5
SimpleNetworkBuilder=[
# 2 input, 2 50-element hidden, 2 output
layerSizes=2:50*2:2
trainingCriterion=CrossEntropyWithSoftmax
evalCriterion=ErrorPrediction
layerTypes=Sigmoid
initValueScale=1.0
applyMeanVarNorm=true
uniformInit=true
needPrior=true
]
SGD=[
# epochSize=0 means epochSize is the size of
# the training set. Must be evenly divisible
# into number of data frames.
epochSize=0
minibatchSize=25
learningRatesPerMB=0.5:0.2*20:0.1
momentumPerMB=0.9
dropoutRate=0.0
maxEpochs=50
]
# Parameter values for the reader
reader=[
# reader to use
readerType=UCIFastReader
file=$RootDir$/Demos/Simple/SimpleDataTrain.txt
miniBatchMode=Partial
randomize=Auto
verbosity=1
features=[
dim=2 # two-dimensional input data
start=0 # Start with first element on line
]
labels=[
start=2 # Skip two elements
dim=1 # One label dimension
labelDim=2 # Two labels possible
labelMappingFile=$RootDir$/Demos/Simple/SimpleMapping.txt
]
]
]
#######################################
# OUTPUT RESUTLS (Simple) #
#######################################
Simple_Demo_Output=[
action=write
# Parameter values for the reader
reader=[
# reader to use
readerType=UCIFastReader
file=$RootDir$/Demos/Simple/SimpleDataTest.txt
features=[
dim=2
start=0
]
labels=[
start=2
dim=1
labelDim=2
labelMappingFile=$RootDir$/Demos/Simple/SimpleMapping.txt
]
]
outputPath=SimpleOutput # Dump output as text
]

Просмотреть файл

@ -1,114 +0,0 @@
# This is a configuration for parallel training of a simple feed-forward neural netowrk using data-parallel SGD.
# The configuration is identical to the Simple.config except for the few additional parallelism related options.
RootDir=..
command=Simple_multigpu_Demo:Simple_multigpu_Demo_Output
# deviceId=-1 for CPU, >=0 for GPU devices
DeviceNumber=Auto
stderr=Demo
precision=float
modelPath=models_multigpu/simple.dnn
deviceId=$DeviceNumber$
outputNodeNames=ScaledLogLikelihood
traceLevel=1
parallelTrain=true
#######################################
# TRAINING CONFIG (Simple, Fixed LR) #
#######################################
Simple_multigpu_Demo=[
action=train
# Notation xxx:yyy*n:zzz is equivalent to xxx,
# then yyy repeated n times, then zzz
# example: 10:20*3:5 is equivalent to 10:20:20:20:5
SimpleNetworkBuilder=[
# 2 input, 2 50-element hidden, 2 output
layerSizes=2:50*2:2
trainingCriterion=CrossEntropyWithSoftmax
evalCriterion=ErrorPrediction
layerTypes=Sigmoid
initValueScale=1.0
applyMeanVarNorm=true
uniformInit=true
needPrior=true
]
SGD=[
# epochSize=0 means epochSize is the size of
# the training set. Must be evenly divisible
# into number of data frames.
epochSize=0
minibatchSize=25
learningRatesPerMB=0.5:0.2*20:0.1
momentumPerMB=0.9
dropoutRate=0.0
maxEpochs=50
ParallelTrain=[
parallelizationMethod=DataParallelSGD
#distributedMBReading=true
parallelizationStartEpoch=2
DataParallelSGD=[
gradientBits=1
#useZeroThresholdFor1BitQuantization=false
]
]
]
# Parameter values for the reader
reader=[
# reader to use
readerType=UCIFastReader
file=$RootDir$/Demos/Simple/SimpleDataTrain.txt
miniBatchMode=Partial
randomize=Auto
verbosity=1
features=[
dim=2 # two-dimensional input data
start=0 # Start with first element on line
]
labels=[
start=2 # Skip two elements
dim=1 # One label dimension
labelDim=2 # Two labels possible
labelMappingFile=$RootDir$/Demos/Simple/SimpleMapping.txt
]
]
]
#######################################
# OUTPUT RESUTLS (Simple) #
#######################################
Simple_multigpu_Demo_Output=[
action=write
# Parameter values for the reader
reader=[
# reader to use
readerType=UCIFastReader
file=$RootDir$/Demos/Simple/SimpleDataTest.txt
features=[
dim=2
start=0
]
labels=[
start=2
dim=1
labelDim=2
labelMappingFile=$RootDir$/Demos/Simple/SimpleMapping.txt
]
]
outputPath=SimpleOutput # Dump output as text
]

Просмотреть файл

@ -1,6 +1,6 @@
%%
% Create training and test sets for the simple CNTK demo. Plot the results
% Create some 2-dimensional data for the testing the CNTK Toolkit
% Create 2-dimensional data for the testing the CNTK Toolkit
N = 10000;
x = 2*(rand(N,1) - 0.5); % Uniform from -1 to 1

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 177 KiB

После

Ширина:  |  Высота:  |  Размер: 177 KiB

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 11 KiB

После

Ширина:  |  Высота:  |  Размер: 11 KiB

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 22 KiB

После

Ширина:  |  Высота:  |  Размер: 22 KiB

Просмотреть файл

@ -0,0 +1,150 @@
# This is a configuration for parallel training of a simple feed-forward neural network using data-parallel SGD.
# The configuration is identical to the Simple.config except for the few additional parallelism related options.
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
# deviceId = -1 for CPU, >= 0 for GPU devices, "auto" chooses the best GPU, or CPU if no usable GPU is available
deviceId = "auto"
command = Multigpu_Demo_Train:Multigpu_Demo_Test
precision = "float"
traceLevel = 1
modelPath = "$ModelDir$/multigpu.dnn"
outputNodeNames = ScaledLogLikelihood
stderr = "$OutputDir$/DemoOut"
parallelTrain = true
#######################################
# TRAINING CONFIG #
#######################################
Multigpu_Demo_Train=[
action = "train"
# Notation xxx:yyy*n:zzz is equivalent to xxx, then yyy repeated n times, then zzz
# Example: 10:20*3:5 is equivalent to 10:20:20:20:5
SimpleNetworkBuilder = [
# 2 input, 2 50-element hidden, 2 output
layerSizes = 2:50*2:2
trainingCriterion = "CrossEntropyWithSoftmax"
evalCriterion = "ErrorPrediction"
layerTypes = "Sigmoid"
initValueScale = 1.0
applyMeanVarNorm = true
uniformInit = true
needPrior = true
]
SGD = [
# epochSize = 0 means epochSize is the size of the training set
epochSize = 0
minibatchSize = 25
learningRatesPerMB = 0.5:0.2*20:0.1
momentumPerMB = 0.9
dropoutRate = 0.0
maxEpochs = 10
# Additional optional parameters are: distributedMBReading
parallelTrain = [
parallelizationMethod = "DataParallelSGD"
parallelizationStartEpoch = 2
# Additional optional parameters are: useZeroThresholdFor1BitQuantization
dataParallelSGD = [
gradientBits = 1
]
]
]
# Parameter values for the reader
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/SimpleDataTrain.txt"
miniBatchMode = "partial"
randomize = "auto"
verbosity = 1
features = [
dim = 2 # two-dimensional input data
start = 0 # Start with first element on line
]
labels = [
start = 2 # Skip two elements
dim = 1 # One label dimension
labelDim = 2 # Two labels possible
labelMappingFile = "$DataDir$/SimpleMapping.txt"
]
]
]
########################################
# TEST RESULTS #
# (computes prediction error and #
# perplexity on a test set and #
# writes the output to the console.) #
########################################
Multigpu_Demo_Test=[
action = "test"
# Parameter values for the reader
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/SimpleDataTest.txt"
randomize = "none"
features = [
dim = 2
start = 0
]
labels = [
start = 2
dim = 1
labelDim = 2
labelMappingFile = "$DataDir$/SimpleMapping.txt"
]
]
]
########################################
# OUTPUT RESULTS #
# (Computes the labels for a test set #
# and writes the results to a file.) #
########################################
Multigpu_Demo_Output=[
action = "write"
# Parameter values for the reader
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/SimpleDataTest.txt"
randomize = "none"
features = [
dim = 2
start = 0
]
labels = [
start = 2
dim = 1
labelDim = 2
labelMappingFile = "$DataDir$/SimpleMapping.txt"
]
]
outputPath = "$OutputDir$/MultigpuOutput" # Dump output as text
]

Просмотреть файл

@ -0,0 +1,134 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
# deviceId=-1 for CPU, >=0 for GPU devices, "auto" chooses the best GPU, or CPU if no usable GPU is available
deviceId = -1
command = Simple_Demo_Train:Simple_Demo_Test
precision = "float"
traceLevel = 1
modelPath = "$ModelDir$/simple.dnn"
outputNodeNames = ScaledLogLikelihood
#######################################
# TRAINING CONFIG #
#######################################
Simple_Demo_Train = [
action = "train"
# Notation xxx:yyy*n:zzz is equivalent to xxx, then yyy repeated n times, then zzz
# Example: 10:20*3:5 is equivalent to 10:20:20:20:5
SimpleNetworkBuilder = [
# 2 input, 2 50-element hidden, 2 output
layerSizes = 2:50*2:2
trainingCriterion = "CrossEntropyWithSoftmax"
evalCriterion = "ErrorPrediction"
layerTypes = "Sigmoid"
initValueScale = 1.0
applyMeanVarNorm = true
uniformInit = true
needPrior = true
]
SGD = [
# epochSize = 0 means epochSize is the size of the training set
epochSize = 0
minibatchSize = 25
learningRatesPerMB = 0.5:0.2*20:0.1
momentumPerMB = 0.9
dropoutRate = 0.0
maxEpochs = 10
]
# Parameter values for the reader
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/SimpleDataTrain.txt"
miniBatchMode = "partial"
randomize = "auto"
verbosity = 1
features = [
dim = 2 # two-dimensional input data
start = 0 # Start with first element on line
]
labels = [
start = 2 # Skip two elements
dim = 1 # One label dimension
labelDim = 2 # Two labels possible
labelMappingFile = "$DataDir$/SimpleMapping.txt"
]
]
]
########################################
# TEST RESULTS #
# (computes prediction error and #
# perplexity on a test set and #
# writes the output to the console.) #
########################################
Simple_Demo_Test = [
action = "test"
# Parameter values for the reader
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/SimpleDataTest.txt"
randomize = "none"
features = [
dim = 2
start = 0
]
labels = [
start = 2
dim = 1
labelDim = 2
labelMappingFile = "$DataDir$/SimpleMapping.txt"
]
]
]
########################################
# OUTPUT RESULTS #
# (Computes the labels for a test set #
# and writes the results to a file.) #
########################################
Simple_Demo_Output=[
action = "write"
# Parameter values for the reader
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/SimpleDataTest.txt"
randomize = "none"
features = [
dim = 2
start = 0
]
labels = [
start = 2
dim = 1
labelDim = 2
labelMappingFile = "$DataDir$/SimpleMapping.txt"
]
]
outputPath = "$OutputDir$/SimpleOutput" # Dump output as text
]

Просмотреть файл

@ -1,94 +1,94 @@
-1 -1 0
-1 -0.99 0
-1 -0.98 0
-1 -0.97 0
-1 -0.96 0
-1 -0.95 0
-1 -0.94 0
-1 -0.93 0
-1 -0.92 0
-1 -0.91 0
-1 -0.9 0
-1 -0.89 0
-1 -0.88 0
-1 -0.87 0
-1 -0.86 0
-1 -0.85 0
-1 -0.84 0
-1 -0.83 0
-1 -0.82 0
-1 -0.81 0
-1 -0.8 0
-1 -0.79 0
-1 -0.78 0
-1 -0.77 0
-1 -0.76 0
-1 -0.75 0
-1 -0.74 0
-1 -0.73 0
-1 -0.72 0
-1 -0.71 0
-1 -0.7 0
-1 -0.69 0
-1 -0.68 0
-1 -0.67 0
-1 -0.66 0
-1 -0.65 0
-1 -0.64 0
-1 -0.63 0
-1 -0.62 0
-1 -0.61 0
-1 -0.6 0
-1 -0.59 0
-1 -0.58 0
-1 -0.57 0
-1 -0.56 0
-1 -0.55 0
-1 -0.54 0
-1 -0.53 0
-1 -0.52 0
-1 -0.51 0
-1 -0.5 0
-1 -0.49 0
-1 -0.48 0
-1 -0.47 0
-1 -0.46 0
-1 -0.45 0
-1 -0.44 0
-1 -0.43 0
-1 -0.42 0
-1 -0.41 0
-1 -0.4 0
-1 -0.39 0
-1 -0.38 0
-1 -0.37 0
-1 -0.36 0
-1 -0.35 0
-1 -0.34 0
-1 -0.33 0
-1 -0.32 0
-1 -0.31 0
-1 -0.3 0
-1 -0.29 0
-1 -0.28 0
-1 -0.27 0
-1 -0.26 0
-1 -0.25 0
-1 -0.24 0
-1 -0.23 0
-1 -0.22 0
-1 -0.21 0
-1 -0.2 0
-1 -0.19 0
-1 -0.18 0
-1 -0.17 0
-1 -0.16 0
-1 -0.15 0
-1 -0.14 0
-1 -0.13 0
-1 -0.12 0
-1 -0.11 0
-1 -0.1 0
-1 -1 1
-1 -0.99 1
-1 -0.98 1
-1 -0.97 1
-1 -0.96 1
-1 -0.95 1
-1 -0.94 1
-1 -0.93 1
-1 -0.92 1
-1 -0.91 1
-1 -0.9 1
-1 -0.89 1
-1 -0.88 1
-1 -0.87 1
-1 -0.86 1
-1 -0.85 1
-1 -0.84 1
-1 -0.83 1
-1 -0.82 1
-1 -0.81 1
-1 -0.8 1
-1 -0.79 1
-1 -0.78 1
-1 -0.77 1
-1 -0.76 1
-1 -0.75 1
-1 -0.74 1
-1 -0.73 1
-1 -0.72 1
-1 -0.71 1
-1 -0.7 1
-1 -0.69 1
-1 -0.68 1
-1 -0.67 1
-1 -0.66 1
-1 -0.65 1
-1 -0.64 1
-1 -0.63 1
-1 -0.62 1
-1 -0.61 1
-1 -0.6 1
-1 -0.59 1
-1 -0.58 1
-1 -0.57 1
-1 -0.56 1
-1 -0.55 1
-1 -0.54 1
-1 -0.53 1
-1 -0.52 1
-1 -0.51 1
-1 -0.5 1
-1 -0.49 1
-1 -0.48 1
-1 -0.47 1
-1 -0.46 1
-1 -0.45 1
-1 -0.44 1
-1 -0.43 1
-1 -0.42 1
-1 -0.41 1
-1 -0.4 1
-1 -0.39 1
-1 -0.38 1
-1 -0.37 1
-1 -0.36 1
-1 -0.35 1
-1 -0.34 1
-1 -0.33 1
-1 -0.32 1
-1 -0.31 1
-1 -0.3 1
-1 -0.29 1
-1 -0.28 1
-1 -0.27 1
-1 -0.26 1
-1 -0.25 1
-1 -0.24 1
-1 -0.23 1
-1 -0.22 1
-1 -0.21 1
-1 -0.2 1
-1 -0.19 1
-1 -0.18 1
-1 -0.17 1
-1 -0.16 1
-1 -0.15 1
-1 -0.14 1
-1 -0.13 1
-1 -0.12 1
-1 -0.11 1
-1 -0.1 1
-1 -0.09 0
-1 -0.08 0
-1 -0.07 0
@ -198,98 +198,98 @@
-1 0.97 0
-1 0.98 0
-1 0.99 0
-1 1 0
0 -1 0
0 -0.99 0
0 -0.98 0
0 -0.97 0
0 -0.96 0
0 -0.95 0
0 -0.94 0
0 -0.93 0
0 -0.92 0
0 -0.91 0
0 -0.9 0
0 -0.89 0
0 -0.88 0
0 -0.87 0
0 -0.86 0
0 -0.85 0
0 -0.84 0
0 -0.83 0
0 -0.82 0
0 -0.81 0
0 -0.8 0
0 -0.79 0
0 -0.78 0
0 -0.77 0
0 -0.76 0
0 -0.75 0
0 -0.74 0
0 -0.73 0
0 -0.72 0
0 -0.71 0
0 -0.7 0
0 -0.69 0
0 -0.68 0
0 -0.67 0
0 -0.66 0
0 -0.65 0
0 -0.64 0
0 -0.63 0
0 -0.62 0
0 -0.61 0
0 -0.6 0
0 -0.59 0
0 -0.58 0
0 -0.57 0
0 -0.56 0
0 -0.55 0
0 -0.54 0
0 -0.53 0
0 -0.52 0
0 -0.51 0
0 -0.5 0
0 -0.49 0
0 -0.48 0
0 -0.47 0
0 -0.46 0
0 -0.45 0
0 -0.44 0
0 -0.43 0
0 -0.42 0
0 -0.41 0
0 -0.4 0
0 -0.39 0
0 -0.38 0
0 -0.37 0
0 -0.36 0
0 -0.35 0
0 -0.34 0
0 -0.33 0
0 -0.32 0
0 -0.31 0
0 -0.3 0
0 -0.29 0
0 -0.28 0
0 -0.27 0
0 -0.26 0
0 -0.25 0
0 -0.24 0
0 -0.23 0
0 -0.22 0
0 -0.21 0
0 -0.2 0
0 -0.19 0
0 -0.18 0
0 -0.17 0
0 -0.16 0
0 -0.15 0
0 -0.14 0
0 -0.13 0
0 -0.12 0
0 -0.11 0
0 -0.1 0
-1 0 0
0 -1 1
0 -0.99 1
0 -0.98 1
0 -0.97 1
0 -0.96 1
0 -0.95 1
0 -0.94 1
0 -0.93 1
0 -0.92 1
0 -0.91 1
0 -0.9 1
0 -0.89 1
0 -0.88 1
0 -0.87 1
0 -0.86 1
0 -0.85 1
0 -0.84 1
0 -0.83 1
0 -0.82 1
0 -0.81 1
0 -0.8 1
0 -0.79 1
0 -0.78 1
0 -0.77 1
0 -0.76 1
0 -0.75 1
0 -0.74 1
0 -0.73 1
0 -0.72 1
0 -0.71 1
0 -0.7 1
0 -0.69 1
0 -0.68 1
0 -0.67 1
0 -0.66 1
0 -0.65 1
0 -0.64 1
0 -0.63 1
0 -0.62 1
0 -0.61 1
0 -0.6 1
0 -0.59 1
0 -0.58 1
0 -0.57 1
0 -0.56 1
0 -0.55 1
0 -0.54 1
0 -0.53 1
0 -0.52 1
0 -0.51 1
0 -0.5 1
0 -0.49 1
0 -0.48 1
0 -0.47 1
0 -0.46 1
0 -0.45 1
0 -0.44 1
0 -0.43 1
0 -0.42 1
0 -0.41 1
0 -0.4 1
0 -0.39 1
0 -0.38 1
0 -0.37 1
0 -0.36 1
0 -0.35 1
0 -0.34 1
0 -0.33 1
0 -0.32 1
0 -0.31 1
0 -0.3 1
0 -0.29 1
0 -0.28 1
0 -0.27 1
0 -0.26 1
0 -0.25 1
0 -0.24 1
0 -0.23 1
0 -0.22 1
0 -0.21 1
0 -0.2 1
0 -0.19 1
0 -0.18 1
0 -0.17 1
0 -0.16 1
0 -0.15 1
0 -0.14 1
0 -0.13 1
0 -0.12 1
0 -0.11 1
0 -0.1 1
0 -0.09 0
0 -0.08 0
0 -0.07 0
@ -400,106 +400,106 @@
0 0.98 0
0 0.99 0
0 1 0
1 -1 0
1 -0.99 0
1 -0.98 0
1 -0.97 0
1 -0.96 0
1 -0.95 0
1 -0.94 0
1 -0.93 0
1 -0.92 0
1 -0.91 0
1 -0.9 0
1 -0.89 0
1 -0.88 0
1 -0.87 0
1 -0.86 0
1 -0.85 0
1 -0.84 0
1 -0.83 0
1 -0.82 0
1 -0.81 0
1 -0.8 0
1 -0.79 0
1 -0.78 0
1 -0.77 0
1 -0.76 0
1 -0.75 0
1 -0.74 0
1 -0.73 0
1 -0.72 0
1 -0.71 0
1 -0.7 0
1 -0.69 0
1 -0.68 0
1 -0.67 0
1 -0.66 0
1 -0.65 0
1 -0.64 0
1 -0.63 0
1 -0.62 0
1 -0.61 0
1 -0.6 0
1 -0.59 0
1 -0.58 0
1 -0.57 0
1 -0.56 0
1 -0.55 0
1 -0.54 0
1 -0.53 0
1 -0.52 0
1 -0.51 0
1 -0.5 0
1 -0.49 0
1 -0.48 0
1 -0.47 0
1 -0.46 0
1 -0.45 0
1 -0.44 0
1 -0.43 0
1 -0.42 0
1 -0.41 0
1 -0.4 0
1 -0.39 0
1 -0.38 0
1 -0.37 0
1 -0.36 0
1 -0.35 0
1 -0.34 0
1 -0.33 0
1 -0.32 0
1 -0.31 0
1 -0.3 0
1 -0.29 0
1 -0.28 0
1 -0.27 0
1 -0.26 0
1 -0.25 0
1 -0.24 0
1 -0.23 0
1 -0.22 0
1 -0.21 0
1 -0.2 0
1 -0.19 0
1 -0.18 0
1 -0.17 0
1 -0.16 0
1 -0.15 0
1 -0.14 0
1 -0.13 0
1 -0.12 0
1 -0.11 0
1 -0.1 0
1 -0.09 0
1 -0.08 0
1 -0.07 0
1 -0.06 0
1 -0.05 0
1 -0.04 0
1 -0.03 0
1 -0.02 0
1 -0.01 0
1 -1 1
1 -0.99 1
1 -0.98 1
1 -0.97 1
1 -0.96 1
1 -0.95 1
1 -0.94 1
1 -0.93 1
1 -0.92 1
1 -0.91 1
1 -0.9 1
1 -0.89 1
1 -0.88 1
1 -0.87 1
1 -0.86 1
1 -0.85 1
1 -0.84 1
1 -0.83 1
1 -0.82 1
1 -0.81 1
1 -0.8 1
1 -0.79 1
1 -0.78 1
1 -0.77 1
1 -0.76 1
1 -0.75 1
1 -0.74 1
1 -0.73 1
1 -0.72 1
1 -0.71 1
1 -0.7 1
1 -0.69 1
1 -0.68 1
1 -0.67 1
1 -0.66 1
1 -0.65 1
1 -0.64 1
1 -0.63 1
1 -0.62 1
1 -0.61 1
1 -0.6 1
1 -0.59 1
1 -0.58 1
1 -0.57 1
1 -0.56 1
1 -0.55 1
1 -0.54 1
1 -0.53 1
1 -0.52 1
1 -0.51 1
1 -0.5 1
1 -0.49 1
1 -0.48 1
1 -0.47 1
1 -0.46 1
1 -0.45 1
1 -0.44 1
1 -0.43 1
1 -0.42 1
1 -0.41 1
1 -0.4 1
1 -0.39 1
1 -0.38 1
1 -0.37 1
1 -0.36 1
1 -0.35 1
1 -0.34 1
1 -0.33 1
1 -0.32 1
1 -0.31 1
1 -0.3 1
1 -0.29 1
1 -0.28 1
1 -0.27 1
1 -0.26 1
1 -0.25 1
1 -0.24 1
1 -0.23 1
1 -0.22 1
1 -0.21 1
1 -0.2 1
1 -0.19 1
1 -0.18 1
1 -0.17 1
1 -0.16 1
1 -0.15 1
1 -0.14 1
1 -0.13 1
1 -0.12 1
1 -0.11 1
1 -0.1 1
1 -0.09 1
1 -0.08 1
1 -0.07 1
1 -0.06 1
1 -0.05 1
1 -0.04 1
1 -0.03 1
1 -0.02 1
1 -0.01 1
1 0 0
1 0.01 0
1 0.02 0

Просмотреть файл

80
Demos/Simple2d/Readme.md Normal file
Просмотреть файл

@ -0,0 +1,80 @@
# CNTK example: Simple2d
## Overview
|:--------|:---|
Data: |Two dimensional synthetic data
Purpose: |Showcase how to train a simple CNTK network (CPU and GPU) and how to use it for scoring (decoding)
Network: |SimpleNetworkBuilder, 2 hidden layers with 50 sigmoid nodes each, cross entropy with softmax
Training: |Stochastic gradient descent with momentum
Comments: |There are two config files: Simple.config uses a single CPU or GPU, Multigpu.config uses data-parallel SGD for training on multiple GPUs
## Running the example
### Getting the data
The data for this example is already contained in the folder Demos/Simple2d/Data/.
### Setup
Compile the sources to generate the cntk executable (not required if you downloaded the binaries).
__Windows:__ Add the folder of the cntk executable to your path
(e.g. `set PATH=%PATH%;c:/src/cntk/x64/Debug/;`)
or prefix the call to the cntk executable with the corresponding folder.
__Linux:__ Add the folder of the cntk executable to your path
(e.g. `export PATH=$PATH:$HOME/src/cntk/build/debug/bin/`)
or prefix the call to the cntk executable with the corresponding folder.
### Run
Run the example from the Demos/Simple2d/Data folder using:
`cntk configFile=../Config/Simple.config`
or run from any folder and specify the Data folder as the `currentDirectory`,
e.g. running from the Demos/Simple2d folder using:
`cntk configFile=Config/Simple.config currentDirectory=Data`
The output folder will be created inside Demos/Simple2d/.
## Details
### Config files
The config files define a `RootDir` variable and sevearal other variables for directories.
The `ConfigDir` and `ModelDir` variables define the folders for additional config files and for model files.
These variables will be overwritten when running on the Philly cluster.
__It is therefore recommended to generally use `ConfigDir` and `ModelDir` in all config files.__
To run on CPU set `deviceId = -1`, to run on GPU set deviceId to "auto" or a specific value >= 0.
Both config files are nearly identical.
Multigpu.config has some additional parameters for parallel training (see parallelTrain in the file).
Both files define the following three commands: train, test and output.
By default only train and test are executed:
`command=Simple_Demo_Train:Simple_Demo_Test`
The prediction error on the test data is written to stdout.
The trained models for each epoch are stored in the output models folder.
In the case of the Multigpu config the console output is written to a file `stderr = DemoOut`.
### Additional files
The 'AdditionalFiles' folder contains the Matlab script that generates the
training and test data as well as the plots that are provided in the folder.
The data is synthetic 2d data representing two classes that are separated by a sinusoidal boundary.
SimpleDemoDataReference.png shows a plot of the training data.
![training data plot](AdditionalFiles/SimpleDemoDataReference.png)
## Using a trained model
The Test (e.g. Simple_Demo_Test) and the Output (e.g. Simple_Demo_Output) commands
specified in the config files use the trained model to compute labels for data specified in the SimpleDataTest.txt file.
The Test command computes prediction error, cross entropy and perplexity for the test set and outputs them to the console.
The Output command writes for each test instance the likelihood per label to a file `outputPath = $OutputDir$/SimpleOutput`.
The model that is used to compute the labels in these commands is defined
in the modelPath variable at the beginning of the file `modelPath=$modelDir$/simple.dnn`.

Просмотреть файл

@ -0,0 +1,53 @@
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0 Transitional//EN">
<html>
<head>
<title>AN4 License Terms</title>
<meta http-equiv="content-type"
content="text/html; charset=ISO-8859-1">
</head>
<body>
<h2>AN4 License Terms</h2>
<p>This audio database is free for use for any purpose (commercial or otherwise)
subject to the restrictions detailed below.</p>
<pre>
/* ====================================================================
* Copyright (c) 1991-2005 Carnegie Mellon University. All rights
* reserved.
*
* Redistribution and use in source and binary forms, with or without
* modification, are permitted provided that the following conditions
* are met:
*
* 1. Redistributions of source code must retain the above copyright
* notice, this list of conditions and the following disclaimer.
*
* 2. Redistributions in binary form must reproduce the above copyright
* notice, this list of conditions and the following disclaimer in
* the documentation and/or other materials provided with the
* distribution.
*
* This work was supported in part by funding from the Defense Advanced
* Research Projects Agency and the National Science Foundation of the
* United States of America, and the CMU Sphinx Speech Consortium.
*
* THIS SOFTWARE IS PROVIDED BY CARNEGIE MELLON UNIVERSITY ``AS IS'' AND
* ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
* THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
* PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL CARNEGIE MELLON UNIVERSITY
* NOR ITS EMPLOYEES BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
* SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
* LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
* DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
* THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
* (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
* OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
*
* ====================================================================
*/
</pre>
</body>
</html>

Просмотреть файл

@ -0,0 +1,93 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
# deviceId=-1 for CPU, >=0 for GPU devices, "auto" chooses the best GPU, or CPU if no usable GPU is available
deviceId = -1
command = speechTrain
precision = "float"
traceLevel = "1"
modelPath = "$ModelDir$/cntkSpeechFF.dnn"
parallelTrain = true
#######################################
# TRAINING CONFIG #
#######################################
speechTrain = [
action = "train"
SimpleNetworkBuilder = [
layerSizes = 363:512:512:132
trainingCriterion = "CrossEntropyWithSoftmax"
evalCriterion = "ErrorPrediction"
layerTypes = "Sigmoid"
initValueScale = 1.0
applyMeanVarNorm = true
uniformInit = true
needPrior = true
]
SGD = [
epochSize = 20480
minibatchSize = 256:1024:2048
learningRatesPerMB = 1.0:0.5:0.1
numMBsToShowResult = 10
momentumPerMB = 0.9:0.656119
dropoutRate = 0.0
maxEpochs = 3
keepCheckPointFiles = true
# Additional optional parameters are: parallelizationStartEpoch
parallelTrain = [
parallelizationMethod = "DataParallelSGD"
distributedMBReading = true
# Additional optional parameters are: useZeroThresholdFor1BitQuantization
dataParallelSGD = [
gradientBits = 1
]
]
autoAdjust = [
reduceLearnRateIfImproveLessThan = 0
loadBestModel = true
increaseLearnRateIfImproveMoreThan = 1000000000
learnRateDecreaseFactor = 0.5
learnRateIncreaseFactor = 1.382
autoAdjustLR = "adjustAfterEpoch"
]
clippingThresholdPerSample = 1#INF
]
reader = [
readerType = "HTKMLFReader"
readMethod = "blockRandomize"
miniBatchMode = "partial"
randomize = "auto"
verbosity = 0
features = [
dim = 363
type = "real"
scpFile = "$DataDir$/glob_0000.scp"
]
labels = [
mlfFile = "$DataDir$/glob_0000.mlf"
labelMappingFile = "$DataDir$/state.list"
labelDim = 132
labelType = "category"
]
]
]

Просмотреть файл

@ -0,0 +1,67 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
# deviceId = -1 for CPU, >=0 for GPU devices, "auto" chooses the best GPU, or CPU if no usable GPU is available
deviceId = -1
command = speechTrain
precision = "float"
traceLevel = 1
modelPath = "$ModelDir$/cntkSpeechLSTM.dnn"
parallelTrain = true
#######################################
# TRAINING CONFIG #
#######################################
frameMode = false
truncated = true
speechTrain = [
action = "train"
NDLNetworkBuilder = [
networkDescription = "$ConfigDir$/lstmp-3layer-opt.ndl"
]
SGD = [
epochSize = 20480
minibatchSize = 2048
learningRatesPerMB = 0.5
numMBsToShowResult = 10
momentumPerMB = 0:0.9
maxEpochs = 4
keepCheckPointFiles = true
]
reader = [
readerType = "HTKMLFReader"
readMethod = "blockRandomize"
miniBatchMode = "partial"
nbruttsineachrecurrentiter = 32
randomize = "auto"
verbosity = 0
features = [
dim = 363
type = "real"
scpFile = "$DataDir$/glob_0000.scp"
]
labels = [
mlfFile = "$DataDir$/glob_0000.mlf"
labelMappingFile = "$DataDir$/state.list"
labelDim = 132
labelType = "category"
]
]
]

Просмотреть файл

@ -0,0 +1,213 @@
# macros to include
load = ndlMacroDefine
# the actual NDL that defines the network
run = ndlCreateNetwork_LSTMP_c1024_p256_x3
ndlMacroDefine = [
# Macro definitions
MeanVarNorm(x)=[
xMean = Mean(x);
xStdDev = InvStdDev(x)
xNorm = PerDimMeanVarNormalization(x, xMean, xStdDev)
]
LogPrior(labels) = [
prior = Mean(labels)
logPrior = Log(Prior)
]
LSTMPComponent(inputDim, outputDim, cellDim, inputx, cellDimX2, cellDimX3, cellDimX4) = [
wx = Parameter(cellDimX4, inputDim, init="uniform", initValueScale=1);
b = Parameter(cellDimX4, 1, init="fixedValue", value=0.0);
Wh = Parameter(cellDimX4, outputDim, init="uniform", initValueScale=1);
Wci = Parameter(cellDim, init="uniform", initValueScale=1);
Wcf = Parameter(cellDim, init="uniform", initValueScale=1);
Wco = Parameter(cellDim, init="uniform", initValueScale=1);
dh = PastValue(outputDim, output, timeStep=1);
dc = PastValue(cellDim, ct, timeStep=1);
wxx = Times(wx, inputx);
wxxpb = Plus(wxx, b);
whh = Times(wh, dh);
wxxpbpwhh = Plus(wxxpb,whh)
G1 = RowSlice(0, cellDim, wxxpbpwhh)
G2 = RowSlice(cellDim, cellDim, wxxpbpwhh)
G3 = RowSlice(cellDimX2, cellDim, wxxpbpwhh);
G4 = RowSlice(cellDimX3, cellDim, wxxpbpwhh);
Wcidc = DiagTimes(Wci, dc);
it = Sigmoid (Plus ( G1, Wcidc));
bit = ElementTimes(it, Tanh( G2 ));
Wcfdc = DiagTimes(Wcf, dc);
ft = Sigmoid( Plus (G3, Wcfdc));
bft = ElementTimes(ft, dc);
ct = Plus(bft, bit);
Wcoct = DiagTimes(Wco, ct);
ot = Sigmoid( Plus( G4, Wcoct));
mt = ElementTimes(ot, Tanh(ct));
Wmr = Parameter(outputDim, cellDim, init="uniform", initValueScale=1);
output = Times(Wmr, mt);
]
LSTMPComponentBetter(inputDim, outputDim, cellDim, inputx, cellDimX2, cellDimX3, cellDimX4) = [
wx = Parameter(cellDimX4, inputDim, init="uniform", initValueScale=1);
b = Parameter(cellDimX4, 1, init="fixedValue", value=0.0);
Wh = Parameter(cellDimX4, outputDim, init="uniform", initValueScale=1);
Wci = Parameter(cellDim, init="uniform", initValueScale=1);
Wcf = Parameter(cellDim, init="uniform", initValueScale=1);
Wco = Parameter(cellDim, init="uniform", initValueScale=1);
dh = PastValue(outputDim, output, timeStep=1);
dc = PastValue(cellDim, ct, timeStep=1);
wxx = Times(wx, inputx);
wxxpb = Plus(wxx, b);
whh = Times(wh, dh);
Wxix = RowSlice(0, cellDim, wxx); #Times(Wxi, inputx);
Whidh = RowSlice(0, cellDim, whh); #Times(Whi, dh);
Wcidc = DiagTimes(Wci, dc);
it = Sigmoid (Plus ( Plus (Wxix, Whidh), Wcidc));
Wxcx = RowSlice(cellDim, cellDim, wxx); #Times(Wxc, inputx);
Whcdh = RowSlice(cellDim, cellDim, whh); #Times(Whc, dh);
bit = ElementTimes(it, Tanh( Plus(Wxcx, Whcdh)));
Wxfx = RowSlice(cellDimX2, cellDim, wxx); #Times(Wxf, inputx);
Whfdh = RowSlice(cellDimX2, cellDim, whh); #Times(Whf, dh);
Wcfdc = DiagTimes(Wcf, dc);
ft = Sigmoid( Plus (Plus (Wxfx, Whfdh), Wcfdc));
bft = ElementTimes(ft, dc);
ct = Plus(bft, bit);
Wxox = RowSlice(cellDimX3, cellDim, wxx); #Times(Wxo, inputx);
Whodh = RowSlice(cellDimX3, cellDim, whh); #Times(Who, dh);
Wcoct = DiagTimes(Wco, ct);
ot = Sigmoid( Plus( Plus( Wxox, Whodh), Wcoct));
mt = ElementTimes(ot, Tanh(ct));
Wmr = Parameter(outputDim, cellDim, init="uniform", initValueScale=1);
output = Times(Wmr, mt);
]
LSTMPComponentNaive(inputDim, outputDim, cellDim, inputx) = [
Wxo = Parameter(cellDim, inputDim, init="uniform", initValueScale=1);
Wxi = Parameter(cellDim, inputDim, init="uniform", initValueScale=1);
Wxf = Parameter(cellDim, inputDim, init="uniform", initValueScale=1);
Wxc = Parameter(cellDim, inputDim, init="uniform", initValueScale=1);
bo = Parameter(cellDim, init="fixedValue", value=0.0);
bc = Parameter(cellDim, init="fixedValue", value=0.0);
bi = Parameter(cellDim, init="fixedValue", value=0.0);
bf = Parameter(cellDim, init="fixedValue", value=0.0);
Whi = Parameter(cellDim, outputDim, init="uniform", initValueScale=1);
Wci = Parameter(cellDim, init="uniform", initValueScale=1);
Whf = Parameter(cellDim, outputDim, init="uniform", initValueScale=1);
Wcf = Parameter(cellDim, init="uniform", initValueScale=1);
Who = Parameter(cellDim, outputDim, init="uniform", initValueScale=1);
Wco = Parameter(cellDim, init="uniform", initValueScale=1);
Whc = Parameter(cellDim, outputDim, init="uniform", initValueScale=1);
dh = PastValue(outputDim, output, timeStep=1);
dc = PastValue(cellDim, ct, timeStep=1);
Wxix = Times(Wxi, inputx);
Whidh = Times(Whi, dh);
Wcidc = DiagTimes(Wci, dc);
it = Sigmoid (Plus ( Plus (Plus (Wxix, bi), Whidh), Wcidc));
Wxcx = Times(Wxc, inputx);
Whcdh = Times(Whc, dh);
bit = ElementTimes(it, Tanh( Plus(Wxcx, Plus(Whcdh, bc))));
Wxfx = Times(Wxf, inputx);
Whfdh = Times(Whf, dh);
Wcfdc = DiagTimes(Wcf, dc);
ft = Sigmoid( Plus (Plus (Plus(Wxfx, bf), Whfdh), Wcfdc));
bft = ElementTimes(ft, dc);
ct = Plus(bft, bit);
Wxox = Times(Wxo, inputx);
Whodh = Times(Who, dh);
Wcoct = DiagTimes(Wco, ct);
ot = Sigmoid( Plus( Plus( Plus(Wxox, bo), Whodh), Wcoct));
mt = ElementTimes(ot, Tanh(ct));
Wmr = Parameter(outputDim, cellDim, init="uniform", initValueScale=1);
output = Times(Wmr, mt);
]
]
ndlCreateNetwork_LSTMP_c1024_p256_x3 = [
# define basic i/o
basefeatDim = 363
rowSliceStart = 1200
featDim = 363
labelDim = 132
cellDim = 1024
cellDimX2 = 2048
cellDimX3 = 3072
cellDimX4 = 4096
hiddenDim = 512
features = Input(featDim)
labels = Input(labelDim)
featNorm = MeanVarNorm(features)
# layer 1
LSTMoutput1 = LSTMPComponent(basefeatDim, hiddenDim, cellDim, featNorm, cellDimX2, cellDimX3, cellDimX4);
# layer 2
LSTMoutput2 = LSTMPComponent(hiddenDim, hiddenDim, cellDim, LSTMoutput1, cellDimX2, cellDimX3, cellDimX4);
# layer 3
LSTMoutput3 = LSTMPComponent(hiddenDim, hiddenDim, cellDim, LSTMoutput2, cellDimX2, cellDimX3, cellDimX4);
W = Parameter(labelDim, hiddenDim, init="uniform", initValueScale=1);
b = Parameter(labelDim, 1, init="fixedValue", value=0);
LSTMoutputW = Plus(Times(W, LSTMoutput3), b);
ce = CrossEntropyWithSoftmax(labels, LSTMoutputW);
err = ErrorPrediction(labels, LSTMoutputW);
logPrior = LogPrior(labels)
scaledLogLikelihood = Minus(LSTMoutputW, logPrior)
# Special Nodes
FeatureNodes = (features)
LabelNodes = (labels)
CriterionNodes = (ce)
EvalNodes = (err)
OutputNodes = (scaledLogLikelihood)
]

Двоичные данные
Demos/Speech/Data/000000000.chunk Normal file

Двоичный файл не отображается.

Разница между файлами не показана из-за своего большого размера Загрузить разницу

Просмотреть файл

@ -0,0 +1,948 @@
An4/71/71/cen5-fjam-b.mfc=000000000.chunk[0,367]
An4/213/213/cen4-fsaf2-b.mfc=000000000.chunk[368,805]
An4/513/513/cen7-mgah-b.mfc=000000000.chunk[806,1173]
An4/614/614/cen7-mkdb-b.mfc=000000000.chunk[1174,1421]
An4/507/507/cen1-mgah-b.mfc=000000000.chunk[1422,1669]
An4/693/693/cen8-mmkw-b.mfc=000000000.chunk[1670,2027]
An4/918/918/cen4-mtos-b.mfc=000000000.chunk[2028,2335]
An4/477/477/an257-mewl-b.mfc=000000000.chunk[2336,2943]
An4/454/454/an70-meht-b.mfc=000000000.chunk[2944,3021]
An4/254/254/cen6-ftmj-b.mfc=000000000.chunk[3022,3249]
An4/946/946/cen6-mwhw-b.mfc=000000000.chunk[3250,3467]
An4/122/122/cen4-fkdo-b.mfc=000000000.chunk[3468,3735]
An4/181/181/an183-fnsv-b.mfc=000000000.chunk[3736,4093]
An4/93/93/cen1-fjmd-b.mfc=000000000.chunk[4094,4251]
An4/128/128/an62-flmm2-b.mfc=000000000.chunk[4252,4409]
An4/688/688/cen3-mmkw-b.mfc=000000000.chunk[4410,4617]
An4/872/872/an332-msrb-b.mfc=000000000.chunk[4618,4985]
An4/624/624/cen5-mkem-b.mfc=000000000.chunk[4986,5383]
An4/146/146/cen2-flrp-b.mfc=000000000.chunk[5384,5541]
An4/198/198/cen2-fplp-b.mfc=000000000.chunk[5542,5969]
An4/239/239/cen4-ftal-b.mfc=000000000.chunk[5970,6187]
An4/49/49/an291-ffmm-b.mfc=000000000.chunk[6188,6335]
An4/306/306/cen7-mbmg-b.mfc=000000000.chunk[6336,6733]
An4/252/252/cen4-ftmj-b.mfc=000000000.chunk[6734,7171]
An4/800/800/an359-mscg2-b.mfc=000000000.chunk[7172,7509]
An4/771/771/an236-mrjc2-b.mfc=000000000.chunk[7510,7597]
An4/880/880/cen5-msrb-b.mfc=000000000.chunk[7598,7955]
An4/795/795/cen7-mrmg-b.mfc=000000000.chunk[7956,8293]
An4/821/821/cen7-msct-b.mfc=000000000.chunk[8294,8611]
An4/255/255/cen7-ftmj-b.mfc=000000000.chunk[8612,8949]
An4/580/580/an58-mjhp-b.mfc=000000000.chunk[8950,9267]
An4/70/70/cen4-fjam-b.mfc=000000000.chunk[9268,9595]
An4/528/528/an171-mjda-b.mfc=000000000.chunk[9596,9963]
An4/901/901/an35-mtje-b.mfc=000000000.chunk[9964,10371]
An4/776/776/cen1-mrjc2-b.mfc=000000000.chunk[10372,10779]
An4/908/908/cen7-mtje-b.mfc=000000000.chunk[10780,11257]
An4/603/603/an316-mkdb-b.mfc=000000000.chunk[11258,11565]
An4/544/544/an20-mjdr-b.mfc=000000000.chunk[11566,11853]
An4/243/243/cen8-ftal-b.mfc=000000000.chunk[11854,12071]
An4/891/891/cen3-mtcv-b.mfc=000000000.chunk[12072,12269]
An4/245/245/an212-ftmj-b.mfc=000000000.chunk[12270,12647]
An4/156/156/an119-fmjc-b.mfc=000000000.chunk[12648,13055]
An4/446/446/cen5-meab-b.mfc=000000000.chunk[13056,13483]
An4/801/801/an360-mscg2-b.mfc=000000000.chunk[13484,13601]
An4/538/538/cen6-mjda-b.mfc=000000000.chunk[13602,13799]
An4/282/282/an1-mblw-b.mfc=000000000.chunk[13800,13947]
An4/589/589/cen7-mjhp-b.mfc=000000000.chunk[13948,14275]
An4/710/710/an389-mmtm-b.mfc=000000000.chunk[14276,14603]
An4/638/638/cen6-mmaf-b.mfc=000000000.chunk[14604,14811]
An4/874/874/an334-msrb-b.mfc=000000000.chunk[14812,15029]
An4/40/40/an40-fejs-b.mfc=000000000.chunk[15030,15337]
An4/176/176/cen6-fmjd-b.mfc=000000000.chunk[15338,15545]
An4/732/732/cen8-mnfe-b.mfc=000000000.chunk[15546,15773]
An4/575/575/cen6-mjgk-b.mfc=000000000.chunk[15774,16191]
An4/234/234/an329-ftal-b.mfc=000000000.chunk[16192,16429]
An4/497/497/cen4-mfaa-b.mfc=000000000.chunk[16430,16687]
An4/619/619/an189-mkem-b.mfc=000000000.chunk[16688,16785]
An4/303/303/cen4-mbmg-b.mfc=000000000.chunk[16786,17093]
An4/502/502/an196-mgah-b.mfc=000000000.chunk[17094,17291]
An4/436/436/cen8-mdxs-b.mfc=000000000.chunk[17292,17619]
An4/889/889/cen1-mtcv-b.mfc=000000000.chunk[17620,18227]
An4/697/697/an384-mmsh-b.mfc=000000000.chunk[18228,18475]
An4/413/413/an108-mdxn-b.mfc=000000000.chunk[18476,18643]
An4/165/165/cen8-fmjc-b.mfc=000000000.chunk[18644,18901]
An4/186/186/cen3-fnsv-b.mfc=000000000.chunk[18902,19149]
An4/274/274/cen1-mblb-b.mfc=000000000.chunk[19150,19417]
An4/309/309/an202-mcel-b.mfc=000000000.chunk[19418,19525]
An4/725/725/cen1-mnfe-b.mfc=000000000.chunk[19526,19783]
An4/699/699/cen1-mmsh-b.mfc=000000000.chunk[19784,20051]
An4/833/833/cen6-msjm-b.mfc=000000000.chunk[20052,20299]
An4/857/857/cen4-mskh-b.mfc=000000000.chunk[20300,20687]
An4/734/734/an82-mnjl-b.mfc=000000000.chunk[20688,21025]
An4/340/340/cen3-mcfl-b.mfc=000000000.chunk[21026,21263]
An4/36/36/an36-fejs-b.mfc=000000000.chunk[21264,21641]
An4/690/690/cen5-mmkw-b.mfc=000000000.chunk[21642,22069]
An4/545/545/cen1-mjdr-b.mfc=000000000.chunk[22070,22347]
An4/115/115/an132-fkdo-b.mfc=000000000.chunk[22348,22505]
An4/48/48/cen8-fejs-b.mfc=000000000.chunk[22506,22723]
An4/518/518/an249-mjbh-b.mfc=000000000.chunk[22724,22811]
An4/89/89/an6-fjmd-b.mfc=000000000.chunk[22812,22889]
An4/668/668/an337-mmdg-b.mfc=000000000.chunk[22890,23007]
An4/622/622/cen2-mkem-b.mfc=000000000.chunk[23008,23175]
An4/8/8/cen5-fash-b.mfc=000000000.chunk[23176,23623]
An4/601/601/cen7-mjjs2-b.mfc=000000000.chunk[23624,24051]
An4/480/480/an260-mewl-b.mfc=000000000.chunk[24052,24409]
An4/182/182/an184-fnsv-b.mfc=000000000.chunk[24410,24497]
An4/179/179/an181-fnsv-b.mfc=000000000.chunk[24498,24825]
An4/92/92/an9-fjmd-b.mfc=000000000.chunk[24826,25003]
An4/164/164/cen7-fmjc-b.mfc=000000000.chunk[25004,25251]
An4/16/16/cen2-fbbh-b.mfc=000000000.chunk[25252,25549]
An4/657/657/an49-mmap-b.mfc=000000000.chunk[25550,25867]
An4/723/723/an349-mnfe-b.mfc=000000000.chunk[25868,26325]
An4/700/700/cen2-mmsh-b.mfc=000000000.chunk[26326,26453]
An4/675/675/cen4-mmdg-b.mfc=000000000.chunk[26454,26861]
An4/386/386/an112-mdcs2-b.mfc=000000000.chunk[26862,27129]
An4/152/152/cen8-flrp-b.mfc=000000000.chunk[27130,27347]
An4/740/740/cen3-mnjl-b.mfc=000000000.chunk[27348,27465]
An4/370/370/cen7-mcsc-b.mfc=000000000.chunk[27466,27783]
An4/683/683/an364-mmkw-b.mfc=000000000.chunk[27784,27861]
An4/440/440/an139-meab-b.mfc=000000000.chunk[27862,28089]
An4/789/789/cen1-mrmg-b.mfc=000000000.chunk[28090,28427]
An4/611/611/cen4-mkdb-b.mfc=000000000.chunk[28428,28685]
An4/10/10/an86-fbbh-b.mfc=000000000.chunk[28686,29013]
An4/343/343/cen6-mcfl-b.mfc=000000000.chunk[29014,29251]
An4/438/438/an137-meab-b.mfc=000000000.chunk[29252,29669]
An4/456/456/cen2-meht-b.mfc=000000000.chunk[29670,29817]
An4/489/489/an161-mfaa-b.mfc=000000000.chunk[29818,30075]
An4/53/53/an295-ffmm-b.mfc=000000000.chunk[30076,30363]
An4/702/702/cen4-mmsh-b.mfc=000000000.chunk[30364,30681]
An4/777/777/cen2-mrjc2-b.mfc=000000000.chunk[30682,30999]
An4/873/873/an333-msrb-b.mfc=000000000.chunk[31000,31097]
An4/768/768/cen6-mrcb-b.mfc=000000000.chunk[31098,31275]
An4/552/552/cen8-mjdr-b.mfc=000000000.chunk[31276,31503]
An4/631/631/an54-mmaf-b.mfc=000000000.chunk[31504,31611]
An4/476/476/an256-mewl-b.mfc=000000000.chunk[31612,31689]
An4/151/151/cen7-flrp-b.mfc=000000000.chunk[31690,31937]
An4/920/920/cen6-mtos-b.mfc=000000000.chunk[31938,32145]
An4/358/358/cen8-mcrt-b.mfc=000000000.chunk[32146,32463]
An4/177/177/cen7-fmjd-b.mfc=000000000.chunk[32464,32761]
An4/635/635/cen3-mmaf-b.mfc=000000000.chunk[32762,32929]
An4/719/719/cen8-mmtm-b.mfc=000000000.chunk[32930,33207]
An4/750/750/cen1-mrab-b.mfc=000000000.chunk[33208,33395]
An4/755/755/cen6-mrab-b.mfc=000000000.chunk[33396,33573]
An4/721/721/an347-mnfe-b.mfc=000000000.chunk[33574,33661]
An4/380/380/cen4-mdcs-b.mfc=000000000.chunk[33662,33909]
An4/625/625/cen6-mkem-b.mfc=000000000.chunk[33910,34117]
An4/106/106/cen1-fkai-b.mfc=000000000.chunk[34118,34295]
An4/658/658/an50-mmap-b.mfc=000000000.chunk[34296,34513]
An4/402/402/an210-mdmc-b.mfc=000000000.chunk[34514,35021]
An4/192/192/an91-fplp-b.mfc=000000000.chunk[35022,35469]
An4/416/416/cen1-mdxn-b.mfc=000000000.chunk[35470,35757]
An4/161/161/cen4-fmjc-b.mfc=000000000.chunk[35758,35965]
An4/797/797/an356-mscg2-b.mfc=000000000.chunk[35966,36183]
An4/433/433/cen5-mdxs-b.mfc=000000000.chunk[36184,36691]
An4/57/57/cen4-ffmm-b.mfc=000000000.chunk[36692,37119]
An4/157/157/an120-fmjc-b.mfc=000000000.chunk[37120,37347]
An4/272/272/an374-mblb-b.mfc=000000000.chunk[37348,37575]
An4/549/549/cen5-mjdr-b.mfc=000000000.chunk[37576,37903]
An4/41/41/cen1-fejs-b.mfc=000000000.chunk[37904,38341]
An4/290/290/cen4-mblw-b.mfc=000000000.chunk[38342,38549]
An4/701/701/cen3-mmsh-b.mfc=000000000.chunk[38550,38677]
An4/398/398/an206-mdmc-b.mfc=000000000.chunk[38678,39005]
An4/640/640/cen8-mmaf-b.mfc=000000000.chunk[39006,39323]
An4/904/904/cen3-mtje-b.mfc=000000000.chunk[39324,39541]
An4/686/686/cen1-mmkw-b.mfc=000000000.chunk[39542,40039]
An4/97/97/cen5-fjmd-b.mfc=000000000.chunk[40040,40397]
An4/259/259/an223-fwxs-b.mfc=000000000.chunk[40398,40495]
An4/729/729/cen5-mnfe-b.mfc=000000000.chunk[40496,41033]
An4/709/709/an388-mmtm-b.mfc=000000000.chunk[41034,41131]
An4/692/692/cen7-mmkw-b.mfc=000000000.chunk[41132,41759]
An4/2/2/an253-fash-b.mfc=000000000.chunk[41760,41827]
An4/39/39/an39-fejs-b.mfc=000000000.chunk[41828,42095]
An4/488/488/cen8-mewl-b.mfc=000000000.chunk[42096,42423]
An4/411/411/an106-mdxn-b.mfc=000000000.chunk[42424,42601]
An4/905/905/cen4-mtje-b.mfc=000000000.chunk[42602,43069]
An4/783/783/cen8-mrjc2-b.mfc=000000000.chunk[43070,43417]
An4/205/205/an296-fsaf2-b.mfc=000000000.chunk[43418,43705]
An4/788/788/an285-mrmg-b.mfc=000000000.chunk[43706,44053]
An4/173/173/cen3-fmjd-b.mfc=000000000.chunk[44054,44251]
An4/389/389/an115-mdcs2-b.mfc=000000000.chunk[44252,44579]
An4/412/412/an107-mdxn-b.mfc=000000000.chunk[44580,44867]
An4/69/69/cen3-fjam-b.mfc=000000000.chunk[44868,45045]
An4/84/84/cen5-fjdn-b.mfc=000000000.chunk[45046,45273]
An4/826/826/an229-msjm-b.mfc=000000000.chunk[45274,45361]
An4/722/722/an348-mnfe-b.mfc=000000000.chunk[45362,45589]
An4/490/490/an162-mfaa-b.mfc=000000000.chunk[45590,45897]
An4/335/335/an263-mcfl-b.mfc=000000000.chunk[45898,46275]
An4/854/854/cen1-mskh-b.mfc=000000000.chunk[46276,46503]
An4/334/334/an262-mcfl-b.mfc=000000000.chunk[46504,46851]
An4/403/403/cen1-mdmc-b.mfc=000000000.chunk[46852,47079]
An4/46/46/cen6-fejs-b.mfc=000000000.chunk[47080,47277]
An4/154/154/an117-fmjc-b.mfc=000000000.chunk[47278,47595]
An4/565/565/cen8-mjes-b.mfc=000000000.chunk[47596,47843]
An4/251/251/cen3-ftmj-b.mfc=000000000.chunk[47844,48071]
An4/139/139/an21-flrp-b.mfc=000000000.chunk[48072,48479]
An4/6/6/cen2-fash-b.mfc=000000000.chunk[48480,48607]
An4/76/76/an122-fjdn-b.mfc=000000000.chunk[48608,48765]
An4/817/817/cen3-msct-b.mfc=000000000.chunk[48766,48913]
An4/328/328/cen4-mcen-b.mfc=000000000.chunk[48914,49161]
An4/293/293/cen7-mblw-b.mfc=000000000.chunk[49162,49409]
An4/214/214/cen5-fsaf2-b.mfc=000000000.chunk[49410,49797]
An4/91/91/an8-fjmd-b.mfc=000000000.chunk[49798,49975]
An4/820/820/cen6-msct-b.mfc=000000000.chunk[49976,50213]
An4/300/300/cen1-mbmg-b.mfc=000000000.chunk[50214,50491]
An4/18/18/cen4-fbbh-b.mfc=000000000.chunk[50492,50829]
An4/526/526/cen7-mjbh-b.mfc=000000000.chunk[50830,51067]
An4/408/408/cen6-mdmc-b.mfc=000000000.chunk[51068,51285]
An4/169/169/an194-fmjd-b.mfc=000000000.chunk[51286,51553]
An4/939/939/an154-mwhw-b.mfc=000000000.chunk[51554,51841]
An4/931/931/cen4-mtxj-b.mfc=000000000.chunk[51842,52299]
An4/758/758/an101-mrcb-b.mfc=000000000.chunk[52300,52647]
An4/781/781/cen6-mrjc2-b.mfc=000000000.chunk[52648,52875]
An4/321/321/an127-mcen-b.mfc=000000000.chunk[52876,52973]
An4/199/199/cen3-fplp-b.mfc=000000000.chunk[52974,53271]
An4/494/494/cen1-mfaa-b.mfc=000000000.chunk[53272,53469]
An4/560/560/cen3-mjes-b.mfc=000000000.chunk[53470,53547]
An4/713/713/cen2-mmtm-b.mfc=000000000.chunk[53548,53855]
An4/938/938/an153-mwhw-b.mfc=000000000.chunk[53856,54143]
An4/163/163/cen6-fmjc-b.mfc=000000000.chunk[54144,54321]
An4/338/338/cen1-mcfl-b.mfc=000000000.chunk[54322,54569]
An4/775/775/an240-mrjc2-b.mfc=000000000.chunk[54570,54777]
An4/264/264/cen3-fwxs-b.mfc=000000000.chunk[54778,54925]
An4/224/224/cen2-fsrb-b.mfc=000000000.chunk[54926,55233]
An4/166/166/an191-fmjd-b.mfc=000000000.chunk[55234,55321]
An4/80/80/cen1-fjdn-b.mfc=000000000.chunk[55322,55469]
An4/426/426/an28-mdxs-b.mfc=000000000.chunk[55470,55577]
An4/737/737/an85-mnjl-b.mfc=000000000.chunk[55578,55965]
An4/919/919/cen5-mtos-b.mfc=000000000.chunk[55966,56363]
An4/102/102/an312-fkai-b.mfc=000000000.chunk[56364,56751]
An4/743/743/cen7-mnjl-b.mfc=000000000.chunk[56752,57129]
An4/948/948/cen8-mwhw-b.mfc=000000000.chunk[57130,57347]
An4/17/17/cen3-fbbh-b.mfc=000000000.chunk[57348,57575]
An4/11/11/an87-fbbh-b.mfc=000000000.chunk[57576,57743]
An4/344/344/cen7-mcfl-b.mfc=000000000.chunk[57744,58111]
An4/359/359/an231-mcsc-b.mfc=000000000.chunk[58112,58329]
An4/203/203/cen7-fplp-b.mfc=000000000.chunk[58330,58877]
An4/704/704/cen6-mmsh-b.mfc=000000000.chunk[58878,59035]
An4/331/331/cen7-mcen-b.mfc=000000000.chunk[59036,59323]
An4/736/736/an84-mnjl-b.mfc=000000000.chunk[59324,59511]
An4/121/121/cen3-fkdo-b.mfc=000000000.chunk[59512,59769]
An4/574/574/cen5-mjgk-b.mfc=000000000.chunk[59770,59977]
An4/143/143/an24-flrp-b.mfc=000000000.chunk[59978,60065]
An4/209/209/an300-fsaf2-b.mfc=000000000.chunk[60066,60473]
An4/367/367/cen4-mcsc-b.mfc=000000000.chunk[60474,60731]
An4/38/38/an38-fejs-b.mfc=000000000.chunk[60732,60809]
An4/390/390/cen1-mdcs2-b.mfc=000000000.chunk[60810,61057]
An4/756/756/cen7-mrab-b.mfc=000000000.chunk[61058,61275]
An4/555/555/an158-mjes-b.mfc=000000000.chunk[61276,61613]
An4/680/680/an361-mmkw-b.mfc=000000000.chunk[61614,62041]
An4/578/578/an56-mjhp-b.mfc=000000000.chunk[62042,62419]
An4/655/655/an47-mmap-b.mfc=000000000.chunk[62420,62667]
An4/646/646/cen1-mmal-b.mfc=000000000.chunk[62668,63035]
An4/720/720/an346-mnfe-b.mfc=000000000.chunk[63036,63453]
An4/608/608/cen1-mkdb-b.mfc=000000000.chunk[63454,63721]
An4/441/441/an140-meab-b.mfc=000000000.chunk[63722,64299]
An4/356/356/cen6-mcrt-b.mfc=000000000.chunk[64300,64547]
An4/926/926/an379-mtxj-b.mfc=000000000.chunk[64548,64625]
An4/541/541/an16-mjdr-b.mfc=000000000.chunk[64626,64893]
An4/195/195/an94-fplp-b.mfc=000000000.chunk[64894,65441]
An4/591/591/an176-mjjs2-b.mfc=000000000.chunk[65442,65789]
An4/9/9/cen7-fash-b.mfc=000000000.chunk[65790,66037]
An4/484/484/cen4-mewl-b.mfc=000000000.chunk[66038,66525]
An4/537/537/cen5-mjda-b.mfc=000000000.chunk[66526,66933]
An4/242/242/cen7-ftal-b.mfc=000000000.chunk[66934,67171]
An4/848/848/cen8-msjr-b.mfc=000000000.chunk[67172,67409]
An4/220/220/an168-fsrb-b.mfc=000000000.chunk[67410,67757]
An4/906/906/cen5-mtje-b.mfc=000000000.chunk[67758,68185]
An4/444/444/cen3-meab-b.mfc=000000000.chunk[68186,68373]
An4/88/88/an10-fjmd-b.mfc=000000000.chunk[68374,68531]
An4/561/561/cen4-mjes-b.mfc=000000000.chunk[68532,68919]
An4/728/728/cen4-mnfe-b.mfc=000000000.chunk[68920,69347]
An4/784/784/an281-mrmg-b.mfc=000000000.chunk[69348,69485]
An4/55/55/cen2-ffmm-b.mfc=000000000.chunk[69486,69983]
An4/593/593/an178-mjjs2-b.mfc=000000000.chunk[69984,70061]
An4/327/327/cen3-mcen-b.mfc=000000000.chunk[70062,70309]
An4/4/4/an255-fash-b.mfc=000000000.chunk[70310,70567]
An4/922/922/cen8-mtos-b.mfc=000000000.chunk[70568,70775]
An4/229/229/cen7-fsrb-b.mfc=000000000.chunk[70776,71253]
An4/297/297/an268-mbmg-b.mfc=000000000.chunk[71254,71651]
An4/215/215/cen6-fsaf2-b.mfc=000000000.chunk[71652,71839]
An4/567/567/an217-mjgk-b.mfc=000000000.chunk[71840,71987]
An4/96/96/cen4-fjmd-b.mfc=000000000.chunk[71988,72335]
An4/846/846/cen6-msjr-b.mfc=000000000.chunk[72336,72543]
An4/850/850/an96-mskh-b.mfc=000000000.chunk[72544,72621]
An4/492/492/an164-mfaa-b.mfc=000000000.chunk[72622,72859]
An4/661/661/cen3-mmap-b.mfc=000000000.chunk[72860,72987]
An4/200/200/cen4-fplp-b.mfc=000000000.chunk[72988,73485]
An4/82/82/cen3-fjdn-b.mfc=000000000.chunk[73486,73583]
An4/936/936/an151-mwhw-b.mfc=000000000.chunk[73584,73891]
An4/60/60/cen7-ffmm-b.mfc=000000000.chunk[73892,74379]
An4/183/183/an185-fnsv-b.mfc=000000000.chunk[74380,74477]
An4/667/667/an336-mmdg-b.mfc=000000000.chunk[74478,74785]
An4/576/576/cen7-mjgk-b.mfc=000000000.chunk[74786,74993]
An4/212/212/cen3-fsaf2-b.mfc=000000000.chunk[74994,75101]
An4/779/779/cen4-mrjc2-b.mfc=000000000.chunk[75102,75449]
An4/418/418/cen3-mdxn-b.mfc=000000000.chunk[75450,75637]
An4/636/636/cen4-mmaf-b.mfc=000000000.chunk[75638,75935]
An4/257/257/an221-fwxs-b.mfc=000000000.chunk[75936,76253]
An4/59/59/cen6-ffmm-b.mfc=000000000.chunk[76254,76481]
An4/899/899/an33-mtje-b.mfc=000000000.chunk[76482,76879]
An4/886/886/an303-mtcv-b.mfc=000000000.chunk[76880,77307]
An4/932/932/cen5-mtxj-b.mfc=000000000.chunk[77308,77735]
An4/336/336/an264-mcfl-b.mfc=000000000.chunk[77736,77813]
An4/877/877/cen2-msrb-b.mfc=000000000.chunk[77814,78051]
An4/629/629/an52-mmaf-b.mfc=000000000.chunk[78052,78199]
An4/767/767/cen5-mrcb-b.mfc=000000000.chunk[78200,78547]
An4/374/374/an243-mdcs-b.mfc=000000000.chunk[78548,78635]
An4/437/437/an136-meab-b.mfc=000000000.chunk[78636,79063]
An4/202/202/cen6-fplp-b.mfc=000000000.chunk[79064,79451]
An4/29/29/cen2-fclc-b.mfc=000000000.chunk[79452,79699]
An4/669/669/an338-mmdg-b.mfc=000000000.chunk[79700,80017]
An4/216/216/cen7-fsaf2-b.mfc=000000000.chunk[80018,80395]
An4/227/227/cen5-fsrb-b.mfc=000000000.chunk[80396,80903]
An4/864/864/an278-msmn-b.mfc=000000000.chunk[80904,81311]
An4/794/794/cen6-mrmg-b.mfc=000000000.chunk[81312,81549]
An4/865/865/an279-msmn-b.mfc=000000000.chunk[81550,81837]
An4/111/111/cen6-fkai-b.mfc=000000000.chunk[81838,82015]
An4/774/774/an239-mrjc2-b.mfc=000000000.chunk[82016,82293]
An4/831/831/cen4-msjm-b.mfc=000000000.chunk[82294,82481]
An4/793/793/cen5-mrmg-b.mfc=000000000.chunk[82482,83049]
An4/301/301/cen2-mbmg-b.mfc=000000000.chunk[83050,83237]
An4/325/325/cen1-mcen-b.mfc=000000000.chunk[83238,83485]
An4/210/210/cen1-fsaf2-b.mfc=000000000.chunk[83486,83863]
An4/117/117/an134-fkdo-b.mfc=000000000.chunk[83864,83991]
An4/388/388/an114-mdcs2-b.mfc=000000000.chunk[83992,84289]
An4/718/718/cen7-mmtm-b.mfc=000000000.chunk[84290,84617]
An4/174/174/cen4-fmjd-b.mfc=000000000.chunk[84618,84955]
An4/652/652/cen7-mmal-b.mfc=000000000.chunk[84956,85233]
An4/228/228/cen6-fsrb-b.mfc=000000000.chunk[85234,85451]
An4/373/373/an242-mdcs-b.mfc=000000000.chunk[85452,85729]
An4/175/175/cen5-fmjd-b.mfc=000000000.chunk[85730,86147]
An4/184/184/cen1-fnsv-b.mfc=000000000.chunk[86148,86505]
An4/393/393/cen4-mdcs2-b.mfc=000000000.chunk[86506,86723]
An4/319/319/cen8-mcel-b.mfc=000000000.chunk[86724,86961]
An4/291/291/cen5-mblw-b.mfc=000000000.chunk[86962,87079]
An4/584/584/cen2-mjhp-b.mfc=000000000.chunk[87080,87287]
An4/827/827/an230-msjm-b.mfc=000000000.chunk[87288,87385]
An4/628/628/an51-mmaf-b.mfc=000000000.chunk[87386,87733]
An4/295/295/an266-mbmg-b.mfc=000000000.chunk[87734,87901]
An4/317/317/cen5-mcel-b.mfc=000000000.chunk[87902,88269]
An4/431/431/cen3-mdxs-b.mfc=000000000.chunk[88270,88517]
An4/52/52/an294-ffmm-b.mfc=000000000.chunk[88518,88785]
An4/491/491/an163-mfaa-b.mfc=000000000.chunk[88786,89023]
An4/844/844/cen4-msjr-b.mfc=000000000.chunk[89024,89261]
An4/116/116/an133-fkdo-b.mfc=000000000.chunk[89262,89369]
An4/61/61/cen8-ffmm-b.mfc=000000000.chunk[89370,89717]
An4/118/118/an135-fkdo-b.mfc=000000000.chunk[89718,89995]
An4/131/131/an65-flmm2-b.mfc=000000000.chunk[89996,90233]
An4/878/878/cen3-msrb-b.mfc=000000000.chunk[90234,90431]
An4/352/352/cen2-mcrt-b.mfc=000000000.chunk[90432,90739]
An4/132/132/cen1-flmm2-b.mfc=000000000.chunk[90740,91127]
An4/230/230/cen8-fsrb-b.mfc=000000000.chunk[91128,91425]
An4/933/933/cen6-mtxj-b.mfc=000000000.chunk[91426,91603]
An4/535/535/cen3-mjda-b.mfc=000000000.chunk[91604,91741]
An4/531/531/an174-mjda-b.mfc=000000000.chunk[91742,91859]
An4/525/525/cen6-mjbh-b.mfc=000000000.chunk[91860,92057]
An4/74/74/cen8-fjam-b.mfc=000000000.chunk[92058,92265]
An4/644/644/an324-mmal-b.mfc=000000000.chunk[92266,92363]
An4/240/240/cen5-ftal-b.mfc=000000000.chunk[92364,92601]
An4/726/726/cen2-mnfe-b.mfc=000000000.chunk[92602,92809]
An4/425/425/an27-mdxs-b.mfc=000000000.chunk[92810,93297]
An4/612/612/cen5-mkdb-b.mfc=000000000.chunk[93298,93655]
An4/698/698/an385-mmsh-b.mfc=000000000.chunk[93656,93733]
An4/787/787/an284-mrmg-b.mfc=000000000.chunk[93734,93881]
An4/666/666/cen8-mmap-b.mfc=000000000.chunk[93882,94099]
An4/31/31/cen4-fclc-b.mfc=000000000.chunk[94100,94467]
An4/470/470/cen3-mema-b.mfc=000000000.chunk[94468,94615]
An4/782/782/cen7-mrjc2-b.mfc=000000000.chunk[94616,94943]
An4/824/824/an227-msjm-b.mfc=000000000.chunk[94944,95211]
An4/287/287/cen1-mblw-b.mfc=000000000.chunk[95212,95379]
An4/748/748/an74-mrab-b.mfc=000000000.chunk[95380,95667]
An4/241/241/cen6-ftal-b.mfc=000000000.chunk[95668,95835]
An4/832/832/cen5-msjm-b.mfc=000000000.chunk[95836,96123]
An4/664/664/cen6-mmap-b.mfc=000000000.chunk[96124,96311]
An4/347/347/an142-mcrt-b.mfc=000000000.chunk[96312,96509]
An4/377/377/cen1-mdcs-b.mfc=000000000.chunk[96510,96757]
An4/124/124/cen6-fkdo-b.mfc=000000000.chunk[96758,96995]
An4/724/724/an350-mnfe-b.mfc=000000000.chunk[96996,97133]
An4/442/442/cen1-meab-b.mfc=000000000.chunk[97134,97601]
An4/742/742/cen6-mnjl-b.mfc=000000000.chunk[97602,97809]
An4/500/500/cen7-mfaa-b.mfc=000000000.chunk[97810,98047]
An4/909/909/cen8-mtje-b.mfc=000000000.chunk[98048,98245]
An4/626/626/cen7-mkem-b.mfc=000000000.chunk[98246,98413]
An4/627/627/cen8-mkem-b.mfc=000000000.chunk[98414,98671]
An4/401/401/an209-mdmc-b.mfc=000000000.chunk[98672,98979]
An4/838/838/an353-msjr-b.mfc=000000000.chunk[98980,99057]
An4/415/415/an110-mdxn-b.mfc=000000000.chunk[99058,99265]
An4/225/225/cen3-fsrb-b.mfc=000000000.chunk[99266,99473]
An4/595/595/an180-mjjs2-b.mfc=000000000.chunk[99474,99571]
An4/673/673/cen2-mmdg-b.mfc=000000000.chunk[99572,99859]
An4/162/162/cen5-fmjc-b.mfc=000000000.chunk[99860,100147]
An4/679/679/cen8-mmdg-b.mfc=000000000.chunk[100148,100445]
An4/590/590/cen8-mjhp-b.mfc=000000000.chunk[100446,100673]
An4/299/299/an270-mbmg-b.mfc=000000000.chunk[100674,100881]
An4/805/805/cen4-mscg2-b.mfc=000000000.chunk[100882,101249]
An4/197/197/cen1-fplp-b.mfc=000000000.chunk[101250,101807]
An4/267/267/cen6-fwxs-b.mfc=000000000.chunk[101808,102115]
An4/630/630/an53-mmaf-b.mfc=000000000.chunk[102116,102463]
An4/888/888/an305-mtcv-b.mfc=000000000.chunk[102464,102691]
An4/812/812/an343-msct-b.mfc=000000000.chunk[102692,102879]
An4/233/233/an328-ftal-b.mfc=000000000.chunk[102880,103207]
An4/529/529/an172-mjda-b.mfc=000000000.chunk[103208,103425]
An4/707/707/an386-mmtm-b.mfc=000000000.chunk[103426,103663]
An4/592/592/an177-mjjs2-b.mfc=000000000.chunk[103664,103791]
An4/130/130/an64-flmm2-b.mfc=000000000.chunk[103792,103919]
An4/310/310/an203-mcel-b.mfc=000000000.chunk[103920,104167]
An4/170/170/an195-fmjd-b.mfc=000000000.chunk[104168,104485]
An4/119/119/cen1-fkdo-b.mfc=000000000.chunk[104486,104753]
An4/345/345/cen8-mcfl-b.mfc=000000000.chunk[104754,105011]
An4/365/365/cen2-mcsc-b.mfc=000000000.chunk[105012,105319]
An4/735/735/an83-mnjl-b.mfc=000000000.chunk[105320,105467]
An4/633/633/cen1-mmaf-b.mfc=000000000.chunk[105468,105795]
An4/654/654/an46-mmap-b.mfc=000000000.chunk[105796,105893]
An4/149/149/cen5-flrp-b.mfc=000000000.chunk[105894,106341]
An4/751/751/cen2-mrab-b.mfc=000000000.chunk[106342,106519]
An4/238/238/cen3-ftal-b.mfc=000000000.chunk[106520,106657]
An4/360/360/an232-mcsc-b.mfc=000000000.chunk[106658,106745]
An4/881/881/cen6-msrb-b.mfc=000000000.chunk[106746,106973]
An4/757/757/cen8-mrab-b.mfc=000000000.chunk[106974,107171]
An4/400/400/an208-mdmc-b.mfc=000000000.chunk[107172,107549]
An4/168/168/an193-fmjd-b.mfc=000000000.chunk[107550,107977]
An4/897/897/an31-mtje-b.mfc=000000000.chunk[107978,108345]
An4/530/530/an173-mjda-b.mfc=000000000.chunk[108346,108653]
An4/566/566/an216-mjgk-b.mfc=000000000.chunk[108654,108911]
An4/95/95/cen3-fjmd-b.mfc=000000000.chunk[108912,109009]
An4/43/43/cen3-fejs-b.mfc=000000000.chunk[109010,109287]
An4/753/753/cen4-mrab-b.mfc=000000000.chunk[109288,109615]
An4/405/405/cen3-mdmc-b.mfc=000000000.chunk[109616,109743]
An4/66/66/an80-fjam-b.mfc=000000000.chunk[109744,109831]
An4/858/858/cen5-mskh-b.mfc=000000000.chunk[109832,110219]
An4/852/852/an98-mskh-b.mfc=000000000.chunk[110220,110307]
An4/237/237/cen2-ftal-b.mfc=000000000.chunk[110308,110485]
An4/602/602/cen8-mjjs2-b.mfc=000000000.chunk[110486,110903]
An4/842/842/cen2-msjr-b.mfc=000000000.chunk[110904,111131]
An4/13/13/an89-fbbh-b.mfc=000000000.chunk[111132,111769]
An4/283/283/an2-mblw-b.mfc=000000000.chunk[111770,111917]
An4/460/460/cen6-meht-b.mfc=000000000.chunk[111918,112105]
An4/20/20/cen6-fbbh-b.mfc=000000000.chunk[112106,112343]
An4/308/308/an201-mcel-b.mfc=000000000.chunk[112344,112451]
An4/471/471/cen4-mema-b.mfc=000000000.chunk[112452,112819]
An4/546/546/cen2-mjdr-b.mfc=000000000.chunk[112820,113017]
An4/468/468/cen1-mema-b.mfc=000000000.chunk[113018,113255]
An4/236/236/cen1-ftal-b.mfc=000000000.chunk[113256,113453]
An4/372/372/an241-mdcs-b.mfc=000000000.chunk[113454,113691]
An4/395/395/cen6-mdcs2-b.mfc=000000000.chunk[113692,113869]
An4/945/945/cen5-mwhw-b.mfc=000000000.chunk[113870,114177]
An4/754/754/cen5-mrab-b.mfc=000000000.chunk[114178,114445]
An4/509/509/cen3-mgah-b.mfc=000000000.chunk[114446,114643]
An4/556/556/an159-mjes-b.mfc=000000000.chunk[114644,114721]
An4/594/594/an179-mjjs2-b.mfc=000000000.chunk[114722,115229]
An4/487/487/cen7-mewl-b.mfc=000000000.chunk[115230,115487]
An4/684/684/an365-mmkw-b.mfc=000000000.chunk[115488,115915]
An4/855/855/cen2-mskh-b.mfc=000000000.chunk[115916,116113]
An4/439/439/an138-meab-b.mfc=000000000.chunk[116114,116401]
An4/354/354/cen4-mcrt-b.mfc=000000000.chunk[116402,116879]
An4/26/26/an149-fclc-b.mfc=000000000.chunk[116880,117107]
An4/588/588/cen6-mjhp-b.mfc=000000000.chunk[117108,117275]
An4/823/823/an226-msjm-b.mfc=000000000.chunk[117276,117363]
An4/68/68/cen2-fjam-b.mfc=000000000.chunk[117364,117511]
An4/101/101/an311-fkai-b.mfc=000000000.chunk[117512,117819]
An4/98/98/cen6-fjmd-b.mfc=000000000.chunk[117820,118017]
An4/450/450/an66-meht-b.mfc=000000000.chunk[118018,118285]
An4/662/662/cen4-mmap-b.mfc=000000000.chunk[118286,118753]
An4/452/452/an68-meht-b.mfc=000000000.chunk[118754,118961]
An4/687/687/cen2-mmkw-b.mfc=000000000.chunk[118962,119419]
An4/218/218/an166-fsrb-b.mfc=000000000.chunk[119420,119667]
An4/314/314/cen2-mcel-b.mfc=000000000.chunk[119668,119955]
An4/33/33/cen6-fclc-b.mfc=000000000.chunk[119956,120163]
An4/424/424/an26-mdxs-b.mfc=000000000.chunk[120164,120281]
An4/615/615/cen8-mkdb-b.mfc=000000000.chunk[120282,120529]
An4/298/298/an269-mbmg-b.mfc=000000000.chunk[120530,120927]
An4/527/527/cen8-mjbh-b.mfc=000000000.chunk[120928,121145]
An4/15/15/cen1-fbbh-b.mfc=000000000.chunk[121146,121383]
An4/910/910/an366-mtos-b.mfc=000000000.chunk[121384,121661]
An4/158/158/cen1-fmjc-b.mfc=000000000.chunk[121662,121949]
An4/246/246/an213-ftmj-b.mfc=000000000.chunk[121950,122347]
An4/849/849/an100-mskh-b.mfc=000000000.chunk[122348,122425]
An4/56/56/cen3-ffmm-b.mfc=000000000.chunk[122426,122623]
An4/404/404/cen2-mdmc-b.mfc=000000000.chunk[122624,122871]
An4/351/351/cen1-mcrt-b.mfc=000000000.chunk[122872,123289]
An4/863/863/an277-msmn-b.mfc=000000000.chunk[123290,123377]
An4/322/322/an128-mcen-b.mfc=000000000.chunk[123378,123775]
An4/419/419/cen4-mdxn-b.mfc=000000000.chunk[123776,124043]
An4/86/86/cen7-fjdn-b.mfc=000000000.chunk[124044,124301]
An4/311/311/an204-mcel-b.mfc=000000000.chunk[124302,124399]
An4/142/142/an23-flrp-b.mfc=000000000.chunk[124400,124807]
An4/25/25/an148-fclc-b.mfc=000000000.chunk[124808,125205]
An4/947/947/cen7-mwhw-b.mfc=000000000.chunk[125206,125493]
An4/250/250/cen2-ftmj-b.mfc=000000000.chunk[125494,125781]
An4/381/381/cen5-mdcs-b.mfc=000000000.chunk[125782,126059]
An4/927/927/an380-mtxj-b.mfc=000000000.chunk[126060,126247]
An4/941/941/cen1-mwhw-b.mfc=000000000.chunk[126248,126425]
An4/769/769/cen7-mrcb-b.mfc=000000000.chunk[126426,126863]
An4/685/685/an59-mmkw-b.mfc=000000000.chunk[126864,127411]
An4/72/72/cen6-fjam-b.mfc=000000000.chunk[127412,127699]
An4/420/420/cen5-mdxn-b.mfc=000000000.chunk[127700,127907]
An4/457/457/cen3-meht-b.mfc=000000000.chunk[127908,128095]
An4/279/279/cen6-mblb-b.mfc=000000000.chunk[128096,128263]
An4/656/656/an48-mmap-b.mfc=000000000.chunk[128264,128481]
An4/773/773/an238-mrjc2-b.mfc=000000000.chunk[128482,128559]
An4/562/562/cen5-mjes-b.mfc=000000000.chunk[128560,128797]
An4/811/811/an342-msct-b.mfc=000000000.chunk[128798,129205]
An4/23/23/an146-fclc-b.mfc=000000000.chunk[129206,129543]
An4/391/391/cen2-mdcs2-b.mfc=000000000.chunk[129544,129761]
An4/172/172/cen2-fmjd-b.mfc=000000000.chunk[129762,130099]
An4/185/185/cen2-fnsv-b.mfc=000000000.chunk[130100,130377]
An4/78/78/an124-fjdn-b.mfc=000000000.chunk[130378,130645]
An4/148/148/cen4-flrp-b.mfc=000000000.chunk[130646,130943]
An4/253/253/cen5-ftmj-b.mfc=000000000.chunk[130944,131361]
An4/445/445/cen4-meab-b.mfc=000000000.chunk[131362,131749]
An4/523/523/cen4-mjbh-b.mfc=000000000.chunk[131750,131907]
An4/524/524/cen5-mjbh-b.mfc=000000000.chunk[131908,132325]
An4/428/428/an30-mdxs-b.mfc=000000000.chunk[132326,132423]
An4/315/315/cen3-mcel-b.mfc=000000000.chunk[132424,132611]
An4/281/281/cen8-mblb-b.mfc=000000000.chunk[132612,132819]
An4/570/570/an220-mjgk-b.mfc=000000000.chunk[132820,132987]
An4/727/727/cen3-mnfe-b.mfc=000000000.chunk[132988,133155]
An4/231/231/an326-ftal-b.mfc=000000000.chunk[133156,133273]
An4/193/193/an92-fplp-b.mfc=000000000.chunk[133274,133871]
An4/892/892/cen4-mtcv-b.mfc=000000000.chunk[133872,134479]
An4/834/834/cen7-msjm-b.mfc=000000000.chunk[134480,134747]
An4/144/144/an25-flrp-b.mfc=000000000.chunk[134748,135185]
An4/828/828/cen1-msjm-b.mfc=000000000.chunk[135186,135413]
An4/934/934/cen7-mtxj-b.mfc=000000000.chunk[135414,135661]
An4/387/387/an113-mdcs2-b.mfc=000000000.chunk[135662,136139]
An4/434/434/cen6-mdxs-b.mfc=000000000.chunk[136140,136377]
An4/469/469/cen2-mema-b.mfc=000000000.chunk[136378,136515]
An4/232/232/an327-ftal-b.mfc=000000000.chunk[136516,136673]
An4/378/378/cen2-mdcs-b.mfc=000000000.chunk[136674,136831]
An4/275/275/cen2-mblb-b.mfc=000000000.chunk[136832,137079]
An4/837/837/an352-msjr-b.mfc=000000000.chunk[137080,137207]
An4/447/447/cen6-meab-b.mfc=000000000.chunk[137208,137425]
An4/521/521/cen2-mjbh-b.mfc=000000000.chunk[137426,137573]
An4/733/733/an81-mnjl-b.mfc=000000000.chunk[137574,137791]
An4/510/510/cen4-mgah-b.mfc=000000000.chunk[137792,138119]
An4/276/276/cen3-mblb-b.mfc=000000000.chunk[138120,138227]
An4/894/894/cen6-mtcv-b.mfc=000000000.chunk[138228,138465]
An4/741/741/cen5-mnjl-b.mfc=000000000.chunk[138466,138783]
An4/898/898/an32-mtje-b.mfc=000000000.chunk[138784,138881]
An4/532/532/an175-mjda-b.mfc=000000000.chunk[138882,138959]
An4/150/150/cen6-flrp-b.mfc=000000000.chunk[138960,139177]
An4/280/280/cen7-mblb-b.mfc=000000000.chunk[139178,139555]
An4/902/902/cen1-mtje-b.mfc=000000000.chunk[139556,139813]
An4/896/896/cen8-mtcv-b.mfc=000000000.chunk[139814,140101]
An4/900/900/an34-mtje-b.mfc=000000000.chunk[140102,140209]
An4/323/323/an129-mcen-b.mfc=000000000.chunk[140210,140357]
An4/579/579/an57-mjhp-b.mfc=000000000.chunk[140358,140595]
An4/451/451/an67-meht-b.mfc=000000000.chunk[140596,140673]
An4/830/830/cen3-msjm-b.mfc=000000000.chunk[140674,140851]
An4/75/75/an121-fjdn-b.mfc=000000000.chunk[140852,140919]
An4/194/194/an93-fplp-b.mfc=000000000.chunk[140920,141027]
An4/620/620/an190-mkem-b.mfc=000000000.chunk[141028,141295]
An4/266/266/cen5-fwxs-b.mfc=000000000.chunk[141296,141713]
An4/659/659/cen1-mmap-b.mfc=000000000.chunk[141714,141931]
An4/903/903/cen2-mtje-b.mfc=000000000.chunk[141932,142359]
An4/189/189/cen6-fnsv-b.mfc=000000000.chunk[142360,142617]
An4/271/271/an373-mblb-b.mfc=000000000.chunk[142618,143015]
An4/67/67/cen1-fjam-b.mfc=000000000.chunk[143016,143253]
An4/219/219/an167-fsrb-b.mfc=000000000.chunk[143254,143511]
An4/778/778/cen3-mrjc2-b.mfc=000000000.chunk[143512,143719]
An4/814/814/an345-msct-b.mfc=000000000.chunk[143720,144287]
An4/829/829/cen2-msjm-b.mfc=000000000.chunk[144288,144525]
An4/47/47/cen7-fejs-b.mfc=000000000.chunk[144526,144823]
An4/799/799/an358-mscg2-b.mfc=000000000.chunk[144824,145241]
An4/804/804/cen3-mscg2-b.mfc=000000000.chunk[145242,145469]
An4/329/329/cen5-mcen-b.mfc=000000000.chunk[145470,145887]
An4/600/600/cen6-mjjs2-b.mfc=000000000.chunk[145888,146225]
An4/876/876/cen1-msrb-b.mfc=000000000.chunk[146226,146493]
An4/708/708/an387-mmtm-b.mfc=000000000.chunk[146494,146781]
An4/24/24/an147-fclc-b.mfc=000000000.chunk[146782,147069]
An4/808/808/cen7-mscg2-b.mfc=000000000.chunk[147070,147457]
An4/313/313/cen1-mcel-b.mfc=000000000.chunk[147458,147765]
An4/482/482/cen2-mewl-b.mfc=000000000.chunk[147766,147923]
An4/51/51/an293-ffmm-b.mfc=000000000.chunk[147924,148261]
An4/935/935/cen8-mtxj-b.mfc=000000000.chunk[148262,148529]
An4/244/244/an211-ftmj-b.mfc=000000000.chunk[148530,148737]
An4/396/396/cen7-mdcs2-b.mfc=000000000.chunk[148738,148945]
An4/745/745/an71-mrab-b.mfc=000000000.chunk[148946,149253]
An4/569/569/an219-mjgk-b.mfc=000000000.chunk[149254,149331]
An4/277/277/cen4-mblb-b.mfc=000000000.chunk[149332,149599]
An4/371/371/cen8-mcsc-b.mfc=000000000.chunk[149600,149847]
An4/650/650/cen5-mmal-b.mfc=000000000.chunk[149848,150175]
An4/135/135/cen4-flmm2-b.mfc=000000000.chunk[150176,150793]
An4/206/206/an297-fsaf2-b.mfc=000000000.chunk[150794,151221]
An4/294/294/cen8-mblw-b.mfc=000000000.chunk[151222,151419]
An4/85/85/cen6-fjdn-b.mfc=000000000.chunk[151420,151567]
An4/785/785/an282-mrmg-b.mfc=000000000.chunk[151568,151695]
An4/406/406/cen4-mdmc-b.mfc=000000000.chunk[151696,152323]
An4/474/474/cen7-mema-b.mfc=000000000.chunk[152324,152561]
An4/790/790/cen2-mrmg-b.mfc=000000000.chunk[152562,152919]
An4/463/463/an286-mema-b.mfc=000000000.chunk[152920,153027]
An4/559/559/cen2-mjes-b.mfc=000000000.chunk[153028,153255]
An4/353/353/cen3-mcrt-b.mfc=000000000.chunk[153256,153523]
An4/435/435/cen7-mdxs-b.mfc=000000000.chunk[153524,153941]
An4/145/145/cen1-flrp-b.mfc=000000000.chunk[153942,154229]
An4/278/278/cen5-mblb-b.mfc=000000000.chunk[154230,154557]
An4/517/517/an248-mjbh-b.mfc=000000000.chunk[154558,154765]
An4/65/65/an79-fjam-b.mfc=000000000.chunk[154766,155073]
An4/341/341/cen4-mcfl-b.mfc=000000000.chunk[155074,155411]
An4/520/520/cen1-mjbh-b.mfc=000000000.chunk[155412,155679]
An4/137/137/cen7-flmm2-b.mfc=000000000.chunk[155680,155967]
An4/806/806/cen5-mscg2-b.mfc=000000000.chunk[155968,156415]
An4/429/429/cen1-mdxs-b.mfc=000000000.chunk[156416,156933]
An4/610/610/cen3-mkdb-b.mfc=000000000.chunk[156934,157151]
An4/869/869/cen7-msmn-b.mfc=000000000.chunk[157152,157539]
An4/141/141/an22-flrp-b.mfc=000000000.chunk[157540,157637]
An4/791/791/cen3-mrmg-b.mfc=000000000.chunk[157638,157805]
An4/289/289/cen3-mblw-b.mfc=000000000.chunk[157806,157963]
An4/711/711/an390-mmtm-b.mfc=000000000.chunk[157964,158061]
An4/432/432/cen4-mdxs-b.mfc=000000000.chunk[158062,158369]
An4/350/350/an145-mcrt-b.mfc=000000000.chunk[158370,158557]
An4/670/670/an339-mmdg-b.mfc=000000000.chunk[158558,158905]
An4/581/581/an59-mjhp-b.mfc=000000000.chunk[158906,159203]
An4/461/461/cen7-meht-b.mfc=000000000.chunk[159204,159531]
An4/103/103/an313-fkai-b.mfc=000000000.chunk[159532,160009]
An4/263/263/cen2-fwxs-b.mfc=000000000.chunk[160010,160317]
An4/362/362/an234-mcsc-b.mfc=000000000.chunk[160318,160415]
An4/478/478/an258-mewl-b.mfc=000000000.chunk[160416,160633]
An4/786/786/an283-mrmg-b.mfc=000000000.chunk[160634,161131]
An4/512/512/cen6-mgah-b.mfc=000000000.chunk[161132,161369]
An4/847/847/cen7-msjr-b.mfc=000000000.chunk[161370,161647]
An4/498/498/cen5-mfaa-b.mfc=000000000.chunk[161648,161975]
An4/916/916/cen2-mtos-b.mfc=000000000.chunk[161976,162233]
An4/410/410/cen8-mdmc-b.mfc=000000000.chunk[162234,162561]
An4/459/459/cen5-meht-b.mfc=000000000.chunk[162562,162859]
An4/223/223/cen1-fsrb-b.mfc=000000000.chunk[162860,163137]
An4/764/764/cen2-mrcb-b.mfc=000000000.chunk[163138,163325]
An4/564/564/cen7-mjes-b.mfc=000000000.chunk[163326,163663]
An4/5/5/cen1-fash-b.mfc=000000000.chunk[163664,164011]
An4/129/129/an63-flmm2-b.mfc=000000000.chunk[164012,164249]
An4/369/369/cen6-mcsc-b.mfc=000000000.chunk[164250,164457]
An4/87/87/cen8-fjdn-b.mfc=000000000.chunk[164458,164625]
An4/167/167/an192-fmjd-b.mfc=000000000.chunk[164626,165043]
An4/598/598/cen4-mjjs2-b.mfc=000000000.chunk[165044,165511]
An4/188/188/cen5-fnsv-b.mfc=000000000.chunk[165512,166029]
An4/749/749/an75-mrab-b.mfc=000000000.chunk[166030,166347]
An4/582/582/an60-mjhp-b.mfc=000000000.chunk[166348,166435]
An4/160/160/cen3-fmjc-b.mfc=000000000.chunk[166436,166633]
An4/180/180/an182-fnsv-b.mfc=000000000.chunk[166634,166761]
An4/682/682/an363-mmkw-b.mfc=000000000.chunk[166762,167379]
An4/339/339/cen2-mcfl-b.mfc=000000000.chunk[167380,167907]
An4/921/921/cen7-mtos-b.mfc=000000000.chunk[167908,168075]
An4/421/421/cen6-mdxn-b.mfc=000000000.chunk[168076,168253]
An4/247/247/an214-ftmj-b.mfc=000000000.chunk[168254,168391]
An4/815/815/cen1-msct-b.mfc=000000000.chunk[168392,168659]
An4/671/671/an340-mmdg-b.mfc=000000000.chunk[168660,168787]
An4/616/616/an186-mkem-b.mfc=000000000.chunk[168788,169105]
An4/196/196/an95-fplp-b.mfc=000000000.chunk[169106,169733]
An4/235/235/an330-ftal-b.mfc=000000000.chunk[169734,169901]
An4/268/268/cen7-fwxs-b.mfc=000000000.chunk[169902,170319]
An4/506/506/an200-mgah-b.mfc=000000000.chunk[170320,170417]
An4/647/647/cen2-mmal-b.mfc=000000000.chunk[170418,170615]
An4/127/127/an61-flmm2-b.mfc=000000000.chunk[170616,170873]
An4/803/803/cen2-mscg2-b.mfc=000000000.chunk[170874,171151]
An4/475/475/cen8-mema-b.mfc=000000000.chunk[171152,171499]
An4/472/472/cen5-mema-b.mfc=000000000.chunk[171500,171617]
An4/599/599/cen5-mjjs2-b.mfc=000000000.chunk[171618,172245]
An4/108/108/cen3-fkai-b.mfc=000000000.chunk[172246,172353]
An4/357/357/cen7-mcrt-b.mfc=000000000.chunk[172354,172711]
An4/342/342/cen5-mcfl-b.mfc=000000000.chunk[172712,173029]
An4/714/714/cen3-mmtm-b.mfc=000000000.chunk[173030,173177]
An4/747/747/an73-mrab-b.mfc=000000000.chunk[173178,173325]
An4/643/643/an323-mmal-b.mfc=000000000.chunk[173326,173523]
An4/99/99/cen7-fjmd-b.mfc=000000000.chunk[173524,173741]
An4/503/503/an197-mgah-b.mfc=000000000.chunk[173742,173949]
An4/533/533/cen1-mjda-b.mfc=000000000.chunk[173950,174167]
An4/691/691/cen6-mmkw-b.mfc=000000000.chunk[174168,174525]
An4/305/305/cen6-mbmg-b.mfc=000000000.chunk[174526,174763]
An4/191/191/cen8-fnsv-b.mfc=000000000.chunk[174764,175101]
An4/772/772/an237-mrjc2-b.mfc=000000000.chunk[175102,175239]
An4/273/273/an375-mblb-b.mfc=000000000.chunk[175240,175467]
An4/660/660/cen2-mmap-b.mfc=000000000.chunk[175468,175615]
An4/540/540/cen8-mjda-b.mfc=000000000.chunk[175616,175903]
An4/930/930/cen3-mtxj-b.mfc=000000000.chunk[175904,176001]
An4/346/346/an141-mcrt-b.mfc=000000000.chunk[176002,176089]
An4/125/125/cen7-fkdo-b.mfc=000000000.chunk[176090,176447]
An4/107/107/cen2-fkai-b.mfc=000000000.chunk[176448,176605]
An4/504/504/an198-mgah-b.mfc=000000000.chunk[176606,176843]
An4/316/316/cen4-mcel-b.mfc=000000000.chunk[176844,177161]
An4/840/840/an355-msjr-b.mfc=000000000.chunk[177162,177359]
An4/689/689/cen4-mmkw-b.mfc=000000000.chunk[177360,177877]
An4/113/113/cen8-fkai-b.mfc=000000000.chunk[177878,178095]
An4/548/548/cen4-mjdr-b.mfc=000000000.chunk[178096,178533]
An4/915/915/cen1-mtos-b.mfc=000000000.chunk[178534,178741]
An4/326/326/cen2-mcen-b.mfc=000000000.chunk[178742,178989]
An4/770/770/cen8-mrcb-b.mfc=000000000.chunk[178990,179267]
An4/618/618/an188-mkem-b.mfc=000000000.chunk[179268,179515]
An4/543/543/an19-mjdr-b.mfc=000000000.chunk[179516,179923]
An4/597/597/cen2-mjjs2-b.mfc=000000000.chunk[179924,180391]
An4/304/304/cen5-mbmg-b.mfc=000000000.chunk[180392,180779]
An4/613/613/cen6-mkdb-b.mfc=000000000.chunk[180780,180947]
An4/551/551/cen7-mjdr-b.mfc=000000000.chunk[180948,181235]
An4/260/260/an224-fwxs-b.mfc=000000000.chunk[181236,181553]
An4/810/810/an341-msct-b.mfc=000000000.chunk[181554,181651]
An4/211/211/cen2-fsaf2-b.mfc=000000000.chunk[181652,181959]
An4/3/3/an254-fash-b.mfc=000000000.chunk[181960,182047]
An4/285/285/an4-mblw-b.mfc=000000000.chunk[182048,182185]
An4/651/651/cen6-mmal-b.mfc=000000000.chunk[182186,182343]
An4/862/862/an276-msmn-b.mfc=000000000.chunk[182344,182631]
An4/296/296/an267-mbmg-b.mfc=000000000.chunk[182632,182969]
An4/261/261/an225-fwxs-b.mfc=000000000.chunk[182970,183067]
An4/705/705/cen7-mmsh-b.mfc=000000000.chunk[183068,183285]
An4/28/28/cen1-fclc-b.mfc=000000000.chunk[183286,183713]
An4/382/382/cen6-mdcs-b.mfc=000000000.chunk[183714,183891]
An4/765/765/cen3-mrcb-b.mfc=000000000.chunk[183892,184069]
An4/499/499/cen6-mfaa-b.mfc=000000000.chunk[184070,184247]
An4/7/7/cen4-fash-b.mfc=000000000.chunk[184248,184605]
An4/110/110/cen5-fkai-b.mfc=000000000.chunk[184606,184913]
An4/893/893/cen5-mtcv-b.mfc=000000000.chunk[184914,185201]
An4/677/677/cen6-mmdg-b.mfc=000000000.chunk[185202,185509]
An4/204/204/cen8-fplp-b.mfc=000000000.chunk[185510,185897]
An4/427/427/an29-mdxs-b.mfc=000000000.chunk[185898,186005]
An4/284/284/an3-mblw-b.mfc=000000000.chunk[186006,186133]
An4/348/348/an143-mcrt-b.mfc=000000000.chunk[186134,186241]
An4/449/449/cen8-meab-b.mfc=000000000.chunk[186242,186549]
An4/423/423/cen8-mdxn-b.mfc=000000000.chunk[186550,186817]
An4/466/466/an289-mema-b.mfc=000000000.chunk[186818,186985]
An4/508/508/cen2-mgah-b.mfc=000000000.chunk[186986,187243]
An4/1/1/an251-fash-b.mfc=000000000.chunk[187244,187341]
An4/399/399/an207-mdmc-b.mfc=000000000.chunk[187342,187799]
An4/21/21/cen7-fbbh-b.mfc=000000000.chunk[187800,188107]
An4/573/573/cen4-mjgk-b.mfc=000000000.chunk[188108,188525]
An4/706/706/cen8-mmsh-b.mfc=000000000.chunk[188526,188743]
An4/609/609/cen2-mkdb-b.mfc=000000000.chunk[188744,188911]
An4/839/839/an354-msjr-b.mfc=000000000.chunk[188912,189299]
An4/312/312/an205-mcel-b.mfc=000000000.chunk[189300,189657]
An4/63/63/an77-fjam-b.mfc=000000000.chunk[189658,189965]
An4/678/678/cen7-mmdg-b.mfc=000000000.chunk[189966,190503]
An4/761/761/an104-mrcb-b.mfc=000000000.chunk[190504,190601]
An4/642/642/an322-mmal-b.mfc=000000000.chunk[190602,190929]
An4/262/262/cen1-fwxs-b.mfc=000000000.chunk[190930,191347]
An4/171/171/cen1-fmjd-b.mfc=000000000.chunk[191348,191665]
An4/114/114/an131-fkdo-b.mfc=000000000.chunk[191666,192263]
An4/853/853/an99-mskh-b.mfc=000000000.chunk[192264,192521]
An4/333/333/an261-mcfl-b.mfc=000000000.chunk[192522,192619]
An4/112/112/cen7-fkai-b.mfc=000000000.chunk[192620,192777]
An4/265/265/cen4-fwxs-b.mfc=000000000.chunk[192778,193325]
An4/813/813/an344-msct-b.mfc=000000000.chunk[193326,193723]
An4/496/496/cen3-mfaa-b.mfc=000000000.chunk[193724,193881]
An4/178/178/cen8-fmjd-b.mfc=000000000.chunk[193882,194129]
An4/54/54/cen1-ffmm-b.mfc=000000000.chunk[194130,194457]
An4/596/596/cen1-mjjs2-b.mfc=000000000.chunk[194458,194895]
An4/875/875/an335-msrb-b.mfc=000000000.chunk[194896,195033]
An4/430/430/cen2-mdxs-b.mfc=000000000.chunk[195034,195281]
An4/928/928/cen1-mtxj-b.mfc=000000000.chunk[195282,195529]
An4/493/493/an165-mfaa-b.mfc=000000000.chunk[195530,195807]
An4/887/887/an304-mtcv-b.mfc=000000000.chunk[195808,195905]
An4/859/859/cen6-mskh-b.mfc=000000000.chunk[195906,196083]
An4/883/883/cen8-msrb-b.mfc=000000000.chunk[196084,196301]
An4/871/871/an331-msrb-b.mfc=000000000.chunk[196302,196479]
An4/836/836/an351-msjr-b.mfc=000000000.chunk[196480,196757]
An4/553/553/an156-mjes-b.mfc=000000000.chunk[196758,197075]
An4/861/861/cen8-mskh-b.mfc=000000000.chunk[197076,197333]
An4/248/248/an215-ftmj-b.mfc=000000000.chunk[197334,197441]
An4/77/77/an123-fjdn-b.mfc=000000000.chunk[197442,197569]
An4/417/417/cen2-mdxn-b.mfc=000000000.chunk[197570,197787]
An4/856/856/cen3-mskh-b.mfc=000000000.chunk[197788,197965]
An4/44/44/cen4-fejs-b.mfc=000000000.chunk[197966,198493]
An4/645/645/an325-mmal-b.mfc=000000000.chunk[198494,198581]
An4/453/453/an69-meht-b.mfc=000000000.chunk[198582,198669]
An4/639/639/cen7-mmaf-b.mfc=000000000.chunk[198670,198957]
An4/907/907/cen6-mtje-b.mfc=000000000.chunk[198958,199175]
An4/330/330/cen6-mcen-b.mfc=000000000.chunk[199176,199373]
An4/258/258/an222-fwxs-b.mfc=000000000.chunk[199374,199461]
An4/746/746/an72-mrab-b.mfc=000000000.chunk[199462,199549]
An4/752/752/cen3-mrab-b.mfc=000000000.chunk[199550,199677]
An4/663/663/cen5-mmap-b.mfc=000000000.chunk[199678,199945]
An4/843/843/cen3-msjr-b.mfc=000000000.chunk[199946,200083]
An4/674/674/cen3-mmdg-b.mfc=000000000.chunk[200084,200321]
An4/105/105/an315-fkai-b.mfc=000000000.chunk[200322,200519]
An4/604/604/an317-mkdb-b.mfc=000000000.chunk[200520,200657]
An4/557/557/an160-mjes-b.mfc=000000000.chunk[200658,201035]
An4/320/320/an126-mcen-b.mfc=000000000.chunk[201036,201123]
An4/605/605/an318-mkdb-b.mfc=000000000.chunk[201124,201251]
An4/731/731/cen7-mnfe-b.mfc=000000000.chunk[201252,201649]
An4/467/467/an290-mema-b.mfc=000000000.chunk[201650,201907]
An4/368/368/cen5-mcsc-b.mfc=000000000.chunk[201908,202045]
An4/672/672/cen1-mmdg-b.mfc=000000000.chunk[202046,202463]
An4/744/744/cen8-mnjl-b.mfc=000000000.chunk[202464,202721]
An4/37/37/an37-fejs-b.mfc=000000000.chunk[202722,203189]
An4/376/376/an245-mdcs-b.mfc=000000000.chunk[203190,203447]
An4/64/64/an78-fjam-b.mfc=000000000.chunk[203448,203645]
An4/27/27/an150-fclc-b.mfc=000000000.chunk[203646,204073]
An4/917/917/cen3-mtos-b.mfc=000000000.chunk[204074,204311]
An4/637/637/cen5-mmaf-b.mfc=000000000.chunk[204312,204659]
An4/464/464/an287-mema-b.mfc=000000000.chunk[204660,204907]
An4/187/187/cen4-fnsv-b.mfc=000000000.chunk[204908,205375]
An4/385/385/an111-mdcs2-b.mfc=000000000.chunk[205376,205573]
An4/885/885/an302-mtcv-b.mfc=000000000.chunk[205574,205981]
An4/914/914/an370-mtos-b.mfc=000000000.chunk[205982,206439]
An4/153/153/an116-fmjc-b.mfc=000000000.chunk[206440,206667]
An4/375/375/an244-mdcs-b.mfc=000000000.chunk[206668,206915]
An4/868/868/cen6-msmn-b.mfc=000000000.chunk[206916,207093]
An4/495/495/cen2-mfaa-b.mfc=000000000.chunk[207094,207221]
An4/712/712/cen1-mmtm-b.mfc=000000000.chunk[207222,207509]
An4/364/364/cen1-mcsc-b.mfc=000000000.chunk[207510,207987]
An4/739/739/cen2-mnjl-b.mfc=000000000.chunk[207988,208185]
An4/256/256/cen8-ftmj-b.mfc=000000000.chunk[208186,208473]
An4/222/222/an170-fsrb-b.mfc=000000000.chunk[208474,208721]
An4/332/332/cen8-mcen-b.mfc=000000000.chunk[208722,208979]
An4/759/759/an102-mrcb-b.mfc=000000000.chunk[208980,209057]
An4/571/571/cen1-mjgk-b.mfc=000000000.chunk[209058,209245]
An4/585/585/cen3-mjhp-b.mfc=000000000.chunk[209246,209413]
An4/422/422/cen7-mdxn-b.mfc=000000000.chunk[209414,209681]
An4/50/50/an292-ffmm-b.mfc=000000000.chunk[209682,209849]
An4/483/483/cen3-mewl-b.mfc=000000000.chunk[209850,210167]
An4/104/104/an314-fkai-b.mfc=000000000.chunk[210168,210535]
An4/641/641/an321-mmal-b.mfc=000000000.chunk[210536,210833]
An4/798/798/an357-mscg2-b.mfc=000000000.chunk[210834,210931]
An4/42/42/cen2-fejs-b.mfc=000000000.chunk[210932,211159]
An4/632/632/an55-mmaf-b.mfc=000000000.chunk[211160,211347]
An4/716/716/cen5-mmtm-b.mfc=000000000.chunk[211348,211585]
An4/19/19/cen5-fbbh-b.mfc=000000000.chunk[211586,211873]
An4/923/923/an376-mtxj-b.mfc=000000000.chunk[211874,212041]
An4/890/890/cen2-mtcv-b.mfc=000000000.chunk[212042,212179]
An4/825/825/an228-msjm-b.mfc=000000000.chunk[212180,212537]
An4/379/379/cen3-mdcs-b.mfc=000000000.chunk[212538,212745]
An4/870/870/cen8-msmn-b.mfc=000000000.chunk[212746,213333]
An4/623/623/cen4-mkem-b.mfc=000000000.chunk[213334,213461]
An4/703/703/cen5-mmsh-b.mfc=000000000.chunk[213462,213679]
An4/126/126/cen8-fkdo-b.mfc=000000000.chunk[213680,213937]
An4/79/79/an125-fjdn-b.mfc=000000000.chunk[213938,214045]
An4/384/384/cen8-mdcs-b.mfc=000000000.chunk[214046,214303]
An4/681/681/an362-mmkw-b.mfc=000000000.chunk[214304,214741]
An4/913/913/an369-mtos-b.mfc=000000000.chunk[214742,214839]
An4/392/392/cen3-mdcs2-b.mfc=000000000.chunk[214840,215007]
An4/217/217/cen8-fsaf2-b.mfc=000000000.chunk[215008,215205]
An4/409/409/cen7-mdmc-b.mfc=000000000.chunk[215206,215533]
An4/515/515/an246-mjbh-b.mfc=000000000.chunk[215534,215621]
An4/90/90/an7-fjmd-b.mfc=000000000.chunk[215622,215839]
An4/760/760/an103-mrcb-b.mfc=000000000.chunk[215840,216247]
An4/62/62/an76-fjam-b.mfc=000000000.chunk[216248,216335]
An4/822/822/cen8-msct-b.mfc=000000000.chunk[216336,216563]
An4/462/462/cen8-meht-b.mfc=000000000.chunk[216564,216761]
An4/292/292/cen6-mblw-b.mfc=000000000.chunk[216762,216929]
An4/676/676/cen5-mmdg-b.mfc=000000000.chunk[216930,217477]
An4/572/572/cen2-mjgk-b.mfc=000000000.chunk[217478,217695]
An4/363/363/an235-mcsc-b.mfc=000000000.chunk[217696,217773]
An4/522/522/cen3-mjbh-b.mfc=000000000.chunk[217774,217931]
An4/924/924/an377-mtxj-b.mfc=000000000.chunk[217932,218299]
An4/816/816/cen2-msct-b.mfc=000000000.chunk[218300,218547]
An4/485/485/cen5-mewl-b.mfc=000000000.chunk[218548,218915]
An4/621/621/cen1-mkem-b.mfc=000000000.chunk[218916,219183]
An4/577/577/cen8-mjgk-b.mfc=000000000.chunk[219184,219391]
An4/318/318/cen6-mcel-b.mfc=000000000.chunk[219392,219619]
An4/792/792/cen4-mrmg-b.mfc=000000000.chunk[219620,219977]
An4/841/841/cen1-msjr-b.mfc=000000000.chunk[219978,220345]
An4/763/763/cen1-mrcb-b.mfc=000000000.chunk[220346,220553]
An4/458/458/cen4-meht-b.mfc=000000000.chunk[220554,220751]
An4/715/715/cen4-mmtm-b.mfc=000000000.chunk[220752,221289]
An4/607/607/an320-mkdb-b.mfc=000000000.chunk[221290,221527]
An4/208/208/an299-fsaf2-b.mfc=000000000.chunk[221528,221925]
An4/134/134/cen3-flmm2-b.mfc=000000000.chunk[221926,222103]
An4/649/649/cen4-mmal-b.mfc=000000000.chunk[222104,222391]
An4/911/911/an367-mtos-b.mfc=000000000.chunk[222392,222799]
An4/730/730/cen6-mnfe-b.mfc=000000000.chunk[222800,223087]
An4/349/349/an144-mcrt-b.mfc=000000000.chunk[223088,223485]
An4/324/324/an130-mcen-b.mfc=000000000.chunk[223486,223553]
An4/501/501/cen8-mfaa-b.mfc=000000000.chunk[223554,223751]
An4/226/226/cen4-fsrb-b.mfc=000000000.chunk[223752,224189]
An4/547/547/cen3-mjdr-b.mfc=000000000.chunk[224190,224357]
An4/414/414/an109-mdxn-b.mfc=000000000.chunk[224358,224625]
An4/201/201/cen5-fplp-b.mfc=000000000.chunk[224626,225233]
An4/221/221/an169-fsrb-b.mfc=000000000.chunk[225234,225391]
An4/12/12/an88-fbbh-b.mfc=000000000.chunk[225392,225859]
An4/879/879/cen4-msrb-b.mfc=000000000.chunk[225860,226267]
An4/563/563/cen6-mjes-b.mfc=000000000.chunk[226268,226415]
An4/123/123/cen5-fkdo-b.mfc=000000000.chunk[226416,226813]
An4/207/207/an298-fsaf2-b.mfc=000000000.chunk[226814,226911]
An4/617/617/an187-mkem-b.mfc=000000000.chunk[226912,227079]
An4/866/866/cen4-msmn-b.mfc=000000000.chunk[227080,227517]
An4/943/943/cen3-mwhw-b.mfc=000000000.chunk[227518,227625]
An4/542/542/an18-mjdr-b.mfc=000000000.chunk[227626,227783]
An4/762/762/an105-mrcb-b.mfc=000000000.chunk[227784,227931]
An4/465/465/an288-mema-b.mfc=000000000.chunk[227932,228019]
An4/249/249/cen1-ftmj-b.mfc=000000000.chunk[228020,228257]
An4/307/307/cen8-mbmg-b.mfc=000000000.chunk[228258,228585]
An4/802/802/cen1-mscg2-b.mfc=000000000.chunk[228586,228823]
An4/73/73/cen7-fjam-b.mfc=000000000.chunk[228824,229061]
An4/554/554/an157-mjes-b.mfc=000000000.chunk[229062,229189]
An4/539/539/cen7-mjda-b.mfc=000000000.chunk[229190,229517]
An4/505/505/an199-mgah-b.mfc=000000000.chunk[229518,229615]
An4/780/780/cen5-mrjc2-b.mfc=000000000.chunk[229616,229993]
An4/100/100/cen8-fjmd-b.mfc=000000000.chunk[229994,230211]
An4/696/696/an383-mmsh-b.mfc=000000000.chunk[230212,230349]
An4/486/486/cen6-mewl-b.mfc=000000000.chunk[230350,230547]
An4/32/32/cen5-fclc-b.mfc=000000000.chunk[230548,230975]
An4/835/835/cen8-msjm-b.mfc=000000000.chunk[230976,231193]
An4/514/514/cen8-mgah-b.mfc=000000000.chunk[231194,231541]
An4/694/694/an381-mmsh-b.mfc=000000000.chunk[231542,231779]
An4/867/867/cen5-msmn-b.mfc=000000000.chunk[231780,232107]
An4/366/366/cen3-mcsc-b.mfc=000000000.chunk[232108,232335]
An4/912/912/an368-mtos-b.mfc=000000000.chunk[232336,232753]
An4/738/738/cen1-mnjl-b.mfc=000000000.chunk[232754,233161]
An4/270/270/an372-mblb-b.mfc=000000000.chunk[233162,233459]
An4/155/155/an118-fmjc-b.mfc=000000000.chunk[233460,233707]
An4/558/558/cen1-mjes-b.mfc=000000000.chunk[233708,233925]
An4/606/606/an319-mkdb-b.mfc=000000000.chunk[233926,234273]
An4/819/819/cen5-msct-b.mfc=000000000.chunk[234274,234671]
An4/288/288/cen2-mblw-b.mfc=000000000.chunk[234672,234829]
An4/120/120/cen2-fkdo-b.mfc=000000000.chunk[234830,235117]
An4/536/536/cen4-mjda-b.mfc=000000000.chunk[235118,235695]
An4/302/302/cen3-mbmg-b.mfc=000000000.chunk[235696,235843]
An4/860/860/cen7-mskh-b.mfc=000000000.chunk[235844,236141]
An4/269/269/an371-mblb-b.mfc=000000000.chunk[236142,236509]
An4/455/455/cen1-meht-b.mfc=000000000.chunk[236510,236717]
An4/286/286/an5-mblw-b.mfc=000000000.chunk[236718,236815]
An4/136/136/cen6-flmm2-b.mfc=000000000.chunk[236816,237043]
An4/481/481/cen1-mewl-b.mfc=000000000.chunk[237044,237471]
An4/58/58/cen5-ffmm-b.mfc=000000000.chunk[237472,237959]
An4/583/583/cen1-mjhp-b.mfc=000000000.chunk[237960,238337]
An4/534/534/cen2-mjda-b.mfc=000000000.chunk[238338,238555]
An4/940/940/an155-mwhw-b.mfc=000000000.chunk[238556,238693]
An4/882/882/cen7-msrb-b.mfc=000000000.chunk[238694,239061]
An4/473/473/cen6-mema-b.mfc=000000000.chunk[239062,239239]
An4/937/937/an152-mwhw-b.mfc=000000000.chunk[239240,239337]
An4/94/94/cen2-fjmd-b.mfc=000000000.chunk[239338,239615]
An4/83/83/cen4-fjdn-b.mfc=000000000.chunk[239616,239763]
An4/568/568/an218-mjgk-b.mfc=000000000.chunk[239764,239851]
An4/45/45/cen5-fejs-b.mfc=000000000.chunk[239852,240259]
An4/766/766/cen4-mrcb-b.mfc=000000000.chunk[240260,240537]
An4/929/929/cen2-mtxj-b.mfc=000000000.chunk[240538,240695]
An4/634/634/cen2-mmaf-b.mfc=000000000.chunk[240696,240953]
An4/337/337/an265-mcfl-b.mfc=000000000.chunk[240954,241051]
An4/884/884/an301-mtcv-b.mfc=000000000.chunk[241052,241429]
An4/516/516/an247-mjbh-b.mfc=000000000.chunk[241430,241507]
An4/796/796/cen8-mrmg-b.mfc=000000000.chunk[241508,241725]
An4/397/397/cen8-mdcs2-b.mfc=000000000.chunk[241726,241973]
An4/648/648/cen3-mmal-b.mfc=000000000.chunk[241974,242151]
An4/81/81/cen2-fjdn-b.mfc=000000000.chunk[242152,242329]
An4/807/807/cen6-mscg2-b.mfc=000000000.chunk[242330,242617]
An4/717/717/cen6-mmtm-b.mfc=000000000.chunk[242618,242845]
An4/394/394/cen5-mdcs2-b.mfc=000000000.chunk[242846,243113]
An4/895/895/cen7-mtcv-b.mfc=000000000.chunk[243114,243461]
An4/140/140/an2121-flrp-b.mfc=000000000.chunk[243462,243779]
An4/653/653/cen8-mmal-b.mfc=000000000.chunk[243780,243957]
An4/355/355/cen5-mcrt-b.mfc=000000000.chunk[243958,244555]
An4/159/159/cen2-fmjc-b.mfc=000000000.chunk[244556,244803]
An4/443/443/cen2-meab-b.mfc=000000000.chunk[244804,245111]
An4/942/942/cen2-mwhw-b.mfc=000000000.chunk[245112,245329]
An4/809/809/cen8-mscg2-b.mfc=000000000.chunk[245330,245637]
An4/519/519/an250-mjbh-b.mfc=000000000.chunk[245638,245725]
An4/944/944/cen4-mwhw-b.mfc=000000000.chunk[245726,246083]
An4/190/190/cen7-fnsv-b.mfc=000000000.chunk[246084,246471]
An4/925/925/an378-mtxj-b.mfc=000000000.chunk[246472,246619]
An4/665/665/cen7-mmap-b.mfc=000000000.chunk[246620,246907]
An4/448/448/cen7-meab-b.mfc=000000000.chunk[246908,247345]
An4/845/845/cen5-msjr-b.mfc=000000000.chunk[247346,247563]
An4/818/818/cen4-msct-b.mfc=000000000.chunk[247564,247821]
An4/695/695/an382-mmsh-b.mfc=000000000.chunk[247822,248089]
An4/511/511/cen5-mgah-b.mfc=000000000.chunk[248090,248567]
An4/479/479/an259-mewl-b.mfc=000000000.chunk[248568,248705]
An4/35/35/cen8-fclc-b.mfc=000000000.chunk[248706,248973]
An4/109/109/cen4-fkai-b.mfc=000000000.chunk[248974,249221]
An4/14/14/an90-fbbh-b.mfc=000000000.chunk[249222,249319]
An4/586/586/cen4-mjhp-b.mfc=000000000.chunk[249320,249647]
An4/133/133/cen2-flmm2-b.mfc=000000000.chunk[249648,249845]
An4/30/30/cen3-fclc-b.mfc=000000000.chunk[249846,250033]
An4/383/383/cen7-mdcs-b.mfc=000000000.chunk[250034,250381]
An4/34/34/cen7-fclc-b.mfc=000000000.chunk[250382,250679]
An4/851/851/an97-mskh-b.mfc=000000000.chunk[250680,250817]
An4/147/147/cen3-flrp-b.mfc=000000000.chunk[250818,250975]
An4/550/550/cen6-mjdr-b.mfc=000000000.chunk[250976,251143]
An4/407/407/cen5-mdmc-b.mfc=000000000.chunk[251144,251521]
An4/587/587/cen5-mjhp-b.mfc=000000000.chunk[251522,251799]
An4/22/22/cen8-fbbh-b.mfc=000000000.chunk[251800,252077]
An4/138/138/cen8-flmm2-b.mfc=000000000.chunk[252078,252655]
An4/361/361/an233-mcsc-b.mfc=000000000.chunk[252656,252733]

Просмотреть файл

@ -0,0 +1,132 @@
_ah_[2]
_ah_[3]
_ah_[4]
_hmm_[2]
_hmm_[3]
_hmm_[4]
_noise_[2]
_noise_[3]
_noise_[4]
aa_s2_1
aa_s3_1
aa_s4_1
ae_s2_1
ae_s3_1
ae_s4_1
ah_s2_1
ah_s3_1
ah_s4_1
ao_s2_1
ao_s3_1
ao_s4_1
aw_s2_1
aw_s3_1
aw_s4_1
ax_s2_1
ax_s3_1
ax_s4_1
ay_s2_1
ay_s3_1
ay_s4_1
b_s2_1
b_s3_1
b_s4_1
ch_s2_1
ch_s3_1
ch_s4_1
d_s2_1
d_s3_1
d_s4_1
dh_s2_1
dh_s3_1
dh_s4_1
eh_s2_1
eh_s3_1
eh_s4_1
er_s2_1
er_s3_1
er_s4_1
ey_s2_1
ey_s3_1
ey_s4_1
f_s2_1
f_s3_1
f_s4_1
g_s2_1
g_s3_1
g_s4_1
hh_s2_1
hh_s3_1
hh_s4_1
ih_s2_1
ih_s3_1
ih_s4_1
iy_s2_1
iy_s3_1
iy_s4_1
jh_s2_1
jh_s3_1
jh_s4_1
k_s2_1
k_s3_1
k_s4_1
l_s2_1
l_s3_1
l_s4_1
m_s2_1
m_s3_1
m_s4_1
n_s2_1
n_s3_1
n_s4_1
ng_s2_1
ng_s3_1
ng_s4_1
ow_s2_1
ow_s3_1
ow_s4_1
oy_s2_1
oy_s3_1
oy_s4_1
p_s2_1
p_s3_1
p_s4_1
r_s2_1
r_s3_1
r_s4_1
s_s2_1
s_s3_1
s_s4_1
sh_s2_1
sh_s3_1
sh_s4_1
sil[2]
sil[3]
sil[4]
t_s2_1
t_s3_1
t_s4_1
th_s2_1
th_s3_1
th_s4_1
uh_s2_1
uh_s3_1
uh_s4_1
uw_s2_1
uw_s3_1
uw_s4_1
v_s2_1
v_s3_1
v_s4_1
w_s2_1
w_s3_1
w_s4_1
y_s2_1
y_s3_1
y_s4_1
z_s2_1
z_s3_1
z_s4_1
zh_s2_1
zh_s3_1
zh_s4_1

72
Demos/Speech/Readme.md Normal file
Просмотреть файл

@ -0,0 +1,72 @@
# CNTK example: Speech
## License
Contents of this directory is a modified version of AN4 dataset pre-processed and optimized for CNTK end-to-end testing.
The data uses the format required by the HTKMLFReader. For details please refer to the documentation.
The [AN4 dataset](http://www.speech.cs.cmu.edu/databases/an4) is a part of CMU audio databases.
This modified version of dataset is distributed under the terms of a AN4 license which can be found in 'AdditionalFiles/AN4.LICENSE.html'
## Overview
|:--------|:---|
Data: |Speech data from the CMU Audio Database aka AN4 (http://www.speech.cs.cmu.edu/databases/an4)
Purpose: |Showcase how to train feed forward and LSTM networks for speech data
Network: |SimpleNetworkBuilder for 2-layer FF, NdlNetworkBuilder for 3-layer LSTM network
Training: |Data-parallel 1-Bit SGD with adjusted learning rate
Comments: |There are two config files: FeedForward.config and LSTM-NDL.config for FF and LSTM training respectively
## Running the example
### Getting the data
The data for this example is already contained in the folder Demos/Text/Data/.
### Setup
Compile the sources to generate the cntk executable (not required if you downloaded the binaries).
__Windows:__ Add the folder of the cntk executable to your path
(e.g. `set PATH=%PATH%;c:/src/cntk/x64/Debug/;`)
or prefix the call to the cntk executable with the corresponding folder.
__Linux:__ Add the folder of the cntk executable to your path
(e.g. `export PATH=$PATH:$HOME/src/cntk/build/debug/bin/`)
or prefix the call to the cntk executable with the corresponding folder.
### Run
Run the example from the Demos/Speech/Data folder using:
`cntk configFile=../Config/FeedForward.config`
or run from any folder and specify the Data folder as the `currentDirectory`,
e.g. running from the Demos/Speech folder using:
`cntk configFile=Config/FeedForward.config currentDirectory=Data`
The output folder will be created inside Demos/Speech/.
## Details
### Config files
The config files define a `RootDir` variable and sevearal other variables for directories.
The `ConfigDir` and `ModelDir` variables define the folders for additional config files and for model files.
These variables will be overwritten when running on the Philly cluster.
__It is therefore recommended to generally use `ConfigDir` and `ModelDir` in all config files.__
To run on CPU set `deviceId = -1`, to run on GPU set deviceId to "auto" or a specific value >= 0.
The FeedForward.config file uses the SimpleNetworkBuilder to create a 2-layer
feed forward network with sigmoid nodes and a softmax layer.
The LSTM-NDL.config file uses the NdlNetworkBuilder and refers to the lstmp-3layer_WithSelfStab.ndl file.
In the ndl file an LSTM component is defined and used to create a 3-layer LSTM network with a softmax layer.
Both configuration only define and execute a single training task:
`command=speechTrain`
The trained models for each epoch are stored in the output models folder.
### Additional files
The 'AdditionalFiles' folder contains the license terms for the AN4 audio database.

Двоичные данные
Demos/Text/AdditionalFiles/perplexity.class50.lr0.1.txt Normal file

Двоичный файл не отображается.

Просмотреть файл

@ -0,0 +1,40 @@
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.4415899 Perplexity = 230.80885 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.4415899 Perplexity = 230.80885
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.2513086 Perplexity = 190.8158 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.2513086 Perplexity = 190.8158
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.1372039 Perplexity = 170.23909 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.1372039 Perplexity = 170.23909
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0720036 Perplexity = 159.49358 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0720036 Perplexity = 159.49358
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0618825 Perplexity = 157.88746 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0618825 Perplexity = 157.88746
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0321352 Perplexity = 153.25991 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0321352 Perplexity = 153.25991
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0083887 Perplexity = 149.66339 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0083887 Perplexity = 149.66339
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0220441 Perplexity = 151.72111 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 5.0220441 Perplexity = 151.72111
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9666183 Perplexity = 143.54066 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9666183 Perplexity = 143.54066
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9719939 Perplexity = 144.31436 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9719939 Perplexity = 144.31436
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9397468 Perplexity = 139.73487 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9397468 Perplexity = 139.73487
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9442405 Perplexity = 140.36421 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9442405 Perplexity = 140.36421
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9287878 Perplexity = 138.21187 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9287878 Perplexity = 138.21187
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9292672 Perplexity = 138.27814 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9292672 Perplexity = 138.27814
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9187232 Perplexity = 136.82781 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9187232 Perplexity = 136.82781
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9224714 Perplexity = 137.34162 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9224714 Perplexity = 137.34162
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9124989 Perplexity = 135.97878 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9124989 Perplexity = 135.97878
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9146056 Perplexity = 136.26556 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9146056 Perplexity = 136.26556
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9083939 Perplexity = 135.42174 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9083939 Perplexity = 135.42174
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9042288 Perplexity = 134.85886 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9042288 Perplexity = 134.85886
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9007954 Perplexity = 134.39664 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.9007954 Perplexity = 134.39664
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8977358 Perplexity = 133.98607 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8977358 Perplexity = 133.98607
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8960381 Perplexity = 133.75879 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8960381 Perplexity = 133.75879
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8950807 Perplexity = 133.63079 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8950807 Perplexity = 133.63079
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8947338 Perplexity = 133.58444 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8947338 Perplexity = 133.58444
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8945776 Perplexity = 133.56358 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8945776 Perplexity = 133.56358
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8944772 Perplexity = 133.55017 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8944772 Perplexity = 133.55017
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8944128 Perplexity = 133.54157 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8944128 Perplexity = 133.54157
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943936 Perplexity = 133.53901 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943936 Perplexity = 133.53901
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.89438 Perplexity = 133.53718 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.89438 Perplexity = 133.53718
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943731 Perplexity = 133.53627 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943731 Perplexity = 133.53627
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943699 Perplexity = 133.53584 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943699 Perplexity = 133.53584
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.894368 Perplexity = 133.53558 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.894368 Perplexity = 133.53558
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.894367 Perplexity = 133.53545 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.894367 Perplexity = 133.53545
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943667 Perplexity = 133.53541 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943667 Perplexity = 133.53541
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943665 Perplexity = 133.53538 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943665 Perplexity = 133.53538
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943664 Perplexity = 133.53537 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943664 Perplexity = 133.53537
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943664 Perplexity = 133.53537 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943664 Perplexity = 133.53537
Final Results: Minibatch[1-8905]: Samples Seen = 73760 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943663 Perplexity = 133.53537 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8943663 Perplexity = 133.53537
Final Results: Minibatch[1-82430]: Samples Seen = 82430 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8084526 Perplexity = 122.54185 TrainNodeNCEBasedCrossEntropy: NCEBasedCrossEntropyWithSoftmax/Sample = 4.8084526 Perplexity = 122.54185

Просмотреть файл

@ -0,0 +1,446 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
# deviceId=-1 for CPU, >=0 for GPU devices, "auto" chooses the best GPU, or CPU if no usable GPU is available
deviceId = "auto"
command = writeWordAndClassInfo:train:test
precision = "float"
traceLevel = 1
modelPath = "$ModelDir$/rnn.dnn"
# uncomment the following line to write logs to a file
#stderr=$OutputDir$/rnnOutput
type = double
numCPUThreads = 1
confVocabSize = 10000
confClassSize = 50
trainFile = "ptb.train.cntk.txt"
validFile = "ptb.valid.cntk.txt"
testFile = "ptb.test.cntk.txt"
writeWordAndClassInfo = [
action = "writeWordAndClass"
inputFile = "$DataDir$/$trainFile$"
outputVocabFile = "$ModelDir$/vocab.txt"
outputWord2Cls = "$ModelDir$/word2cls.txt"
outputCls2Index = "$ModelDir$/cls2idx.txt"
vocabSize = "$confVocabSize$"
nbrClass = "$confClassSize$"
cutoff = 0
printValues = true
]
#######################################
# TRAINING CONFIG #
#######################################
train = [
action = "train"
minibatchSize = 2048
traceLevel = 1
epochSize = 0
recurrentLayer = 1
defaultHiddenActivity = 0.1
useValidation = true
rnnType = "CLASSLSTM"
SimpleNetworkBuilder = [
trainingCriterion = "classCrossEntropyWithSoftmax"
evalCriterion = "classCrossEntropyWithSoftmax"
nodeType = "sigmoid"
initValueScale = 6.0
layerSizes = "$confVocabSize$:150:200:10000"
addPrior = false
addDropoutNodes = false
applyMeanVarNorm = false
uniformInit = true
lookupTableOrder = 1
# these are for the class information for class-based language modeling
vocabSize = "$confVocabSize$"
nbrClass = "$confClassSize$"
]
SGD = [
learningRatesPerSample = 0.1
momentumPerMB = 0
gradientClippingWithTruncation = true
clippingThresholdPerSample = 15.0
maxEpochs = 16
unroll = false
numMBsToShowResult = 100
gradUpdateType = "none"
loadBestModel = true
# settings for Auto Adjust Learning Rate
AutoAdjust = [
autoAdjustLR = "adjustAfterEpoch"
reduceLearnRateIfImproveLessThan = 0.001
continueReduce = false
increaseLearnRateIfImproveMoreThan = 1000000000
learnRateDecreaseFactor = 0.5
learnRateIncreaseFactor = 1.382
numMiniBatch4LRSearch = 100
numPrevLearnRates = 5
numBestSearchEpoch = 1
]
dropoutRate = 0.0
]
reader = [
readerType = "LMSequenceReader"
randomize = "none"
nbruttsineachrecurrentiter = 10
# word class info
wordclass = "$ModelDir$/vocab.txt"
# if writerType is set, we will cache to a binary file
# if the binary file exists, we will use it instead of parsing this file
# writerType=BinaryReader
# write definition
wfile = "$OutputDir$/sequenceSentence.bin"
# wsize - inital size of the file in MB
# if calculated size would be bigger, that is used instead
wsize = 256
# wrecords - number of records we should allocate space for in the file
# files cannot be expanded, so this should be large enough. If known modify this element in config before creating file
wrecords = 1000
# windowSize - number of records we should include in BinaryWriter window
windowSize = "$confVocabSize$"
file = "$DataDir$/$trainFile$"
# additional features sections
# for now store as expanded category data (including label in)
features = [
# sentence has no features, so need to set dimension to zero
dim = 0
# write definition
sectionType = "data"
]
# sequence break table, list indexes into sequence records, so we know when a sequence starts/stops
sequence = [
dim = 1
wrecords = 2
# write definition
sectionType = "data"
]
#labels sections
labelIn = [
dim = 1
labelType = "Category"
beginSequence = "</s>"
endSequence = "</s>"
# vocabulary size
labelDim = "$confVocabSize$"
labelMappingFile = "$OutputDir$/sentenceLabels.txt"
# Write definition
# sizeof(unsigned) which is the label index type
elementSize = 4
sectionType = "labels"
mapping = [
# redefine number of records for this section, since we don't need to save it for each data record
wrecords = 11
# variable size so use an average string size
elementSize = 10
sectionType = "labelMapping"
]
category = [
dim = 11
# elementSize = sizeof(ElemType) is default
sectionType = "categoryLabels"
]
]
# labels sections
labels = [
dim = 1
labelType = "NextWord"
beginSequence = "O"
endSequence = "O"
# vocabulary size
labelDim = "$confVocabSize$"
labelMappingFile = "$OutputDir$/sentenceLabels.out.txt"
# Write definition
# sizeof(unsigned) which is the label index type
elementSize = 4
sectionType = "labels"
mapping = [
# redefine number of records for this section, since we don't need to save it for each data record
wrecords = 3
# variable size so use an average string size
elementSize = 10
sectionType = "labelMapping"
]
category = [
dim = 3
# elementSize = sizeof(ElemType) is default
sectionType = categoryLabels
]
]
]
cvReader = [
# reader to use
readerType = "LMSequenceReader"
randomize = "none"
# word class info
wordclass = "$ModelDir$/vocab.txt"
# if writerType is set, we will cache to a binary file
# if the binary file exists, we will use it instead of parsing this file
# writerType = "BinaryReader"
# write definition
wfile = "$OutputDir$/sequenceSentence.valid.bin"
# wsize - inital size of the file in MB
# if calculated size would be bigger, that is used instead
wsize = 256
# wrecords - number of records we should allocate space for in the file
# files cannot be expanded, so this should be large enough. If known modify this element in config before creating file
wrecords = 1000
# windowSize - number of records we should include in BinaryWriter window
windowSize = "$confVocabSize$"
file = "$DataDir$/$validFile$"
# additional features sections
# for now store as expanded category data (including label in)
features = [
# sentence has no features, so need to set dimension to zero
dim = 0
# write definition
sectionType = "data"
]
# sequence break table, list indexes into sequence records, so we know when a sequence starts/stops
sequence = [
dim = 1
wrecords = 2
# write definition
sectionType = "data"
]
# labels sections
# it should be the same as that in the training set
labelIn = [
dim = 1
# vocabulary size
labelDim = "$confVocabSize$"
labelMappingFile = "$OutputDir$/sentenceLabels.out.txt"
labelType = "Category"
beginSequence = "</s>"
endSequence = "</s>"
# Write definition
# sizeof(unsigned) which is the label index type
elementSize = 4
sectionType = "labels"
mapping = [
# redefine number of records for this section, since we don't need to save it for each data record
wrecords = 11
# variable size so use an average string size
elementSize = 10
sectionType = "labelMapping"
]
category = [
dim = 11
# elementSize = sizeof(ElemType) is default
sectionType = "categoryLabels"
]
]
#labels sections
labels = [
dim = 1
labelType = "NextWord"
beginSequence = "O"
endSequence = "O"
# vocabulary size
labelDim = "$confVocabSize$"
labelMappingFile = "$OutputDir$/sentenceLabels.out.txt"
# Write definition
# sizeof(unsigned) which is the label index type
elementSize = 4
sectionType = "labels"
mapping = [
# redefine number of records for this section, since we don't need to save it for each data record
wrecords = 3
# variable size so use an average string size
elementSize = 10
sectionType = "labelMapping"
]
category = [
dim = 3
# elementSize = sizeof(ElemType) is default
sectionType = "categoryLabels"
]
]
]
]
#######################################
# TEST CONFIG #
#######################################
test = [
action = "eval"
# correspond to the number of words/characteres to train in a minibatch
minibatchSize = 1
# need to be small since models are updated for each minibatch
traceLevel = 1
epochSize = 0
recurrentLayer = 1
defaultHiddenActivity = 0.1
useValidation = true
rnnType = "CLASSLSTM"
reader = [
# reader to use
readerType = "LMSequenceReader"
randomize = "none"
# word class info
wordclass = "$ModelDir$/vocab.txt"
# if writerType is set, we will cache to a binary file
# if the binary file exists, we will use it instead of parsing this file
# writerType = "BinaryReader"
# write definition
wfile = "$OutputDir$/sequenceSentence.bin"
# wsize - inital size of the file in MB
# if calculated size would be bigger, that is used instead
wsize = 256
# wrecords - number of records we should allocate space for in the file
# files cannot be expanded, so this should be large enough. If known modify this element in config before creating file
wrecords = 1000
# windowSize - number of records we should include in BinaryWriter window
windowSize = "$confVocabSize$"
file = "$DataDir$/$testFile$"
# additional features sections
# for now store as expanded category data (including label in)
features = [
# sentence has no features, so need to set dimension to zero
dim = 0
# write definition
sectionType = "data"
]
# sequence break table, list indexes into sequence records, so we know when a sequence starts/stops
sequence = [
dim = 1
wrecords = 2
# write definition
sectionType = "data"
]
#labels sections
labelIn = [
dim = 1
# vocabulary size
labelDim = "$confVocabSize$"
labelMappingFile = "$OutputDir$/sentenceLabels.txt"
labelType = "Category"
beginSequence = "</s>"
endSequence = "</s>"
# Write definition
# sizeof(unsigned) which is the label index type
elementSize = 4
sectionType = "labels"
mapping = [
# redefine number of records for this section, since we don't need to save it for each data record
wrecords = 11
# variable size so use an average string size
elementSize = 10
sectionType = "labelMapping"
]
category = [
dim = 11
# elementSize = sizeof(ElemType) is default
sectionType = "categoryLabels"
]
]
#labels sections
labels = [
dim = 1
labelType = "NextWord"
beginSequence = "O"
endSequence = "O"
# vocabulary size
labelDim = "$confVocabSize$"
labelMappingFile = "$OutputDir$/sentenceLabels.out.txt"
# Write definition
# sizeof(unsigned) which is the label index type
elementSize = 4
sectionType = "labels"
mapping = [
# redefine number of records for this section, since we don't need to save it for each data record
wrecords = 3
# variable size so use an average string size
elementSize = 10
sectionType = "labelMapping"
]
category = [
dim = 3
# elementSize = sizeof(ElemType) is default
sectionType = "categoryLabels"
]
]
]
]

67
Demos/Text/Readme.md Normal file
Просмотреть файл

@ -0,0 +1,67 @@
# CNTK example: Text
## License
Note: The data is not checked into the repository currently since a license is required for the penn treebank data.
## Overview
|:--------|:---|
Data: |The Penn Treebank Project (https://www.cis.upenn.edu/~treebank/) annotates naturally-occuring text for linguistic structure .
Purpose: |Showcase how to train a recurrent network for text data.
Network: |SimpleNetworkBuilder for recurrent network with two hidden layers.
Training: |Stochastic gradient descent with adjusted learning rate.
Comments: |The provided configuration file performs class based RNN training.
## Running the example
### Getting the data
The data for this example is already contained in the folder Demos/Text/Data/.
### Setup
Compile the sources to generate the cntk executable (not required if you downloaded the binaries).
__Windows:__ Add the folder of the cntk executable to your path
(e.g. `set PATH=%PATH%;c:/src/cntk/x64/Debug/;`)
or prefix the call to the cntk executable with the corresponding folder.
__Linux:__ Add the folder of the cntk executable to your path
(e.g. `export PATH=$PATH:$HOME/src/cntk/build/debug/bin/`)
or prefix the call to the cntk executable with the corresponding folder.
### Run
Run the example from the Demos/Text/Data folder using:
`cntk configFile=../Config/rnn.config`
or run from any folder and specify the Data folder as the `currentDirectory`,
e.g. running from the Demos/Text folder using:
`cntk configFile=Config/rnn.config currentDirectory=Data`
The output folder will be created inside Demos/Text/.
## Details
### Config files
The config files define a `RootDir` variable and sevearal other variables for directories.
The `ConfigDir` and `ModelDir` variables define the folders for additional config files and for model files.
These variables will be overwritten when running on the Philly cluster.
__It is therefore recommended to generally use `ConfigDir` and `ModelDir` in all config files.__
To run on CPU set `deviceId = -1`, to run on GPU set deviceId to "auto" or a specific value >= 0.
The configuration contains three commands.
The first writes the word and class information as three separate files into the data directory.
The training command uses the SimpleNetworkBuilder to build a recurrent network
using `rnnType = CLASSLSTM` and the LMSequenceReader.
The test command evalutes the trained network agains the specified `testFile`.
The trained models for each epoch are stored in the output models folder.
### Additional files
The 'AdditionalFiles' folder contains perplexity and expected results files for comparison.

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 16 KiB

После

Ширина:  |  Высота:  |  Размер: 16 KiB

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 14 KiB

После

Ширина:  |  Высота:  |  Размер: 14 KiB

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 15 KiB

После

Ширина:  |  Высота:  |  Размер: 15 KiB

Просмотреть файл

До

Ширина:  |  Высота:  |  Размер: 11 KiB

После

Ширина:  |  Высота:  |  Размер: 11 KiB

Просмотреть файл

@ -1,66 +0,0 @@
WorkDir=.
ModelDir=$WorkDir$/_out/$ConfigName$
stderr=$WorkDir$/_out/$ConfigName$
ndlMacros=$WorkDir$/Macros.ndl
precision=float
deviceId=Auto
command=Train:Test
Train=[
action=train
modelPath=$ModelDir$/01_OneHidden
NDLNetworkBuilder=[
networkDescription=$WorkDir$/01_OneHidden.ndl
]
SGD=[
epochSize=60000
minibatchSize=32
learningRatesPerMB=0.1
momentumPerMB=0
maxEpochs=30
]
reader=[
readerType=UCIFastReader
file=$WorkDir$/Train-28x28.txt
features=[
dim=784
start=1
]
labels=[
dim=1
start=0
labelDim=10
labelMappingFile=$WorkDir$/labelsmap.txt
]
]
]
Test=[
action=test
modelPath=$ModelDir$/01_OneHidden
NDLNetworkBuilder=[
networkDescription=$WorkDir$/01_OneHidden.ndl
]
reader=[
readerType=UCIFastReader
file=$WorkDir$/Test-28x28.txt
features=[
dim=784
start=1
]
labels=[
dim=1
start=0
labelDim=10
labelMappingFile=$WorkDir$/labelsmap.txt
]
]
]

Просмотреть файл

@ -1,25 +0,0 @@
load=ndlMnistMacros
run=DNN
ndlMnistMacros = [
FeatDim = 784
LabelDim = 10
features = Input(FeatDim, tag = feature)
featScale = Const(0.00390625)
featScaled = Scale(featScale, features)
labels = Input(LabelDim, tag = label)
]
DNN=[
hiddenDim = 200
# DNNSigmoidLayer and DNNLayer are defined in Macros.ndl
h1=DNNSigmoidLayer(FeatDim, hiddenDim, featScaled, 1)
ol=DNNLayer(hiddenDim, labelDim, h1, 1)
CE = CrossEntropyWithSoftmax(labels, ol, tag = Criteria)
Err = ErrorPrediction(labels, ol, tag = Eval)
OutputNodes = ol
]

Просмотреть файл

@ -1,66 +0,0 @@
WorkDir=.
ModelDir=$WorkDir$/_out/$ConfigName$
stderr=$WorkDir$/_out/$ConfigName$
ndlMacros=$WorkDir$/Macros.ndl
precision=float
deviceId=Auto
command=Train:Test
Train=[
action=train
modelPath=$ModelDir$/02_Convolution
NDLNetworkBuilder=[
networkDescription=$WorkDir$/02_Convolution.ndl
]
SGD=[
epochSize=60000
minibatchSize=32
learningRatesPerMB=0.5
momentumPerMB=0*10:0.7
maxEpochs=15
]
reader=[
readerType=UCIFastReader
file=$WorkDir$/Train-28x28.txt
features=[
dim=784
start=1
]
labels=[
dim=1
start=0
labelDim=10
labelMappingFile=$WorkDir$/labelsmap.txt
]
]
]
Test=[
action=test
modelPath=$ModelDir$/02_Convolution
NDLNetworkBuilder=[
networkDescription=$WorkDir$/02_Convolution.ndl
]
reader=[
readerType=UCIFastReader
file=$WorkDir$/Test-28x28.txt
features=[
dim=784
start=1
]
labels=[
dim=1
start=0
labelDim=10
labelMappingFile=$WorkDir$/labelsmap.txt
]
]
]

Просмотреть файл

@ -1,66 +0,0 @@
WorkDir=.
ModelDir=$WorkDir$/_out/$ConfigName$
stderr=$WorkDir$/_out/$ConfigName$
ndlMacros=$WorkDir$/Macros.ndl
precision=float
deviceId=Auto
command=Train:Test
Train=[
action=train
modelPath=$ModelDir$/03_ConvBatchNorm
NDLNetworkBuilder=[
networkDescription=$WorkDir$/03_ConvBatchNorm.ndl
]
SGD=[
epochSize=60000
minibatchSize=32
learningRatesPerMB=0.5
momentumPerMB=0*10:0.7
maxEpochs=8
]
reader=[
readerType=UCIFastReader
file=$WorkDir$/Train-28x28.txt
features=[
dim=784
start=1
]
labels=[
dim=1
start=0
labelDim=10
labelMappingFile=$WorkDir$/labelsmap.txt
]
]
]
Test=[
action=test
modelPath=$ModelDir$/03_ConvBatchNorm
NDLNetworkBuilder=[
networkDescription=$WorkDir$/03_ConvBatchNorm.ndl
]
reader=[
readerType=UCIFastReader
file=$WorkDir$/Test-28x28.txt
features=[
dim=784
start=1
]
labels=[
dim=1
start=0
labelDim=10
labelMappingFile=$WorkDir$/labelsmap.txt
]
]
]

Просмотреть файл

@ -56,11 +56,11 @@ if __name__ == "__main__":
trnLbl = loadLabels('http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz', 60000)
trn = np.hstack((trnLbl, trnData))
print 'Writing train text file...'
np.savetxt(r'./Train-28x28.txt', trn, fmt = '%u', delimiter='\t')
np.savetxt(r'./../Data/Train-28x28.txt', trn, fmt = '%u', delimiter='\t')
print 'Done.'
testData = loadData('http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz', 10000)
testLbl = loadLabels('http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz', 10000)
test = np.hstack((testLbl, testData))
print 'Writing test text file...'
np.savetxt(r'./Test-28x28.txt', test, fmt = '%u', delimiter='\t')
np.savetxt(r'./../Data/Test-28x28.txt', test, fmt = '%u', delimiter='\t')
print 'Done.'

Просмотреть файл

@ -0,0 +1,87 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
deviceId = "auto"
command = train:test
precision = "float"
modelPath = "$ModelDir$/01_OneHidden"
ndlMacros = "$ConfigDir$/Macros.ndl"
# uncomment the following line to write logs to a file
# stderr = "$OutputDir$/01_OneHidden_out"
#######################################
# TRAINING CONFIG #
#######################################
train = [
action = "train"
NDLNetworkBuilder = [
networkDescription = "$ConfigDir$/01_OneHidden.ndl"
]
SGD = [
epochSize = 60000
minibatchSize = 32
learningRatesPerMB = 0.1
momentumPerMB = 0
maxEpochs = 15
]
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/Train-28x28.txt"
features = [
dim = 784
start = 1
]
labels = [
dim = 1
start = 0
labelDim = 10
labelMappingFile = "$DataDir$/labelsmap.txt"
]
]
]
#######################################
# TEST CONFIG #
#######################################
test = [
action = "test"
NDLNetworkBuilder=[
networkDescription = "$ConfigDir$/01_OneHidden.ndl"
]
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/Test-28x28.txt"
features = [
dim = 784
start = 1
]
labels = [
dim = 1
start = 0
labelDim = 10
labelMappingFile = "$DataDir$/labelsmap.txt"
]
]
]

Просмотреть файл

@ -0,0 +1,34 @@
# macros to include
load = ndlMnistMacros
# the actual NDL that defines the network
run = DNN
ndlMnistMacros = [
featDim = 784
labelDim = 10
features = Input(featDim)
featScale = Const(0.00390625)
featScaled = Scale(featScale, features)
labels = Input(labelDim)
]
DNN = [
hiddenDim = 200
# DNNSigmoidLayer and DNNLayer are defined in Macros.ndl
h1 = DNNSigmoidLayer(featDim, hiddenDim, featScaled, 1)
ol = DNNLayer(hiddenDim, labelDim, h1, 1)
ce = CrossEntropyWithSoftmax(labels, ol)
err = ErrorPrediction(labels, ol)
# Special Nodes
FeatureNodes = (features)
LabelNodes = (labels)
CriterionNodes = (ce)
EvalNodes = (err)
OutputNodes = (ol)
]

Просмотреть файл

@ -0,0 +1,87 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
deviceId = "auto"
command = train:test
precision = "float"
modelPath = "$ModelDir$/02_Convolution"
ndlMacros = "$ConfigDir$/Macros.ndl"
# uncomment the following line to write logs to a file
# stderr = "$OutputDir$/02_Convolution_out"
#######################################
# TRAINING CONFIG #
#######################################
train = [
action = "train"
NDLNetworkBuilder = [
networkDescription = "$ConfigDir$/02_Convolution.ndl"
]
SGD = [
epochSize = 60000
minibatchSize = 32
learningRatesPerMB = 0.5
momentumPerMB = 0*10:0.7
maxEpochs = 15
]
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/Train-28x28.txt"
features = [
dim = 784
start = 1
]
labels = [
dim = 1
start = 0
labelDim = 10
labelMappingFile = "$DataDir$/labelsmap.txt"
]
]
]
#######################################
# TEST CONFIG #
#######################################
test = [
action = test
NDLNetworkBuilder = [
networkDescription = "$ConfigDir$/02_Convolution.ndl"
]
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/Test-28x28.txt"
features = [
dim = 784
start = 1
]
labels = [
dim = 1
start = 0
labelDim = 10
labelMappingFile = "$DataDir$/labelsmap.txt"
]
]
]

Просмотреть файл

@ -1,15 +1,18 @@
load=ndlMnistMacros
run=DNN
# macros to include
load = ndlMnistMacros
# the actual NDL that defines the network
run = DNN
ndlMnistMacros = [
ImageW = 28
ImageH = 28
LabelDim = 10
imageW = 28
imageH = 28
labelDim = 10
features = ImageInput(ImageW, ImageH, 1, tag = feature)
features = ImageInput(imageW, imageH, 1)
featScale = Const(0.00390625)
featScaled = Scale(featScale, features)
labels = Input(LabelDim, tag = label)
labels = Input(labelDim)
]
DNN=[
@ -52,8 +55,14 @@ DNN=[
h1 = DNNSigmoidLayer(512, h1Dim, pool2, 1)
ol = DNNLayer(h1Dim, labelDim, h1, 1)
CE = CrossEntropyWithSoftmax(labels, ol, tag = Criteria)
Err = ErrorPrediction(labels, ol, tag = Eval)
OutputNodes = ol
ce = CrossEntropyWithSoftmax(labels, ol)
err = ErrorPrediction(labels, ol)
# Special Nodes
FeatureNodes = (features)
LabelNodes = (labels)
CriterionNodes = (ce)
EvalNodes = (err)
OutputNodes = (ol)
]

Просмотреть файл

@ -0,0 +1,87 @@
# Parameters can be overwritten on the command line
# for example: cntk configFile=myConfigFile RootDir=../..
# For running from Visual Studio add
# currentDirectory=$(SolutionDir)/<path to corresponding data folder>
RootDir = ".."
ConfigDir = "$RootDir$/Config"
DataDir = "$RootDir$/Data"
OutputDir = "$RootDir$/Output"
ModelDir = "$OutputDir$/Models"
deviceId = "auto"
command = train:test
precision = "float"
modelPath = "$ModelDir$/03_ConvBatchNorm"
ndlMacros = "$ConfigDir$/Macros.ndl"
# uncomment the following line to write logs to a file
# stderr = "$OutputDir$/03_ConvBatchNorm_out"
#######################################
# TRAINING CONFIG #
#######################################
train = [
action = "train"
NDLNetworkBuilder = [
networkDescription = "$ConfigDir$/03_ConvBatchNorm.ndl"
]
SGD = [
epochSize = 60000
minibatchSize = 32
learningRatesPerMB = 0.5
momentumPerMB = 0*10:0.7
maxEpochs = 8
]
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/Train-28x28.txt"
features = [
dim = 784
start = 1
]
labels = [
dim = 1
start = 0
labelDim = 10
labelMappingFile = "$DataDir$/labelsmap.txt"
]
]
]
#######################################
# TEST CONFIG #
#######################################
test = [
action = "test"
NDLNetworkBuilder = [
networkDescription = "$ConfigDir$/03_ConvBatchNorm.ndl"
]
reader = [
readerType = "UCIFastReader"
file = "$DataDir$/Test-28x28.txt"
features = [
dim = 784
start = 1
]
labels = [
dim = 1
start = 0
labelDim = 10
labelMappingFile = "$DataDir$/labelsmap.txt"
]
]
]

Просмотреть файл

@ -1,18 +1,21 @@
load=ndlMnistMacros
run=DNN
# macros to include
load = ndlMnistMacros
# the actual NDL that defines the network
run = DNN
ndlMnistMacros = [
ImageW = 28
ImageH = 28
LabelDim = 10
imageW = 28
imageH = 28
labelDim = 10
features = ImageInput(ImageW, ImageH, 1, tag = feature)
features = ImageInput(imageW, imageH, 1)
featScale = Const(0.00390625)
featScaled = Scale(featScale, features)
labels = Input(LabelDim, tag = label)
labels = Input(labelDim)
]
DNN=[
DNN = [
# conv1
kW1 = 5
kH1 = 5
@ -55,8 +58,14 @@ DNN=[
ol = DNNLayer(h1Dim, labelDim, h1_act, 1)
CE = CrossEntropyWithSoftmax(labels, ol, tag = Criteria)
Err = ErrorPrediction(labels, ol, tag = Eval)
OutputNodes = ol
ce = CrossEntropyWithSoftmax(labels, ol)
err = ErrorPrediction(labels, ol)
# Special Nodes
FeatureNodes = (features)
LabelNodes = (labels)
CriterionNodes = (ce)
EvalNodes = (err)
OutputNodes = (ol)
]

Просмотреть файл

@ -0,0 +1,31 @@
DNNSigmoidLayer(inDim, outDim, x, parmScale) = [
W = Parameter(outDim, inDim, init="uniform", initValueScale=parmScale)
b = Parameter(outDim, 1, init="uniform", initValueScale=parmScale)
t = Times(W, x)
z = Plus(t, b)
y = Sigmoid(z)
]
DNNLayer(inDim, outDim, x, parmScale) = [
W = Parameter(outDim, inDim, init="uniform", initValueScale=parmScale)
b = Parameter(outDim, 1, init="uniform", initValueScale=parmScale)
t = Times(W, x)
z = Plus(t, b)
]
ConvReLULayer(inp, outMap, inWCount, kW, kH, hStride, vStride, wScale, bValue) = [
convW = Parameter(outMap, inWCount, init="uniform", initValueScale=wScale)
convB = Parameter(outMap, 1, init="fixedValue", value=bValue)
conv = Convolution(convW, inp, kW, kH, outMap, hStride, vStride, zeroPadding=false)
convPlusB = Plus(conv, convB);
act = RectifiedLinear(convPlusB);
]
BatchNorm(dim, x, scaleInit, biasInit) = [
m = Mean(x)
isd = InvStdDev(x)
norm = ColumnElementTimes(Minus(x, m), isd)
sc = Parameter(dim, 1, init="uniform", initValueScale=scaleInit)
b = Parameter(dim, 1, init="uniform", initValueScale=biasInit)
bn_norm = Plus(ColumnElementTimes(norm, sc), b)
]

Просмотреть файл

@ -8,5 +8,3 @@
7
8
9

Просмотреть файл

@ -1,35 +0,0 @@
DNNSigmoidLayer(inDim, outDim, x, parmScale)
{
W = Parameter(outDim, inDim, init = Uniform, initValueScale = parmScale)
b = Parameter(outDim, init = Uniform, initValueScale = parmScale)
t = Times(W, x)
z = Plus(t, b)
y = Sigmoid(z)
}
DNNLayer(inDim, outDim, x, parmScale)
{
W = Parameter(outDim, inDim, init = Uniform, initValueScale = parmScale)
b = Parameter(outDim, init = Uniform, initValueScale = parmScale)
t = Times(W, x)
z = Plus(t, b)
}
ConvReLULayer(inp, outMap, inWCount, kW, kH, hStride, vStride, wScale, bValue)
{
convW = Parameter(outMap, inWCount, init = Uniform, initValueScale = wScale)
conv = Convolution(convW, inp, kW, kH, outMap, hStride, vStride, zeroPadding = false)
convB = Parameter(outMap, 1, init = fixedValue, value = bValue)
convPlusB = Plus(conv, convB);
act = RectifiedLinear(convPlusB);
}
BatchNorm(dim, x, scaleInit, biasInit)
{
m = Mean(x)
isd = InvStdDev(x)
norm = ColumnElementTimes(Minus(x, m), isd)
sc = Parameter(dim, 1, init=Uniform, initValueScale=scaleInit)
b = Parameter(dim, 1, init=Uniform, initValueScale=biasInit)
bn_norm = Plus(ColumnElementTimes(norm, sc), b)
}

Просмотреть файл

@ -0,0 +1,82 @@
# CNTK example: MNIST
## Overview
|:--------|:---|
Data: |The MNIST database (http://yann.lecun.com/exdb/mnist/) of handwritten digits.
Purpose: |This example demonstrates usage of NDL to train neural networks on MNIST dataset.
Network: |NDLNetworkBuilder, simple feed forward and convolutional networks, cross entropy with softmax.
Training: |Stochastic gradient descent both with and without momentum.
Comments: |There are two config files, details are provided below.
## Running the example
### Getting the data
The MNIST dataset is not included in the CNTK distribution but can be easily
downloaded and converted by running the following command from the 'AdditionalFiles' folder:
`python mnist_convert.py`
The script will download all required files and convert them to CNTK-supported format.
The resulting files (Train-28x28.txt and Test-28x28.txt) will be stored in the 'Data' folder.
In case you don't have a Python installed, there are 2 options:
1. Download and install latest version of Python 2.7 from: https://www.python.org/downloads/
Then install numpy package by following instruction from: http://www.scipy.org/install.html#individual-packages
2. Alternatively install Python Anaconda distribution which contains most of the
popular Python packages including numpy: http://continuum.io/downloads
### Setup
Compile the sources to generate the cntk executable (not required if you downloaded the binaries).
__Windows:__ Add the folder of the cntk executable to your path
(e.g. `set PATH=%PATH%;c:/src/cntk/x64/Debug/;`)
or prefix the call to the cntk executable with the corresponding folder.
__Linux:__ Add the folder of the cntk executable to your path
(e.g. `export PATH=$PATH:$HOME/src/cntk/build/debug/bin/`)
or prefix the call to the cntk executable with the corresponding folder.
### Run
Run the example from the Image/MNIST/Data folder using:
`cntk configFile=../Config/01_OneHidden.config`
or run from any folder and specify the Data folder as the `currentDirectory`,
e.g. running from the Image/MNIST folder using:
`cntk configFile=Config/01_OneHidden.config currentDirectory=Data`
The output folder will be created inside Image/MNIST/.
## Details
### Config files
There are three config files and corresponding network description files in the 'Config' folder:
1. 01_OneHidden.ndl is a simple, one hidden layer network that produces 2.3% of error.
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/01_OneHidden.config`
2. 02_Convolution.ndl is more interesting, convolutional network which has 2 convolutional and 2 max pooling layers.
The network produces 0.87% of error after training for about 2 minutes on GPU.
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/02_Convolution.config`
3. 03_ConvBatchNorm.ndl is almost identical to 02_Convolution.ndl
except that it uses batch normalization for the fully connected layer h1.
Note that batch normalization is implemented using just NDL (see Macros.ndl for details).
As a result, it uses less epochs (8 vs 15 in 02_Convolution) to achieve the same accuracy.
To run the sample, navigate to the Data folder and run the following command:
`cntk configFile=../Config/03_ConvBatchNorm.config`
For more details, refer to .ndl and corresponding .config files.
### Additional files
The 'AdditionalFiles' folder contains the python script to download and convert the data.

Просмотреть файл

@ -1,28 +0,0 @@
This example demonstrates usage of NDL to train 2 neural networks on MNIST dataset (http://yann.lecun.com/exdb/mnist/).
MNIST dataset is not included in CNTK distribution but can be easily downloaded and converted by running the following command from this folder:
python mnist_convert.py
The script will download all required files and convert them to CNTK-supported format.
In case you don't have a Python installed, there are 2 options:
1. Download and install latest version of Python 2.7 from: https://www.python.org/downloads/
Then install numpy package by following instruction from: http://www.scipy.org/install.html#individual-packages
2. Alternatively install Python Anaconda distribution which contains most of the popular Python packages including numpy:
http://continuum.io/downloads
Short description of the networks:
1. 01_OneHidden.ndl is a simple, one hidden layer network that produces 2.3% of error.
To run the sample, navigate to this folder and run the following command:
<path to CNTK executable> configFile=01_OneHidden.config configName=01_OneHidden
2. 02_Convolution.ndl is more interesting, convolutional network which has 2 convolutional and 2 max pooling layers. The network produces 0.87% of error after training for about 2 minutes on GPU.
To run the sample, navigate to this folder and run the following command:
<path to CNTK executable> configFile=02_Conv.config configName=02_Conv
3. 03_ConvBatchNorm.ndl is almost identical to 02_Convolution.ndl except that it uses batch normalization for the fully connected layer h1.
Note that batch normalization is implemented using just NDL (see Macros.ndl for details).
As a result, it uses less epochs (8 vs 15 in 02_Convolution) to achieve the same accuracy.
To run the sample, navigate to this folder and run the following command:
<path to CNTK executable> configFile=03_ConvBatchNorm.config configName=03_ConvBatchNorm
For more details, refer to .ndl and corresponding .config files.

14
ExampleSetups/Readme.md Normal file
Просмотреть файл

@ -0,0 +1,14 @@
# CNTK Demos and Example Setups
This folder contains examples that correspond to popular data sets and tasks.
These data sets often require a license and are therefore not included in the repository.
The 'Demos' folder contains a few self-contained and documented demos to get started with CNTK.
The four examples shown in the table below provide a good introduction to CNTK.
|Folder | Domain | Network types |
|:------------------------|:-------------------------------------------------|:----------------|
Demos/Simple2d | Synthetic 2d data | FF (CPU and GPU)
Demos/Speech | Speech data (CMU AN4) | FF and LSTM
Demos/Text | Text data (penn treebank) | RNN
ExampleSetups/Image/MNIST | Image data (MNIST handwritten digit recognition) | CNN