This commit is contained in:
Big Data Tech. Lab 이태훈 2017-07-06 09:12:52 +09:00
Родитель d66929bda4
Коммит d3edc8902d
9 изменённых файлов: 9 добавлений и 9 удалений

Просмотреть файл

@ -33,7 +33,7 @@ for filename in os.listdir(filepath):
xlist = []
ylist = []
# loop through to get name and and BBox values from object
# loop through to get name and BBox values from object
for child in child:
if str(child.tag) == 'name':
label = child.text

Просмотреть файл

@ -84,7 +84,7 @@ void ComputationNetwork::ClearNetwork()
// serialization
// -----------------------------------------------------------------------
// after after editing--network is possibly not validated/compiled
// after editing--network is possibly not validated/compiled
void ComputationNetwork::SaveEdited(const wstring& fileName, const FileOptions fileFormat)
{
if (!IsCompiled())

Просмотреть файл

@ -318,7 +318,7 @@ void ComputationNetwork::DetermineSCCsR(ComputationNodeBasePtr cur,
}
}
if (bFound)
fprintf(stderr, "\nDetermineSCCsR: %ls %ls operation was discovered multiple times as as loop participant", cur->NodeName().c_str(), cur->OperationName().c_str());
fprintf(stderr, "\nDetermineSCCsR: %ls %ls operation was discovered multiple times as loop participant", cur->NodeName().c_str(), cur->OperationName().c_str());
// TODO: Once we forbid FormRecurrentLoops() from non-NULL, can we ever re-hit a loop here? If not, then turn bFound into a LogicError().
if (!bFound)
{

Просмотреть файл

@ -632,7 +632,7 @@ public:
}
// Update K (number of url pairs that have smaller or equal gain), rk (rank of
// score in descending order) and and m_pairwiseDifferences (save for gradient
// score in descending order) and m_pairwiseDifferences (save for gradient
// computation).
size_t pairCount = 0;
for (auto &qu : m_queryUrls)

Просмотреть файл

@ -1246,7 +1246,7 @@ void SectionStats::InitCompute(const ConfigArray& compute)
m_rms = 0.0; // root mean square
// second pass measures
m_varSum = 0.0; // accumulated sum of difference between the mean and and the value squared
m_varSum = 0.0; // accumulated sum of difference between the mean and the value squared
// compute after second pass
m_variance = 0.0;

Просмотреть файл

@ -446,7 +446,7 @@ private:
double m_pvariance;
// second pass measures
double m_varSum; // accumulated sum of difference between the mean and and the value squared
double m_varSum; // accumulated sum of difference between the mean and the value squared
// compute after second pass
double m_variance;

Просмотреть файл

@ -514,7 +514,7 @@
"training_data = data[\"2001-02-05\":\"2009-01-20\"] \n",
"\n",
"# We define our test data as: data[\"2008-01-02\":]\n",
"# This example allows to to include data up to current date\n",
"# This example allows to include data up to current date\n",
"\n",
"test_data= data[\"2009-01-20\":\"2016-12-29\"] \n",
"training_features = np.asarray(training_data[predictor_names], dtype = \"float32\")\n",

Просмотреть файл

@ -593,7 +593,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Above, we use the function `element_select` which will return one of two options given the condition `is_first_label`. Remember that we're working with sequences so when the decoder LSTM is run its input will be unrolled along with the network. The above allows us to to have a dynamic input that will return a specific element given what time step we're currently processing.\n",
"Above, we use the function `element_select` which will return one of two options given the condition `is_first_label`. Remember that we're working with sequences so when the decoder LSTM is run its input will be unrolled along with the network. The above allows us to have a dynamic input that will return a specific element given what time step we're currently processing.\n",
"\n",
"Therefore, the `decoder_input` will be `label_sentence_start_scattered` (which is simply `<s>`) when we are at the first time step, and otherwise it will return the `past_value` (i.e. the previous element given what time step we're currently at) of `label_sequence`.\n",
"\n",

Просмотреть файл

@ -128,7 +128,7 @@ class MinibatchSource(cntk_py.MinibatchSource):
**Important:**
Click :cntkwiki:`here <BrainScript-epochSize-and-Python-epoch_size-in-CNTK>`
for a description of input and label samples.
max_sweeps (`int`, defaults to :const:`cntk.io.INFINITELY_REPEAT`): The maximum number of of sweeps over
max_sweeps (`int`, defaults to :const:`cntk.io.INFINITELY_REPEAT`): The maximum number of sweeps over
the input dataset After this number has been reached, the reader returns empty minibatches on
subsequent calls to func:`next_minibatch`. `max_samples` and `max_sweeps` are mutually exclusive,
an exception will be raised if both have non-default values.