911 строки
81 KiB
Plaintext
911 строки
81 KiB
Plaintext
{
|
|
"cells": [
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 1,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"from IPython.display import Image"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {
|
|
"nbpresent": {
|
|
"id": "29b9bd1d-766f-4422-ad96-de0accc1ce58"
|
|
}
|
|
},
|
|
"source": [
|
|
"# CNTK 102: Feed Forward Network with Simulated Data\n",
|
|
"\n",
|
|
"The purpose of this tutorial is to familiarize you with quickly combining components from the CNTK python library to perform a **classification** task. You may skip *Introduction* section, if you have already completed the Logistic Regression tutorial or are familiar with machine learning. \n",
|
|
"\n",
|
|
"## Introduction\n",
|
|
"\n",
|
|
"**Problem** (recap from CNTK 101):\n",
|
|
"\n",
|
|
"A cancer hospital has provided data and wants us to determine if a patient has a fatal [malignant][] cancer vs. a benign growth. This is known as a classification problem. To help classify each patient, we are given their age and the size of the tumor. Intuitively, one can imagine that younger patients and/or patient with small tumor size are less likely to have malignant cancer. The data set simulates this application where the each observation is a patient represented as a dot where red color indicates malignant and blue indicates a benign disease. Note: This is a toy example for learning, in real life there are large number of features from different tests/examination sources and doctors' experience that play into the diagnosis/treatment decision for a patient.\n",
|
|
"[malignant]: https://en.wikipedia.org/wiki/Malignancy"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 2,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/html": [
|
|
"<img src=\"https://www.cntk.ai/jup/cancer_data_plot.jpg\" width=\"400\" height=\"400\"/>"
|
|
],
|
|
"text/plain": [
|
|
"<IPython.core.display.Image object>"
|
|
]
|
|
},
|
|
"execution_count": 2,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Figure 1\n",
|
|
"Image(url=\"https://www.cntk.ai/jup/cancer_data_plot.jpg\", width=400, height=400)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"**Goal**:\n",
|
|
"Our goal is to learn a classifier that classifies any patient into either benign or malignant category given two features (age, tumor size). \n",
|
|
"\n",
|
|
"In CNTK 101 tutorial, we learnt a linear classifier using Logistic Regression which misclassified some data points. Often in real world problems, linear classifiers cannot accurately model the data in situations where there is little to no knowledge of how to construct good features. This often results in accuracy limitations and requires models that have more complex decision boundaries. In this tutorial, we will combine multiple linear units (from the CNTK 101 tutorial - Logistic Regression) to a non-linear classifier. The other aspect of such classifiers where the feature encoders are automatically learnt from the data will be covered in later tutorials. \n",
|
|
"\n",
|
|
"**Approach**:\n",
|
|
"Any learning algorithm has typically five stages. These are Data reading, Data preprocessing, Creating a model, Learning the model parameters, and Evaluating (a.k.a. testing/prediction) the model. \n",
|
|
"\n",
|
|
"We keep everything same as CNTK 101 except for the third (Model creation) step where we use a feed forward network instead.\n",
|
|
" \n",
|
|
"\n",
|
|
"## Feed forward network model\n",
|
|
"\n",
|
|
"The data set used is similar to the one used in the Logistic Regression tutorial. The model combines multiple logistic classifiers to be able to classify data when the decision boundary needed to properly categorize the data is more complex than a simple linear model (like Logistic Regression). The figure below illustrates the general shape of the network."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 3,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/html": [
|
|
"<img src=\"https://upload.wikimedia.org/wikipedia/en/5/54/Feed_forward_neural_net.gif\" width=\"200\" height=\"200\"/>"
|
|
],
|
|
"text/plain": [
|
|
"<IPython.core.display.Image object>"
|
|
]
|
|
},
|
|
"execution_count": 3,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Figure 2\n",
|
|
"Image(url=\"https://upload.wikimedia.org/wikipedia/en/5/54/Feed_forward_neural_net.gif\", width=200, height=200)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"A feedforward neural network is an artificial neural network where connections between the units **do not** form a cycle.\n",
|
|
"The feedforward neural network was the first and simplest type of artificial neural network devised. In this network, the information moves in only one direction, forward, from the input nodes, through the hidden nodes (if any) and to the output nodes. There are no cycles or loops in the network\n",
|
|
"\n",
|
|
"In this tutorial, we will go through the different steps needed to complete the five steps for training and testing a model on the toy data."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 4,
|
|
"metadata": {
|
|
"collapsed": true,
|
|
"nbpresent": {
|
|
"id": "138d1a78-02e2-4bd6-a20e-07b83f303563"
|
|
}
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Import the relevant components\n",
|
|
"from __future__ import print_function # Use a function definition from future version (say 3.x from 2.7 interpreter)\n",
|
|
"import matplotlib.pyplot as plt\n",
|
|
"%matplotlib inline\n",
|
|
"\n",
|
|
"import numpy as np\n",
|
|
"import sys\n",
|
|
"import os\n",
|
|
"\n",
|
|
"import cntk as C\n",
|
|
"import cntk.tests.test_utils\n",
|
|
"cntk.tests.test_utils.set_device_from_pytest_env() # (only needed for our build system)\n",
|
|
"C.cntk_py.set_fixed_random_seed(1) # fix a random seed for CNTK components"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Data Generation\n",
|
|
"This section can be *skipped* (next section titled <a href='#Model Creation'>Model Creation</a>) if you have gone through CNTK 101. \n",
|
|
"\n",
|
|
"Let us generate some synthetic data emulating the cancer example using `numpy` library. We have two features (represented in two-dimensions) each either being to one of the two classes (benign:blue dot or malignant:red dot). \n",
|
|
"\n",
|
|
"In our example, each observation in the training data has a label (blue or red) corresponding to each observation (set of features - age and size). In this example, we have two classes represented by labels 0 or 1, thus a binary classification task."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 6,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Ensure we always get the same amount of randomness\n",
|
|
"np.random.seed(0)\n",
|
|
"\n",
|
|
"# Define the data dimensions\n",
|
|
"input_dim = 2\n",
|
|
"num_output_classes = 2"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Input and Labels\n",
|
|
"\n",
|
|
"In this tutorial we are generating synthetic data using `numpy` library. In real world problems, one would use a reader, that would read feature values (`features`: *age* and *tumor size*) corresponding to each observation (patient). Note, each observation can reside in a higher dimension space (when more features are available) and will be represented as a tensor in CNTK. More advanced tutorials shall introduce the handling of high dimensional data."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 7,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Helper function to generate a random data sample\n",
|
|
"def generate_random_data_sample(sample_size, feature_dim, num_classes):\n",
|
|
" # Create synthetic data using NumPy. \n",
|
|
" Y = np.random.randint(size=(sample_size, 1), low=0, high=num_classes)\n",
|
|
"\n",
|
|
" # Make sure that the data is separable\n",
|
|
" X = (np.random.randn(sample_size, feature_dim)+3) * (Y+1)\n",
|
|
" X = X.astype(np.float32) \n",
|
|
" # converting class 0 into the vector \"1 0 0\", \n",
|
|
" # class 1 into vector \"0 1 0\", ...\n",
|
|
" class_ind = [Y==class_number for class_number in range(num_classes)]\n",
|
|
" Y = np.asarray(np.hstack(class_ind), dtype=np.float32)\n",
|
|
" return X, Y "
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 8,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Create the input variables denoting the features and the label data. Note: the input \n",
|
|
"# does not need additional info on number of observations (Samples) since CNTK first create only \n",
|
|
"# the network tooplogy first \n",
|
|
"mysamplesize = 64\n",
|
|
"features, labels = generate_random_data_sample(mysamplesize, input_dim, num_output_classes)\n"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Let us visualize the input data. \n",
|
|
"\n",
|
|
"**Caution**: If the import of `matplotlib.pyplot` fails, please run `conda install matplotlib` which will fix the `pyplot` version dependencies"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 9,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYMAAAEPCAYAAACgFqixAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3Xd8FHX+x/HXJwkpG5II0kFQpClVqjQJCkjxQD1QsWP5\neXZBREEUUE8FQUVFUdScJ7YDATkUKUIoKgLSETlClQ6BQMiGtP38/tiFixyQtptJ+Twfj32QGWa+\n817KfnbmO/P9iqpijDGmdAtyOoAxxhjnWTEwxhhjxcAYY4wVA2OMMVgxMMYYgxUDY4wxBLgYiMhH\nInJARNZlWzdGRDaJyBoR+VpEogOZwRhjTM4CfWYQB1x7xrq5QENVbQZsAYYGOIMxxpgcBLQYqOpS\n4OgZ6+arqse3uAyoEcgMxhhjcuZ0n8E9wGyHMxhjTKnnWDEQkWeBDFX93KkMxhhjvEKcOKiI3A30\nBK7OYTsbOMkYY/JBVSUv2xfGmYH4Xt4Fke7AU0BvVU3LaWdVLbGvESNGOJ7B3p+9N3t/Je+VH4G+\ntfRz4CegnojsEpEBwNtAWWCeiKwSkXcDmcEYY0zOAnqZSFVvPcvquEAe0xhjTN45fTdRqRYbG+t0\nhIAqye+vJL83sPdXGkl+ry8VBhHRopzPGGOKIhFBi2AHsjHGmCLOioExxhgrBsYYY6wYGGOMwYqB\nMcYYrBgYY4zBioExxhisGBhjjMGKgTHGGKwYGGOMwYqBMcYYrBgYY4zBioExxhisGBhjjMGKgTHG\nGKwYGGOMwYqBMcYYrBgYY4zBioExxhisGBhjjMGKgTHGGKwYGGOMwYqBMcYYrBgYY4zBioExxhis\nGBhjjCHAxUBEPhKRAyKyLtu6ciIyV0Q2i8gcEYkJZAZjjDE5C/SZQRxw7RnrngHmq2p9YAEwNMAZ\njDHG5CCgxUBVlwJHz1jdB/jE9/MnwPWBzGCMMSZnTvQZVFLVAwCquh+o5ECGYi81NZWkpCSnYxhj\nSoii0IGsTgcoTlSVYU8+SfnoaGpUqkTXdu2sKBhjCizEgWMeEJHKqnpARKoAB8+38ciRI0//HBsb\nS2xsbGDTFXFffvkl306cyB+ZmZQDHvz1Vx6//34+mTLF6Wi5cvz4cSa88w4H9+yh87XX0rt3b6cj\nGQepKidPniQiIsLpKMVafHw88fHxBWpDVAP7xVxELgb+raqNfcujgSOqOlpEngbKqeoz59hXA52v\nuHn8b3+j1vvvM8i3vBH4a7Vq/L5nj5OxciUlJYV2TZvSePdumqWl8b7LxYMjRjBoyBCnoxkHzJ49\nm7tuvpmklBTq1azJlO++47LLLnM6VokgIqiq5GWfQN9a+jnwE1BPRHaJyADgVaCriGwGrvEtm1y6\n6NJL+TE8HI9veakINWrUcDRTbk2fPp1q+/fzaVoag4E5bjejRozAnwU/PT2dBQsWMGfOHJKTk/3W\nrvGvXbt2cWffvkxPTibN4+HRnTvp3aULHo8n551NQAT0MpGq3nqO3+oSyOOWZA89/DAzPv+ctlu3\nUkmEX4OCmPvRR07HyhW3201lVU59XakMnMzIQFURydOXmLNKTk6mS9u2ZO3ahUuEvZGRLPzlFy66\n6KICt238a/Xq1bQJCaG9b/kBVZ47coQDBw5QtWpVR7OVVk70GZgCcLlcLPjlFxYsWIDb7ebjjh2p\nWLGi07FypWvXrjwrwudAM+CF8HCu79KFoCD/nKCOfukl6ick8ElaGgKMTElhyMMP88XMmX5p3/hP\nlSpV+C0rixNAWWALkKpKuXLlHE5WelkxKIZCQ0Pp3r270zHy7JJLLmHWDz8w+IEHOHjwIJ27dmXc\nu+/6rf0dmzfT1VcIAK7JymJOQoLf2jf+07p1a67t25eWU6fSGpinyhuvv054eLjT0UqtgHcgF4R1\nIJu8eGPsWGaOGMEst5swYEBYGDG33cY7xeQyWmmjqixcuJCdO3fSvHlzmjZt6nSkEiM/HchWDEyJ\nkZmZyf23387X06cTIkKrli2ZOns2UVFRTkczplBZMTAGSExMJDMzk0qVKvmlY9qY4saKgTHGmKL3\nnIExxpjiwYqBMcYYKwbGGGOsGBhjjMGKgTHGGKwYGGOMwYqBMaYIyMzMZPPmzezatcvpKKWWFQNj\njKMOHjzIlY0bc22LFrSoX5+7brqJrKwsp2OVOlYMjDGOevy++4hNSGB7Sgo7T55kx7ffMumDD5yO\nVepYMTDGOGr9mjXckZmJAC6gr9vNuhUrnI5V6lgxMMY4qm79+vw7OBiADGB2RAR1GzVyNlQpZGMT\nGWMctWvXLrq0a0d0cjJJWVk0aNWKaXPmEBoa6nS0YssGqjOmiFNV3nvnHd5//XUAHnrqKR546CGH\nUznP7XazZs0aIiIiaNq0qd9mvyutrBgYU8R9EhfHK488wsduNwrc7XIxcuJEbrvjDqejmRLERi01\npoibGhfH391u2gHtgZfcbr7+5BOnYxljxcCYwuSKiuJAtuX9vnXGOM0uExlTiFauXEmP2FgeTEnB\nI8L7LhdzlyzhiiuucDpavmVkZLB27VoAmjZtSpkyZRxOZKzPwJhiYMOGDXwaF4cEBXHngAFcfvnl\nTkfKt2PHjtG9Y0eObd8OQMwll/D9kiXExMQ4nKx0s2JgjClUAx98kOS4OCalpQFwf1gYUQMG8MZ7\n7zmcrHQLaAeyiESLSH0RqZn3aMaYkug/69fTOy0NAQTonZbGf9avdzqWyYfzFgMRiRKRISKyBlgF\nfALMFJE/ROQLEelYKCmNMUVSo5Yt+TI8nCwgE/gyPJxGLVs6Hcvkw3kvE4nIfOAzYKaqJmZbHwS0\nAu4AVqnqxwEJZ5eJjCnSUlJS6NOlC5vXrQOgfpMmfDN/PpGRkQ4nK92sz8AUutTUVEYNG8bKpUup\nVacOf3/jDapUqeJ0LFOIPB4PCQkJANSpU8eeHi4CAloMRORy4GIg5NQ6VZ2Zl4Od0d5A4F7AA6wH\nBqhq+hnbWDEo4m649lqCFi/m/pMnWRgSwowqVVj1++/2zdAYBwWsGIjIJKAl8BveD28AVdU785zS\n2141YCnQQFXTReQr4FtV/ecZ21kxKMISExOpXa0ah9LTOTWkWMeoKIZPmcK1117raDZjSrNA3k3U\nAWiuqrep6h2+V74KQTbBQKSIhOAdxnxvAdszhUxEUODUnFSKtxPRLhN4qSrvjB/PFbVr06JOHeI+\n+sjpSMacU27/1/4C1PPXQVV1LzAO2AXsAZJUdb6/2jeFo3z58vylZ09uiIhgCvBwaCjuSpXo0KGD\n09GKhI8nTWLCsGG8s307r2/dyouPPcbUKVOcjmX87PDhwwx54gluv/563n/3XTweT847FUEhOW8C\nwEfALyKyB0jDe0uxqmrz/BxURC4A+gC1gGPAVBG5VVU/P3PbkSNHnv45NjaW2NjY/BzS+ImqkpmZ\neXrIgX/861+89vLLfLFkCbXq1WPhSy8RERHhcMqiYUpcHKPdbtr7lke53UyJi6Nvv34FbvvYsWM8\n9fDDLP/xR2pecgnj3n+funXrFrjdkiAxMZGEhARq1aoV8JsZkpOT6diiBVfv20fXjAzenTePLZs2\nMfbttwN63DPFx8cTHx9fsEZUNccXsAW4EagLXHrqlZt9z9FeX2BStuU7gHfOsp2aomPa119rhago\nDRbR1g0b6o4dO5yOVKTd0KWLTgJV32ss6F39+hW4XY/Ho93at9cBYWG6EnRcUJDWuPBCPXLkiB9S\nF28zv/lGL3S5tEV0tJYLD9dJEycG9HhfffWVditb9vTf8WHQsJAQzcjICOhxc+L77Mzb53KuNoJl\neW04h/Za472DKBzvWcY/gIfPsl1A/qBM3v3+++9aISJCfwHNBP17UJC2vOwyp2MVaT/99JNWcLl0\nFOhw0AqRkbpmzZoCt3v48GGNDg3VjGyFplt0tM6cOdMPqYuvEydOaHmXS5f5/kwSQCtEROj27dsD\ndszJkyfrDdmKQQpoaHCwpqWlBeyYuZGfYpDbPoOVIvJPEeknIr1PvQpwNrIcmAqsBtb6CsIH+W3P\nBN6yZcvoFhxMa7w9/0M9Htb/5z+43W6noxVZbdu2Zd6PP3L8scdIGziQxStW0LRp0wK3GxoaSobH\nwwnfsgJHVQkLCytw28XZnj17uCAoiDa+5UuBxqGhbNmyJWDH7NatG7+UKcPYoCAWA/0jIujXu3ex\nnLIzt7eWfnqW1aoFv6Mop+NqbvKZwJszZw5P9e3LyhMnCMV7WtchPJwktxuRPN3B5jhVZenSpezc\nuZNmzZrRqBhOvv74Aw+wbPJk7na7WRQezs66dVm0cmWx/BDyl5SUFGpWqsQst5u2eK9tt4uIYOWm\nTdSqVStgx92yZQtDH32Ufbt30+Gaaxg1ejTh4eEBO15u2BPIJmA8Hg/9+/ThP/HxXOHx8B3w+vvv\nc+vttzsdLc8e/7//47vPP6elCAuyshgzYQJ3DRjgdKw8UVU++vBDli9eTM06dRg4eLA96Ad89+23\n3HnzzVwUHMzO9HTGvvUW99x/v9OxCl0gHzr7CHhSVZN8y+WAMaoa0D9lKwZFi8fjYfbs2ezdu5c2\nbdrQpEkTpyPl2YoVK7ipc2fWpaQQBWwGWoWFcTApyfFvc8Y/jh49ytatW7nooouoXLmy03EckZ9i\nkNtbS5ufKgQAqnpURFrkKZ0p9oKCgujVq5fTMQpk7969NAwO5tREk/WBcBGOHDlCtWrVnIxm/KRc\nuXK0tJFT8yy3HchBInJ66iLfmYHNbWeKnWbNmvFLZia/4O14jQOiLrjAsW+QK1asoHv79rRp0IAR\nQ4eSmZnpSA5jcntm8Cbws28MIYCbgTGBiWRM4NSqVYuPv/ySnv37czItjRpVqvDN7NkEBwcXepYt\nW7bQs3NnxqSkUA947q23GHz0KG9OnFjoWYzJy6ilTYCrfYsLVHVdwFL995jWZ2ACwuPxcOLECaKi\nogJyN1RGRga7d++mQoUKREVFnXWbcePGsX3oUN7JyABgN3BF2bIcSk72ex5TugR02ktVXaeqb/pe\nAS8ExgRSUFAQ0dHRASkEa9asoU61anRq3JjqFSvywTnmAw4NDeV4tjOSY0BoSG5P1o3xL7u11Bg/\nUlUurVqVlw4c4FZgK9A+IoK5y5b9z91Xhw4domXDhvQ9epR6mZm87nLx4KhRPDF4sCPZTckR0DMD\nY0zOjh8/zsEjR7jVt3wpEBsczLp1/3syXbFiRX5eswZ58EGW33wzL8XFWSEwjrEzA+O41NRU3p0w\ngV0JCbTq0IHbbrut2D3VfIrH46FSTAzfnDhBeyAJaB4ZyeS5c2nXrp3T8UwpEbDnDESkD/AqUA3v\nOEKnhrCOznNKY7LJyMige8eOlNu4katOnuT1yZNZu3w5r731ltPR8iUoKIjJU6Zwfd++NAsJYVNG\nBrfde68VAlPk5fYJ5ATgBlVdH/hIfzqunRmUcPPnz2fIDTew8sQJgoAjQI2QEA4ePUrZsmWdjpdv\ne/bsYf369VSrVq1YPqltirdAPoF8oLALgcm9/fv389zgwexKSKB1p04Mf+GFYjOCpdvtplJQ0OnO\nqxggLDiYkydPFutiUL16dapXr+50DGNyLbdnBm8CFYEZeGc6A0BVZwYump0Z5MaJEydocdll9Nm/\nn9jMTCZGRBB+9dX8a9Ysp6PlSmJiIk3q1mVYUhKxqkwIDeW3Jk1YuHx5se03MMZpgRyozoawLqK+\n/fZbXuvfn3jfg0ongQplyrD74EEuuOACZ8Pl0qZNm3j8nnvYuXMnra+8kvEffkj58uWdjpVnHo+H\nadOmsW3bNpo1a0a3bt0K7djTp01jSlwcrqgoBj77LA0bNiy0Y5uiJz/FwG+zlwXihc10lqNvv/1W\nO0ZF/WmmJVdIiCYlJTkd7U9WrlypbS6/XKtER2ufa67R/fv3Ox3Jrzwej97Zt6+2iIzUQSEhWicy\nUkc9+2yhHPsfH3+sF7tcGgc6WkQrREbq77//XijHNkUT+Zjp7LxnBiLypKqOE5HXz1FIBuWp8uSR\nnRnkzO1206phQ67Zs4dOGRlMioigQo8eTP76a6ejnXbgwAGa1qvH2OPHiQXGh4Sw9LLL+Gnt2hJz\nKejXX3+lb6dO/JaSQgRwEKgTGsqOffsCfpbTvE4d3ti6lU6+5WEieAYN4tWxYwN6XFN0BaIDeavv\n1435i2QCzeVyEb98OS8MG8Y/ExLo2KkTQ5591ulYf/Lzzz/TEjg1Dc6YzEzKbd5MYmIiFSpUcDKa\n3xw5coSLQ0KI8C1XAi4oU4akpKSAF4OsrCyy3y4QrkqyjX5q8ui8xUBVZ/h+/ahw4pj8qFixIm9P\nmuR0jHOKjo7mD4+HLLzzJx8A0lVL1MxczZs3Z5MqXwHdgQ+DgnCVL0/NmjUDfux7HnmE+55/njFu\nNweAt10u5t51V8CPa0qW8w5HISLvichl5/i9CBG5U0T6ByaaKSk6depEtSuuoLvLxSggNjKSoU8/\nTURERI77FhcXXnghs374gZdr16Z6aCgzmjThu/h4Qgph4LnHBg3isbFjeaNVK2Z07syMuXO54oor\nAn5cU7Lk1GfQAngWqId3DvRDQDhQF6gA/AOYoKonAxLO+gxKjIyMDOLi4ti1Ywetr7yS3r17Ox3J\nmBIrkLeWRgOtgapAKrBJVQPej2DFwBhj8i5gxcApVgyMMSbvAjkchTHF1rFjx1i8eDFlypQhNjaW\n8PBwpyMZU+TYmYEp0Xbu3Els69ZcevIkblVOVq3KwuXLiYmJcTqaMQET8MltRKR4jH5mjM8zjzzC\nPYmJzD9+nB+Tk7lixw5effFFp2MZU+TkqhiISGsRWQ9s8S03FZG3A5rMGD/YuW0bsVlZgHcSjk7p\n6ezassXZUMYUQbk9M3gLuA5IBFDVtUDnQIUyJc+qVat48803+eyzz0hPTy+047bp2JEJ4eFkAPHA\n88HB7Dt8mA0bNhRaBmOKg9wWgyBV3XnGuqyCHFhEYkRkiohsEpGNItKmIO2ZoutfX31Fjw4dSHjm\nGSY98ADdO3YkIyPjvPt4PB7efuMNrr/6au6//XZ27jzzn1/uvDRuHO527YgJCeE64G9ZWXT6+Wc6\nX3kla9asyVebxpRIuRnNDvga73MGq/COKPAEMCWvo+Kd0eY/gAG+n0OA6LNsU5CB+0wRUSUmRpf7\nRlXNAu0YGalffPHFefd5+okntLXLpVNAnw8K0hoXXqgHDx7Md4YeHTponC+Dgo4Dvfumm/LdnjFF\nGfkYtTS3ZwYPAoOAmniHlrnSty5ffA+xdVTVON8nfqaqHs9ve6boUlUST5ygkW85CGiYlcXhw4fP\nu8+E995jhttNX2CUx0OH1FRmzJiR7xxZGRlkHy6uPHAyJSXf7RlT0uS2GASr6i2qWsH3uqWAx70E\nOCwicSKySkQ+EJGSM1CNOU1EuKZdO54pU4YU4GfgaxGuuuqq8+6nqgRnWw72rcuvW+6/n6dcLuKB\nucAIl4ub77sv3+0ZU9Lk9qGz3SLyOXCfqp6a9nIu0LwAx20OPKyqK33Taj4DjDhzw5EjR57+OTY2\nltjY2Hwe0jjl02nTuPPGG7nw55+pGB3NxA8+OO8k8SLC/ffey18/+YSn3W7WBwWxIDSUsQUYz+ju\ne+4hPS2NIePHExQUxCvDh3P99dfnuz1jipL4+Hji4+ML1EZuxyZaDcQBdwL9VHW7iKxW1XwNjSgi\nlYGfVbW2b7kD8LSq/uWM7bQg3wZN8ZWVlcXro0ezYNYsKlatyogxY7j00kudjmVMsRDIgepWqWpz\nEbkKmAg8CfxdVfN7ZoCILALuV9X/iMgIwKWqT5+xjRUDY4zJo0A+gSwAqroY6AoMB+rnLd7/eAz4\nTETWAE2BlwvYnjEmH9asWUPf7t3p2ro1b44di8fjcTqScUBu+wxOX75R1T0iEgt0LMiB1fvgWquC\ntGGMk1atWsXUr74iLDyce+67j4suusjpSHmWkJBAt44dGXHiBLWB5zdu5FhSEiNeesnpaKaQ5TS5\nTX9V/UJEHjvb76vqWwFLhl0mMkXXggULuOUvf+FBt5ujwcFMKVuWn9es4eKLLw7ocZOSkhj6xBNs\nXLWK+o0b88r48QWaR/qVl19m/4gRjPfNmbwZ6FKuHH8cOeKnxMYJgRjCupzv14r5i2RMyfTiU0/x\njtvNTQBZWbiSk3nn9dcZ+1bgvh9lZWXRKzaWRps2MSo9nRm//07XFSv4ZcMGQkND89VmcEgIGfLf\nz4x0IDg4+Nw7mBLrvMVAVd/1/fpc4cQxpnhIOXGC6tmWa3g8rE9KCugxt2zZwt6EBJakpxMExGZk\n0Hj/ftatW0fLli3z1Wb//v1p9fe/UzUri9oeD393uXh08GD/BjfFQm5HLX1FRKJFJERE5ojIARG5\nNdDhjCmII0eOsGDBAtatW1egB9bO5obbbuNJl4t1wGJgjMvFDbcG9r9EcHAwGaqnBwXzAGkeT4G+\nyV900UUs/fVXdvXvzzfdu/PMhAkMGjLEL3lN8ZLbW0vXqGozEbkeuB7v0BQLVbVpQMNZn4HJpxUr\nVvCXLl2oJ8KOjAx6/PWvTPzkE0TydBn1nDweDy8+9xyfx8URFhrKkBde4PY77/RL2+eiqvS+5hqC\nli3jptRUZoaHc6hJE+b99JNd2jF/EsjnDDaoaiMR+QCYoarfnSoQ+Q2bq3BWDEw+NaxVi5G7dtEP\nSAHaRUby4uef07sATzEXBWlpabz2yitsWLGC+k2b8vTw4bhcLqdjmSImkHMgzxaRDXiHrX5YRCoA\naTnsY4xjEvbsoZfv50i819cTEhKcjOQXYWFhDM82RIsx/pKrPgNVfQq4GmihqhnASeDGQAYzpiCa\n1a9PnO+S0EHg2zJlaNo0oFc1jSnWcnWZyCl2mcjkJD09nY0bNxIWFsZll112uk9g8+bN9OrcGU9y\nMokZGQx+6imes7mPTSkRsD4Dp1gxKFxHjx5l0aJFlClThmuuuYbw8HCnI53XwYMH6da+PWn79+P2\neGjSpg1ff//96XvuMzIy2LFjB+XKlSvQg1nGFDcBKQbi/apVRVX3FSRcflgxKDzbtm0jtk0bLk9L\nIxlIrVqVhcuXExMT43S0c7rjxhupMmsWYzIyyARuiIig0/PP89QzzzgdzRhHBWSgOt+n8bx8pzLF\nwpCHHuLhI0f4PjmZpcnJNN2xg9deLtpjB25av56+GRkIUAa4PjWV31atcjqWMcVSbkctXSMi+Zq7\noDTzeDyMGvUyNWpcTu3azfjnPyc7Hemc/tixg46+0SoF6JCezu6tW50NlYPLGjfmX2XKoHiHUZge\nEcHlzfM9qroxpVpuby29AlghIlvx3rYteE8a7H/eeYwZ8wZjxkzD7Z4MJPPgg3dw4YXl6NWrV477\nFrYrO3Xi7R07aJmWRirwkcvFHZ07Ox3rvMZNnMi1HTrQYN8+3B4Pzdu14/FBg5yOZUyxlNuHzs46\nxZSqBvSrY3HvM2jYsB2//fYyEOtb8x633LKSL774yMFUZ5eSksKt11/P/Ph4PMC9d93FWx98QFBQ\nbk8ec3b06FEG9OvH94sWUb5sWcZNmED/Ag7hkJGRwaZNmwgNDaV+/fp+e8LYmOIsYJPb+D70w/FO\nbNMVCA90ISgJypaNBP7b7y6yj+joyELNkJWVxeDBw7jggmpceGFNXnvtjbNuFxkZyTfz5rEvMZHE\nY8d458MP/VoIAO695RYqL1nC4cxMZiYlMej++1m+fHmB2ixTpgxNmjShQYMGVgiMKYDcnhk8AjwE\nzPCt6gNMODWqaaAU9zODhQsX0qvXTaSmPkJQ0HHKlv2MX39dSp06dQrl+LNmzWLUqDGsWbOTzMzp\nQCguV1/ef/95br+98McZjA4PZ0daGuV9y4NCQqjy978zxAZGM8avAjnt5f8BrVV1mKoOA9oAf8tr\nwNKmc+fOLFnyPU8+mcozz0SyZs3PhVYI3nprAjff/DgrV15PZmYX4BagBm73M0yd+l2hZDhThZgY\nNvh+VmBjaKjd/3+GtLQ0m3bSOCK3Zwbr8Q5Fke5bDgNWqmrjgIYr5mcGToqJqcLx4wuAy31r/gpc\nS1DQLgYMSOLDD98p9EwzZ87kvltuoZ/Hw+aQEFLr1OGHZcuK/MNtheHIkSP0792b+GXLKBMczMuv\nvMJj1hlu8imQo5YOAfoDX/tW3QB8oapj85wyD6wY5F94eDRpaduAU9+87yc4eD3R0X+wevVP1KpV\ny5Fc69atY+HChZQvX56bbrqJsLAwR3IUNf169qTi/Pm8lZHBH8DVLhcfzJhB165dnY5miqGADkch\nIq2BDr7FJaq6Io/58syKQf7deuu9zJhxiNTUl4DfKFPmbwwa9Dcee+wxqlWr5nQ8v9i8eTMHDhyg\nUaNGlC9f/vT67du388Izz3B43z669OnDowMH+r0z3N+qxsSw4vhxaviWnxdBhg9n1AsvOJrLFE+B\nHMIa4HfgxKl9RKSJqq7Ly8FM4fn44wlERw/l22/7c+GFF/Luu9/Rrl07p2P5hary6KNP8fHHnxEa\nWhvVrXz//XTatm3LgQMH6NCiBX87dozeHg+jf/2VfXv28Orrrzsd+7yqVKp0uhh4gJUREfQuIUXb\nFA+5vUw0Am8n8na8fX/gfejsqgBmszMDc1Y//PADffo8RErKciAG+IaqVZ9k794E3n//fZYMHMjk\n1FQA9gKXhYdzzLdcVC1dupQbunenmwg7gJD69Zn74492Gc3kSyDPDG4FaquqTWhjHLdlyxY8nqvw\nFgKA69i//0aysrJQVbL/D/A9Kl/4IfOoQ4cOrNiwgUWLFhETE0PPnj1Pj75qTGHIbTHYCERhs5uZ\nIqBRo0aIvArsB6oAn1GzZgOCg4O5/vrreWHoUP6elkZjj4dXXS4euPdehxPnzsUXX8zFF1/sdAxT\nSuX2MlELvA+crSNbQVDVgM52ZpeJzs7j8TB9+nR27txJixYt6NSpk9ORCt2oUa/wyiujCQ2tTFjY\nSX744d80adIEgK1bt/LiM89wyNeB/PiTTxb5DmRj/CmQt5ZuAD4G1uPt3wJAVX/Ia8i8sGLwv1SV\nG2+8nXkmXMZWAAAYaklEQVTzNpOR0Z6QkG94/vnHePrp0ndP+sGDBzl06BCXXnqpPatgTDaBLAYr\nVLVVvpPlU0kvBseOHWPy5MkcP36cHj160KxZsxz3+fnnn+na9S5SUtbhHS5qN6GhDThy5ACRkYU7\n7pExpmgKZAfyYhF5EZjJny8TFejWUhEJAlYCu1W1d0HaKm6SkpJo2rQthw41Jj29Ji+91I2pUz+h\nR48e593v8OHDBAdfircQAFQnODiSY8eOWTEwxuRbbotBa9+vsdnWKVDQW0sfB34DogvYTrHz4Ycf\ncuBAC9LSvBPeuN1defjhIWzbdv5i0KpVKzyeX4F/A50JCnqXypUrUqVKlcCHNsaUWLkdwrrjWV4F\nKgQiUgPoCXxYkHaKorVr19K9e1/atOnGuHHjz3prY2JiEmlp2Qetq8vx40k5tl2lShVmz55G9eqD\nCQmpQOPGM1mw4N/WQWqMKZBcnRmIyLCzrVfVgkyS+wbwFP+9WbxESEhIoEOHrpw4MQK4hA0bnicp\n6Rgvvvj8n7br2fNa3nrrFtzu7kBNwsMH06vX+c8KTunQoQO7d2/2f3hjTKmV28tEWdl+Dgd64X32\nIF9EpBdwQFXXiEgscM6OjpEjR57+OTY2ltjY2PwetlB89dW/SE29FXgYALe7Nu++2+1/ikHHjh35\n4IPXePLJ20hJOU7v3n2YOPHsE88YY8z5xMfHEx8fX6A2cj1Q3Z92EgkHvlfV2HwdVORl4HYgE4jA\n+0DbNFW984ztit3dRKNHj+a553aRkTHBt2YdFSr04dCh7Y7mMsaUHoGc3OZMYXB6gMU8802SU1NV\na+OddWXBmYWguOrfvz9hYVOAUcDnQG+aNbs8h72MMcZZ5y0GInJqhNLVIrLK91oLbAEKf3aUYqBm\nzZpERUUDi/E+tD2IH3/cwI8//pjntpKTk5k3bx6LFy8mIyPD31GNMea0nPoMlgPNgb7Z1mUC+/01\naJ2qLgIW+aOtoiAzM5P9+7cDm4Fg39p1rF+/nvbt2+e6nZ07d9K27TWkpFTF4zlO3brRLF06B5fL\nFYjYxphSLqfLRAKgqluzvXba6KXnFhISQqVKtYBvfWuSCApaTL169fLUzt/+NpiDB+/m+PElnDix\nmk2bqjFmzDi/5zXGGMj5zKCiiJxz0BtVLdozhjhk+vTP6N79BkRGk56+jQEDbufqq6/OUxsJCdvJ\nyhrsWwri5MkubNr0s//DGmMMOReDYKAs57n10/yvtm3bsn37b2zYsIHKlStTv379PLfRpk1z/vjj\nQ9LSWgKpuFyTad++n//DGmMMOdxaKiKrVLV5IeY58/hF+tbSCRMm8sor48nKyuShh+5h+PBnEPFP\n3UxKSqJbtxtYv34DHk86f/1rXz799AOCg4Nz3jmbKVOmEhc3hagoF8OHD6Jx48Z+yWeMKbr8Pmqp\niKxW1SsKnCyfinIx+PLLr7j33uG43Z8BYbhcd/PSS/cwcOCjfjuGqrJv3z5CQ0OpUKFCnvf/6KM4\nHnvsRdzukYgcxOUazcqVS9iwYSNvvPERQUHC0KEP07NnT79lNsY4LxDFoLyqHilwsnwqysXgL3+5\nlVmzugOnHo/4nhYtXmPlyoBO8ZAndeo0Z+vWN4BTk9/cT4sWa9m4cRcnT04AMoiIGMiMGZ/QrVs3\nB5Ma419Llixh27ZtNGnShCuucOz7rGP8/tCZk4WgqLvggrKI/JFtzR/ExEQ5ludssrKygFPz6L4B\nzGDVqnKcPKnADuAWUlNf4q234pyKaIzfPfXoowzo0YN5jzxCrw4deO/tt52OVCzkaziKwlKUzww2\nb95Mq1ZX4XbfiscTjsv1IQsWfEvr1q1z3rmQvPnm2zz77Hu43UPwjha+Ee+D43uAxnhnMf03ffr8\nyIwZkx1Maox/rF+/nh5XXskGt5sLgO1A07Awdh88SHR06RkpP5CT25gz1K9fn7Vrl/HJJ/8kMzOL\n/v3jadiwodOx/uTxxx8hIiKCt9+eyO+/VyYr69QIItV9r7dwuf7B4MHT8tx2YmIiIkL58uUB+Hzy\nZKZ/+imRMTEMfv55GjVq5Lf3YUxu7du3j/plynCBb/kSoFxICIcPHy5VxSBfVLXIvrzxTEElJSVp\nVFQlhTkKqjBPg4OjtGfPm3Tp0qV5ais1NVV79PirhoZGa2holF533U369ptval2XSz8DfU1EK5Yt\nq//5z38C9G6MObd9+/ZphchIjQf1gH4KWrNCBU1PT3c6WqHyfXbm7fM2rzsU5suKgf/Ex8drTExl\nDQsrp9HRlXThwoX5amfIkOEaEdFH4aRCqkZE9NIKURfqMm+VUQV9KihIhw8d6t83YEwuzZ07VyvH\nxGhYcLDWrV5d16xZ43SkQpefYmCXiUqJTp06kZi4hyNHjlC+fPk8P69wypIlK0lNfRjvwLWQmnov\nErqc7K2VUcXj8RQ8tDH50LVrV/YdPYrb7bZ5wfPA5kosRYKDg6lYsWK+CwFA3bq1KFNmgW9JCQ1d\nSOOmDbjb5WImMBH4wOXi1jvu8EdkY/JFRKwQ5JHdTWTy5ODBg7RuHcuRI+UBDxUqJPPLLwuYPnWq\ntwM5OpqnX3yRVq1aOR3VmFLL7w+dOa20FoPZs2dz//0DSUo6TOfOXZg8+X1iYorOVNFut5slS5Yg\nInTs2JGIiIgc91m9ejUbNmygXr16tGnTphBSGlN6WTEIsNTUVIYOHcmiRb9wySUXMX78y1x00UV+\nPcbGjRtp1SqW1NTPgcaEhg4jNjaJOXPyfvtnUTF27HhGjBhDUFAnPJ6fePTRu3j11VFOxzKmxLJi\nEGA9evyVRYsgNfUhgoMXU7HiZDZvXu3X+5ffeusthgzZRFrae741JwgJqUB6eqrfBsErTIcPH6ZG\njTqkpa0HLgIOExHRkLVrl1K3bl2n4xlTIhXmHMilzrFjx/jhhzm+b+zXkJU1ipSUS4iPj/frcS64\n4AJCQrYCp4pgAi7XBcWyEAAcOHCA0NAqeAsBQAVCQ+uwb98+J2MZY85gxSCXgoKCUPUAp+YiViCN\nkBD/3p3br18/atY8SlBQN+ApoDMnThzn+edfxJ9nSYcOHWLp0qXs3LnTb22eTe3atQkJSQZOXeb6\ngaysLVx++eUBPa4xJm+sGORSVFQU/fr1x+XqDXxOaOhDVKx4jM6dO/v1OBEREbRu3RI4BLiAODye\nSxg9ehIff/wPvxxj9uzZXHzxZVx33VM0aNCCMWPe8Eu7ZxMREcGcOTOoWHEgISGRxMTcxsyZX1Gh\nQgVSU1NZsGABCxcuJC3NZlI1xknWZ5AHmZmZjBs3nkWLlnPppRcxatSw02Pz+NMllzRlx4444NS8\nQu8A39CrVwVmzfqiQG2np6dTvnw1UlJmAu2APbhcLVm+fH5Ax1ZSVZKTk4mKikJEOHjwIJ3btCEq\nMZEsIKtqVX5Ytoxy5coFLIMxpYX1GQRYSEgITz/9JN999xVvvz02IIUAoGrVKsAK35ICvwJpVKly\nYYHbPnjwIB5PGbyFAKA6ISEtSEhIKHDb5yMiREdHn+77GP7kk3Tfs4dlycksT06m1Y4dvDh8eEAz\nGGPOzYpBEfTee2OIjBwO9AY6IfId5cptZcSIpwvcduXKlSlTxgPM963ZRkbGCho0aFDgtvNi++bN\ndM3w9r8I0DU9ne2//16oGYwx/2XFoAhq2rQpmzevYezYTtx1V23GjXuG339f7ZdnGsqUKcM333xF\nVNStREVdTlhYc8aNe5H69ev7IXnuNW/Xjo/Cw8kATgL/iIigefv2hZrBGPNf1mdQSp04cYLt27dT\ntWrVfM2vXFBut5ubr7uOn376CQ/QtUsXJk+bRmhoaI77GmPOzx46M8WKqnLgwAFEhMqVKzsdx5gS\nw4qBMcYYu5vIGGNM/jhSDESkhogsEJGNIrJeRB5zIocxxhgvp84MMoFBqtoQaAs8LCKFe2+jKVSZ\nmZk8//TTNKtdm05XXMGCBQty3skYU2iKRJ+BiMwA3lbVH85Yb30GJcTTTzzBL5Mm8ZrbzU7gQZeL\neT/+SLNmzZyOZkyJk58+A8fnQBaRi4FmwC/OJjEej4fp06ezcuVKWrduTZ8+fQgK8s/J45eTJzPP\n7aYe0Ar4NTWVGdOmWTEwpohwtBiISFlgKvC4qp5wMktpl5GRQcuW7Vm3bhPQFZhKp06TWLBgll8K\nQkRYGInZlhNDQrjU5qg1pshwrBiISAjeQvCpqn5zru1Gjhx5+ufY2FhiY2MDnq0o2rt3L48/PoyE\nhB106NCK0aNH4XK5/Nb+pEmTfIVgPtAGSGfJkmbMmjWL3r17F7j9oS+9xM2PPMIgt5udwcHMjo5m\nxV13FbhdYwzEx8cXeG4Vx/oMROSfwGFVHXSebazPAO/Twg0aNOfAgb5kZl5NePgHtG9/knnzvvHb\npDcPPfQE7733Dt7BIU59R7iNiROv4oEHHsh3u4cOHWLgwGfZtCmBqlXLU+3CSC6sVIlHnniC6tWr\n+yO6MeYMxeY5AxFpD9wGXC0iq0VklYh0dyJLcbB06VKSk6uQmfky0IWTJyezZMliEhMTc9w3t1q3\nboZIeeAVvCOlbiAo6DuuvPLKfLd58uRJ2rS5mn/9K5JVq4byww9hbEzYx8tjxlghMKaIceQykar+\nCAQ7ceziKDg4GNV0vB/SAmSimkVwsP/+CO+8805mzZrPtGljUX0BkWDGjx9H06ZN893m8uXLOXw4\nlIyM1wHh5MnOrF5dg127dlGrVi2/ZTfGFJzjdxOZnHXs2JEqVdJIS3uA9PTORER8TM+e1/t1Ipig\noCCmTp3Mvn37SExMpEGDBgWe0tNbrDKzrcnyexEzxvhHkXjO4Fysz+C/jh49yogRL7Nly046dmzJ\nkCGD/D7/sr+lp6fTvHlHEhIuJy2tBxERk7nqqhBmz/7ab30dOVm0aBHvvBOHiDBo0AMFuuxlTHFh\nA9UZv0lPT2f37t1UrFiRqKiofLdz/PhxnnvuJX77bStt2zbj2WeHEBYW5sek5zZ//nx6976N1NQR\nQBYRES8wf/43tGvXLsd9jSnOrBgYv1i9ejVdu/bm5MkgMjOPMn786zzwwH1Ox8qzzp37EB/fF7jD\nt+Zd+vT5kRkzPnMyljEBV2zuJjJFl6rSvfuNJCa+RkrKTtLSVjFw4HA2btzodLQ8y8zMAsKzrYkg\nIyPLqTjGFGlWDMyfJCUlkZSUCNziW1OHkJCrWLt2rZOx8uWxx+7G5RoMzACm4nIN55FH7EE3Y86m\naPdAmkIXExNDaGgo6ek/Ae2Ao2RlLefSS590Olqe9evXF4/Hw+uvv4uIMGzYe/To0cPpWMYUSdZn\nYP7H7Nmz6dfvTkJCmpKRsYkHHriL119/2elYxphcsg5k4zd79+5l/fr1VK9enUaNGjkdxxiTB1YM\njDHG2N1Exhhj8seKgTHGGCsGxhhjrBgYY4zBioExxhisGBhjjMGKgTHGGKwYGGOMwYqBMcYYrBgY\nY4zBioExxhisGBhjjMGKgTHGGKwYGGOMwYqBMcYYrBgYY4zBioExxhisGBhjjMHBYiAi3UXkdxH5\nj4g87VQOY4wxDhUDEQkC3gGuBRoC/UWkgRNZnBQfH+90hIAqye+vJL83sPdXGjl1ZtAa2KKqO1U1\nA/gS6ONQFseU9H+QJfn9leT3Bvb+SiOnikF14I9sy7t964wxxjjAOpCNMcYgqlr4BxW5Ehipqt19\ny88Aqqqjz9iu8MMZY0wJoKqSl+2dKgbBwGbgGmAfsBzor6qbCj2MMcYYQpw4qKpmicgjwFy8l6o+\nskJgjDHOceTMwBhjTNFSJDuQS/IDaSJSQ0QWiMhGEVkvIo85nSkQRCRIRFaJyEyns/ibiMSIyBQR\n2eT7e2zjdCZ/EpGBIrJBRNaJyGciEup0poIQkY9E5ICIrMu2rpyIzBWRzSIyR0RinMyYX+d4b2N8\n/zbXiMjXIhKdm7aKXDEoBQ+kZQKDVLUh0BZ4uIS9v1MeB35zOkSAjAe+U9XLgKZAibnEKSLVgEeB\n5qraBO+l5FucTVVgcXg/T7J7BpivqvWBBcDQQk/lH2d7b3OBhqraDNhCLt9bkSsGlPAH0lR1v6qu\n8f18Au8HSYl6xkJEagA9gQ+dzuJvvm9ZHVU1DkBVM1X1uMOx/C0YiBSREMAF7HU4T4Go6lLg6Bmr\n+wCf+H7+BLi+UEP5ydnem6rOV1WPb3EZUCM3bRXFYlBqHkgTkYuBZsAvzibxuzeAp4CS2CF1CXBY\nROJ8l8E+EJEIp0P5i6ruBcYBu4A9QJKqznc2VUBUUtUD4P2CBlRyOE+g3APMzs2GRbEYlAoiUhaY\nCjzuO0MoEUSkF3DAd/YjvldJEgI0ByaoanPAjfeSQ4kgIhfg/dZcC6gGlBWRW51NVShK3BcXEXkW\nyFDVz3OzfVEsBnuAmtmWa/jWlRi+0++pwKeq+o3TefysPdBbRLYBXwCdReSfDmfyp93AH6q60rc8\nFW9xKCm6ANtU9YiqZgHTgHYOZwqEAyJSGUBEqgAHHc7jVyJyN95Ltbku5EWxGKwA6ohILd9dDLcA\nJe2OlI+B31R1vNNB/E1Vh6lqTVWtjffvboGq3ul0Ln/xXVr4Q0Tq+VZdQ8nqKN8FXCki4SIieN9f\nSeggP/MsdSZwt+/nu4Di/KXsT+9NRLrjvUzbW1XTctuIIw+dnU9JfyBNRNoDtwHrRWQ13tPTYar6\nvbPJTB48BnwmImWAbcAAh/P4jaouF5GpwGogw/frB86mKhgR+RyIBS4UkV3ACOBVYIqI3APsBG5y\nLmH+neO9DQNCgXnees4yVX0ox7bsoTNjjDFF8TKRMcaYQmbFwBhjjBUDY4wxVgyMMcZgxcAYYwxW\nDIwxxmDFwBQBIvKsb8jktb7xflrlo41aIrI+j/vEiciNeT1WQYjID76hSBCRpYV43F4iMqqwjmeK\nHysGxlG++bB7As1UtSne4RD+OP9e51SkH5oRkZ7AmlNjUalqhwAc46z/p1X1W+A6EQn39zFNyWDF\nwDitKnBYVTMBfGPi7AcQkVYi8qNvko5lIhLpOwNYLCIrfa8rz2zQN7HOGBH5xbfv/dl+7x3fxB9z\nOcdIlSJyn4gsF5HVvklswn3ra4vIz74zmBdFJDnbPoN9+6wRkRHneK+3kW3Yg1P7i0gnEVmYbcKc\nT8+SqbaI/Jptuc6pZRHZLiKvishKoK+IPCreSXfW+J5QPSUeuO4c2UwpZ8XAOG0uUFO8M9tNEJGr\nAHxDPXwJPOqbpKMLkAocALqoaku8Yx+9fZY278U79HIbvPNj/J+viNwA1PVNSnMX5x6A7WtVba2q\nVwC/+9oD76Q2b/jOYHbjOxMRka6+dlsDVwAtReRs3/rbA79mW85+JtMM7zAXlwOXisifsqnqNiBJ\nRJr4Vg0APsq2yWFVbamq/8I7imoz35/b37Jt8yvQ8Rzv2ZRyVgyMo1Q1Be+on/8HHAK+FJE7gfrA\nXlVd5dvuhG/CjlDgQ980f1OAy87SbDfgTt/YT78A5YG6wFV4R1JFVffhneHqbBr7zj7W4R31saFv\nfVu8o5QCZP/G3Q3oKiKrgFW+7HXP0m453/s9m+Wquk+948OsAS4+yzYfAQN8l4JuPiPDV9l+Xgt8\nLiK3AVnZ1h/EOyy1Mf+jyA1UZ0of3wfgYmCxrxP4TrwfqmebC2EgsF9Vm4hIMN6zhTMJ3jOKeX9a\n6Z1rITf+gXfExw0ichfQ6VTUM46R/edXVHVSDu1mnuf3so8umcXZ/29+jXcgsoXASlVNyvZ72YtM\nL7yFrzfwrIg08hXScM7+52WMnRkYZ4lIPRGpk21VM7yjSG4GqohIC992ZX0f/jHAPt+2d+KdovFM\nc4CHfPNGICJ1RcSFt+Dc7OtTqAp0PkesssB+36Wq27KtXwb09f2cfV7gOcA9IhLpO141Eal4lnY3\ni0jt7G//HMc/K99wxHOA9/DOffs/fMNO11TVRXgvF0X73g9APWBDXo5pSg87MzBOKwu8LSIxeL85\nJwD/p6oZInIz8I54p5V04+03eBf42ncp6Xv+/I34lA/xXmZZ5ftwPAhcr6rTReRqYCPecft/Okem\n54Dlvv1+AaJ86wcCk0VkGN4P5WMAqjpPRBoAP/uGDE4Gbsd72Su7b/EWoG2+5XPd/XS+u6I+wztf\n79xzbB/syxiNt9iMzzZHc2dK0Kxsxr9sCGtjcklEIlQ11ffzzcAtqnpDHvavAnyiqtcWIMOTQLSq\nnuuOpXPtVwn4TFW75vfYpmSzMwNjcq+FiLyD9xv3UbyTjeeaqu4XkUkiUjY/816LyDSgNnB1XvfF\nO5Xsk/nYz5QSdmZgjDHGOpCNMcZYMTDGGIMVA2OMMVgxMMYYgxUDY4wxWDEwxhgD/D+hOCwdBNkA\ncwAAAABJRU5ErkJggg==\n",
|
|
"text/plain": [
|
|
"<matplotlib.figure.Figure at 0x2b60b65af98>"
|
|
]
|
|
},
|
|
"metadata": {},
|
|
"output_type": "display_data"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Plot the data \n",
|
|
"import matplotlib.pyplot as plt\n",
|
|
"%matplotlib inline\n",
|
|
"\n",
|
|
"# given this is a 2 class \n",
|
|
"colors = ['r' if l == 0 else 'b' for l in labels[:,0]]\n",
|
|
"\n",
|
|
"plt.scatter(features[:,0], features[:,1], c=colors)\n",
|
|
"plt.xlabel(\"Scaled age (in yrs)\")\n",
|
|
"plt.ylabel(\"Tumor size (in cm)\")\n",
|
|
"plt.show()"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"<a id='#Model Creation'></a>\n",
|
|
"## Model Creation\n",
|
|
"\n",
|
|
"Our feed forward network will be relatively simple with 2 hidden layers (`num_hidden_layers`) with each layer having 50 hidden nodes (`hidden_layers_dim`)."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 10,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/html": [
|
|
"<img src=\"http://cntk.ai/jup/feedforward_network.jpg\" width=\"200\" height=\"200\"/>"
|
|
],
|
|
"text/plain": [
|
|
"<IPython.core.display.Image object>"
|
|
]
|
|
},
|
|
"execution_count": 10,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Figure 3\n",
|
|
"Image(url=\"http://cntk.ai/jup/feedforward_network.jpg\", width=200, height=200)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"The number of green nodes (refer to picture above) in each hidden layer is set to 50 in the example and the number of hidden layers (refer to the number of layers of green nodes) is 2. Fill in the following values:\n",
|
|
"- num_hidden_layers\n",
|
|
"- hidden_layers_dim\n",
|
|
"\n",
|
|
"Note: In this illustration, we have not shown the bias node (introduced in the logistic regression tutorial). Each hidden layer would have a bias node."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 11,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"num_hidden_layers = 2\n",
|
|
"hidden_layers_dim = 50"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Network input and output: \n",
|
|
"- **input** variable (a key CNTK concept): \n",
|
|
">An **input** variable is a container in which we fill different observations (data point or sample, equivalent to a blue/red dot in our example) during model learning (a.k.a.training) and model evaluation (a.k.a. testing). Thus, the shape of the `input` must match the shape of the data that will be provided. For example, when data are images each of height 10 pixels and width 5 pixels, the input feature dimension will be two (representing image height and width). Similarly, in our examples the dimensions are age and tumor size, thus `input_dim` = 2). More on data and their dimensions to appear in separate tutorials.\n",
|
|
"\n",
|
|
"\n",
|
|
"**Question** What is the input dimension of your chosen model? This is fundamental to our understanding of variables in a network or model representation in CNTK.\n"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 12,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# The input variable (representing 1 observation, in our example of age and size) x, which \n",
|
|
"# in this case has a dimension of 2. \n",
|
|
"#\n",
|
|
"# The label variable has a dimensionality equal to the number of output classes in our case 2. \n",
|
|
"\n",
|
|
"input = C.input_variable(input_dim)\n",
|
|
"label = C.input_variable(num_output_classes)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Feed forward network setup\n",
|
|
"Let us define the feedforward network one step at a time. The first layer takes an input feature vector ($\\bf{x}$) with dimensions `input_dim`, say $m$, and emits the output a.k.a. *evidence* (first hidden layer $\\bf{z_1}$ with dimension `hidden_layer_dim`, say $n$). Each feature in the input layer is connected with a node in the output layer by the weight which is represented by a matrix $\\bf{W}$ with dimensions ($m \\times n$). The first step is to compute the evidence for the entire feature set. Note: we use **bold** notations to denote matrix / vectors: \n",
|
|
"\n",
|
|
"$$\\bf{z_1} = \\bf{W} \\cdot \\bf{x} + \\bf{b}$$ \n",
|
|
"\n",
|
|
"where $\\bf{b}$ is a bias vector of dimension $n$. \n",
|
|
"\n",
|
|
"In the `linear_layer` function, we perform two operations:\n",
|
|
"0. multiply the weights ($\\bf{W}$) with the features ($\\bf{x}$) and add individual features' contribution,\n",
|
|
"1. add the bias term $\\bf{b}$."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 13,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"def linear_layer(input_var, output_dim):\n",
|
|
" input_dim = input_var.shape[0]\n",
|
|
" \n",
|
|
" weight = C.parameter(shape=(input_dim, output_dim))\n",
|
|
" bias = C.parameter(shape=(output_dim))\n",
|
|
"\n",
|
|
" return bias + C.times(input_var, weight)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"The next step is to convert the *evidence* (the output of the linear layer) through a non-linear function a.k.a. *activation functions* of your choice that would squash the evidence to activations using a choice of functions ([found here][]). **Sigmoid** or **Tanh** are historically popular. We will use **sigmoid** function in this tutorial. The output of the sigmoid function often is the input to the next layer or the output of the final layer. \n",
|
|
"[found here]: https://docs.microsoft.com/en-us/cognitive-toolkit/Brainscript-Activation-Functions \n",
|
|
"\n",
|
|
"**Question**: Try different activation functions by passing different them to `nonlinearity` value and get familiarized with using them."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 14,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"def dense_layer(input_var, output_dim, nonlinearity):\n",
|
|
" l = linear_layer(input_var, output_dim)\n",
|
|
" \n",
|
|
" return nonlinearity(l)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Now that we have created one hidden layer, we need to iterate through the layers to create a fully connected classifier. Output of the first layer $\\bf{h_1}$ becomes the input to the next layer.\n",
|
|
"\n",
|
|
"In this example we have only 2 layers, hence one could conceivably write the code as:\n",
|
|
"\n",
|
|
" h1 = dense_layer(input_var, hidden_layer_dim, sigmoid)\n",
|
|
" h2 = dense_layer(h1, hidden_layer_dim, sigmoid)\n",
|
|
"\n",
|
|
"\n",
|
|
"To be more agile when experimenting with the number of layers, we prefer to write it as follows:\n",
|
|
"\n",
|
|
" h = dense_layer(input_var, hidden_layer_dim, sigmoid)\n",
|
|
" for i in range(1, num_hidden_layers):\n",
|
|
" h = dense_layer(h, hidden_layer_dim, sigmoid)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 15,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Define a multilayer feedforward classification model\n",
|
|
"def fully_connected_classifier_net(input_var, num_output_classes, hidden_layer_dim, \n",
|
|
" num_hidden_layers, nonlinearity):\n",
|
|
" \n",
|
|
" h = dense_layer(input_var, hidden_layer_dim, nonlinearity)\n",
|
|
" for i in range(1, num_hidden_layers):\n",
|
|
" h = dense_layer(h, hidden_layer_dim, nonlinearity)\n",
|
|
" \n",
|
|
" return linear_layer(h, num_output_classes)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"The network output `z` will be used to represent the output of a network across."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 16,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Create the fully connected classfier\n",
|
|
"z = fully_connected_classifier_net(input, num_output_classes, hidden_layers_dim, \n",
|
|
" num_hidden_layers, C.sigmoid)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"While the aforementioned network helps us better understand how to implement a network using CNTK primitives, it is much more convenient and faster to use the [layers library](https://www.cntk.ai/pythondocs/layerref.html). It provides predefined commonly used “layers” (lego like blocks), which simplifies the design of networks that consist of standard layers layered on top of each other. For instance, ``dense_layer`` is already easily accessible through the [`Dense`](https://www.cntk.ai/pythondocs/layerref.html#dense) layer function to compose our deep model. We can pass the input variable (`input`) to this model to get the network output. \n",
|
|
"\n",
|
|
"**Suggested task**: Please go through the model defined above and the output of the `create_model` function and convince yourself that the implementation below encapsulates the code above."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 17,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"def create_model(features):\n",
|
|
" with C.layers.default_options(init=C.layers.glorot_uniform(), activation=C.sigmoid):\n",
|
|
" h = features\n",
|
|
" for _ in range(num_hidden_layers):\n",
|
|
" h = C.layers.Dense(hidden_layers_dim)(h)\n",
|
|
" last_layer = C.layers.Dense(num_output_classes, activation = None)\n",
|
|
" \n",
|
|
" return last_layer(h)\n",
|
|
" \n",
|
|
"z = create_model(input)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Learning model parameters\n",
|
|
"\n",
|
|
"Now that the network is setup, we would like to learn the parameters $\\bf W$ and $\\bf b$ for each of the layers in our network. To do so we convert, the computed evidence ($\\bf z_{final~layer}$) into a set of predicted probabilities ($\\textbf p$) using a `softmax` function.\n",
|
|
"\n",
|
|
"$$ \\textbf{p} = \\mathrm{softmax}(\\bf{z_{final~layer}})$$ \n",
|
|
"\n",
|
|
"One can see the `softmax` function as an activation function that maps the accumulated evidences to a probability distribution over the classes (Details of the [softmax function][]). Other choices of activation function can be [found here][].\n",
|
|
"\n",
|
|
"[softmax function]: https://www.cntk.ai/pythondocs/cntk.ops.html#cntk.ops.softmax\n",
|
|
"\n",
|
|
"[found here]: https://docs.microsoft.com/en-us/cognitive-toolkit/Brainscript-Activation-Functions"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Training\n",
|
|
"\n",
|
|
"If you have already gone through CNTK101, please skip this section and jump to the section titled,\n",
|
|
"<a href='#Run the trainer'>Run the trainer'</a>.\n",
|
|
"\n",
|
|
"The output of the `softmax` is a probability of observations belonging to the respective classes. For training the classifier, we need to determine what behavior the model needs to mimic. In other words, we want the generated probabilities to be as close as possible to the observed labels. This function is called the *cost* or *loss* function and shows what is the difference between the learnt model vs. that generated by the training set.\n",
|
|
"\n",
|
|
"$$ H(p) = - \\sum_{j=1}^C y_j \\log (p_j) $$ \n",
|
|
"\n",
|
|
"where $p$ is our predicted probability from `softmax` function and $y$ represents the label. This label provided with the data for training is also called the ground-truth label. In the two-class example, the `label` variable has dimensions of two (equal to the `num_output_classes` or $C$). Generally speaking, if the task in hand requires classification into $C$ different classes, the label variable will have $C$ elements with 0 everywhere except for the class represented by the data point where it will be 1. Understanding the [details][] of this cross-entropy function is highly recommended.\n",
|
|
"\n",
|
|
"[`cross-entropy`]: http://cntk.ai/pythondocs/cntk.ops.html#cntk.ops.cross_entropy_with_softmax\n",
|
|
"[details]: http://colah.github.io/posts/2015-09-Visual-Information/"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 18,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"loss = C.cross_entropy_with_softmax(z, label)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"#### Evaluation\n",
|
|
"\n",
|
|
"In order to evaluate the classification, one can compare the output of the network which for each observation emits a vector of evidences (can be converted into probabilities using `softmax` functions) with dimension equal to number of classes."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 19,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"eval_error = C.classification_error(z, label)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"### Configure training\n",
|
|
"\n",
|
|
"The trainer strives to reduce the `loss` function by different optimization approaches, [Stochastic Gradient Descent][] (`sgd`) being one of the most popular one. Typically, one would start with random initialization of the model parameters. The `sgd` optimizer would calculate the `loss` or error between the predicted label against the corresponding ground-truth label and using [gradient-decent][] generate a new set model parameters in a single iteration. \n",
|
|
"\n",
|
|
"The aforementioned model parameter update using a single observation at a time is attractive since it does not require the entire data set (all observation) to be loaded in memory and also requires gradient computation over fewer datapoints, thus allowing for training on large data sets. However, the updates generated using a single observation sample at a time can vary wildly between iterations. An intermediate ground is to load a small set of observations and use an average of the `loss` or error from that set to update the model parameters. This subset is called a *minibatch*.\n",
|
|
"\n",
|
|
"With minibatches we often sample observation from the larger training dataset. We repeat the process of model parameters update using different combination of training samples and over a period of time minimize the `loss` (and the error). When the incremental error rates are no longer changing significantly or after a preset number of maximum minibatches to train, we claim that our model is trained.\n",
|
|
"\n",
|
|
"One of the key parameter for optimization is called the `learning_rate`. For now, we can think of it as a scaling factor that modulates how much we change the parameters in any iteration. We will be covering more details in later tutorial. \n",
|
|
"With this information, we are ready to create our trainer.\n",
|
|
"\n",
|
|
"[optimization]: https://en.wikipedia.org/wiki/Category:Convex_optimization\n",
|
|
"[Stochastic Gradient Descent]: https://en.wikipedia.org/wiki/Stochastic_gradient_descent\n",
|
|
"[gradient-decent]: http://www.statisticsviews.com/details/feature/5722691/Getting-to-the-Bottom-of-Regression-with-Gradient-Descent.html"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 20,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Instantiate the trainer object to drive the model training\n",
|
|
"learning_rate = 0.5\n",
|
|
"lr_schedule = C.learning_rate_schedule(learning_rate, C.UnitType.minibatch) \n",
|
|
"learner = C.sgd(z.parameters, lr_schedule)\n",
|
|
"trainer = C.Trainer(z, (loss, eval_error), [learner])"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"First lets create some helper functions that will be needed to visualize different functions associated with training."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 21,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Define a utility function to compute the moving average sum.\n",
|
|
"# A more efficient implementation is possible with np.cumsum() function\n",
|
|
"def moving_average(a, w=10): \n",
|
|
" if len(a) < w: \n",
|
|
" return a[:] # Need to send a copy of the array\n",
|
|
" return [val if idx < w else sum(a[(idx-w):idx])/w for idx, val in enumerate(a)]\n",
|
|
"\n",
|
|
"\n",
|
|
"# Defines a utility that prints the training progress\n",
|
|
"def print_training_progress(trainer, mb, frequency, verbose=1): \n",
|
|
" training_loss = \"NA\"\n",
|
|
" eval_error = \"NA\"\n",
|
|
"\n",
|
|
" if mb%frequency == 0:\n",
|
|
" training_loss = trainer.previous_minibatch_loss_average\n",
|
|
" eval_error = trainer.previous_minibatch_evaluation_average\n",
|
|
" if verbose: \n",
|
|
" print (\"Minibatch: {}, Train Loss: {}, Train Error: {}\".format(mb, training_loss, eval_error))\n",
|
|
" \n",
|
|
" return mb, training_loss, eval_error"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"<a id='#Run the trainer'></a>\n",
|
|
"### Run the trainer\n",
|
|
"\n",
|
|
"We are now ready to train our fully connected neural net. We want to decide what data we need to feed into the training engine.\n",
|
|
"\n",
|
|
"In this example, each iteration of the optimizer will work on 25 samples (25 dots w.r.t. the plot above) a.k.a. `minibatch_size`. We would like to train on say 20000 observations. Note: In real world case, we would be given a certain amount of labeled data (in the context of this example, observation (age, size) and what they mean (benign / malignant)). We would use a large number of observations for training say 70% and set aside the remainder for evaluation of the trained model.\n",
|
|
"\n",
|
|
"With these parameters we can proceed with training our simple feed forward network."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 22,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Initialize the parameters for the trainer\n",
|
|
"minibatch_size = 25\n",
|
|
"num_samples = 20000\n",
|
|
"num_minibatches_to_train = num_samples / minibatch_size"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 23,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"# Run the trainer and perform model training\n",
|
|
"training_progress_output_freq = 20\n",
|
|
"\n",
|
|
"plotdata = {\"batchsize\":[], \"loss\":[], \"error\":[]}\n",
|
|
"\n",
|
|
"for i in range(0, int(num_minibatches_to_train)):\n",
|
|
" features, labels = generate_random_data_sample(minibatch_size, input_dim, num_output_classes)\n",
|
|
" \n",
|
|
" # Specify the input variables mapping in the model to actual minibatch data for training\n",
|
|
" trainer.train_minibatch({input : features, label : labels})\n",
|
|
" batchsize, loss, error = print_training_progress(trainer, i, \n",
|
|
" training_progress_output_freq, verbose=0)\n",
|
|
" \n",
|
|
" if not (loss == \"NA\" or error ==\"NA\"):\n",
|
|
" plotdata[\"batchsize\"].append(batchsize)\n",
|
|
" plotdata[\"loss\"].append(loss)\n",
|
|
" plotdata[\"error\"].append(error)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Let us plot the errors over the different training minibatches. Note that as we iterate the training loss decreases though we do see some intermediate bumps. The bumps indicate that during that iteration the model came across observations that it predicted incorrectly. This can happen with observations that are novel during model training.\n",
|
|
"\n",
|
|
"One way to smoothen the bumps is by increasing the minibatch size. One could conceptually use the entire data set in every iteration. This would ensure the loss keeps consistently decreasing over iterations. However, this approach requires the gradient computations over all data points in the dataset and repeat those after locally updating the model parameters for a large number of iterations. For this toy example it is not a big deal. However with real world example, making multiple passes over the entire data set for each iteration of parameter update becomes computationally prohibitive. \n",
|
|
"\n",
|
|
"Hence, we use smaller minibatches and using `sgd` enables us to have a great scalability while being performant for large data sets. There are advanced variants of the optimizer unique to CNTK that enable harnessing computational efficiency for real world data sets and will be introduced in advanced tutorials. "
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 24,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYkAAACfCAYAAAAMJSWPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJzt3XecVPW5x/HPF0SUSFMIiICoiMYWYwSMkoAoBhPlWqJA\nLIglakw0eG2gV2NMVWO5ek0kdo0FMQKaKGLBLipFioIozYCgNEtAkeW5f/zOyDDM7M7szpk5s/u8\nX699MXPOb855ZnaZ55xflZnhnHPOZdOo3AE455xLLk8SzjnncvIk4ZxzLidPEs4553LyJOGccy4n\nTxLOOedy8iTRwEn6i6RLCy0rqbekD+KN7uvzzpfUtxTnqg+i382MYpetRRwvSjo5jmO70tmi3AG4\neEhaALQHOpjZyrTtU4FvA13MbJGZnZ3vMbOUrdUgG0k7AvOBLcxsQ22OUV9I6gU8QfgsGwHNgM8B\nRdv2MLN/F3JMM3se2LvYZV3D5HcS9ZcRvogHpzZI2gvYmlp+uRdR6gtQsZ9Iahz3OerCzF4ys+Zm\n1gLYk/C5tExty0wQipQlWNcgeZKo3+4FhqQ9HwLcnV5A0p2SfhM97i3pA0nnS1omabGkU7KV3bhJ\nwyV9LGmepJ+m7fiRpCmSPpG0UNIVaa97Pvp3taRPJfWMXnOGpLejbTMl7Zv2mu9IekvSKkkPSNoy\n2xuWNETSS5Kuk/QxcIWkKyTdm1ZmR0kbJDWKnj8n6TfR6z6V9KSkbXMc/21JP0p73ljSR5L2ldRU\n0r2SlkdxTpLUNttxarBJEoiqbX4j6RXCXUYnSaelfVZzJZ2WVv4QSfPTnn8gaZik6VFcf5fUpNCy\n0f7hkj6Myp0efY6da3xDweWSFkhaKukOSc2jfVtH50l9bq+lPv/ofc6P3ud7ko6vxefp6sCTRP32\nGtBc0m7RF+JA4D6qv4JvDzQHOgCnA/8nqWU1ZbeNyp4CjJS0a7Tvc+AkM2sJ/Bg4S9KAaN8Pon9b\nRFfLkyQdB1wOnBhdVQ8AVqSd6zjgMGAnQnXZKdW8h57Ae0A74HfRtsy7p8zngwlJtC3QFLggx7Hv\nB36a9rw/8LGZTYte3wLYgfC5nAWsrSbOQpxIeM8tgMXAUuDw6LM6A7gpulNMyXx/xwGHADsD+wMn\nFVpW0hHAOUBvoBvQN8trczmD8Ln9ANiF8PncEO0bSrjD7RBt/znwRZRE/gwcEr3Pg4DpeZ7PFYkn\nifovdTfRD3gHWFJD+XXAVWZWZWZPEL7sd8tR1oD/MbOvzOwF4J/A8QBm9oKZzYoezwQeJHy5pEtP\nVqcBV5vZlOg188wsvWH8RjNbZmargceA9LuMTIvN7BYz22BmX9bwflPuNLP3o/Kjqjn+A8AASVtF\nzwdH2wC+ArYDulkw1cw+z/P8NbnDzN6Nfi9VZvZPM1sIYGYTgWeA71fz+uvN7GMzWwU8TvWfX66y\nxwG3R3GsBa4sIP6fAtdG7WD/AUawMdl+BbRh4+c2xczWRPs2AHtLahr9/mcXcE5XBJ4k6r/7CP8Z\nTwHuyaP8iozG5DXANjnKrjKzL9KeLyRcDSKpp6Rno6qY1cCZhC+CXDoB71ezf1meMQHUptfV0nyO\nb2bvA28DR0ramnDHc3+0+15gPPCgpH9L+qOK1yayyXuSdERULbNC0irCRUB1n28hn1+ush0y4viA\n/NuVOhD+PlIWAk2j6ri7gKeBUVE11u8lNTKzzwhJ+BfAUknj0u5UXYl4kqjnzGwRoQH7cOAfRT58\n6+iLMqUzG+9U/g6MAXYws1bArWz8QslWRfEBoRqiGDKP/x9Cr6GU7et4/AcJife/gFlmNg/AzNab\n2VVmtidwIHAkUKwuoF+/p+gu5mFCVVpbM2sNTCD+jgAfAh3Tnncm/+qmJcCOac93BL6M7li+MrPf\nmNkeQC/gGOAEADMbb2b9CFWb7xP+jlwJeZJoGE4F+kZVBMUk4EpJTSR9n9D2MCratw3hTuMrST3Y\ntB7/Y0I1QnpSuA24QNJ+AJJ2kdSpSHFOA34gqVPUvnJJHY/3IKF95Gw23kUgqY+kvaL2n88J1SiF\ndvHN54u+KdAEWA5Y1FZwSIHnqY1RwGmSuklqBlxWwGsfAM6POg00B35L9NlJOljSnpJE2ucmqX10\nx7Q1sJ6Q7KuK+YZczTxJ1F9fX+GZ2fxUXX/mvkKOk8WHwCrCVeK9wJlmNjfa93PgKkmfEL5MHkqL\nZy3hKvhlSSsl9TCz0dG2+yV9CjxKaMQsNN7N34DZ09H5pwNvENo0NilS4PGWAq8CB5D2vghXu6OB\nT4BZwHOEzyU1EPGWfA5f0zYz+wQYRrhTW0G48s58TzUds+CyZvY48BfgBWAO8FK0K1e7T/qx/kb4\nrF4kdCr4BPhVtK8D4S73E2AG8BQhgTQGLiT8fX0MfI/QcO5KSHEvOiSpP6EXQyNCo9efMva3INSb\ndyb8UfzZzO6KNSjnXJ1Fvakmm1nTcsfi4hNrkohuu98l3AovIVzFDUrvoSBpOKEr5HBJbQhXKO3M\nbH1sgTnnakXSUYRebM0JHSH+Y2YDyxuVi1Pc1U09gLlmttDMviLU5f5XRhkj/MER/bvCE4RziXUO\noS3kXULPp1+UNxwXt7jnbtqBTbvM/ZuQONLdDIyTtITQ2OlXJc4lVNTTyDUgSZjg74fAVDPrK2kX\nYIKkfTIHIUkq93xDzjlXkcys1t2j465uWkxokE7pGG1LN5So/340UGk+sHu2g5lZ4n+uuOKKssfg\ncXqclRqjx1n8n7qKO0m8AXSN+kZvCQwCxmWUWQgcCiCpHWFOmHkxx+Wccy4PsVY3mVmVpF8Q+j2n\nusC+I+nMsNtGEgbV3CUpNXHXRZa2/oFzzrnyib1NwsyeJGOCODO7Ne3xh4R2iRpt2ACNEj78r0+f\nPuUOIS8eZ3FVQpyVECN4nEkT+2C6YpFkzz1nNJDfi3POFYUkLMEN10V1Tz5zmMZozhxYsKC8MTjn\nXClVVJJ49FFYs6bmcnG54w544IGayznnXH1RUUnigANgzJjynHv2bGjRAv5d0JL0zjlX2SoqSZxx\nBixbVnO5OJx+Oixd6knCOdewVFTDdbliraqCVq3gkUdgxAh4882yhOGccwVrUA3X5fLuu9CuHey1\nl99JOOcaFk8SeZg8GfbbLySKnj3DeA3nnGsIYk8SkvpLmi3pXUkXZ9l/gaSpkqZImiFpvaRWccdV\niMmT4bvfhcaNYezY5A/oc865Yon16y5adOhmwojqPYHBkjaZvM/MrjWz75jZfsBwYKKZrY4zrkK1\nbQu9e5c7CuecK70kLDqUbjBhwfRqffEFDBoE60u0NNGIEaH7rXPONTRxJ4lsiw7tkK2gpK2B/sAj\nNR10q63CyOenny5GiM4553JJwqJDKUcCL1VX1fTrX//668c9e/bhnnv60L9//IE551ylmDhxIhMn\nTiza8WIdJyHpAODXZtY/en4JYYrwP2Up+w9glJk9mONYm4yTWL4cunaFRYvCSOhSWbo0jL72iQad\nc5Ug6eMk8ll0CEktgd7A2HwP3KYNHHxwGOBWSnPnwqWXlvaczjlXLrEmCTOrAlKLDs0CHkwtOiTp\nZ2lFjwLGm9naQo5/8snxzuW0YQPccMOm4yI6dvQBdc65hqOip+VYvx7MoEmTeM45dy7067fp9OBf\nfgnNm4ceVj5ewjmXdEmvborVFlvElyBg40jrdE2bhnmcPvoovvM651xSVHSSiFtqpHUmr3JyzjUU\nniSqkStJDBwYxmo451x9V9FtEnEyg9atQ7tE27YlO61zzhVVg26TSDGD224LjcnFsm4dXHmlJwjn\nXMNWb+4kjj02NDL7GAbnnNuorncS9SZJzJ8P++8Pb70VGpadc855ddPXdtoJzj4bLrqo3JE451z9\nUfZFh6IyfaKFh2ZKeq625xo+HF58EV56qfbx5mPDBvjrX0NbiHPO1WdxT/DXCHgXOARYQpjLaZCZ\nzU4r0xJ4BTjMzBZLamNmy7McK6/eTfffDxMnwsiRRXoTObRsGUZit24d73mcc64ukl7dlM+iQz8F\nHjGzxQDZEkQhBg+GW2+tyxHgmWfg5purL+MD6pxzDUESFh3qBmwr6TlJb0g6qS4nlMJPXTz9NKxY\nUX2ZHXbwJOGcq/+SsOjQFsB+QF/gG8Crkl41s/cyC6YvOtSnTx/6xLSow+TJcO651Zfp2BEWL47l\n9M45V2v1btGhqDF7KzO7Mnp+G/CEmT2ScaySjLg2C2tVzJwJ22+fu9zll4dZYNPylnPOJU7S2yTy\nWXRoLNBLUmNJzYCewDvFCmDlysJ6IS1aFGZ6rS5BAPTtC9/+dmGxLF8e76y1zjlXbGVfdCjq6TQe\nmA68Bow0s7eLFcMRR8C4zdbCyy3b9ODZ9OkDRx9dWCzTp4c1MD77rLDXOedcudSbEde5TJgAZ50F\ns2blN3PrihXw8cew++61CLIGd90FQ4fC1Kmw777FP75zzmVKenVT2fXrB3vvDdddl1/57baLJ0EA\nnHIKHHUUvLdZk7xzziVTvb+TAJg3D7p3T8a8Tu++GxLRdtuVNw7nXMPgE/zl6eqr4b77YNo0X5va\nOddweJIowLJl0K5dkQICRo0KbQvduhXvmM45V0wlaZOQtIukptHjPpLOldSqtictl2ImCICxY2HS\npOIe0znnkiTfipdHgCpJXYGRQCfg/tiiKiEz+PLL8Pi44+D55/N/bSFTc8yfD59/Xnh8zjlXTvkm\niQ1mth44GrjJzC4EahhuVhmefDKMi5g8GV54AXbeOf/XFjI1x1lnhdlpnXOukuSbJL6SNBgYAjwe\nbasXY4f794fLLoPDDw93FYX0fipkJtjp0zeO0D7vPHjiicJjdc65Uss3SQwFvgf8zszmS9oJuDef\nF9a06JCk3pJWS5oS/VyWf/h1J4XpxadNg7vvLmwG2Xyrmz76KFRppRJQo0bwdtHGlDvnXHzymgU2\nmibjXABJrYHm6ZP05RItOnQzaYsOSRqbvuhQ5AUzG1BQ5EXWoUP4KcSuu8KQITWXe+utcBeRSkBd\nu4YJBJ1zLuny7d00UVILSdsCU4C/ScpnDHM+iw4B1HEFiPLYdlv45S9rLpdKEildu/qoa+dcZci3\nuqmlmX0KHAPcY2Y9gUPzeF0+iw4BfE/SNEn/lLRHnjFVjC23hN69Nz73JOGcqxT5Ljq0haTtgeOB\nS4scw2Sgs5mtkXQ4MIawWt1mSrXoULFlLmDUuTMsWRLaKZo2LU9Mzrn6qSyLDkk6Dvgf4GUzO1vS\nzsA1ZnZsDa+rcdGhLK+ZD3zXzFZmbC/JokOl8uGH0L593Zdadc656iR6Wg5JjYE5hIbrD4HXgcFm\n9k5amXZmtix63AMYZWZdshyrXiWJcps5E/70pzB9eePG5Y7GOReXUk3L0VHSo5I+in4ekVTjiIJ8\nFh0CfiJppqSpwA3AwFq+l7J49VV4+OFyR1G43XYL3XcvLXbloXOuXsm3umkCYRqO1NiIE4ETzKxf\njLFlxpDIO4kHHoAxY+Chh8odSeGWL4cePeCqq+CEE8odjXMuDqVadKitmd1pZuujn7uAtrU9aX1S\n06jrMWNg1arSxVOTESPCFCQAbdqEpV2HDYPXXy9vXM65ZMo3SayQdKKkxtHPicCKOAOrFNXN32QG\np50GX3yRe38pmcHtt4fkkLLXXnDbbTBw4MaJDp1zLiXfJHEqofvrUkID9E+AU2KKqaJ06BB6Km3Y\nsPm+xYtDo3D79pvv++oraN06/Fsqc+eGLrc77rjp9gED4MUXvTuuc25zeSWJaMT0ADNra2bfNLOj\ngGq7vzYUTZtCy5ZhfqZMmdNxpGvSBFq1gkWL4o8x5cUX4fvfz76v3Mu6OueSqS4LeZ5ftCgq3HXX\nZb8Kz5yOI1OpR15XlySccy6buiQJHwYWOfHEUHWUqZKTxIYNpW8zcc4lT12ShH+F1KB3bzjooNz7\nS50kxo+Hb30rv7ITJoRZbocPhylTPGE411BVO05C0mdkTwYCtjazfOd+qrOkjpOoizFj4L77YPTo\nckeyObPQVXb06I2DBY87DoYODQPxnHOVIdHTckBYdIgwkroRcHuueZskdQdeAQaa2T+y7K93SaKq\nKixAlPT5m8xg6tSQLA48EI48stwROefylegkES069C5piw4BgzIXHYrKTQDWAnc0lCThnHNxK9WI\n69rKd9GhXwKjgSwdSZNv+XK48MJyR1E+n34K8+eXOwrnXBziThI1LjokqQNwlJn9hQrtMdW0Kdxy\nS3Ibd9esyT7Yr1gmTYLu3eH660MVmnOu/ihZw3M1bgAuTnueM1EkddGh5s1hiy1g9erQFdYMzj8f\nrrkmbC+3K68MA/5GjIjn+P36hdlwf/azMOHhbbfBPvvEcy7nXPXKsuhQrQ+ex6JDkualHgJtgP8A\nPzOzcRnHSnSbxJ57hplg99oLFiwIXV9zzemUbv16+Oyz7OMsiuXAA+G3v4W+feM7B4TkeMcdodvs\nGWeEcya9Ud65+i7pbRJvAF0l7ShpS2AQsMmXv5ntHP3sRGiX+HlmgqgE6bPB1jSILt2YMWESwLis\nXRviOeCA+M6RIoX38tZb4fPwBOFc5Yu1MsTMqiSlFh1KdYF9R9KZYbeNzHxJnPHEaYcdapckdtkl\n3gF1kybB3ntDs2bxnSPT9tvD2WeX7nzOufjEXmNuZk8Cu2VsuzVH2VPjjicuZ58d6v0hJImf/CS/\n1+2yC7z/fqiqiePK+4UXfL4m51ztxV3d1GB07w7duoXHhdxJtGgB22wTphuPw+rVcPDB8Ry7UHPm\nwMqV5Y7COVcITxIx+P3vNyaMfMQ5h9N118GPfhTPsQs1dizsvz9Mm1buSJxz+fIkEYPjjy+s6+t3\nv5usJU7jctFFIYH26wf33FPuaJxz+Yh97qZiSXoXWJe/WbPg6KNDl9yrrw5Vbs65eCS9C6xzm9lz\nT3jjjbB069NPlzsa51x1/E6iiM49F/77vzdfQ9o558rF7yQSZPp0mDev5nKlsGQJPPNMuaNwzlU6\nTxJFtG5d/FNf5Ovxx+Huu8sdRe2MHBlGoif8xtG5BiH2JCGpv6TZkt6VdHGW/QMkvSVpqqTXJVWz\n4GeyfVSHic7nzy/uGIJC1rNOmm7dwmSERxwBr7ziycK5cir7okOSmpnZmujx3sAoM9tsJeZKaJN4\n6KFQzTNsWOGvPeEE6N8fTjqpOLF06QJPPgm7716c45XaunVh+vVbbw3Tj59ySpg40OeDcq4wdW2T\niHtajq8XHQKQlFp06OskkUoQkW2AGFc+iNfAgbV/bTEH1C1aFNaQqOS1qLfcEn71KzjvPHj99TAV\nuScI50qv7IsOAUg6StI7wGNAxc7fVBfFTBKpqqb68KUqQc+eIWFks2pV6ErrnItHApbEATMbA4yR\n1Av4LdAvW7mkLjpUDMWcDbZLFzjzzOIcK+luuSWsiPfjH8NRR8Fhh8E3vpHfa83C0rMtWoTVBZ2r\nD+rdokNZXvM+0N3MVmZsT3ybRF189BF861uwYkW5I6k8H3wA48aFHlGTJoUJDa+9FnbdddNyGzbA\nX/8aRnzPnBn+raoKSWXhQmjcuDzx19asWWF98e99r9yRuCRL+jiJGhcdkrRL2uP9gC0zE0RD0LZt\nmMNp7dpyR1J5OnWCc86BCRPCl/3xx8O2225erlGjcLfWrRtcfnlIFCtXwuzZ2RPE+vXxrg1eGxs2\nwL/+Fe6YDj00xJ7p7bdDz7C5c/M75sqVcPPNYQXDCy9M9t/g8uWha/dbb3mvt1KJfcS1pP7AjWxc\ndOiP6YsOSboIOBlYB6wFLjCzV7Mcp17fSbjkefTRkHx++MOwHO1BB4XOAI3yvLRauxamTAlL2qbW\nGqmtdevC2uE33hjufIYNC8kwWzXZV1/B//4v/OEPodpxxIjcVXB33RXaew4/PPSwu+8+mDo1xJ1v\ntV1dLFgAZ50VHpuFn3XroE0bGD168/ILF4b3M2lS6JzRv3+IvV8/aNUq/ngrUV3vJHxajnpg2bLw\nZeH/SYpv7twwv9TLL4efTz8Na3dnW3nvvfdCp4HXXw9fYnPmwB57wO23wz77bF5+2DBo0iQkng4d\nwlXysmXh2Jlf0FVVIWGdcAL06pVfp4QlS+CCC8JYk+uuC5MqZr5u6dLQkyz9zquQ9VDyNXNm6Jyx\n1Vabbv/sM3jppY1xSeEz2X77UP1anffegyeeCF29d9klJEa3OU8SDdycOeFK6sorizfGwuW2ZEn4\nwu7UafN9l10WpmXp2TP87Lvv5l+K6R57DGbMCL/DDz8MVY7t24fjtG5dvJgnTgxX3+PGhSv0Ulq/\nPsz0e8MNYRaAHj1Ke37nSaJBe/llOPbYUK0wdGi5o3H13bp14a4jX7Nnw5AhoffYHXdkT6xxGzIk\n3HkNGVJY7PVJ0huuXUweeSRUH9x9tycIF78FC0KD/223hTaLNWtylzULdw6pL+fx48uTIADOOCP8\nX+naNTTOf/FFeeKoZJ4kEqSqKlQJ1OTZZ8NI5PHjQ6Oqc3Hr0gXuvDO0z5x8Mmy3XWgH+N3vspf/\n9FN47TX4+c/zb+iPQ69eoc1i9Gh46inYeefQRuTy59VNCWIG22wTGhObN89drqoqNHB26FC62JxL\nt349vP9++FvcY49yR5O/adPCtDUDBpQ3jrlzQ8P74YfHfy5vk6hn9t4bjjwyJIo33wx3Fl26lDsq\n51wxvPoqXHNN6AU3fDicf/7mZR54INy17bYbdOwYujR/+WUYx1KbpOJtEvXMSSeF0dc9eoQ/lB02\nm+nKOVdsGzaEUfrLlxf/2GYwdmyo+jrhhLDmzIIF2RMEhP3nnReqxpYvD+NtmjQp39QxfifhnGvw\nPv88jCkZNQpOPTU8btcu9OjK9uX8l7+EOcM6dQrLFXfuHH4OPDD79Py//GWYdPOYY2CLEs+Yl/g7\niTwWHfpptOjQW5JeitaUqFjFnFgrTh5ncVVCnJUQI5Qnzm22CfN6TZ8eEsPuu4dtl1ySvfygQXDZ\nZRO5+OKQGKqq4PnnQ5tHNjfdFEbIlzpBFEOsSSJadOhm4IfAnsBgSZl5dh7wAzP7NmEG2L/FGVPc\n/D9icXmcxVMJMUJ54+zYMYzcXrgwtAtef332cq1bw7x5EznsMDj9dLjqqtAdfdCg0sZbCklYdOi1\ntPKvkWW9CeecK6W6zrVVnyRi0aE0pwNPxBqRc865vMW9nsSxwA/N7GfR8xOBHmZ2bpayBxOqpnqZ\n2aos+73V2jnnaiHJa1wvBjqnPe8YbduEpH2AkUD/bAkC6vYmnXPO1U4SFh3qDDwCnGRm78ccj3PO\nuQLEeidhZlWSfgE8xcZFh95JX3QI+B9gW+AWSQK+MjOfUNg55xKgYgbTOeecK72KmJajpgF5JY7l\ndknLJE1P29Za0lOS5kgaL6ll2r7hkuZKekfSYSWKsaOkZyXNkjRD0rkJjbOppEmSpkZxXpHEOKPz\nNpI0RdK4pMYYnXtBNDB1qqTXkxirpJaSHo7OOUtSzwTG2C36DKdE/34i6dykxRmdd5ikmZKmS/q7\npC2LGqeZJfqHkMjeA3YEmgDTgN3LGE8vYF9getq2PwEXRY8vBv4YPd4DmEqo1usSvQ+VIMb2wL7R\n422AOcDuSYszOnez6N/GhHEyPRIa5zDgPmBcEn/naXHOA1pnbEtUrMBdwNDo8RZAy6TFmBFvI2AJ\n0ClpcQIdot/5ltHzh4AhxYyzZB90HT6EA4An0p5fAlxc5ph2ZNMkMRtoFz1uD8zOFithDEjPMsQ7\nBjg0yXECzYA3ge5Ji5PQK28C0IeNSSJRMaadbz6wXca2xMQKtADez7I9MTFmie0w4MUkxklIEguB\n1tEX/7hi/1+vhOqmQgfklcM3zWwZgJktBb4Zbc+MfTEljl1SF8Kdz2uEP5pExRlV40wFlgITzOyN\nBMZ5PXAhkN6Al7QYUwyYIOkNSadH25IU607Ackl3RlU5IyU1S1iMmQYC90ePExWnmS0B/gwsis75\niZk9Xcw4KyFJVKJE9AaQtA0wGjjPzD5n87jKHqeZbTCz7xCu1ntI2pMExSnpx8AyM5sGVDdWp+yf\nZeQgM9sP+BFwjqTvk6DPk3C1ux/wf1Gc/yFc3SYpxq9JagIMAB6ONiUqTkmtCFMd7Ui4q/iGpBOy\nxFXrOCshSeQ1IK/MlklqByCpPfBRtH0xoR4zpWSxS9qCkCDuNbOxSY0zxcw+BSYC/UlWnAcBAyTN\nAx4A+kq6F1iaoBi/ZmYfRv9+TKhm7EGyPs9/Ax+Y2ZvR80cISSNJMaY7HJhsZqmVJpIW56HAPDNb\naWZVwKPAgcWMsxKSRI0D8spAbHpVOQ44JXo8BBibtn1Q1NtgJ6Ar8HqJYrwDeNvMbkxqnJLapHpd\nSNoa6Ae8k6Q4zWyEmXU2s50Jf3vPmtlJwGNJiTFFUrPo7hFJ3yDUpc8gWZ/nMuADSd2iTYcAs5IU\nY4bBhIuDlKTFuQg4QNJWkkT4PN8uapylbACqQ+NMf0IPnbnAJWWO5X5CT4cvo1/QUEKj0dNRjE8B\nrdLKDyf0IHgHOKxEMR4EVBF6gk0FpkSf4bYJi3PvKLZpwHTg0mh7ouJMO3dvNjZcJy5GQn1/6nc+\nI/V/JWmxAt8mXPxNA/5B6N2UqBij8zYDPgaap21LYpxXROecDtxN6AVatDh9MJ1zzrmcKqG6yTnn\nXJl4knDOOZeTJwnnnHM5eZJwzjmXkycJ55xzOXmScM45l5MnCZc4kjZIuifteWNJH2vjNN1HSrqo\nhmNsL2lU9HiIpJsKjGF4HmXulHRMIcctJknPSdqvXOd3DYMnCZdE/wH2ktQ0et6PtEnJzOwxM7u6\nugOY2Ydmdnz6pgJjGFFg+YoiqXG5Y3CVwZOES6p/AT+OHm8yNUL6nUF0NX+jpJclvZe6so+mcZmR\ndrzO0ZX3HEmXpx3r0WjG1BmpWVMl/QHYOpql9N5o28nauJjP3WnH7Z157nRRHG9Hs53OlPRkKvml\n3wlI2k7S/LT392i0aMw8SecoLCwzRdIr0aRuKSdHMU2X1D16fTOFxbFekzRZ0pFpxx0r6RnCaFzn\nauRJwiWwSakPAAACR0lEQVSRAQ8Cg6Mv1H2ASVnKpLQ3s4OAIwmLrWQr0x04mjAlxHFp1TRDzax7\ntP88Sa3NbDiwxsz2M7OTJO1BuLPoY2HG2vPyOHe6rsBNZrYX8AlwbDXvO2VP4CjCBH2/Az63MGvq\na8DJaeW2jmI6hzBfF8ClwDNmdgDQF7g2mhsL4DvAMWZ2cI4YnNuEJwmXSGY2k7By1mDgn1Q/TfeY\n6DXvsHHe/EwTzGy1mX1BmC+oV7T9V5KmEb58OwK7RtvTz9cXeNjMVkXnWV3gueebWequZnL0vmry\nnJmtsTD76Grg8Wj7jIzXPxCd/0WguaQWhIn9LlFYp2MisCUbZ1KeYGaf5HF+54Awt7tzSTUOuIaw\nIlybasp9mfY4VzLZbH59Sb0JCaCnmX0p6TlgqwJjzOfc6WWq0s6xno0XapnnTX+NpT3fwKb/b7Ot\nGyDgWDObm75D0gGE9h7n8uZ3Ei6JUl+2dwBXmtmsWrw2Uz9JraJql6OAlwmzj66KEsTuhKVyU9al\nNe4+S6ii2hZAUusCz51r+wJg/+jxcTnK1GRgFFMvwqpknwHjgXO/Prm0by2P7ZwnCZdIBmBmi83s\n5nzKVvM85XVCNdM0QtXRFOBJoImkWcDvgVfTyo8EZki618zejvY/H1Xh/LnAc+fafi1wtqTJhKmd\nc6nuuF9ImgLcApwabb+K8L6mS5oJ/KaaYztXLZ8q3DnnXE5+J+Gccy4nTxLOOedy8iThnHMuJ08S\nzjnncvIk4ZxzLidPEs4553LyJOGccy6n/wel65OR/E4ZuAAAAABJRU5ErkJggg==\n",
|
|
"text/plain": [
|
|
"<matplotlib.figure.Figure at 0x2b60b6db358>"
|
|
]
|
|
},
|
|
"metadata": {},
|
|
"output_type": "display_data"
|
|
},
|
|
{
|
|
"data": {
|
|
"image/png": "iVBORw0KGgoAAAANSUhEUgAAAYkAAACfCAYAAAAMJSWPAAAABHNCSVQICAgIfAhkiAAAAAlwSFlz\nAAALEgAACxIB0t1+/AAAIABJREFUeJztnXnYVGX5xz9fCQVFwCU0ENFQNPdM0FIRTcwll/JHiuaC\nWaYWZrm2oVlWRqZpZhoaaopbKVQqZBKooQgiiCyyuSCuwCuIC/Levz+eZ3gPwyxn5p2Zd164P9d1\nrjnLc+7nnjNnzn2e7fvIzHAcx3GcXGzQ0g44juM49YsHCcdxHCcvHiQcx3GcvHiQcBzHcfLiQcJx\nHMfJiwcJx3EcJy8eJOoUSX+U9KNS00o6SNIr1fVudb7zJR1Si7xaG5IaJX261uc2l2TepdyDOews\nk7RdJX1zWgYPEjVG0gJJH0jaPGv/s/EPui2AmZ1tZr9IYzNH2rIGv0jqEX3w+4JmX4/mDEDKe66k\nsZLel/SupDcl3S9pq2bklTfvtPegpMcknbGGEbNNzWxBBf3K5LVA0or4/ZfFz99XOh+nCX8Y1B4D\n5gMDMzsk7Qa0p3kPlkqg6IOqnpHUptp5VIhyf5PmXMNC5xpwjpl1BHoBnYHf5TRSXnCr+m/fTAw4\nysw6xkDU0cwG50qY6x4r9b5rRfdp1fAg0TLcDpyW2D4NGJ5MIOlWST+L6wdJekXS9yW9IWmhpNNz\npW3apUslvSVpnqSTEgeOlDRZUoOklyQNSZz33/i5NL6h7RvP+aakF+K+5yXtlTjns5Kek7RE0l2S\nNsz1hSWdJulxSVdLegsYImmIpNsTadZ4c49vqD+L570r6eHsElji3BckHZnYbhPftPeStJGk2yW9\nHf18StInc9lJi6Tekp6M9hZKuk7SJ7KSHSVpbvTjqqzzz4g+vyPpoUwJMm32AGa2FLgf2C3avFXS\nDZL+KWkZ0E/ShpKGxt96UTy+UcKPCyW9JulVSYNIBMXs+0rSsbHE2yDpRUmHSfo5cCBwffKtXmtW\nW3WUdFu8DvOVqMKK98V4Sb+RtDher8PTfP+1dq55j71NuMdy7ZOkHyuUSl6X9BdJHaONzD14hqSX\ngEfT/ijrKh4kWoYJwKaSdooPxBOAOyj8Frc1sCnQFTgT+IOkTgXSbh7Tng7cJGnHeGw5cIqZdQKO\nAr4t6Zh4rG/87Bjf0J6SNAD4KfD1+PZ6DPBOIq8BwGHA9sCeMb987AvMAbYCMtUY2W/q2dsDCUH0\nk8BGwAV5bN8JnJTYPhx4y8ymxPM7At0I1+XbwPsF/EzDKuB70d7ngUOAc7LSHAfsHZdjFatkJB0L\nXBKPfxIYD9xVqgOStgSOByYndg8ErjCzTYEngF8DOwB7xM9uhN+T+DD+PvBFYEfg0AJ59SG8yPwg\n3jt9gQVm9uPo/3ey3uqTv+P1hHt3O6AfcGoMSBn6ADOALYDfAMNKuQ5ZZO6xLjTdY9n7BgGnAgcB\nn46+XZ9lpy+wM/ClZviyTuBBouXIlCb6E/4grxVJ/xHhz7/KzB4iPOx3ypPWgJ+Y2UozGwf8E/ga\ngJmNM7Ppcf15YAThz5IkGay+AVxlZpPjOfPMLNkwfq2ZvRHfakcByVJGNgvN7AYzazSzD4t83wy3\nmtncmP6eAvbvAo6R1C5uD6TpwbuS8ADqZYFnzWx5yvxzYmaTzezpaO9l4CbWvo6/MrMGM3sVuIam\nKsazgF+a2WwzawR+BewlqXvK7K+TtBh4lnDf/CBx7EEzmxB9/BD4JnB+9OO9mFfGjwGE6zvDzN4H\nLiuQ5xnAMDP7T7S9yMxmF0gvWF3ldQJwiZmtMLOXgN8CpyTSvmRmt1gQkhsObC2pSwHbD8RSx5L4\n+Y3EsVz3WPa+k4CrzewlM1sBXAqcqKbqOQOGmNn7Jdyn6yzZxWOndtwBjCO8gd+WIv078YGSYQXQ\nIU/aJWb2QWL7JUKpAoUqpF8Sqig2jMu9BfLtDswtcPyNLJ8+VSBtOb2uXs+yn/M7m9lcSS8AR0v6\nB6HE89N4+HZgG2BELH3dAfzIzFaV4Q8AsWR2NbAPoT3pE8CkrGSvJtZX/wZAD+BaSb/NmCM8mLqR\n7hp918xuyXNs9fmxSm1jYJK0Ou5vQNNLQFfgmSwf85VmuxNeNkplS8K1eTkrn26J7dW/sZm9r+Bs\nB+DNPDaPNbPH8hzLdf2y93WNPiT9+QShhJvhVRzASxItRnz7nA8cAfytwuY3k9Q+sb0tTSWVvwIP\nAN3MrDPwJ5oeDLkaaV8BelbIr2z77xEeYhkKBZg0jCC8JR4LTDezeQBm9rGZXWFmuwJfAI4mVDc0\nhz8SSoA943X8EWs/YJMlgx40/QavAGeZ2eZx2czMOmRKAM0keY3fJgTWXRN5dY7VRQCLcviYr6G+\n0H1QqHH/bUJJrkdWPgsLnFOMYg37xfa9lsOflaz5wtPSnUjqBg8SLcsZwCGxqF9JBFwuqa2kAwlt\nD/fEYx0IJY2VsZ45WY//FtDImg+DPwMXSNobQFLPEqpFijEF6Cupe3zDv6SZ9kYQ2kfOJrRRACCp\nn6TdYnXCcsIDoTG3ibUQ0C42fmcWEeqx3zWzFZJ2jnlmc6GkzvF6DY7+AdwI/FDSLtG/TpL+r/Sv\nW5hYfXMzcE0sVSCpm6TDYpJ7gNMlfUbSxjSVvHIxDBgk6eDY8NtVUqa68w1C3X4uHxpjPr+Q1EFS\nD+B8QumupbgLOF/SdpI6ENopRiRK6vXew6umeJCoPcl+6PMzdf3Zx0qxk4NFwBLCG9PthLfWF+Ox\nc4ArJDUAPwbuTvjzPuEP80Ss6+1jZvfFfXdKehf4O6GxtlR/1/4CZv+O+U8FJhLaNNZIUqK914H/\nAfuR+F6Ehvz7gAZgOvAY8SGlMGDshkJmgWWEN/L34+fBhHaAk+M1+RNNASB53oOEKqjJ8bvdEv18\ngNA2MELSUsL3Pzzr3EL+lHLsYkKj7YSY12hC11nM7GFCW8l/gNkU6MljZhMJDb7XEK7jWEIJFeBa\nYIBCT61rcvgymHDd5hGqWO8ws1tL/B5JRsWeVJnl/iLps7mF8PuPI1Slrog+ps1/vULVnnQo9qC4\nhhCQhpnZr7OOXwCcTPhh2gKfAbaMDaGO4zhOC1LVIBGL97MJXexeI7wtnmhmM/Ok/zLwPTPL2xXP\ncRzHqR3Vrm7qA7wYu5qtJBTJjy2QPtlt0XEcx2lhCgYJhVGr+bqapSG7S9+rrNn1LZlXe0K9bKn1\ni47jOE6VKDhOwsxWxSHqncysocq+HA08nq8tQpI3JjmO45SBmZXdYytNddNyYJqkYZJ+n1lS2l9I\nUw8ICAOa8vWPPpEiVU1mVvfLkCFDWtwH99P9bK0+up+VX5pLmhHXf6P8wV4TgR1i3+hFhEAwMDtR\n7CN/EKGXk+M4jlMnFA0SZjZcQdmzV9w1y0IjdFEsVFd9h9A3O9MFdoaks8JhuykmPQ54xCo/qMxx\nHMdpBkWDhKR+BNGtBYSRiN0lnWZBOK4oFgbs7JS1709Z28PJkspurfTr16+lXUiF+1lZWoOfrcFH\ncD/rjaLjJCRNAk4ys1lxuxdwl5l9rgb+Jf0wa2wE+Yh5x3GctEjCqtxw3TYTIAAsyAO3LTfDZtFQ\n7Q5WjuM4TpI0DdfPSPozQV4ZQuPyMwXSV49XXoHOnVska8dxnPWRNCWJs4EXCAJYg+N6LsXL6vOq\nS7w7juPUkoJtEgqTgN9mZi3eNVWS2ZIlXpJwHMcpgaq2SViYuauH8kxuX3M8QDiO49SUNG0S8wjz\nC4wkzCQGgJldnSaDYlLhMU0/4HeEBvG3zOzgNLYdx3Gc6pImSMyNywaE2bhSE6XCrychFS7pQUtI\nhcfR1n8ADjOzhZK2LCUPx3Ecp3oUDBKxTWJTM7ugTPurpcKjvYxUeHI+iZOA+81sIYCZvV1mXo7j\nOE6FSdMmsX8z7KeRCu8FbC7pMUkTJZ3SjPwcx3GcCpKmumlKbI+4lzXbJMoV/cvlw97AIcAmwP8k\n/c/M5mQnvKx7dzjpJGjfnn79+q03w+Idx3HSMnbsWMaOHVsxe2lkOXJNWG5mdkZR49J+wGVmdnjc\nviSe++tEmouBdmZ2edz+M/CQmd2fZctsl11gxAjYffdiWTuO4zg0vwtsGhXYQeUaJ51U+IPAdbH9\nYyNgXyB3z6lttgkD6jxIOI7j1IS8bRKS7kms/zrr2Og0xmObRkYqfDowIiMVLulbMc1M4BFgKjAB\nuMnMXshpsHv3IM3hOI7j1IRCJYkdE+v9gYsT259Mm0FKqfChwNCixjIlCcdxHKcmFOrdVKixomXm\nm+7e3YOE4zhODSlUkthY0mcJgaR9XFdc2tfCubUYMACOOaZFsnYcx1kfydu7SdJjhU6stXSGJKvE\npN6O4zjrE83t3VS0C2y94EHCcRyndGoxM53jOI6znlL1ICHpcEkzJc2OA+eyjx8kaamkyXH5cbV9\nchzHcdKRRpajbNKowEbGmZm3SDuO49QZqYKEpG5Aj2R6MxuX4tQ0KrAQekyl4+KLYeedYVBzBoI7\njuM4aSgaJOJo6xMIc1uvirsNSBMkcqnA9smR7vOSpgALgQvzjrgGaN8e5s9PkbXjOI7TXNKUJI4D\ndjKzD6vkwyRgWzNbIekI4AGCfPhaXHbZZTB5MrzyCv0OOcRVYB3HcbJoCRXYh4ABZra8ZOMpVGBz\nnDMf+JyZLc7aH7rAPvIIDB0KY8aU6o7jOM56R9VVYIEVhDklHgVWlybMbHCKc4uqwErayszeiOt9\nCIFr8VqWMrh+k+M4Ts1IEyRGxqVkzGyVpIwK7AbAsIwKbDhsNwH/J+lsYCXwPqH9Iz8ZJVgzUNnB\n0XEcx0lBqhHXkjakqZ1glpmtrKpXuX1oGnG9dCl06uRBwnEcpwhVl+WQ1A8YDiwgdFXtDpyWsgts\nxXBZDsdxnNKpRZCYBJxkZrPidi/gLjP7XLmZloMHCcdxnNKphXZT20yAADCz2UDbcjN0HMdxWg9p\nGq6fkfRn4I64fTLwTPVcchzHceqFNNVNGwHnAgfEXeOBG6o4uC6fH2tWNzU2wgYuYus4jlOI9XM+\niUcfhauuCgPrHMdxnLxUrU1C0j3xc5qkqdlLCQ4WlApPpOstaaWkrxY1uvXW8PLLaV1wHMdxyqTQ\n9KWfMrNFcbT0WmSUXQsaD1Lhs0lIhQMnZkuFx3RjCIPpbjGzv+Ww1VSSaGiAbt1g2TIfK+E4jlOA\nqpUkzGxRXD3HzF5KLsA5Ke2vlgqPA/AyUuHZfBe4D3gzldVOnUJ7RENDSjccx3GcckjT8ts/x74j\nUtrPJRXeLZlAUlfgODP7I6XMK9ESGk6vvRZKL47jOOsJebvARj2lc4CeWW0QmwJPVtCHa4BkW0Xe\nQHHZZZetXu/XoQP9Fi2C3XaroCtF+O534atfhZNPrl2ejuM4JVAzqXBJnYDNgF8ClyQOLSuo0rqm\njaJS4ZLmZVaBLYH3gG+Z2cgsW2t2gV21Ctq0SeNG5bj2WpgxA268sbb5Oo7jlEktZDn2A6ab2bK4\n3RH4jJk9lcK5NsAsQsP1IuBpYKCZzciT/lZgVNGG65Zi8mT4+tfhhfwT5zmO49QTtZDl+COQnHBo\nedxXFDNbBWSkwqcDIzJS4ZK+leuUNHZbjD33hIUL4c107euO4zitnTQliSlmtlfWvqlmtkdVPVvb\nj5YvSQAceSSceWZom3Acx6lzalGSmCdpsKS2cTkPmFf0rHWJBQvCKG+AAQNCe4jjOM56QJqSRBfg\n98AhhOqgR4HvmVlN61xyliSWLYMOHao/oO7qq2HOHLjhhurm4ziOU2GqPsd1DAYnlptBVenaNUxl\n2rlzdfMZPx6+9rXq5uE4jlOHFBoncZGZXSXpOnI0KJvZ4Kp6lobu3cOAumoGicbGECSuu656eTiO\n49QphUoSmW6q9Tt3RGbUdTUH1M2cCR07hrwcx3HWM/IGCTMbFT+HNycDSYcTRlVvAAxLDqSLx48B\nrgAagZXA+Wb2RCrj3buH6qZqMm4c9O1bOXstMQjQcRynTApVN42iwLgFMzummPGo7no9CRVYSQ9m\nqcD+OzO6WtLuwD3AZ1J5Xwv9pl12gV13XXv/VVfB4MHQrl16Ww0NoWrso4+grc8A6zhO/VOoC+xQ\n4LfAfIKE981xWQ7MTWm/qAqsma1IbHYglCjSsf328N57qZOXRd++cOCBa++/7z54+unSbHXqBL17\nwxPpCkqO4zgtTaHqpv8CSPqtme2TODRKUtp2ilwqsH2yE0k6jqAR9UngqJS24fTTUyetOH37llcV\ndeSR8K9/Qb9+VXHLcRynkhTtAgtsIunTZjYPQNL2wCaVdMLMHgAekHQA8HNyy5OvqQLbrx/9WvJB\ne+CB5Y2bOOII+MY3QnWV4zhOhamZCuzqBKHh+SbCKGsBPYCzzKzoBNNpVGBznDMX6J2tNFs3shwZ\n3nknVHctXgyfSBNrI6tWhelXn3kGeuSc9M9xHKdi1GIw3cOSdgR2jrtmmtmHKe1PBHaIU6AuIgzK\nG5hMIKmnmc2N63sDG6aVIm9RttgCtt0WpkyBffYpnj5DmzYwcCC8+KIHCcdx6p6i2k2SNgYuBL5j\nZs8B20r6chrjKVVgj5f0vKTJwHVAfQxtfv11OOWUwmmuuSaM+i7G2LFw111N27//PRx6aLPccxzH\nqQVpqpvuBiYBp5rZbjFoPJmtDFtt8lY3Zap7OnasbIb33gu33QajRjXf1oAB0L8/fCuXOrrjOE71\nqIUKbE8zu4ow0C3TZbXKinolcMEFcPfdlbc7fnxlBtEtXQqjR4dA4TiO08pIEyQ+ktSeOLBOUk8g\nbZtE9cnoN1WaceNyj48olfvuC1VLm23WfFuO4zg1Jk2QGAI8DHSX9FeCVPhFVfWqFLbZpvLSHEuX\nwty5sPfezbd1++3F2zYcx3HqlIJBQpKAmcBXgdOBu4B9zGxs1T1LSzVKEk88AX36wIYbNs/Oq6/C\n9OlhbEQuRoyovvaU4zhOMygYJGJL8b/M7B0z+6eZ/cPM3q6Rb+mohn7ToYfC8JS6hm+/DQccALka\n1bt1g6lTYaONcp87ejQ88ED5fpbLT34CV15Z+3wdx2l1pKlumiypd9U9KZfu3cPsdJVko43SS4Nv\nsQXMmwfz5699TCrcRTYj0VFrnnsObr0Vfp13TKPjOA6QLkjsC0yQNFfSVEnTJE1Nm4GkwyXNlDRb\n0sU5jp8k6bm4PB6VYNPTqVPpQnuVRAoN3OPHl35u//7w+OOwYkXxtJVk5Ej473/h5pvDmA3HcZw8\npNGT+FK5xlNKhc8D+ppZQ5QAuRnYr9w8W4SM2N9pp5V2XqdO8LnPwWOPwVHpdQ0rQteu8OijwfdP\nfxq+nGp8pOM46xl5SxKS2kn6HmG09eHAwij5/ZKZvZTSfhqp8Alm1hA3JxCUY1sX5ZYkIFQ5PfRQ\nZf1JS48ewe/+OfUUHcdxClY3DQf2AaYBRxDmliiVXFLhhYLAmUDln5grV8LQoaF6pZhI4Lvvlm5/\nt92C4N9bb4XtWbNgzpx05w4cCCeeWHqelWLbbfM3rDuOs95TqLppFzPbHUDSMKCqFf+SDgYGAQfk\nS1O2VPgxxwT11SVL4G9/g2HDcjcof/RRU2+pUmQ+NtgAXnqpqQH9iitg333hu98tfm737mFxHMep\nADWTCpc02cz2zredynhKqXBJewD3A4dnFGFz2MovFd7QAG++CTvumPv4iy/CDjvAxx/Dz38ON94I\nDz4I+2U1fUyYAN/+dlB2LZfly0OgmT0bunQp306lee65oEP14x+3tCeO49SQamo37Snp3bgsA/bI\nrEtKWyezWipc0oYEqfCRyQSStiUEiFPyBYii/O9/cO65+Y/vuGPohdS2LVx+eXhY7rLL2unGj2++\nFMff/x7GTdRTgIDQ5fWjj9KlPeGE8D3mzw8BuJ7m8XAcp6YUmr60TXONm9kqSRmp8A2AYRmp8HDY\nbgJ+AmwO3BBHeK80s7WmOC1IckCdWQgIheiTx/y4cXDqqSVlvRZ33AGDBjXPRqVZuTJIlaedW/us\ns+Cii8JAwcWLQxfdrl3h5Zdz277hhqBNtfnmYcmsb7VVZb+H4zg1p6hUeL1QtLqpWzc49tgwd/Q3\nv1l6Bo2NYWDcjBlh5rhyWLQolFAWLoSNNy7Phw3SDF0pkZEjw3Spjz9e3vkrV4YG/S22WPvYe+/B\npZeGYLJ4cWj3WRznjJo1q3yfHcepCFWfma5V0LEjbLJJeHs9+eTybCxZErqElhsgIMxrccst5QWI\nv/41VHfdeGP5+edj+PDSx3Akads2d4CAcN1LGZC3aBFMmxaq9dq3L98nx3FqwrpRkoBQ394cQT6z\nYKOluoPOmgVf/GIQ/CtWXVYK770XurnOmxcG77U0EyfC+ecHTasDDoCvfQ2+8pX68M1x1kFqMelQ\n66C5iq1Sy44X6NUL2rULb9mVZJNNYMGC+nkI9+4dqr1efjm0/zzwQAhif/lLS3vmOE4OCnWBXUac\naIimmegsrpuZVXi+0MIULUmsCwweHBqIL7mkpT2pLUuXhlJcvfUIc5x1gOaWJNad6qZ1gYcfDhLe\n48a1tCf1w7nnwqc+FXqk7bNPaHdyHCc1NaluknSApEFxfUtJ25fgYDEV2J0kPSnpA0nfT+/6OshB\nB4Vupx/Wz+ywLc7BB4fea1deCdttFwZFnnRS+jEfzvrJuv5CWUOKliQkDSFoOO1kZr0kdQXuNbP9\nixoPKrCzSajAAicmVWAlbQn0AI4DlpjZ1XlsrfslCacwq1aFBv6pU1tW72p9oKEh9D5rbltfpRg9\nGl5/fc1u1osXwx/+AJ07r5l25UrYa68gx3PWWeHlYj2mFiWJrwDHAO8BmNlrwKYp7adRgX3bzCYB\nH6f22inO1KlBZmRdok2bMA4lX4CYNAnOPDNIoK9aVdze/PlhDMmYMaH318et6BZsbIRnnoE//hHO\nOAN23z0oC5x6arq36HfegXvugX/+M/fxoUND1/Lttw8zNX772/Cb31Ru7Msee4RBl9lLrgGbEAaD\njh4dficI3/WII3IHsbZtg2LABx8EKf6jjw5VuY2NlfF9PSPNOImPzMwkGYCkTUqwn0sFtrTR1E55\nDB0a/iDZ+lTrMttsAzvvHEaLv/YaDBgQ5snYaacwBiabO+4IgfSDD4Jq7xtvBLHFG27ILZ8+Z05u\nleCePXP3Hkum33zzYLtNs4UMmjjvvPB9+/QJbTft2oU8c3WhXr4cnnwy9Cx75JEwaLRvX/jGN3Lb\nvuIK+OlPw0N7zhyYOzd8vvVWuJ7Z/OMfsGxZ6Hgxc2bo6vz00/DnP+dWOBg/PvdDO5+w5q235r8O\nuejVC373u6DVNmIE/PCHcPfdpdtxUlU3XQDsCPQHfgmcAdxpZtcVNS4dD3zJzL4Vt78O9DGzwTnS\nDgGWFapuGjJkyOrtklRgWzNmcPrpYTn44HTnLFsWHkj1JjJYS2bPDg+FMWOCGu+AAcXP+fDDULro\n0iV3A/l55+WeN+Saa8IDN196s/Dm/uabIVjddltQCc5m6tTw8E4+lOfOhbFj84tXpuX55+Gcc+Dz\nn4cvfQn237+yXb6HDQvBZ+HC8IDu3Tsse+5ZH1VWZuF/kSsILVoUuoqXovzc2BhsVjLoV4hsFdjL\nL7+8+r2bJPUHDoubo81sTCrjKVVg47GiQWK9bZN46KFQjXLCCaEBt127wun/8pcgiT5yZOF0Tm35\n4IMQhLp1y/1AGjQoPMh22CGUTnr2DOvbbFMduRYn8IMfwJ/+FNpgktf+61/PHZxPOAHuvz/8D/fa\nqykgHnnk2u0j2Xz8cSidzZsXBs/mKvUtWRLujwoFoJp0gZW0NaGayICJZvZ6SufaALMIDdeLCHNS\nDDSzGTnSDgGWm1nOyY3W6yAB4U307LPhhRfgssvCTZmrCmXatKBddeGFcPzxNXfTcVolZqG6MVOC\nmzcvlD53223ttMuWhQCxYkVoB5s4MSxXXhlKUdlceGEoyc2ZExQVunQJwWjUqFCCyfajS5emQLHp\npk2BZO7csgJH1YOEpDOBnwL/IQykOwj4mZndktLBw4FraVKB/VVSBVbSVsAzhMbwRmA5YcKj5Vl2\n1u8gAeEGuvPO0Ih3zjnhzSWb668PdcE33+wzzjlOPTB8eNA+22GH0NOqWE0AhI4XDQ1rtoH16FGW\nZE8tgsQs4Atm9k7c3gJ40sxytF5VDw8SjuM4pVOLLrDvAMsS28viPsdxHGcdJ28X2MTo5znAU5Ie\nJLRJHAtMrYFvjuM4TgtTaJxEZsDc3LhkeLB67jiO4zj1hAv8OY7jrMNUfWY6SZ8ELgJ2BVY3y5vZ\nIeVm6jiO47QO0jRc/xWYCWwPXA4sIAj1OY7jOOs4aYLEFmY2DFhpZv81szOA1KWIYlLhMc3vJb0o\naYqkvdLarkeSw+HrGfezsrQGP1uDj+B+1htpgsTK+LlI0lGSPgukmvklSoVfD3yJUF01UNLOWWmO\nAHqa2Y7AWcCNaZ2vR1rLjeN+VpbW4Gdr8BHcz3ojjQrszyV1An4AXAd0BL6X0v5qqXAASRmp8JmJ\nNMcCtwGY2VOSOknayszeSJmH4ziOUyWKliTM7B9m1mBmz5vZwWb2OaBnSvu5pMK7FUmzMEcax3Ec\npwUoqwuspJfNbNsU6YpKhUsaBfzSzJ6M2/8GLjKzyVm2vP+r4zhOGVS1C2we0ma4EEgGk23ivuw0\n3YukadaXdBzHccqjXJH6tG/1E4EdJPWQtCFwIpA9ycFI4FRYPf/EUm+PcBzHqQ8KaTctI3cwENA+\njXEzWyXpO8BomqTCZySlws3sX5KOlDSHMI/2oJK/heM4jlMVWo0sh+M4jlN7WsWciGkG5NXQl2GS\n3pA0NbFvM0mjJc2S9EjsMpw5dmkcKDhD0mG5rVbcx20k/UfSdEnTJA2uUz83kvSUpGejn0Pq0c+Y\n7waSJksaWa8+xrwXSHouXtOn69HX2M393pjndEn71qGPveI1nBw/GyQNrjc/Y77nS3pe0lRJf5W0\nYUX9NLO6XgiBbA7QA2gLTAF2bkF/DgD2AqYm9v2a0CML4GLgV3F9F+BZQrXedvF7qAY+bg3sFdc7\nEKaQ3bmYVb2MAAAGf0lEQVTe/Ix5bxw/2wATCGNr6tHP84E7gJH1+Jsn/JwHbJa1r658Bf4CDIrr\nnwA61ZuPWf5uALxG6GBTV34CXeNvvmHcvhs4rZJ+1uxCN+Mi7Ac8lNi+BLi4hX3qwZpBYiawVVzf\nGpiZy1fgIWDfFvD3AeDQevYT2JgwjW3vevOT0ONuDNCPpiBRVz4m8ptPkNJJ7qsbXwmDcefm2F83\nPubw7TBgfD36SQgSLwGbxQf/yEr/11tDdVOaAXktTReLPbLM7HWgS9zf4gMFJW1HKPlMINw0deVn\nrMZ5FngdGGNmE+vQz98BF7JmR4568zGDAWMkTVSYnx7qy9ftgbcl3Rqrcm6StHGd+ZjNCcCdcb2u\n/DSz14DfAi/HPBvM7N+V9LM1BInWSF30BpDUAbgPOM/MlrO2Xy3up5k1mtlnCW/rfSTtSh35Keko\n4A0zm0Lh8UEtfi0j+5vZ3sCRwLmSDqSOrifhbXdv4A/Rz/cIb7f15ONqJLUFjgHujbvqyk9JnQnS\nRj0IpYpNJJ2cw6+y/WwNQSLNgLyW5g1JWwFI2hp4M+5PNVCwGkj6BCFA3G5mmdkE687PDGb2LjAW\nOJz68nN/4BhJ84C7gEMk3Q68Xkc+rsbMFsXPtwjVjH2or+v5KvCKmT0Tt+8nBI168jHJEcAkM3s7\nbtebn4cC88xssZmtAv4OfKGSfraGIJFmQF6tEWu+VY4ETo/rp9E0xetI4MTY22B7YAfg6Rr5eAvw\ngpldW69+Stoy0+tCUnugPzCjnvw0sx+a2bZm9mnCvfcfMzsFGFUvPmaQtHEsPSJpE0Jd+jTq63q+\nAbwiqVfc9UVgej35mMVAwstBhnrz82VgP0ntJIlwPV+oqJ+1bABqRuPM4YQeOi8Cl7SwL3cSejp8\nGH+gQYRGo39HH0cDnRPpLyX0IJgBHFYjH/cHVhF6gj0LTI7XcPM683P36NsUYCrwo7i/rvxM5H0Q\nTQ3Xdecjob4/85tPy/xX6s1XYE/Cy98U4G+E3k115WPMd2PgLWDTxL569HNIzHMqMJzQC7Rifvpg\nOsdxHCcvraG6yXEcx2khPEg4juM4efEg4TiO4+TFg4TjOI6TFw8SjuM4Tl48SDiO4zh58SDh1B2S\nGiXdlthuI+ktNcl0Hy3poiI2PiXpnrh+mqTrSvTh0hRpbpX01VLsVhJJj0nau6Xyd9YPPEg49ch7\nwG6SNorb/UmIkpnZKDO7qpABM1tkZl9L7irRhx+WmL5VIalNS/vgtA48SDj1yr+Ao+L6GtIIyZJB\nfJu/VtITkuZk3uyjjMu0hL1t45v3LEk/Tdj6e1RMnZZRTZX0S6B9VCm9Pe47VU2T+QxP2D0oO+8k\n0Y8Xotrp85IezgS/ZElA0haS5ie+39/jpDHzJJ2rMLHMZElPRlG3DKdGn6ZK6h3P31hhcqwJkiZJ\nOjph90FJjxJG4zpOUTxIOPWIASOAgfGBugfwVI40GbY2s/2BowmTreRK0xv4CkESYkCimmaQmfWO\nx8+TtJmZXQqsMLO9zewUSbsQShb9LCjWnpci7yQ7ANeZ2W5AA3B8ge+dYVfgOIJA3y+A5RZUUycA\npybStY8+nUvQ6wL4EfCome0HHAIMjdpYAJ8FvmpmB+fxwXHWwIOEU5eY2fOEmbMGAv+ksEz3A/Gc\nGTTp5mczxsyWmtkHBL2gA+L+70maQnj4bgPsGPcn8zsEuNfMlsR8lpaY93wzy5RqJsXvVYzHzGyF\nBfXRpcA/4v5pWeffFfMfD2wqqSNB2O8ShXk6xgIb0qSkPMbMGlLk7zhA0HZ3nHplJPAbwoxwWxZI\n92FiPV8wWUtfX9JBhACwr5l9KOkxoF2JPqbJO5lmVSKPj2l6UcvON3mOJbYbWfN/m2veAAHHm9mL\nyQOS9iO09zhOarwk4dQjmYftLcDlZja9jHOz6S+pc6x2OQ54gqA+uiQGiJ0JU+Vm+CjRuPsfQhXV\n5gCSNisx73z7FwD7xPUBedIU44To0wGEWcmWAY8Ag1dnLu1Vpm3H8SDh1CUGYGYLzez6NGkLbGd4\nmlDNNIVQdTQZeBhoK2k6cCXwv0T6m4Bpkm43sxfi8f/GKpzflph3vv1DgbMlTSJIO+ejkN0PJE0G\nbgDOiPuvIHyvqZKeB35WwLbjFMSlwh3HcZy8eEnCcRzHyYsHCcdxHCcvHiQcx3GcvHiQcBzHcfLi\nQcJxHMfJiwcJx3EcJy8eJBzHcZy8/D9Pt3WyaiA8CAAAAABJRU5ErkJggg==\n",
|
|
"text/plain": [
|
|
"<matplotlib.figure.Figure at 0x2b60bbcf128>"
|
|
]
|
|
},
|
|
"metadata": {},
|
|
"output_type": "display_data"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Compute the moving average loss to smooth out the noise in SGD\n",
|
|
"plotdata[\"avgloss\"] = moving_average(plotdata[\"loss\"])\n",
|
|
"plotdata[\"avgerror\"] = moving_average(plotdata[\"error\"])\n",
|
|
"\n",
|
|
"# Plot the training loss and the training error\n",
|
|
"import matplotlib.pyplot as plt\n",
|
|
"\n",
|
|
"plt.figure(1)\n",
|
|
"plt.subplot(211)\n",
|
|
"plt.plot(plotdata[\"batchsize\"], plotdata[\"avgloss\"], 'b--')\n",
|
|
"plt.xlabel('Minibatch number')\n",
|
|
"plt.ylabel('Loss')\n",
|
|
"plt.title('Minibatch run vs. Training loss')\n",
|
|
"\n",
|
|
"plt.show()\n",
|
|
"\n",
|
|
"plt.subplot(212)\n",
|
|
"plt.plot(plotdata[\"batchsize\"], plotdata[\"avgerror\"], 'r--')\n",
|
|
"plt.xlabel('Minibatch number')\n",
|
|
"plt.ylabel('Label Prediction Error')\n",
|
|
"plt.title('Minibatch run vs. Label Prediction Error')\n",
|
|
"plt.show()"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"## Evaluation / Testing \n",
|
|
"\n",
|
|
"Now that we have trained the network, let us evaluate the trained network on data that hasn't been used for training. This is often called **testing**. Let us create some new data set and evaluate the average error and loss on this set. This is done using `trainer.test_minibatch`."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 25,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/plain": [
|
|
"0.12"
|
|
]
|
|
},
|
|
"execution_count": 25,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Generate new data\n",
|
|
"test_minibatch_size = 25\n",
|
|
"features, labels = generate_random_data_sample(test_minibatch_size, input_dim, num_output_classes)\n",
|
|
"\n",
|
|
"trainer.test_minibatch({input : features, label : labels})"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Note, this error is very comparable to our training error indicating that our model has good \"out of sample\" error a.k.a. generalization error. This implies that our model can very effectively deal with previously unseen observations (during the training process). This is key to avoid the phenomenon of overfitting."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"We have so far been dealing with aggregate measures of error. Lets now get the probabilities associated with individual data points. For each observation, the `eval` function returns the probability distribution across all the classes. If you used the default parameters in this tutorial, then it would be a vector of 2 elements per observation. First let us route the network output through a softmax function.\n",
|
|
"\n",
|
|
"#### Why do we need to route the network output `netout` via `softmax`?\n",
|
|
"\n",
|
|
"The way we have configured the network includes the output of all the activation nodes (e.g., the green layer in Figure 4). The output nodes (the orange layer in Figure 4), converts the activations into a probability. A simple and effective way is to route the activations via a softmax function."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 26,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"data": {
|
|
"text/html": [
|
|
"<img src=\"http://cntk.ai/jup/feedforward_network.jpg\" width=\"200\" height=\"200\"/>"
|
|
],
|
|
"text/plain": [
|
|
"<IPython.core.display.Image object>"
|
|
]
|
|
},
|
|
"execution_count": 26,
|
|
"metadata": {},
|
|
"output_type": "execute_result"
|
|
}
|
|
],
|
|
"source": [
|
|
"# Figure 4\n",
|
|
"Image(url=\"http://cntk.ai/jup/feedforward_network.jpg\", width=200, height=200)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 27,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"out = C.softmax(z)"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {},
|
|
"source": [
|
|
"Let us test on previously unseen data."
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 28,
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"outputs": [],
|
|
"source": [
|
|
"predicted_label_probs = out.eval({input : features})"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "code",
|
|
"execution_count": 29,
|
|
"metadata": {},
|
|
"outputs": [
|
|
{
|
|
"name": "stdout",
|
|
"output_type": "stream",
|
|
"text": [
|
|
"Label : [1, 0, 0, 1, 1, 1, 0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 1]\n",
|
|
"Predicted: [1, 0, 0, 0, 0, 0, 0, 1, 1, 0, 1, 1, 1, 0, 1, 0, 1, 1, 0, 0, 1, 0, 0, 0, 1]\n"
|
|
]
|
|
}
|
|
],
|
|
"source": [
|
|
"print(\"Label :\", [np.argmax(label) for label in labels])\n",
|
|
"print(\"Predicted:\", [np.argmax(row) for row in predicted_label_probs])"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"source": [
|
|
"**Exploration Suggestion**\n",
|
|
"- Try exploring how the classifier behaves with different data distributions, e.g. changing the `minibatch_size` parameter from 25 to say 64. What happens to the error rate? How does the error compare to the logistic regression classifier?\n",
|
|
"- Try exploring different optimizers such as Adam (`fsadagrad`). \n",
|
|
" learner = fsadagrad(z.parameters(), 0.02, 0, targetAdagradAvDenom=1)\n",
|
|
"- Can you change the network to reduce the training error rate? When do you see *overfitting* happening?"
|
|
]
|
|
},
|
|
{
|
|
"cell_type": "markdown",
|
|
"metadata": {
|
|
"collapsed": true
|
|
},
|
|
"source": [
|
|
"#### Code link\n",
|
|
"\n",
|
|
"If you want to try running the tutorial from python command prompt. Please run the [FeedForwardNet.py][] example.\n",
|
|
"\n",
|
|
"[FeedForwardNet.py]: https://github.com/Microsoft/CNTK/blob/v2.0/Tutorials/NumpyInterop/FeedForwardNet.py"
|
|
]
|
|
}
|
|
],
|
|
"metadata": {
|
|
"anaconda-cloud": {},
|
|
"kernelspec": {
|
|
"display_name": "Python 3",
|
|
"language": "python",
|
|
"name": "python3"
|
|
},
|
|
"language_info": {
|
|
"codemirror_mode": {
|
|
"name": "ipython",
|
|
"version": 3
|
|
},
|
|
"file_extension": ".py",
|
|
"mimetype": "text/x-python",
|
|
"name": "python",
|
|
"nbconvert_exporter": "python",
|
|
"pygments_lexer": "ipython3",
|
|
"version": "3.5.2"
|
|
}
|
|
},
|
|
"nbformat": 4,
|
|
"nbformat_minor": 1
|
|
}
|