This repository contains code samples for building a continuous integration pipeline for an AI application using Azure.
Перейти к файлу
Richin Jain 56444c2c5f
Update Tutorial.md
2017-11-28 15:31:49 -05:00
flaskwebapp Updates tests 2017-11-22 17:29:16 +00:00
images Add files via upload 2017-11-07 14:06:58 -05:00
target Adding target folder 2017-10-23 11:36:17 -04:00
tests/integration Updating dockerfile add location of unit tests 2017-11-22 14:28:33 +00:00
web Updating Ubuntu version for docker 2017-09-19 15:44:57 -04:00
.gitignore adding ignore for model test files 2017-10-05 09:18:05 -04:00
LICENSE Initial commit 2017-09-14 11:08:15 -07:00
Makefile Updates makefile with new ACR 2017-09-19 21:53:32 +01:00
README.md Update README.md 2017-11-06 16:37:39 -05:00
Tutorial.md Update Tutorial.md 2017-11-28 15:31:49 -05:00
deploy.yaml Updating deploy file. 2017-10-24 15:42:52 -04:00
downloadblob.sh Updating blob download script to download 2 files 2017-10-06 14:40:17 -04:00
downloadtest.sh Add testing script 2017-09-20 16:10:59 +01:00
modeldownload.ps1 Update modeldownload.ps1 2017-09-19 09:17:18 -04:00
test_api.sh Reverting back as accessing ACR this way requires auth creds 2017-09-21 00:42:17 -04:00

README.md

This repository contains samples showing how to build an AI application with DevOps in mind. For an AI application, there are always two streams of work, Data Scientists building machine learning models and App developers building the application and exposing it to end users to consume. test

In this tutorial we demonstrate how you can build a continous integration pipeline for an AI application. The pipeline kicks off for each new commit, run the test suite, if the test passes takes the latest build, packages it in a Docker container. The container is then deployed using Azure container service (ACS) and images are securely stored in Azure container registry (ACR). ACS is running Kubernetes for managing container cluster but you can choose Docker Swarm or Mesos.

The application securely pulls the latest model from an Azure Storage account and packages that as part of the application. Teh deployed application has the app code and ML model packaged as single container.

This decouples the app developers and data scientists, to make sure that their production app is always running the latest code with latest ML model.

Variation to this tutorial could be consuming the ML application as an endpoint instead of packaging it in the app.

The goal is to show how easy it is do devops for an AI application.

Steps to set up build and test

  • Create VM on Azure

  • Install Docker

  • Setup Docker to be executed without sudo If you want to avoid typing sudo whenever you run the docker command, add your username to the docker group:

    • sudo usermod -aG docker ${USER} To apply the new group membership, you can log out of the server and back in, or you can type the following:

    • su - ${USER} You will be prompted to enter your user's password to continue. Afterwards, you can confirm that your user is now added to the docker group by typing:

    • id -nG

  • Login to ACR so credentials are stored

  • Install anaconda https://medium.com/@GalarnykMichael/install-python-on-ubuntu-anaconda-65623042cb5a