This repository contains code samples for building a continuous integration pipeline for an AI application using Azure.
Перейти к файлу
Richin Jain 7521167058
Update Tutorial.md
2018-06-18 17:15:25 -04:00
flaskwebapp Updates tests 2017-11-22 17:29:16 +00:00
images Add files via upload 2017-11-07 14:06:58 -05:00
tests/integration Updating dockerfile add location of unit tests 2017-11-22 14:28:33 +00:00
web Updating Ubuntu version for docker 2017-09-19 15:44:57 -04:00
.gitignore adding ignore for model test files 2017-10-05 09:18:05 -04:00
LICENSE Initial commit 2017-09-14 11:08:15 -07:00
Makefile Updates makefile 2018-06-18 15:04:29 +01:00
README.md Update README.md 2018-06-18 16:59:40 -04:00
Tutorial.md Update Tutorial.md 2018-06-18 17:15:25 -04:00
deploy.yaml Updating deploy file. 2017-10-24 15:42:52 -04:00
downloadblob.sh Update downloadblob.sh 2018-06-18 17:07:23 -04:00

README.md

This repository contains samples showing how to build an AI application with DevOps in mind. For an AI application, there are always two streams of work, Data Scientists building machine learning models and App developers building the application and exposing it to end users to consume. test

In this tutorial we demonstrate how you can build a continous integration pipeline for an AI application. The pipeline kicks off for each new commit, run the test suite, if the test passes takes the latest build, packages it in a Docker container. The container is then deployed using Azure container service (ACS) and images are securely stored in Azure container registry (ACR). ACS is running Kubernetes for managing container cluster but you can choose Docker Swarm or Mesos.

The application securely pulls the latest model from an Azure Storage account and packages that as part of the application. Teh deployed application has the app code and ML model packaged as single container.

This decouples the app developers and data scientists, to make sure that their production app is always running the latest code with latest ML model.

Variation to this tutorial could be consuming the ML application as an endpoint instead of packaging it in the app. The goal is to show how easy it is do devops for an AI application.

For detailed instructions please refer to the tutorial