Enables inference and deployment of InnerEye-DeepLearning (https://github.com/microsoft/InnerEye-deeplearning) models as an async REST API on Azure
Перейти к файлу
Javier Alvarez-Valle dcc0c058e1 trigger 2021-04-07 18:01:45 +01:00
.github/workflows trigger 2021-04-07 18:01:45 +01:00
Tests Init 2021-03-29 17:38:31 +01:00
docs Init 2021-03-29 17:38:31 +01:00
.flake8 Init 2021-03-29 17:38:31 +01:00
.gitattributes Init 2021-03-29 17:38:31 +01:00
.gitignore Init 2021-03-29 17:38:31 +01:00
CODE_OF_CONDUCT.md Init 2021-03-29 17:38:31 +01:00
GeoPol.xml Init 2021-03-29 17:38:31 +01:00
LICENSE Init 2021-03-29 17:38:31 +01:00
README.md trigger 2021-04-07 17:20:48 +01:00
SECURITY.md Init 2021-03-29 17:38:31 +01:00
SUPPORT.md Init 2021-03-29 17:38:31 +01:00
THIRDPARTYNOTICES.md Init 2021-03-29 17:38:31 +01:00
app.py Init 2021-03-29 17:38:31 +01:00
azure_config.py Init 2021-03-29 17:38:31 +01:00
configuration_constants.py Init 2021-03-29 17:38:31 +01:00
configure.py Init 2021-03-29 17:38:31 +01:00
conftest.py Init 2021-03-29 17:38:31 +01:00
dockerignore Init 2021-03-29 17:38:31 +01:00
download_model_and_run_scoring.py Init 2021-03-29 17:38:31 +01:00
environment.yml Init 2021-03-29 17:38:31 +01:00
mypy.ini Init 2021-03-29 17:38:31 +01:00
requirements.txt Init 2021-03-29 17:38:31 +01:00
source_config.py Init 2021-03-29 17:38:31 +01:00
submit_for_inference.py Init 2021-03-29 17:38:31 +01:00

README.md

Introduction

InnerEye Inference API

InnerEye-Inference is a AppService webapp in python to run inference on medical imaging models trained with the InnerEye-DeepLearning toolkit.

You can also integrate this with DICOM using the InnerEye-EdgeGateway

Getting Started

Installing Conda or Miniconda

Download a Conda or Miniconda installer for your platform and run it.

Creating a Conda environment

Note that in order to create the Conda environment you will need to have build tools installed on your machine. If you are running Windows, they should be already installed with Conda distribution.

You can install build tools on Ubuntu (and Debian-based distributions) by running
sudo apt-get install build-essential
If you are running CentOS/RHEL distributions, you can install the build tools by running
yum install gcc gcc-c++ kernel-devel make

Start the conda prompt for your platform. In that prompt, navigate to your repository root and run

  • conda env create --file environment.yml
  • conda activate inference

Configuration

Add this script with name set_environment.sh to set your env variables. This can be executed in Linux. The code will read the file if the environment variables are not present.

#!/bin/bash
export CUSTOMCONNSTR_AZUREML_SERVICE_PRINCIPAL_SECRET=
export CUSTOMCONNSTR_API_AUTH_SECRET=
export CLUSTER=
export WORKSPACE_NAME=
export EXPERIMENT_NAME=
export RESOURCE_GROUP=
export SUBSCRIPTION_ID=
export APPLICATION_ID=
export TENANT_ID=

Run with source set_environment.sh

Running flask app locally

  • flask run to test it locally

Running flask app in Azure

  • Install Azure CLI: curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
  • Login: az login --use-device-code
  • Deploy: az webapp up --sku S1 --name test-python12345 --subscription <your_subscription_name> -g InnerEyeInference --location <your region>
  • In the Azure portal go to Monitoring > Log Stream for debugging logs

Deployment build

If you would like to reproduce the automatic deployment of the service for testing purposes:

  • az ad sp create-for-rbac --name "<name>" --role contributor --scope /subscriptions/<subs>/resourceGroups/InnerEyeInference --sdk-auth
  • The previous command will return a json object with the content for the variable secrets.AZURE_CREDENTIALS .github/workflows/deploy.yml

Help and Bug Reporting

  1. Guidelines for how to report bug.

Licensing

MIT License

You are responsible for the performance and any necessary testing or regulatory clearances for any models generated

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Microsoft Open Source Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct.

Resources: