4f9d1fad61
* Fix models * Add correct asserts * Fix tests, we should not use text messages to identify error codes! * Clean code |
||
---|---|---|
.github/workflows | ||
Tests | ||
docs | ||
.flake8 | ||
.gitattributes | ||
.gitignore | ||
CODE_OF_CONDUCT.md | ||
GeoPol.xml | ||
LICENSE | ||
README.md | ||
SECURITY.md | ||
SUPPORT.md | ||
THIRDPARTYNOTICES.md | ||
app.py | ||
azure_config.py | ||
configuration_constants.py | ||
configure.py | ||
conftest.py | ||
dockerignore | ||
download_model_and_run_scoring.py | ||
environment.yml | ||
mypy.ini | ||
requirements.txt | ||
source_config.py | ||
submit_for_inference.py |
README.md
Introduction
InnerEye Inference API
InnerEye-Inference is a AppService webapp in python to run inference on medical imaging models trained with the InnerEye-DeepLearning toolkit.
You can also integrate this with DICOM using the InnerEye-EdgeGateway
Getting Started
Installing Conda or Miniconda
Download a Conda or Miniconda installer for your platform and run it.
Creating a Conda environment
Note that in order to create the Conda environment you will need to have build tools installed on your machine. If you are running Windows, they should be already installed with Conda distribution.
You can install build tools on Ubuntu (and Debian-based distributions) by running
sudo apt-get install build-essential
If you are running CentOS/RHEL distributions, you can install the build tools by running
yum install gcc gcc-c++ kernel-devel make
Start the conda
prompt for your platform. In that prompt, navigate to your repository root and run
conda env create --file environment.yml
conda activate inference
Configuration
Add this script with name set_environment.sh to set your env variables. This can be executed in Linux. The code will read the file if the environment variables are not present.
#!/bin/bash
export CUSTOMCONNSTR_AZUREML_SERVICE_PRINCIPAL_SECRET=
export CUSTOMCONNSTR_API_AUTH_SECRET=
export CLUSTER=
export WORKSPACE_NAME=
export EXPERIMENT_NAME=
export RESOURCE_GROUP=
export SUBSCRIPTION_ID=
export APPLICATION_ID=
export TENANT_ID=
Run with source set_environment.sh
Running flask app locally
flask run
to test it locally
Running flask app in Azure
- Install Azure CLI:
curl -sL https://aka.ms/InstallAzureCLIDeb | sudo bash
- Login:
az login --use-device-code
- Deploy:
az webapp up --sku S1 --name test-python12345 --subscription <your_subscription_name> -g InnerEyeInference --location <your region>
- In the Azure portal go to Monitoring > Log Stream for debugging logs
Deployment build
If you would like to reproduce the automatic deployment of the service for testing purposes:
az ad sp create-for-rbac --name "<name>" --role contributor --scope /subscriptions/<subs>/resourceGroups/InnerEyeInference --sdk-auth
- The previous command will return a json object with the content for the variable
secrets.AZURE_CREDENTIALS
.github/workflows/deploy.yml
Help and Bug Reporting
Licensing
You are responsible for the performance and any necessary testing or regulatory clearances for any models generated
Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Microsoft Open Source Code of Conduct
This project has adopted the Microsoft Open Source Code of Conduct.
Resources:
- Microsoft Open Source Code of Conduct
- Microsoft Code of Conduct FAQ
- Contact opencode@microsoft.com with questions or concerns