Initial checkin of API deliverables for review

This commit is contained in:
Jennifer Marsman 2019-03-12 17:04:55 -04:00
Родитель 5f5cc1f4a4
Коммит 4ffa4dcaf3
2 изменённых файлов: 297 добавлений и 0 удалений

52
APIDeliverables.md Normal file
Просмотреть файл

@ -0,0 +1,52 @@
# AI for Earth - API Deliverables
Select AI for Earth grant recipients are being funded to provide AI for Earth APIs. If you are providing an API, here are the specific deliverables to submit:
+ [Container with machine learning model](#container)
+ [Jupyter notebook (for demo suite)](#notebook)
+ [Documentation](#doc)
+ [Assets for website](#assets)
The container should be uploaded to a container registry. All other material (or links to material) should be emailed to aiforearthcommunity@microsoft.com.
## <a name="container">Container with machine learning model</a>
The actual delivery of your API can be done via a Docker container.
+ Please follow the directions [here](./Quickstart.md) to create the Docker container.
+ In step 8, when you build your Docker image, please tag it using the tag “<grantee_moniker>/<image_name>”.
+ Replace step 10-11 with publishing to our AI for Earth container registry. Use the following commands:
```
docker login --username <username> --password <password> ai4egrantee.azurecr.io
docker push ai4egrantee.azurecr.io/your_custom_image_name:tag
```
+ Please send an email to aiforearthcommunity@microsoft.com with the subject line = “Container push request” and the body of your email containing the email address of the person who will push the container (so we can grant that email address the appropriate permissions to push to our container registry).
+ In terms of testing, please ensure that your code meets the defined [acceptance criteria](./AcceptanceCriteria.md).
**Alternate option:** People can either provide a container that meets the acceptance criteria, or they can relax/lower the bar on acceptance criteria and provide a code drop with a semi-functional container.
## <a name="notebook">Jupyter Notebook</a>
We are compiling a suite of demos, to showcase the work of our AI for Earth grant recipients. These demos are intended for an audience of developers and data scientists, so they can see how to call your API and the type of results that your machine learning model returns. Please include sample data for calling your API that can be publicly shown.
+ Please follow the directions [here](./JupyterNotebook.md) to create a Jupyter notebook that can be used to demonstrate your amazing work.
+ We have also provided a [template notebook](./Notebooks/template-demo.ipynb) that you can start from.
## <a name="doc">Documentation</a>
Of course, every good API needs documentation to show its usage. Please include any assumptions that your code makes (for example, all input images must be square tiles) and an example of how to call the API.
+ Please provide documentation of your API using the [OpenAPI specification](https://swagger.io/specification/), as a .json file.
+ We recommend that you build and validate it using the [Swagger Editor](https://editor.swagger.io/). You can start with the example that they provide or with [our landcover mapping documentation](./Documentation/landcover_api_spec_swagger.0.1.json) as an example.
+ The final product (rendered on the website) will look like this, for an example of useful information to include (click on the version numbers): https://aka.ms/aieapisdoc
Additional resources that may be useful
+ This is the process that we will follow to import your API: https://docs.microsoft.com/en-us/azure/api-management/import-and-publish#a-namecreate-api-aimport-and-publish-a-backend-api
+ This link documents the API import restrictions and known issues for OpenAPI/Swagger: https://docs.microsoft.com/en-us/azure/api-management/api-management-api-import-restrictions
+ Important information and tips related to OpenAPI import: https://blogs.msdn.microsoft.com/apimanagement/2018/04/11/important-changes-to-openapi-import-and-export/
## <a name="assets">Assets for website</a>
These assets could potentially be used on the AI for Earth website to highlight your API. For an example, see https://aka.ms/AI4EAPI.
Please provide the following:
+ Image (high-resolution; we will crop to the right size)
+ Three-line summary of API (300 characters maximum)
+ Link (to follow on the “Learn about X”)

Просмотреть файл

@ -0,0 +1,245 @@
{
"swagger": "2.0",
"info": {
"title": "AI for Earth Landcover API",
"version": "v0.1",
"description": "This specification represents the core [AI for Earth](https://www.microsoft.com/en-us/aiforearth) API offering. An access key is required for access."
},
"host": "aiforearth.azure-api.net",
"basePath": "/v0.1",
"schemes": [
"https"
],
"securityDefinitions": {
"apiKeyHeader": {
"type": "apiKey",
"name": "Ocp-Apim-Subscription-Key",
"in": "header"
},
"apiKeyQuery": {
"type": "apiKey",
"name": "subscription-key",
"in": "query"
}
},
"security": [
{
"apiKeyHeader": []
},
{
"apiKeyQuery": []
}
],
"paths": {
"/landcover/classify": {
"post": {
"description": "This operation classifies the landcover for a given region based on the provided satellite image. The provided image must be a Tiff file with 4 bands representing the red, green, blue and near-infrared value of the pixel.\n\nA successful classification will return an image file corresponding to the landcover of the provided image. The following labels are possible with the corresponding color labels:\n- <b>No Data</b> - black (0, 0 ,0)\n- <b>Water</b> - blue (0, 0, 255)\n- <b>Trees</b> - dark green (0, 128, 0)\n- <b>Herbaceous</b> - light green (128, 255, 128)\n- <b>Barren/Impervious</b> - brown (128, 96, 96)",
"operationId": "5ab5905bb8d61f0e48853404",
"summary": "/landcover/classify",
"parameters": [
{
"name": "type",
"in": "query",
"description": "File type of the returned image. Supported values are:\n- tiff (default)\n- jpeg",
"type": "string",
"default": "tiff",
"enum": [
"tiff",
"jpeg"
]
},
{
"name": "Content-Type",
"in": "header",
"description": "Media type of the request body. Currently only image/tiff is supported.",
"required": true,
"type": "string",
"enum": [
"image/tiff"
]
}
],
"responses": {
"200": {
"description": "The response body will contain an image file with the land cover labels. The image will be colored corresponding the the following labels:\n- <b>No Data</b> - black (0, 0 ,0)\n- <b>Water</b> - blue (0, 0, 255)\n- <b>Trees</b> - dark green (0, 128, 0)\n- <b>Herbaceous</b> - light green (128, 255, 128)\n- <b>Barren/Impervious</b> - brown (128, 96, 96)\n\nThe size of the output image will be the same as a minus a 64 pixel border around the image. For example, if the input image is 256 pixels by 256 pixels the output image will be 128 pixels by 128 pixels.",
"examples": {
"image/jpeg": "[binary image data]",
"image/tiff": "[binary image data]"
}
},
"400": {
"description": "Possible Errors: \n<ul>\n<li><b>InvalidImageFormat</b>\n<br/>Input data is not a valid image.</li>\n<li><b>InvalidImageSize</b>\n<br/>Input image is too large or too small.</li>\n</ul>"
},
"415": {
"description": "Unsupported media type in the request body. Currently only image/tiff is supported"
}
},
"produces": [
"image/jpeg",
"image/tiff",
"application/json"
]
}
},
"/landcover/details": {
"post": {
"description": "This operation classifies the landcover for a given region based on the provided satellite image. The response will contain an image file with the classification along with details about the breakdown of each label in the image.\n\n The provided image must be a Tiff file with 4 bands representing the red, green, blue and near-infrared value of the pixel.\n\nA successful classification will return an image file corresponding to the landcover of the provided image. The following labels are possible with the corresponding color labels:\n- <b>No Data</b> - black (0, 0 ,0)\n- <b>Water</b> - blue (0, 0, 255)\n- <b>Trees</b> - dark green (0, 128, 0)\n- <b>Herbaceous</b> - light green (128, 255, 128)\n- <b>Barren/Impervious</b> - brown (128, 96, 96)\n\nThe label breakdown section contains each label that appears in the image, along with the percentage of pixels that are classified with that label.",
"operationId": "5ada78aab225207e719fa59b",
"summary": "/landcover/details",
"parameters": [
{
"name": "type",
"in": "query",
"description": "File type of the returned image. Supported values are:\n- tiff (default)\n- jpeg\n",
"type": "string",
"default": "tiff",
"enum": [
"tiff",
"jpeg"
]
},
{
"name": "Content-Type",
"in": "header",
"description": "Media type of the request body. Currently only image/tiff is supported.\n",
"required": true,
"type": "string",
"enum": [
"image/tiff"
]
}
],
"responses": {
"200": {
"description": "The response body will contain an image file with the land cover labels along with a dictionary containing all the labels containing a given image and the percent of the image predicted to contain that label. The possible labels are listed below.\n- <b>No Data</b> - black (0, 0 ,0)\n- <b>Water</b> - blue (0, 0, 255)\n- <b>Trees</b> - dark green (0, 128, 0)\n- <b>Herbaceous</b> - light green (128, 255, 128)\n- <b>Barren/Impervious</b> - brown (128, 96, 96)\n\nThe size of the output image will be the same as a minus a 64 pixel border around the image. For example, if the input image is 256 pixels by 256 pixels the output image will be 128 pixels by 128 pixels.",
"examples": {
"application/json": {
"image_data": "/9j/4AAQSkZJRgABAQAAAQABAAD/2wBDAAgGBgcGBQgHBwcJCQgKDBQNDAsLDBkSEw8UHRofHh0aHBwgJC4nICIsIxwcKDcpLDAxNDQ0Hyc5PTgyPC4zNDL/2wBDAQkJCQwLDBgNDRgyIRwhMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjIyMjL/wAARCAEAAQADASIAAhEBAxEB/8QAHwAAAQUBAQEBAQEAAAAAAAAAAAECAwQFBgcICQoL/8QAtRAAAgEDAwIEAwUFBAQAAAF9AQIDAAQRBRIhMUEGE1FhByJxFDKBkaEII0KxwRVS0fAkM2JyggkKFhcYGRolJicoKSo0NTY3ODk6Q0RFRkdISUpTVFVWV1hZWmNkZWZnaGlqc3R1dnd4eXqDhIWGh4iJipKTlJWWl5iZmqKjpKWmp6ipqrKztLW2t7i5usLDxMXGx8jJytLT1NXW19jZ2uHi4+Tl5ufo6erx8vP09fb3+Pn6/8QAHwEAAwEBAQEBAQEBAQAAAAAAAAECAwQFBgcICQoL/8QAtREAAgECBAQDBAcFBAQAAQJ3AAECAxEEBSExBhJBUQdhcRMiMoEIFEKRobHBCSMzUvAVYnLRChYkNOEl8RcYGRomJygpKjU2Nzg5OkNERUZHSElKU1RVVldYWVpjZGVmZ2hpanN0dXZ3eHl6goOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4uPk5ebn6Onq8vP09fb3+Pn6/9oADAMBAAIRAxEAPwDiqKKK+aPkAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigD//Z",
"label_breakdown": {
"No Data": 0,
"Trees": 1,
"Water": 0
}
}
}
},
"400": {
"description": "Possible Errors: \r\n<ul>\r\n<li><b>EmptyImage</b>\r\n<br/>An image was not supplied.\r\n</li>\r\n<li><b>InvalidImageFormat</b>\r\n<br/>Input data is not a valid image.</li>\r\n<li><b>InvalidImageSize</b>\r\n<br/>Input image is too large or too small.</li>\r\n</ul>"
},
"415": {
"description": "Unsupported media type in the request body. Currently only image/tiff is supported"
}
},
"produces": [
"application/json"
]
}
}
},
"definitions": {
"GeoTile": {
"type": "object",
"required": [
"lon",
"lat"
],
"properties": {
"lon": {
"type": "number"
},
"lat": {
"type": "number"
}
}
},
"GeoImage": {
"type": "object",
"required": [
"lon",
"lat",
"location",
"model",
"classification",
"options"
],
"properties": {
"lon": {
"type": "number"
},
"lat": {
"type": "number"
},
"location": {
"type": "string"
},
"model": {
"type": "string"
},
"classification": {
"type": "string"
},
"options": {
"type": "array",
"items": {
"type": "string"
}
}
}
},
"CacheRequest": {
"type": "object",
"required": [
"lon",
"lat",
"radius",
"tiles"
],
"properties": {
"lon": {
"type": "number"
},
"lat": {
"type": "number"
},
"radius": {
"type": "number"
},
"model": {
"type": "string"
},
"options": {
"type": "array",
"items": {
"type": "string"
}
},
"tiles": {
"type": "array",
"items": {
"$ref": "#/definitions/GeoTile"
}
}
}
},
"Cache": {
"type": "object",
"properties": {
"available": {
"type": "array",
"items": {
"$ref": "#/definitions/GeoImage"
}
},
"missing": {
"type": "array",
"items": {
"$ref": "#/definitions/GeoImage"
}
}
}
},
"Body": {
"example": "[Binary image data]"
}
}
}