зеркало из
1
0
Форкнуть 0

Standalone scripts to support pipeline template (#55)

* Standalone scripts & instructions to run pipeline template in Synapse gallery

Co-authored-by: karthick-rn <kanarend@microsoft.com>
Co-authored-by: Tai Yee <taiyee@microsoft.com>
This commit is contained in:
Karthick Narendran 2022-06-27 13:30:19 +01:00 коммит произвёл GitHub
Родитель 4eb6984687
Коммит 20bce07164
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
13 изменённых файлов: 203 добавлений и 1 удалений

5
.gitignore поставляемый
Просмотреть файл

@ -129,4 +129,7 @@ dmypy.json
.pyre/
# Packaged zip
*.zip
*.zip
# Mac
.DS_Store

Просмотреть файл

@ -0,0 +1,61 @@
#! /usr/bin/env bash
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
# This script creates the required linked service & the spark
# job definition on the Synapse workspace and should be run
# prior to executing the pipeline template from Synapse gallery.
set -e
ENV_CODE=${1:-${ENV_CODE}}
if [[ -z ${ENV_CODE} ]]
then
echo "Missing Environment Code"
exit 1
fi
base_dir="$(cd "$(dirname "$0")/../../" && pwd)"
# Uncompress the custom vision model v2 package
if ! unzip "${base_dir}"/custom-vision-model-v2.zip -d "${base_dir}"/custom-vision-model-v2-"${ENV_CODE}";
then
echo "Unzip failed please check .."
exit 1
else
echo "Unzip completed successfully"
fi
pkg_dir=${base_dir}/custom-vision-model-v2-${ENV_CODE}
# The following one's are created on the Synapse workspace to run
# the pipeline template in Synapse gallery.
services=("linked-service" "spark-job-definition")
for item in "${services[@]}"
do
if [[ $item == "linked-service" ]]
then
folder_name="linkedService"
else
folder_name="sparkJobDefinition"
fi
for file in "${pkg_dir}/${folder_name}"/*.json
do
full_name="${file}"
file_name=$(basename -s .json "${file}")
# Edit the json file format to work with Synapse CLI commands
if [[ $folder_name == "sparkJobDefinition" ]]
then
contents="$(jq '.properties' "${full_name}")" && echo -E "${contents}" > "${full_name}"
sleep 5
fi
# Create the linked services and spark job definition on Synapse workspace
echo "Creating ${item} please wait.."
az synapse "${item}" create --workspace-name "${ENV_CODE}-pipeline-syn-ws" \
--name "${file_name}" --file @"${full_name}"
done
done

Двоичные данные
deploy/gallery/images/1.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 170 KiB

Двоичные данные
deploy/gallery/images/2.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 50 KiB

Двоичные данные
deploy/gallery/images/3.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 346 KiB

Двоичные данные
deploy/gallery/images/4.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 293 KiB

Двоичные данные
deploy/gallery/images/5.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 291 KiB

Двоичные данные
deploy/gallery/images/6.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 295 KiB

Двоичные данные
deploy/gallery/images/7.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 294 KiB

Двоичные данные
deploy/gallery/images/8.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 215 KiB

Двоичные данные
deploy/gallery/images/9.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 200 KiB

Просмотреть файл

@ -0,0 +1,76 @@
If you've landed on this page from Synapse gallery then Congrats! -- you are in the right place.
The instructions on this page will guide you through configuring the pipeline template to successfully run a custom vision model for object detection (for example, detecting swimming pool objects). You can also use the pipeline template to explore other object detection use cases as well. Currently, the template only supports custom vision model v2.
**Prerequisites**
* Follow the steps in the [readme](https://github.com/Azure/Azure-Orbital-Analytics-Samples/blob/main/deploy/README.md) to deploy the infrastructure. This includes creation of the custom vision model v2 package as well.
* Run the following command to create the linked services and spark job definition on the Synapse workspace:
```
./deploy/gallery/create_service.sh <environmentCode>
```
NOTE
**environmentCode** should be the same as the one used in the deployment steps
* Run the following command to copy the sample GeoTiff image and the required configurations into the storage account for detecting swimming pools using the Object Detection CV model:
```
./deploy/scripts/copy_geotiff.sh <environmentCode>
```
**Switch back to Synapse gallery**
1. The input page appears:
![](./images/1.png)
2. Select the values from the dropdown as shown below and click **Open pipeline**:
![](./images/2.png)
3. On the right side, there is a list of mandatory fields. Click each of them and select the respective names from the dropdown as shown below. This additional step is only interim; we are working with the Synapse product group for a long term fix.
- Transforms
![](./images/3.png)
- Pool Geolocation
![](./images/4.png)
- Copy Tiles
![](./images/5.png)
- Copy Config file
![](./images/6.png)
- Copy Xml from Convert Transform
![](./images/7.png)
4. When all the required fields are provided as shown on the right side of the image below, go ahead and publish.
![](./images/8.png)
5. When published successfully, you are just one step away from running the pipeline. Enable the Integration Runtime by moving the **Data flow debug** toggle button. While this is warming up, enter the value for the parameters.
![](./images/9.png)
|No |Parameter | Value | Comments |
|--| ---- | --- | ------- |
| 1|Prefix| \<environmentCode>-test-container | |
| 2|StorageAccountName| rawdata<6-character-random-string> | Get the storage account name from \<environmentCode>-data-rg |
| 3|AOI | -117.063550 32.749467 -116.999386 32.812946 | Sample bounding box |
| 4|BatchAccountName | | Get the batch account name from \<environmentCode>-orc-rg |
| 5|BatchJobName | \<environmentCode>-data-cpu-pool | Get the jobname from the batch account|
| 6|BatchLocation | | Get the region from the batch account. Usually be the deployment region|
| 7|SparkPoolName | pool<6-character-random-string>| Get the spark pool name from \<environmentCode>-pipeline-rg |
| 8|EnvCode | \<environmentCode> | Same as used in the deployment steps|
6. All set to start the pipeline now. Press **Debug**.

62
deploy/scripts/copy_geotiff.sh Executable file
Просмотреть файл

@ -0,0 +1,62 @@
#!/usr/bin/env bash
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
# This script automates the following activities & should be run prior to
# executing the pipeline template from Synapse gallery.
# * Create storage container under "<envcode>-data-rg" resource group
# * Copy the sample TIF into the newly created storage container
# * Copy the configurations required for Object Detection
set -e
ENV_CODE=${1:-${ENV_CODE}}
if [[ -z "$ENV_CODE" ]]
then
echo "Missing Environment Code"
exit 1
fi
# Default name for the container
container_name="${ENV_CODE}-test-container"
# Retrieve the respective values for the environment variables
AZURE_STORAGE_ACCOUNT=$(az storage account list --resource-group "${ENV_CODE}-data-rg" --query [0].name -o tsv)
AZURE_STORAGE_KEY=$(az storage account keys list --account-name "$AZURE_STORAGE_ACCOUNT" --query [0].value -o tsv)
AZURE_STORAGE_AUTH_MODE=key
export AZURE_STORAGE_ACCOUNT
export AZURE_STORAGE_KEY
export AZURE_STORAGE_AUTH_MODE
# Declare an associated array for dictionaries
declare -A array
key1='sample_4326.tif'
value1='https://aoigeospatial.blob.core.windows.net/public/samples/sample_4326.tif'
key2='custom_vision_object_detection.json'
value2='https://raw.githubusercontent.com/Azure/Azure-Orbital-Analytics-Samples/main/src/aimodels/custom_vision_object_detection_offline/specs/custom_vision_object_detection.json'
key3='config.json'
value3='https://raw.githubusercontent.com/Azure/Azure-Orbital-Analytics-Samples/main/src/aimodels/custom_vision_object_detection_offline/config/config.json'
array["raw/$key1"]="$value1"
array["config/$key2"]="$value2"
array["config/$key3"]="$value3"
echo "Creating storage container $container_name.."
if ! az storage container create --name "$container_name" --fail-on-exist;
then
echo "Failed during storage container creation"
exit 1
else
echo "Container $container_name created successfully"
fi
for key in "${!array[@]}"; do
echo "Copying $key from source"
az storage blob copy start --destination-blob $key --destination-container "$container_name" --source-uri "${array[$key]}"
done