зеркало из
1
0
Форкнуть 0

Moves the files into relevant directories in deploy directory (#94)

This commit is contained in:
Sushil Kumar 2022-10-25 12:42:02 -07:00 коммит произвёл GitHub
Родитель 6bf53ecbdc
Коммит 75cd960c8e
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
13 изменённых файлов: 30 добавлений и 26 удалений

Просмотреть файл

@ -76,7 +76,7 @@ jobs:
# Setup the infrastructure for testing the pipeline
DEPLOY_PGSQL=${USE_POSTGRE_SQL} \
PRE_PROVISIONED_AI_MODEL_INFRA_NAME=${CI_BATCH_ACCOUNT_NAME} \
$(System.DefaultWorkingDirectory)/deploy/setup.sh ${ENV_CODE} ${AZURE_REGION} ${ENV_CODE}-ci-environment
$(System.DefaultWorkingDirectory)/deploy/scripts/setup.sh ${ENV_CODE} ${AZURE_REGION} ${ENV_CODE}-ci-environment
# Grant access on Synapse workspace to resource owner azure account
SYNAPSE_WORKSPACE_NAME=$(az synapse workspace list --query "[?tags.workspaceId && tags.workspaceId == 'default'].name" -o tsv -g "${ENV_CODE}-pipeline-rg")
@ -106,4 +106,4 @@ jobs:
inlineScript: |
set -e
# Cleanup the resources
$(System.DefaultWorkingDirectory)/deploy/cleanup.sh ${ENV_CODE} ${CI_BATCH_ACCOUNT_NAME}
$(System.DefaultWorkingDirectory)/deploy/scripts/cleanup.sh ${ENV_CODE} ${CI_BATCH_ACCOUNT_NAME}

Просмотреть файл

@ -45,20 +45,20 @@ az account set -s <subscription_id>
The following command will install, configure and generate the custom vision model package.
```bash
./deploy/setup.sh <environmentCode> <location>
./deploy/scripts/setup.sh <environmentCode> <location>
```
- `environmentCode` used as a prefix for Azure resource names. It allows only alpha numeric(no special characters) and must be between 3 and 8 characters.
- `location`is a valid Azure region.
For eg.
```bash
./deploy/setup.sh aoi eastus
./deploy/scripts/setup.sh aoi eastus
```
[setup.sh](./setup.sh) executes tasks in 3 steps
- installs the infrastructure using [install.sh](./install.sh) script.
- configures the infrastructure for setting up the dependecies using [configure.sh](./configure.sh) script.
- packages the pipeline code to a zip file using [package.sh](./package.sh) script.
[setup.sh](./scripts/setup.sh) executes tasks in 3 steps
- installs the infrastructure using [install.sh](./scripts/install.sh) script.
- configures the infrastructure for setting up the dependecies using [configure.sh](./scripts/configure.sh) script.
- packages the pipeline code to a zip file using [package.sh](./scripts/package.sh) script.
After the script has run successfully, please check the batch-account pool created is created successfully.
@ -91,12 +91,12 @@ As discussed above the `environmentCode` is used as prefix to generate resource
Execute the cleanup script as follows:
```bash
./deploy/cleanup.sh <environmentCode>
./deploy/scripts/cleanup.sh <environmentCode>
```
For eg.
```bash
./deploy/cleanup.sh aoi
./deploy/scripts/cleanup.sh aoi
```
If one wants not to delete any specific resource group and thus resource they can use NO_DELETE_*_RESOURCE_GROUP environment variable, by setting it to true
@ -107,9 +107,9 @@ NO_DELETE_MONITORING_RESOURCE_GROUP=true
NO_DELETE_NETWORKING_RESOURCE_GROUP=true
NO_DELETE_ORCHESTRATION_RESOURCE_GROUP=true
NO_DELETE_PIPELINE_RESOURCE_GROUP=true
./deploy/cleanup.sh <environmentCode>
./deploy/scripts/cleanup.sh <environmentCode>
```
## AI Model
Follow the [document](./using-ai-model-in-pipeline.md) to understand and use [Custom Vision Model](/src/aimodels) or bring different ai-model for processing with the pipeline.
Follow the [document](./docs/using-ai-model-in-pipeline.md) to understand and use [Custom Vision Model](/src/aimodels) or bring different ai-model for processing with the pipeline.

Просмотреть файл

@ -41,7 +41,7 @@ param dbPassword string
param dbPort int = 5432
var commandString = 'apt update -y && apt install curl -y && curl -o root.crt https://cacerts.digicert.com/BaltimoreCyberTrustRoot.crt.pem && curl -o CVModelSQLScript.sql https://raw.githubusercontent.com/Azure/Azure-Orbital-Analytics-Samples/main/deploy/CVModelSQLScript.sql && psql --set=sslmode=require --set=sslrootcert=root.crt -h ${server} -p ${dbPort} -U ${username} -W -d ${db} -f CVModelSQLScript.sql'
var commandString = 'apt update -y && apt install curl -y && curl -o root.crt https://cacerts.digicert.com/BaltimoreCyberTrustRoot.crt.pem && curl -o CVModelSQLScript.sql https://raw.githubusercontent.com/Azure/Azure-Orbital-Analytics-Samples/main/deploy/scripts/CVModelSQLScript.sql && psql --set=sslmode=require --set=sslrootcert=root.crt -h ${server} -p ${dbPort} -U ${username} -W -d ${db} -f CVModelSQLScript.sql'
resource containerGroup 'Microsoft.ContainerInstance/containerGroups@2021-09-01' = {
name: name

Просмотреть файл

Просмотреть файл

Просмотреть файл

@ -3,7 +3,7 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
PRJ_ROOT="$(cd `dirname "${BASH_SOURCE}"`/..; pwd)"
PRJ_ROOT="$(cd `dirname "${BASH_SOURCE}"`/../..; pwd)"
ENV_CODE=${1:-${ENV_CODE}}
AI_MODEL_INFRA_TYPE=${2:-${AI_MODEL_INFRA_TYPE:-"batch-account"}} # Currently supported values are aks and batch-account
@ -36,7 +36,7 @@ if [[ -z "$SYNAPSE_POOL" ]]; then
fi
if [[ -n $SYNAPSE_WORKSPACE ]] && [[ -n $SYNAPSE_WORKSPACE_RG ]] && [[ -n $SYNAPSE_POOL ]]; then
# upload synapse pool
az synapse spark pool update --name ${SYNAPSE_POOL} --workspace-name ${SYNAPSE_WORKSPACE} --resource-group ${SYNAPSE_WORKSPACE_RG} --library-requirements "${PRJ_ROOT}/deploy/environment.yml"
az synapse spark pool update --name ${SYNAPSE_POOL} --workspace-name ${SYNAPSE_WORKSPACE} --resource-group ${SYNAPSE_WORKSPACE_RG} --library-requirements "${PRJ_ROOT}/deploy/scripts/environment.yml"
fi
if [[ -z "$SYNAPSE_STORAGE_ACCOUNT" ]]; then

Просмотреть файл

Просмотреть файл

@ -3,6 +3,8 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
PRJ_ROOT="$(cd `dirname "${BASH_SOURCE}"`/../..; pwd)"
set -ex
if [[ -z "$1" ]]
@ -32,7 +34,7 @@ if [[ "$AI_MODEL_INFRA_TYPE" != "batch-account" ]] && [[ "$AI_MODEL_INFRA_TYPE"
fi
DEPLOYMENT_SCRIPT="az deployment sub create -l $LOCATION -n $DEPLOYMENT_NAME \
-f ./deploy/infra/main.bicep \
-f ${PRJ_ROOT}/deploy/infra/main.bicep \
-p \
location=$LOCATION \
environmentCode=$ENV_CODE \

Просмотреть файл

@ -60,7 +60,7 @@ def replace(tokens_map: dict, body: str):
def package(pipeline_name: str, tokens_map: dict, modes='batch-account'):
script_dirname = os.path.dirname(__file__)
src_folder_path = os.path.join(script_dirname, '..', 'src', 'workflow', pipeline_name)
src_folder_path = os.path.join(script_dirname, '../..', 'src', 'workflow', pipeline_name)
package_folder_path= os.path.join(os.getcwd(), pipeline_name)
# mode

Просмотреть файл

@ -3,7 +3,7 @@
# Copyright (c) Microsoft Corporation.
# Licensed under the MIT license.
PRJ_ROOT="$(cd `dirname "${BASH_SOURCE}"`/..; pwd)"
PRJ_ROOT="$(cd `dirname "${BASH_SOURCE}"`/../..; pwd)"
ENV_CODE=${1:-$ENV_CODE}
PIPELINE_NAME=${2:-${PIPELINE_NAME:-"custom-vision-model"}}
@ -86,7 +86,7 @@ if [[ "$AI_MODEL_INFRA_TYPE" == "batch-account" ]]; then
fi
echo 'Retrieved resource from Azure and ready to package'
PACKAGING_SCRIPT="python3 ${PRJ_ROOT}/deploy/package.py \
PACKAGING_SCRIPT="python3 ${PRJ_ROOT}/deploy/scripts/package.py \
--raw_storage_account_name $RAW_STORAGE_ACCOUNT_NAME \
--synapse_storage_account_name $SYNAPSE_STORAGE_ACCOUNT_NAME \
--modes $MODE \
@ -132,7 +132,7 @@ elif [[ "$AI_MODEL_INFRA_TYPE" == "aks" ]]; then
counter=$((counter+1))
done
BASE64ENCODEDZIPCONTENT_FUNCTIONAPP_URL="https://${BASE64ENCODEDZIPCONTENT_FUNCTIONAPP_HOST}"
PACKAGING_SCRIPT="python3 ${PRJ_ROOT}/deploy/package.py \
PACKAGING_SCRIPT="python3 ${PRJ_ROOT}/deploy/scripts/package.py \
--raw_storage_account_name $RAW_STORAGE_ACCOUNT_NAME \
--synapse_storage_account_name $SYNAPSE_STORAGE_ACCOUNT_NAME \
--modes $MODE \

Просмотреть файл

@ -11,6 +11,8 @@ AI_MODEL_INFRA_TYPE=${5:-${AI_MODEL_INFRA_TYPE:-"batch-account"}} # Currently su
PRE_PROVISIONED_AI_MODEL_INFRA_NAME=${6:-$PRE_PROVISIONED_AI_MODEL_INFRA_NAME}
DEPLOY_PGSQL=${7:-${DEPLOY_PGSQL:-"true"}}
PRJ_ROOT="$(cd `dirname "${BASH_SOURCE}"`/../..; pwd)"
set -ex
if [[ -z "$ENV_CODE" ]]
@ -43,24 +45,24 @@ if [[ -z "$ENV_TAG" ]]
DEPLOY_PGSQL=${DEPLOY_PGSQL} \
DEPLOY_AI_MODEL_INFRA=${DEPLOY_AI_MODEL_INFRA} \
AI_MODEL_INFRA_TYPE=${AI_MODEL_INFRA_TYPE} \
./deploy/install.sh "$ENV_CODE" "$LOCATION"
${PRJ_ROOT}/deploy/scripts/install.sh "$ENV_CODE" "$LOCATION"
else
DEPLOY_PGSQL=${DEPLOY_PGSQL} \
DEPLOY_AI_MODEL_INFRA=${DEPLOY_AI_MODEL_INFRA} \
AI_MODEL_INFRA_TYPE=${AI_MODEL_INFRA_TYPE} \
./deploy/install.sh "$ENV_CODE" "$LOCATION" "$ENV_TAG"
${PRJ_ROOT}/deploy/scripts/install.sh "$ENV_CODE" "$LOCATION" "$ENV_TAG"
fi
if [[ "${DEPLOY_AI_MODEL_INFRA}" == "false" ]] && [[ "${AI_MODEL_INFRA_TYPE}" == "batch-account" ]]; then
echo "Setting up the batch account!!!"
./test/use-pre-provisioned-batch-account.sh \
${PRJ_ROOT}/test/use-pre-provisioned-batch-account.sh \
"$ENV_CODE" \
"$PRE_PROVISIONED_AI_MODEL_INFRA_NAME" \
"$PIPELINE_NAME"
fi
echo "Performing configuration"
./deploy/configure.sh \
${PRJ_ROOT}/deploy/scripts/configure.sh \
"$ENV_CODE" \
"$AI_MODEL_INFRA_TYPE" \
"$PRE_PROVISIONED_AI_MODEL_INFRA_NAME"
@ -71,7 +73,7 @@ if [[ -z "$PIPELINE_NAME" ]]
else
echo "Performing pipeline packaging"
DEPLOY_PGSQL=${DEPLOY_PGSQL} \
./deploy/package.sh \
${PRJ_ROOT}/deploy/scripts/package.sh \
"$ENV_CODE" \
"$PIPELINE_NAME" \
"$AI_MODEL_INFRA_TYPE" \

Просмотреть файл

@ -23,7 +23,7 @@ No | Folder | Purpose
8 | sparkJobDefinition | Contains JSON definition files to create Spark Job definitions in Synapse Workspace
9 | publish_config.json | Contains configuration details such as the branch in your repository to use as pubish branch
**Note:** Individual files and folders in the [Workflow](../src/workflow) parent folder are not intended to be imported into your workspace individually. You will need to run the `package.sh` in the deploy folder to create the zip file and then use the contents of the zip files to check-in your code to a repository.
**Note:** Individual files and folders in the [Workflow](../src/workflow) parent folder are not intended to be imported into your workspace individually. You will need to run the [package.sh](../deploy/scripts/package.sh) to create the zip file and then use the contents of the zip files to check-in your code to a repository.
## Main Pipeline