removing torchserve-huggingface-textgen workflow and its references (#3382)

This commit is contained in:
Jayesh Tanna 2024-09-12 07:27:03 +05:30 коммит произвёл GitHub
Родитель 7ae3bd0cf0
Коммит 0833654898
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: B5690EEEBB952194
3 изменённых файлов: 1 добавлений и 70 удалений

Просмотреть файл

@ -1,67 +0,0 @@
# This code is autogenerated.
# Code is generated by running custom script: python3 readme.py
# Any manual changes to this file may cause incorrect behavior.
# Any manual changes will be overwritten if the code is regenerated.
name: cli-scripts-deploy-custom-container-torchserve-huggingface-textgen
on:
workflow_dispatch:
schedule:
- cron: "54 0/12 * * *"
pull_request:
branches:
- main
paths:
- cli/deploy-custom-container-torchserve-huggingface-textgen.sh
- infra/bootstrapping/**
- .github/workflows/cli-scripts-deploy-custom-container-torchserve-huggingface-textgen.yml
- cli/setup.sh
permissions:
id-token: write
concurrency:
group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}
cancel-in-progress: true
jobs:
build:
runs-on: ubuntu-latest
steps:
- name: check out repo
uses: actions/checkout@v2
- name: azure login
uses: azure/login@v1
with:
client-id: ${{ secrets.OIDC_AZURE_CLIENT_ID }}
tenant-id: ${{ secrets.OIDC_AZURE_TENANT_ID }}
subscription-id: ${{ secrets.OIDC_AZURE_SUBSCRIPTION_ID }}
- name: bootstrap resources
run: |
bash bootstrap.sh
working-directory: infra/bootstrapping
continue-on-error: false
- name: setup-cli
run: |
source "${{ github.workspace }}/infra/bootstrapping/sdk_helpers.sh";
source "${{ github.workspace }}/infra/bootstrapping/init_environment.sh";
bash setup.sh
working-directory: cli
continue-on-error: true
- name: Eagerly cache access tokens for required scopes
run: |
# Workaround for azure-cli's lack of support for ID token refresh
# Taken from: https://github.com/Azure/login/issues/372#issuecomment-2056289617
# Management
az account get-access-token --scope https://management.azure.com/.default --output none
# ML
az account get-access-token --scope https://ml.azure.com/.default --output none
- name: validate readme
run: |
python check-readme.py "${{ github.workspace }}/cli/"
working-directory: infra/bootstrapping
continue-on-error: false
- name: test script script
run: |
source "${{ github.workspace }}/infra/bootstrapping/sdk_helpers.sh";
source "${{ github.workspace }}/infra/bootstrapping/init_environment.sh";
set -e; bash -x deploy-custom-container-torchserve-huggingface-textgen.sh
working-directory: cli

Просмотреть файл

@ -45,7 +45,6 @@ path|status|
[deploy-custom-container-tfserving-half-plus-two-integrated.sh](deploy-custom-container-tfserving-half-plus-two-integrated.sh)|[![deploy-custom-container-tfserving-half-plus-two-integrated](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-custom-container-tfserving-half-plus-two-integrated/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-custom-container-tfserving-half-plus-two-integrated.yml)
[deploy-custom-container-tfserving-half-plus-two.sh](deploy-custom-container-tfserving-half-plus-two.sh)|[![deploy-custom-container-tfserving-half-plus-two](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-custom-container-tfserving-half-plus-two/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-custom-container-tfserving-half-plus-two.yml)
[deploy-custom-container-torchserve-densenet.sh](deploy-custom-container-torchserve-densenet.sh)|[![deploy-custom-container-torchserve-densenet](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-custom-container-torchserve-densenet/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-custom-container-torchserve-densenet.yml)
[deploy-custom-container-torchserve-huggingface-textgen.sh](deploy-custom-container-torchserve-huggingface-textgen.sh)|[![deploy-custom-container-torchserve-huggingface-textgen](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-custom-container-torchserve-huggingface-textgen/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-custom-container-torchserve-huggingface-textgen.yml)
[deploy-custom-container-triton-single-model.sh](deploy-custom-container-triton-single-model.sh)|[![deploy-custom-container-triton-single-model](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-custom-container-triton-single-model/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-custom-container-triton-single-model.yml)
[deploy-local-endpoint.sh](deploy-local-endpoint.sh)|[![deploy-local-endpoint](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-local-endpoint/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-local-endpoint.yml)
[deploy-managed-online-endpoint-access-resource-sai.sh](deploy-managed-online-endpoint-access-resource-sai.sh)|[![deploy-managed-online-endpoint-access-resource-sai](https://github.com/Azure/azureml-examples/workflows/cli-scripts-deploy-managed-online-endpoint-access-resource-sai/badge.svg?branch=main)](https://github.com/Azure/azureml-examples/actions/workflows/cli-scripts-deploy-managed-online-endpoint-access-resource-sai.yml)

Просмотреть файл

@ -15,6 +15,5 @@ Each example consists of a script located in the [CLI](../../..) directory as we
|[r/multimodel-plumber](r/multimodel-plumber)|[deploy-custom-container-r-multimodel-plumber](../../../deploy-custom-container-r-multimodel-plumber.sh)|Deploy three regression models to one endpoint using the Plumber R package|
|[tfserving/half-plus-two](tfserving/half-plus-two)|[deploy-custom-container-tfserving-half-plus-two](../../../deploy-custom-container-tfserving-half-plus-two.sh)|Deploy a simple Half Plus Two model using a TFServing custom container using the standard model registration process.|
|[tfserving/half-plus-two-integrated](tfserving/half-plus-two-integrated)|[deploy-custom-container-tfserving-half-plus-two-integrated](../../../deploy-custom-container-tfserving-half-plus-two-integrated.sh)|Deploy a simple Half Plus Two model using a TFServing custom container with the model integrated into the image.|
|[torchserve/densenet](torchserve/densenet)|[deploy-custom-container-torchserve-densenet](../../../deploy-custom-container-torchserve-densenet.sh)|Deploy a single model using a Torchserve custom container.|
|[torchserve/huggingface-textgen](torchserve/huggingface-textgen)|[deploy-custom-container-torchserve-huggingface-textgen](../../../deploy-custom-container-torchserve-huggingface-textgen.sh)|Deploy Huggingface models to an online endpoint and follow along with the Huggingface Transformers Torchserve example.|
|[torchserve/densenet](torchserve/densenet)|[deploy-custom-container-torchserve-densenet](../../../deploy-custom-container-torchserve-densenet.sh)|Deploy a single model using a Torchserve custom container.|
|[triton/single-model](triton/single-model)|[deploy-custom-container-triton-single-model](../../../deploy-custom-container-triton-single-model.sh)|Deploy a Triton model using a custom container|