azureml-examples/cli/README.md

63 KiB

page_type languages products description
sample
azurecli
azure-machine-learning
Top-level directory for official Azure Machine Learning CLI sample code.

Azure Machine Learning CLI (v2) (preview) examples

cleanup code style: black license: MIT

Welcome to the Azure Machine Learning examples repository!

Prerequisites

  1. An Azure subscription. If you don't have an Azure subscription, create a free account before you begin.
  2. A terminal. Install and set up the CLI (v2) before you begin.

Getting started

  1. Install and set up the CLI (v2)
az extension remove --name ml

az extension add --name ml --yes

Examples

Scripts

path status
batch-score-rest.sh batch-score-rest
batch-score.sh batch-score
deploy-local-endpoint.sh deploy-local-endpoint
deploy-managed-online-endpoint-access-resource-sai.sh deploy-managed-online-endpoint-access-resource-sai
deploy-managed-online-endpoint-access-resource-uai.sh deploy-managed-online-endpoint-access-resource-uai
deploy-managed-online-endpoint-mlflow.sh deploy-managed-online-endpoint-mlflow
deploy-managed-online-endpoint.sh deploy-managed-online-endpoint
deploy-mlcompute-create-with-user-identity.sh deploy-mlcompute-create-with-user-identity
deploy-mlcompute-create_with-system-identity.sh deploy-mlcompute-create_with-system-identity
deploy-mlcompute-update-to-system-identity.sh deploy-mlcompute-update-to-system-identity
deploy-mlcompute-update-to-user-identity.sh deploy-mlcompute-update-to-user-identity
deploy-moe-autoscale.sh deploy-moe-autoscale
deploy-moe-vnet-mlflow.sh deploy-moe-vnet-mlflow
deploy-moe-vnet.sh deploy-moe-vnet
deploy-r.sh deploy-r
deploy-rest.sh deploy-rest
deploy-safe-rollout-kubernetes-online-endpoints.sh deploy-safe-rollout-kubernetes-online-endpoints
deploy-safe-rollout-online-endpoints.sh deploy-safe-rollout-online-endpoints
deploy-tfserving.sh deploy-tfserving
deploy-torchserve.sh deploy-torchserve
deploy-triton-managed-online-endpoint.sh deploy-triton-managed-online-endpoint
misc.sh misc
mlflow-uri.sh mlflow-uri
train-rest.sh train-rest
train.sh train

Jobs (jobs)

path status description
jobs/basics/hello-automl/hello-automl-job-basic.yml jobs/basics/hello-automl/hello-automl-job-basic A Classification job using bank marketing
jobs/single-step/dask/nyctaxi/job.yml jobs/single-step/dask/nyctaxi/job This sample shows how to run a distributed DASK job on AzureML. The 24GB NYC Taxi dataset is read in CSV format by a 4 node DASK cluster, processed and then written as job output in parquet format.
jobs/single-step/gpu_perf/gpu_perf_job.yml jobs/single-step/gpu_perf/gpu_perf_job Runs NCCL-tests on gpu nodes.
jobs/single-step/julia/iris/job.yml jobs/single-step/julia/iris/job Train a Flux model on the Iris dataset using the Julia programming language.
jobs/single-step/lightgbm/iris/job-sweep.yml jobs/single-step/lightgbm/iris/job-sweep Run a hyperparameter sweep job for LightGBM on Iris dataset.
jobs/single-step/lightgbm/iris/job.yml jobs/single-step/lightgbm/iris/job Train a LightGBM model on the Iris dataset.
jobs/single-step/pytorch/cifar-distributed/job.yml jobs/single-step/pytorch/cifar-distributed/job Train a basic convolutional neural network (CNN) with PyTorch on the CIFAR-10 dataset, distributed via PyTorch.
jobs/single-step/pytorch/iris/job.yml jobs/single-step/pytorch/iris/job Train a neural network with PyTorch on the Iris dataset.
jobs/single-step/pytorch/word-language-model/job.yml jobs/single-step/pytorch/word-language-model/job Train a multi-layer RNN (Elman, GRU, or LSTM) on a language modeling task with PyTorch.
jobs/single-step/r/accidents/job.yml jobs/single-step/r/accidents/job Train a GLM using R on the accidents dataset.
jobs/single-step/r/iris/job.yml jobs/single-step/r/iris/job Train an R model on the Iris dataset.
jobs/single-step/scikit-learn/diabetes/job.yml jobs/single-step/scikit-learn/diabetes/job Train a scikit-learn LinearRegression model on the Diabetes dataset.
jobs/single-step/scikit-learn/iris-notebook/job.yml jobs/single-step/scikit-learn/iris-notebook/job Train a scikit-learn SVM on the Iris dataset using a custom Docker container build with a notebook via papermill.
jobs/single-step/scikit-learn/iris/job-docker-context.yml jobs/single-step/scikit-learn/iris/job-docker-context Train a scikit-learn SVM on the Iris dataset using a custom Docker container build.
jobs/single-step/scikit-learn/iris/job-sweep.yml jobs/single-step/scikit-learn/iris/job-sweep Sweep hyperparemeters for training a scikit-learn SVM on the Iris dataset.
jobs/single-step/scikit-learn/iris/job.yml jobs/single-step/scikit-learn/iris/job Train a scikit-learn SVM on the Iris dataset.
jobs/single-step/spark/nyctaxi/job.yml jobs/single-step/spark/nyctaxi/job This sample shows how to run a single node Spark job on Azure ML. The 47GB NYC Taxi dataset is read in parquet format by a 1 node Spark cluster, processed and then written as job output in parquet format.
jobs/single-step/tensorflow/mnist-distributed-horovod/job.yml jobs/single-step/tensorflow/mnist-distributed-horovod/job Train a basic neural network with TensorFlow on the MNIST dataset, distributed via Horovod.
jobs/single-step/tensorflow/mnist-distributed/job.yml jobs/single-step/tensorflow/mnist-distributed/job Train a basic neural network with TensorFlow on the MNIST dataset, distributed via TensorFlow.
jobs/single-step/tensorflow/mnist/job.yml jobs/single-step/tensorflow/mnist/job Train a basic neural network with TensorFlow on the MNIST dataset.
jobs/basics/hello-code.yml jobs/basics/hello-code no description
jobs/basics/hello-data-uri-folder.yml jobs/basics/hello-data-uri-folder no description
jobs/basics/hello-dataset.yml jobs/basics/hello-dataset no description
jobs/basics/hello-git.yml jobs/basics/hello-git no description
jobs/basics/hello-iris-datastore-file.yml jobs/basics/hello-iris-datastore-file no description
jobs/basics/hello-iris-datastore-folder.yml jobs/basics/hello-iris-datastore-folder no description
jobs/basics/hello-iris-file.yml jobs/basics/hello-iris-file no description
jobs/basics/hello-iris-folder.yml jobs/basics/hello-iris-folder no description
jobs/basics/hello-iris-literal.yml jobs/basics/hello-iris-literal no description
jobs/basics/hello-mlflow.yml jobs/basics/hello-mlflow no description
jobs/basics/hello-notebook.yml jobs/basics/hello-notebook no description
jobs/basics/hello-pipeline-abc.yml jobs/basics/hello-pipeline-abc no description
jobs/basics/hello-pipeline-customize-output-file.yml jobs/basics/hello-pipeline-customize-output-file no description
jobs/basics/hello-pipeline-customize-output-folder.yml jobs/basics/hello-pipeline-customize-output-folder no description
jobs/basics/hello-pipeline-default-artifacts.yml jobs/basics/hello-pipeline-default-artifacts no description
jobs/basics/hello-pipeline-io.yml jobs/basics/hello-pipeline-io no description
jobs/basics/hello-pipeline-settings.yml jobs/basics/hello-pipeline-settings no description
jobs/basics/hello-pipeline.yml jobs/basics/hello-pipeline no description
jobs/basics/hello-sweep.yml jobs/basics/hello-sweep Hello sweep job example.
jobs/basics/hello-world-env-var.yml jobs/basics/hello-world-env-var no description
jobs/basics/hello-world-input.yml jobs/basics/hello-world-input no description
jobs/basics/hello-world-org.yml jobs/basics/hello-world-org
jobs/basics/hello-world-output-data.yml jobs/basics/hello-world-output-data no description
jobs/basics/hello-world-output.yml jobs/basics/hello-world-output no description
jobs/basics/hello-world.yml jobs/basics/hello-world no description
jobs/pipelines/cifar-10/pipeline.yml jobs/pipelines/cifar-10/pipeline Pipeline using distributed job to train model based on cifar-10 dataset
jobs/pipelines/nyc-taxi/pipeline.yml jobs/pipelines/nyc-taxi/pipeline Train model with nyc taxi data
jobs/automl-standalone-jobs/cli-automl-classification-task-bankmarketing/cli-automl-classification-task-bankmarketing.yml jobs/automl-standalone-jobs/cli-automl-classification-task-bankmarketing/cli-automl-classification-task-bankmarketing A Classification job using bank marketing
jobs/automl-standalone-jobs/cli-automl-forecasting-task-energy-demand/cli-automl-forecasting-task-energy-demand.yml jobs/automl-standalone-jobs/cli-automl-forecasting-task-energy-demand/cli-automl-forecasting-task-energy-demand A Time-Series Forecasting job using energy demand dataset
jobs/automl-standalone-jobs/cli-automl-image-classification-multiclass-task-fridge-items/cli-automl-image-classification-multiclass-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-classification-multiclass-task-fridge-items/cli-automl-image-classification-multiclass-task-fridge-items A multi-class Image classification job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-image-classification-multilablel-task-fridge-items/cli-automl-image-classification-multilabel-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-classification-multilablel-task-fridge-items/cli-automl-image-classification-multilabel-task-fridge-items A multi-label Image classification job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-image-instance-segmentation-task-fridge-items/cli-automl-image-instance-segmentation-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-instance-segmentation-task-fridge-items/cli-automl-image-instance-segmentation-task-fridge-items An Image Instance segmentation job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-image-object-detection-task-fridge-items/cli-automl-image-object-detection-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-object-detection-task-fridge-items/cli-automl-image-object-detection-task-fridge-items An Image Object Detection job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-regression-task-hardware-perf/cli-automl-regression-task-hardware-perf.yml jobs/automl-standalone-jobs/cli-automl-regression-task-hardware-perf/cli-automl-regression-task-hardware-perf A regression job using hardware performance dataset
jobs/automl-standalone-jobs/cli-automl-text-classification-multilabel-paper-cat/cli-automl-text-classification-multilabel-paper-cat.yml jobs/automl-standalone-jobs/cli-automl-text-classification-multilabel-paper-cat/cli-automl-text-classification-multilabel-paper-cat A text classification multilabel job using paper categorization data
jobs/automl-standalone-jobs/cli-automl-text-classification-newsgroup/cli-automl-text-classification-newsgroup.yml jobs/automl-standalone-jobs/cli-automl-text-classification-newsgroup/cli-automl-text-classification-newsgroup A text classification job using newsgroup dataset
jobs/automl-standalone-jobs/cli-automl-text-ner-conll/cli-automl-text-ner-conll2003.yml jobs/automl-standalone-jobs/cli-automl-text-ner-conll/cli-automl-text-ner-conll2003 A text named entity recognition job using CoNLL 2003 data
jobs/pipelines-with-components/basics/1a_e2e_local_components/pipeline.yml jobs/pipelines-with-components/basics/1a_e2e_local_components/pipeline Dummy train-score-eval pipeline with local components
jobs/pipelines-with-components/basics/1b_e2e_registered_components/pipeline.yml jobs/pipelines-with-components/basics/1b_e2e_registered_components/pipeline E2E dummy train-score-eval pipeline with registered components
jobs/pipelines-with-components/basics/2a_basic_component/pipeline.yml jobs/pipelines-with-components/basics/2a_basic_component/pipeline Hello World component example
jobs/pipelines-with-components/basics/2b_component_with_input_output/pipeline.yml jobs/pipelines-with-components/basics/2b_component_with_input_output/pipeline Component with inputs and outputs
jobs/pipelines-with-components/basics/3a_basic_pipeline/pipeline.yml jobs/pipelines-with-components/basics/3a_basic_pipeline/pipeline Basic Pipeline Job with 3 Hello World components
jobs/pipelines-with-components/basics/3b_pipeline_with_data/pipeline.yml jobs/pipelines-with-components/basics/3b_pipeline_with_data/pipeline Pipeline with 3 component jobs with data dependencies
jobs/pipelines-with-components/basics/4a_local_data_input/pipeline.yml jobs/pipelines-with-components/basics/4a_local_data_input/pipeline Example of using data in a local folder as pipeline input
jobs/pipelines-with-components/basics/4b_datastore_datapath_uri/pipeline.yml jobs/pipelines-with-components/basics/4b_datastore_datapath_uri/pipeline Example of using data folder from a Workspace Datastore as pipeline input
jobs/pipelines-with-components/basics/4c_web_url_input/pipeline.yml jobs/pipelines-with-components/basics/4c_web_url_input/pipeline Example of using a file hosted at a web URL as pipeline input
jobs/pipelines-with-components/basics/4d_data_input/pipeline.yml jobs/pipelines-with-components/basics/4d_data_input/pipeline Example of using data from a data as pipeline input
jobs/pipelines-with-components/basics/5a_env_public_docker_image/pipeline.yml jobs/pipelines-with-components/basics/5a_env_public_docker_image/pipeline Pipeline job with component using public docker image as environment
jobs/pipelines-with-components/basics/5b_env_registered/pipeline.yml jobs/pipelines-with-components/basics/5b_env_registered/pipeline Pipeline job with component using a registered AzureML environment
jobs/pipelines-with-components/basics/5c_env_conda_file/pipeline.yml jobs/pipelines-with-components/basics/5c_env_conda_file/pipeline Pipeline job with component using environment defined by a conda file
jobs/pipelines-with-components/basics/6a_tf_hello_world/pipeline.yml jobs/pipelines-with-components/basics/6a_tf_hello_world/pipeline Prints the environment variable ($TF_CONFIG) useful for scripts running in a Tensorflow training environment
jobs/pipelines-with-components/basics/6b_pytorch_hello_world/pipeline.yml jobs/pipelines-with-components/basics/6b_pytorch_hello_world/pipeline Prints the environment variables useful for scripts running in a PyTorch training environment
jobs/pipelines-with-components/basics/6c_r_iris/pipeline.yml jobs/pipelines-with-components/basics/6c_r_iris/pipeline Train an R model on the Iris dataset.
jobs/pipelines-with-components/image_classification_with_densenet/pipeline.yml jobs/pipelines-with-components/image_classification_with_densenet/pipeline Train densenet for image classification
jobs/pipelines-with-components/nyc_taxi_data_regression/pipeline.yml jobs/pipelines-with-components/nyc_taxi_data_regression/pipeline Train regression model based on nyc taxi dataset
jobs/pipelines-with-components/pipeline_with_hyperparameter_sweep/pipeline.yml jobs/pipelines-with-components/pipeline_with_hyperparameter_sweep/pipeline Tune hyperparameters using TF component
jobs/pipelines-with-components/rai_pipeline_adult_analyse/pipeline.yml jobs/pipelines-with-components/rai_pipeline_adult_analyse/pipeline Sample rai pipeline
jobs/automl-standalone-jobs/cli-automl-classification-task-bankmarketing/cli-automl-classification-task-bankmarketing.yml jobs/automl-standalone-jobs/cli-automl-classification-task-bankmarketing/cli-automl-classification-task-bankmarketing A Classification job using bank marketing
jobs/automl-standalone-jobs/cli-automl-forecasting-task-energy-demand/cli-automl-forecasting-task-energy-demand.yml jobs/automl-standalone-jobs/cli-automl-forecasting-task-energy-demand/cli-automl-forecasting-task-energy-demand A Time-Series Forecasting job using energy demand dataset
jobs/automl-standalone-jobs/cli-automl-image-classification-multiclass-task-fridge-items/cli-automl-image-classification-multiclass-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-classification-multiclass-task-fridge-items/cli-automl-image-classification-multiclass-task-fridge-items A multi-class Image classification job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-image-classification-multilablel-task-fridge-items/cli-automl-image-classification-multilabel-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-classification-multilablel-task-fridge-items/cli-automl-image-classification-multilabel-task-fridge-items A multi-label Image classification job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-image-instance-segmentation-task-fridge-items/cli-automl-image-instance-segmentation-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-instance-segmentation-task-fridge-items/cli-automl-image-instance-segmentation-task-fridge-items An Image Instance segmentation job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-image-object-detection-task-fridge-items/cli-automl-image-object-detection-task-fridge-items.yml jobs/automl-standalone-jobs/cli-automl-image-object-detection-task-fridge-items/cli-automl-image-object-detection-task-fridge-items An Image Object Detection job using fridge items dataset
jobs/automl-standalone-jobs/cli-automl-regression-task-hardware-perf/cli-automl-regression-task-hardware-perf.yml jobs/automl-standalone-jobs/cli-automl-regression-task-hardware-perf/cli-automl-regression-task-hardware-perf A regression job using hardware performance dataset
jobs/automl-standalone-jobs/cli-automl-text-classification-multilabel-paper-cat/cli-automl-text-classification-multilabel-paper-cat.yml jobs/automl-standalone-jobs/cli-automl-text-classification-multilabel-paper-cat/cli-automl-text-classification-multilabel-paper-cat A text classification multilabel job using paper categorization data
jobs/automl-standalone-jobs/cli-automl-text-classification-newsgroup/cli-automl-text-classification-newsgroup.yml jobs/automl-standalone-jobs/cli-automl-text-classification-newsgroup/cli-automl-text-classification-newsgroup A text classification job using newsgroup dataset
jobs/automl-standalone-jobs/cli-automl-text-ner-conll/cli-automl-text-ner-conll2003.yml jobs/automl-standalone-jobs/cli-automl-text-ner-conll/cli-automl-text-ner-conll2003 A text named entity recognition job using CoNLL 2003 data

Endpoints (endpoints)

path status description

Resources (resources)

path status description
resources/compute/cluster-basic.yml resources/compute/cluster-basic no description
resources/compute/cluster-location.yml resources/compute/cluster-location no description
resources/compute/cluster-low-priority.yml resources/compute/cluster-low-priority no description
resources/compute/cluster-minimal.yml resources/compute/cluster-minimal no description
resources/compute/cluster-ssh-password.yml resources/compute/cluster-ssh-password no description
resources/compute/cluster-system-identity.yml resources/compute/cluster-system-identity no description
resources/compute/cluster-user-identity.yml resources/compute/cluster-user-identity no description

Assets (assets)

path status description
assets/component/train.yml assets/component/train A example train component
assets/data/cloud-file-https.yml assets/data/cloud-file-https Data asset created from a file in cloud using https URL.
assets/data/cloud-file-wasbs.yml assets/data/cloud-file-wasbs Data asset created from a file in cloud using wasbs URL.
assets/data/cloud-file.yml assets/data/cloud-file Data asset created from file in cloud.
assets/data/cloud-folder-https.yml assets/data/cloud-folder-https Dataset created from folder in cloud using https URL.
assets/data/cloud-folder-wasbs.yml assets/data/cloud-folder-wasbs Data asset created from folder in cloud using wasbs URL.
assets/data/cloud-folder.yml assets/data/cloud-folder Data asset created from folder in cloud.
assets/data/iris-csv-example.yml assets/data/iris-csv-example no description
assets/data/local-file.yml assets/data/local-file Data asset created from local file.
assets/data/local-folder.yml assets/data/local-folder Dataset created from local folder.
assets/data/public-file-https.yml assets/data/public-file-https Data asset created from a publicly available file using https URL.
assets/environment/docker-context.yml assets/environment/docker-context no description
assets/environment/docker-image-plus-conda.yml assets/environment/docker-image-plus-conda Environment created from a Docker image plus Conda environment.
assets/environment/docker-image.yml assets/environment/docker-image Environment created from a Docker image.
assets/model/local-file.yml assets/model/local-file Model created from local file.
assets/model/local-mlflow.yml assets/model/local-mlflow Model created from local MLflow model directory.

Contents

directory description
assets assets
endpoints endpoints
jobs jobs
resources resources

Contributing

We welcome contributions and suggestions! Please see the contributing guidelines for details.

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. Please see the code of conduct for details.

Reference