Example of using HyperDrive to tune a regular ML learner.
Перейти к файлу
Dan Grecoe f28c893f99
Matrix Strategy (#111)
* Matrix Strategy

Adding a strategy to run in 3 regions : eastus, westus2, southcentralus

location parameter was never set in the 03 notebook but added azlocation to set that field when running that notebook.

* Re-order parameters

Papermill complained it didn't know abut azurerggroup, but similarly said it didn't know what estimators were....

* Correct Param Names

azurergname wasn't used throughout the matrix strategy, only the first one, which seems to make sense based on the results.
2020-04-07 17:51:45 -04:00
.ci Update azure-pipelines-v2.yml 2019-12-06 18:51:10 -05:00
.gitignore added .azureml to gitignore 2019-09-12 00:08:40 -04:00
00_Data_Prep.ipynb Clean up the language. 2019-07-08 19:40:52 +00:00
01_Training_Script.ipynb Added an 'estimators' parameter 2019-12-06 17:56:58 -05:00
02_Testing_Script.ipynb Create scripts dir using Python. Schange script to use joblib directly. 2019-10-16 11:46:57 -04:00
03_Run_Locally.ipynb Added an 'estimators' parameter 2019-12-06 17:57:21 -05:00
04_Hyperparameter_Random_Search.ipynb Added an 'estimators' parameter 2019-12-06 17:57:51 -05:00
05_Train_Best_Model.ipynb Rename model_estimators to estimators 2019-12-09 09:50:16 -05:00
06_Test_Best_Model.ipynb Point to ntebook 07 as next 2019-10-24 14:27:58 +00:00
07_Train_With_AML_Pipeline.ipynb Upgrade azure-cli and azureml-sdk. Remove workarounds got feedback 20566 2019-11-26 20:24:34 +00:00
08_Tear_Down.ipynb renamed the tear down notebook 2019-09-12 00:09:09 -04:00
Design.png Use Nanette's diagram 2019-03-04 09:16:10 -05:00
LICENSE Initial commit 2018-08-14 10:13:23 -07:00
README.md Alignment 2019-07-08 15:27:22 -04:00
azure-pipelines.yml Matrix Strategy (#111) 2020-04-07 17:51:45 -04:00
environment.yml Upgrade azure-cli and azureml-sdk. Remove workarounds got feedback 20566 2019-11-26 20:24:34 +00:00
get_auth.py ServicePrincipalAuthentication structural change 2019-03-06 06:59:28 -05:00
text_utilities.py Add copyright notice. 2019-02-28 21:12:32 +00:00

README.md

Author: Mario Bourgoin

Training of Python scikit-learn models on Azure

Overview

This scenario shows how to tune a Frequently Asked Questions (FAQ) matching model that can be deployed as a web service to provide predictions for user questions. For this scenario, "Input Data" in the architecture diagram refers to text strings containing the user questions to match with a list of FAQs. The scenario is designed for the Scikit-Learn machine learning library for Python but can be generalized to any scenario that uses Python models to make real-time predictions.

Design

alt text The scenario uses a subset of Stack Overflow question data which includes original questions tagged as JavaScript, their duplicate questions, and their answers. It tunes a Scikit-Learn pipeline to predict the match probability of a duplicate question with each of the original questions. The application flow for this architecture is as follows:

  1. Create an Azure ML Service workspace.
  2. Create an Azure ML Compute cluster.
  3. Upload training, tuning, and testing data to Azure Storage.
  4. Configure a HyperDrive random hyperparameter search.
  5. Submit the search.
  6. Monitor until complete.
  7. Retrieve the best set of hyperparameters.
  8. Register the best model.

Prerequisites

  1. Linux (Ubuntu).
  2. Anaconda Python installed.
  3. Azure account.

The tutorial was developed on an Azure Ubuntu DSVM, which addresses the first two prerequisites. You can allocate such a VM on Azure Portal by creating a "Data Science Virtual Machine for Linux (Ubuntu)" resource.

Setup

To set up your environment to run these notebooks, please follow these steps. They setup the notebooks to use Azure seamlessly.

  1. Create a Linux Ubuntu VM.
  2. Log in to your VM. We recommend that you use a graphical client such as X2Go to access your VM. The remaining steps are to be done on the VM.
  3. Open a terminal emulator.
  4. Clone, fork, or download the zip file for this repository:
    git clone https://github.com/Microsoft/MLHyperparameterTuning.git
    
  5. Enter the local repository:
    cd MLHyperparameterTuning
    
  6. Create the Python MLHyperparameterTuning virtual environment using the environment.yml:
    conda env create -f environment.yml
    
  7. Activate the virtual environment:
    source activate MLHyperparameterTuning
    
    The remaining steps should be done in this virtual environment.
  8. Login to Azure:
    az login
    
    You can verify that you are logged in to your subscription by executing the command:
    az account show -o table
    
  9. If you have more than one Azure subscription, select it:
    az account set --subscription <Your Azure Subscription>
    
  10. Start the Jupyter notebook server:
    jupyter notebook
    

Steps

After following the setup instructions above, run the Jupyter notebooks in order starting with 00_Data_Prep_Notebook.ipynb.

Cleaning up

The last Jupyter notebook describes how to delete the Azure resources created for running the tutorial. Consult the conda documentation for information on how to remove the conda environment created during the setup. And if you created a VM, you may also delete it.

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.