diff --git a/devops/04-ReleaseGates.ipynb b/devops/04-ReleaseGates.ipynb new file mode 100644 index 0000000..0159bba --- /dev/null +++ b/devops/04-ReleaseGates.ipynb @@ -0,0 +1,187 @@ +{ + "cells": [ + { + "cell_type": "markdown", + "metadata": { + "collapsed": true, + "trusted": false + }, + "source": [ + "# Release Deployment Control using Gates\n", + "\n", + "Gates allow automatic collection of health signals from external services. These signals are then used to promote the release when all the signals are successful at the same time or stop the deployment on timeout.\n", + "\n", + "In this lab, we will focus on quality metrics from tests such as pass rate and deploy only if they are within required thresholds. Additionally, we will also use a HTTP triggered function in an Azure function app and parse the response." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "### New Release Pipeline\n", + "\n", + "1. Create a new release pipeline by selecting `Releases` tab on the left followed by `New pipeline`.\n", + "\n", + "![newReleasePipeline](../images/newReleasePipeline.png)\n", + "\n", + "2. Select `Empty Job` template to add our own steps:\n", + "\n", + "![newReleasePipeline](../images/emptyJobTemplate.png)\n", + "\n", + "3. Ensure there is one stage added under Stages.\n", + "\n", + "4. With `Artifacts`, select _+ Add_. Select `Source type` is `Build`. Provide the `Project` name and `Source (build pipeline)` and add this to artifacts.\n", + "\n", + "5. (Optional) You can also set a continuous deployment trigger by selecting the lightning button in artifacts and then turning on the first enable button. A typical scenario is to create a release every time a new build is available." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "### Pre-Deployment and Post-Deployment Approvals ###\n", + "\n", + "Each stage in a release pipeline can be configured with pre-deployment and post-deployment conditions that can include waiting for users to manually approve or reject deployments, and checking with other automated systems until specific conditions are verified. The following diagram shows how these features are combined in a stage of a release pipeline:\n", + "\n", + "![approvalsGates](../images/approvalsGates.png)\n", + "\n", + "By using approvals, gates, and manual intervention you can take full control of your releases to meet a wide range of deployment requirements.\n", + "\n", + "You can define approvals at the start of a stage (pre-deployment approvers), at the end of a stage (post-deployment approvers) or both. The following image illustrates how you can set pre and post deployment conditions:\n", + "\n", + "![prePostIcons](../images/prePostIcons.png)\n", + "\n", + "#### A. Pre-deployment approval ####\n", + "\n", + "For a pre-deployment approval, you may choose the icon at the entry point of the stage and enable pre-deployment approvers. We will not assign any approvers for pre-deployment in this scenario. However, a typical scenario for pre-deployment approvals is when some users must manually validate the change request and approve the deployment to a stage. With gates, you may want to ensure there are no issues in the work item before deploying a build to a stage.\n", + "\n", + "***Triggers***\n", + "\n", + "Triggers will start the deployment and there are two options available: After release and Manual only. In this lab, we will perform manual deployments. To allow only manual deployments, select `Manual`:\n", + "\n", + "![prePostIcons](../images/manualTrigger.png)\n", + "\n", + "***Gates***\n", + "\n", + "We will setup pre-deployment gates to check for published test results and coverage. Follow the below steps to setup a pre-deployment gate:\n", + "\n", + "1. Turn the _Enabled_ button on next to Gates.\n", + "\n", + "2. Click _+ Add_ and select _Invoke Azure Function_\n", + "\n", + "3. Configure the Azure Function Parameters by populating the below fields:\n", + "\n", + "| Name | Description |\n", + "|----------|-------------|\n", + "| _Azure function URL_ | The URL of the Azure function to be invoked. Populate this field with the URL of the function we created for monitoring tests. |\n", + "| _Function key_ | The value of the available function or the host key for the function to be invoked. You can set this to $(myFunctionKey) |\n", + "| _Method_ | The HTTP method with which the function will be invoked. Set this field to GET |\n", + "| _Headers_ | The header in JSON format to be attached to the request sent to the function. Leave this default. |\n", + "| _Query parameters_ | Query parameters to append to the function URL. We will not be passing any query parameters. |\n", + "| _Body_ | This is optional and is the request body for the Azure function call in JSON format. |\n", + "| _Completion Event_ | This is mandatory and is about how the task reports completion. We will use `ApiResponse` which is default. |\n", + "| _Success criteria_ | This is optional and is about how to parse the response body for success. Our function returns `\"status: \"success\"` if all the tests passed. Hence, we will call eq(root['status'], 'success') for validation.|\n", + "\n", + "\n", + "![azureFunctionConfig](../images/azureFunctionConfig.png)\n", + "\n", + "4. Open the `Evaluation options` tab and change the `The time between re-evaluation of gates` to 5 minutes. This is the duration after which the gates are re-evaluated. This must be greater than the longest typical response time of the configured gates to allow all responses to be received in each evaluation.\n", + "\n", + "#### B. Post-deployment approval #### \n", + "\n", + "For a post-deployment approval, choose the icon at the exit point of the stage and enable post-deployment approvers. List your username in the Approvers textbox and reduce the timeout to a day. _Note:_ A maximum timeout of 365 days is allowed.\n", + "\n", + "A typical scenario for post-deployment approval is when users must manually sign off the app after deployment before the release is promoted to other stages. With gates, you may want to ensure there are no incidents from the monitoring for the app after it's been deployed, before promoting the release." + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false, + "trusted": true + }, + "source": [ + "#### Create a Release\n", + "\n", + "1. Save the pipeline (by clicking Save at the top right hand corner) by giving a meaningful name to the new release pipeline. This should show up in the releases tab:\n", + "\n", + "![newPipeline](../images/newPipeline.png)\n", + "\n", + "2. Select `Create a Release` to queue a release.\n", + "\n", + "3. In the `Stages` section, select `Deploy` to process the gates. You should see `Processing gates` status in the `Stages` section:\n", + "\n", + "![pendingApproval](../images/processingStages.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false + }, + "source": [ + "### Release Logs\n", + "\n", + "An additional feature available with the release tabs is to view the stage logs. Select `Logs` below stage(s) to view the deployment process and gate logs.\n", + "\n", + "![releaseLogs](../images/releaseLogs.png)\n", + "\n", + "This should allow you to view the `Deployment process` where you can see logs from `Azure Function` used in deployment gates and approvals, etc. as shown below:\n", + "\n", + "![deploymentProcess](../images/deploymentProcess.png)" + ] + }, + { + "cell_type": "markdown", + "metadata": { + "collapsed": false, + "trusted": true + }, + "source": [ + "### Approvals\n", + "\n", + "In the final step, the approver would have recieved an email for approval of Deployment to Stage with a link to `View Approval`. On selecting `View Approval`, it will take the approver to the stage to approve as shown below:\n", + "\n", + "![pendingApproval](../images/pendingApproval.png)\n", + "\n", + "If you are the approver, select `Approve` in Stages to provide comments and approve if there are no issues.\n", + "\n", + "Select the `Releases` tab, to see if the release was a succcess. On success, you should see a green tick under stages as shown below:\n", + "\n", + "![successStages](../images/successStages.png)" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "Python 3 (Ubuntu Linux)", + "language": "python", + "metadata": { + "cocalc": { + "description": "Python 3 programming language", + "priority": 100, + "url": "https://www.python.org/" + } + }, + "name": "python3" + }, + "language_info": { + "codemirror_mode": { + "name": "ipython", + "version": 3 + }, + "file_extension": ".py", + "mimetype": "text/x-python", + "name": "python", + "nbconvert_exporter": "python", + "pygments_lexer": "ipython3", + "version": "3.6.7" + } + }, + "nbformat": 4, + "nbformat_minor": 0 +} \ No newline at end of file diff --git a/devops/05-Monitor.ipynb b/devops/05-Monitor.ipynb new file mode 100644 index 0000000..5ffadb9 --- /dev/null +++ b/devops/05-Monitor.ipynb @@ -0,0 +1,58 @@ +{ + "cells": [ + { + "metadata": {}, + "cell_type": "markdown", + "source": "# Azure Monitor Alerts and Gates\n\nGates also allow us to query Azure monitor alerts. In this lab, we will begin by leveraging _Application Insights_ to create alerts. Azure Application Insights can alert you to changes in performance or usage metrics in your app. We will then monitor the alert rules in our deployment gates.\n\nA typical scenario is where a scoring service is deployed and the service is monitored for some time interval for before promoting it to production." + }, + { + "metadata": {}, + "cell_type": "markdown", + "source": "### Application Insights\n\n1. In the portal, navigate to your _Azure Machine Learning Service Workspace_ page and select the `Application Insights` account.\n\n![applicationInsights](../images/applicationInsights.png)\n\n2. In the _Application Insights_ dashboard, select _Alerts_:\n\n![alerts](../images/alerts.png)\n\n3. Select _View classic alerts_ as shown below:\n\n![classicAlerts](../images/classicAlerts.png)\n\n4. Select _Add metric alert_:\n\n![addMetricAlert](../images/addMetricAlert.png)\n\n5. In the _Add Rule_ window, provide a name for the rule. You will find that the Criteria information is already populated. Select a metric to use in the rule. For example, you can use `Server Response Time`, `Availability` or `Failed Requests` along with _Condition_, _Threshold value_ and _Period_. A sample rule can be \"Server response time less than 3\".\n\n![addRule](../images/addRule.png)\n\n6. In the _Notify via_ section, you can list email addresses seperated by semicolons in `Notification email recipients` for getting notified.\n\n7. On success, you will be able to view the rules in _Alerts (classic)_ dashboard:\n\n![rules](../images/rules.png)" + }, + { + "metadata": { + "trusted": true + }, + "cell_type": "markdown", + "source": "### Query Azure Monitor Alerts\n\nEdit the release pipeline created from the previous lab. The release pipeline should have pre-deployment gates enabled. By now, you should have an idea of how to add gates to release stages. Refer to previous lab for guidance (if you need).\n\n1. Select `Pre-deployment conditions` in the stages and under `Gates`, select _+ Add_ followed by `Query Azure Monitor alerts`.\n\n![gatesAlerts](../images/gatesAlerts.png)\n\nFill out all the mandatory fields with _Query Azure Monitor Alerts_ and save.\n\n| Name | Description |\n|----------|-------------|\n| _Display Name_ | A mandatory name for gates related to Query Azure Monitor Alerts |\n| _Azure Subscription_ | Azure Resource Manager subscription to monitor |\n| _Resource group_ | Name of resource group to monitor |\n| _Resource type_ | Name of resource type to monitor. In this lab, we will use Application Insights |\n| _Resource name_ | Name of resource name to monitor. In this lab, we will use the name of the Application Insights resource |\n| _Alert rules_ | List of Azure alert rules to monitor. Select the alert rule created in the previous section. |\n\n2. Navigate to Releases and open the release pipeline saved. Deploy/Redeploy:\n\n![deploy](../images/deploy.png)\n\n3. Select _Logs_ of _Pre-deployment gates_ to verify if all gates succeeded. \n\n![gatesSuccess](../images/gatesSuccess.png)" + }, + { + "metadata": { + "trusted": true + }, + "cell_type": "markdown", + "source": "### Exercise\n\n1. Can you add another alert rule related to _availability_ to the gate?\n\n2. (Optional) The _Query Azure Monitor alerts_ passed in this scenario because there wasnt any data for response time. However, you can create a sccoring service using AKS and pass more scoring data and and then redoploy to see if the additional data makes a difference. Run the below script (after replacing and ) locally to obtain the predictions. You can also change input_j to obtain different predictions.\n\n````python\n import requests\n import json\n\n input_j = [[1.92168882e+02, 5.82427351e+02, 2.09748253e+02, 4.32529303e+01, 1.52377597e+01, 5.37307613e+01, 1.15729573e+01, 4.27624778e+00, 1.68042813e+02, 4.61654301e+02, 1.03138200e+02, 4.08555785e+01, 1.80809993e+01, 4.85402042e+01, 1.09373285e+01, 4.18269355e+00, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 3.07200000e+03, 5.64000000e+02, 2.22900000e+03, 9.84000000e+02, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 0.00000000e+00, 3.03000000e+02, 6.63000000e+02, 3.18300000e+03, 3.03000000e+02, 5.34300000e+03, 4.26300000e+03, 6.88200000e+03, 1.02300000e+03, 1.80000000e+01]]\n\n data = json.dumps({'data': input_j})\n test_sample = bytes(data, encoding = 'utf8')\n\n url = ''\n api_key = '' \n headers = {'Content-Type':'application/json', 'Authorization':('Bearer '+ api_key)}\n\n resp = requests.post(url, test_sample, headers=headers)\n print(resp.text)\n````" + }, + { + "metadata": { + "trusted": true + }, + "cell_type": "code", + "source": "", + "execution_count": null, + "outputs": [] + } + ], + "metadata": { + "kernelspec": { + "name": "python3", + "display_name": "Python 3", + "language": "python" + }, + "language_info": { + "mimetype": "text/x-python", + "nbconvert_exporter": "python", + "name": "python", + "pygments_lexer": "ipython3", + "version": "3.5.4", + "file_extension": ".py", + "codemirror_mode": { + "version": 3, + "name": "ipython" + } + } + }, + "nbformat": 4, + "nbformat_minor": 1 +} \ No newline at end of file diff --git a/images/addMetricAlert.png b/images/addMetricAlert.png new file mode 100644 index 0000000..e8224d8 Binary files /dev/null and b/images/addMetricAlert.png differ diff --git a/images/addRule.png b/images/addRule.png new file mode 100644 index 0000000..ffcbde1 Binary files /dev/null and b/images/addRule.png differ diff --git a/images/alerts.png b/images/alerts.png new file mode 100644 index 0000000..db89e9e Binary files /dev/null and b/images/alerts.png differ diff --git a/images/applicationInsights.png b/images/applicationInsights.png new file mode 100644 index 0000000..9451d67 Binary files /dev/null and b/images/applicationInsights.png differ diff --git a/images/azureFunctionConfig.png b/images/azureFunctionConfig.png index e3fcd7b..b9e05ed 100644 Binary files a/images/azureFunctionConfig.png and b/images/azureFunctionConfig.png differ diff --git a/images/classicAlerts.png b/images/classicAlerts.png new file mode 100644 index 0000000..b54e20b Binary files /dev/null and b/images/classicAlerts.png differ diff --git a/images/deploy.png b/images/deploy.png new file mode 100644 index 0000000..dfaabb9 Binary files /dev/null and b/images/deploy.png differ diff --git a/images/deploymentProcess.png b/images/deploymentProcess.png index 7e372a7..d82ed9b 100644 Binary files a/images/deploymentProcess.png and b/images/deploymentProcess.png differ diff --git a/images/gatesAlerts.png b/images/gatesAlerts.png new file mode 100644 index 0000000..918dd3b Binary files /dev/null and b/images/gatesAlerts.png differ diff --git a/images/gatesSuccess.png b/images/gatesSuccess.png new file mode 100644 index 0000000..b71a27e Binary files /dev/null and b/images/gatesSuccess.png differ diff --git a/images/processingStages.png b/images/processingStages.png index dfd91f0..049c1af 100644 Binary files a/images/processingStages.png and b/images/processingStages.png differ diff --git a/images/releaseLogs.png b/images/releaseLogs.png index a7589b0..6f144fe 100644 Binary files a/images/releaseLogs.png and b/images/releaseLogs.png differ diff --git a/images/rules.png b/images/rules.png new file mode 100644 index 0000000..f8aa9e0 Binary files /dev/null and b/images/rules.png differ diff --git a/images/successStages.png b/images/successStages.png index 8e19a99..ddf08ea 100644 Binary files a/images/successStages.png and b/images/successStages.png differ