Merge pull request #233 from microsoft/azuresqlworkshop

Discussed with Bob Ward and reviewed and merging
This commit is contained in:
Anna Hoffman (Thomas) 2020-02-05 15:06:34 -08:00 коммит произвёл GitHub
Родитель 8e709bd1b6 503292fc48
Коммит 4b4ade0a57
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
145 изменённых файлов: 28676 добавлений и 13150 удалений

Просмотреть файл

@ -111,10 +111,11 @@ This is a modular workshop, and in each section, you'll learn concepts, technolo
<tr><td style="background-color: AliceBlue; color: black;"><b>Module</b></td><td style="background-color: AliceBlue; color: black;"><b>Topics</b></td></tr>
<tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/01-IntroToAzureSQL.md" target="_blank">01 - Introduction to Azure SQL</a></td><td>Starting with a brief history of why and how we built Azure SQL, youll then learn about the various deployment options and service tiers, including what to use when. This includes Azure SQL Database and Azure SQL managed instance. Understanding what Platform as a Service (PaaS) encompasses and how it compares to the SQL Server “box” will help level-set what you get (and dont get) when you move to the cloud. Well cover deployment, configuration, and other getting started related tasks for Azure SQL (hands-on). </td></tr>
<tr><td style="background-color: AliceBlue; color: black;"><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/02-Security.md" target="_blank">02 - Security</a> </td><td td style="background-color: AliceBlue; color: black;">Ensuring security and compliance of your data is always a top priority. Youll learn how to use Azure SQL to secure your data, how to configure logins and users, how to use tools and techniques for monitoring security, how to ensure your data meets industry and regulatory compliance standards, and how to leverage the extra benefits and intelligence that is only available in Azure. Well also cover some of the networking considerations for securing SQL.</td></tr>
<tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/03-Performance.md" target="_blank">03 - Performance</a></td><td>Youve been responsible for getting your SQL fast, keeping it fast, and making it fast again when something is wrong. Well show you how to leverage your existing performance skills, processes, and tools and apply them to Azure SQL, including taking advantage of the intelligence in Azure to keep your database tuned.</td></tr>
<tr><td style="background-color: AliceBlue; color: black;"><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/04-Availability.md" target="_blank">04 - Availability</a> </td><td td style="background-color: AliceBlue; color: black;">Depending on the SLA your business requires, Azure SQL has the options you need including built-in capabilities. You will learn how to translate your knowledge of backup/restore, Always on failover cluster instances, and Always On availability groups with the options for business continuity in Azure SQL.</td></tr> <tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/05-PuttingItTogether.md" target="_blank">05 - Putting it all together</a></td><td>In the final activity, well validate your Azure SQL expertise with a challenging problem-solution exercise. Well then broaden your horizons to the many other opportunities and resources for personal and corporate growth that Azure offer.</td></tr>
<tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/01-IntroToAzureSQL.md" target="_blank">01 - Introduction to Azure SQL</a></td><td>Starting with a brief history of why and how we built Azure SQL, youll then learn about the various deployment options and service tiers, including what to use when. This includes Azure SQL Database and Azure SQL managed instance. Understanding what Platform as a Service (PaaS) encompasses and how it compares to the SQL Server “box” will help level-set what you get (and dont get) when you move to the cloud. </td></tr>
<tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/02-DeployAndConfigure.md" target="_blank">02 - Deploy and Configure</a></td><td>Well cover deployment, configuration, and other getting started related tasks for Azure SQL. You'll learn how you should plan for deployment, deploy, and verify your deployment, and how it compares to SQL Server. Then, you'll learn how deploying and configuring databases compares, plus some insights on loading data. </td></tr>
<tr><td style="background-color: AliceBlue; color: black;"><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/03-Security.md" target="_blank">03 - Security</a> </td><td td style="background-color: AliceBlue; color: black;">Ensuring security and compliance of your data is always a top priority. Youll learn how to use Azure SQL to secure your data, how to configure logins and users, how to use tools and techniques for monitoring security, how to ensure your data meets industry and regulatory compliance standards, and how to leverage the extra benefits and intelligence that is only available in Azure. Well also cover some of the networking considerations for securing SQL.</td></tr>
<tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/04-Performance.md" target="_blank">04 - Performance</a></td><td>Youve been responsible for getting your SQL fast, keeping it fast, and making it fast again when something is wrong. Well show you how to leverage your existing performance skills, processes, and tools and apply them to Azure SQL, including taking advantage of the intelligence in Azure to keep your database tuned.</td></tr>
<tr><td style="background-color: AliceBlue; color: black;"><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/05-Availability.md" target="_blank">05 - Availability</a> </td><td td style="background-color: AliceBlue; color: black;">Depending on the SLA your business requires, Azure SQL has the options you need including built-in capabilities. You will learn how to translate your knowledge of backup/restore, Always on failover cluster instances, and Always On availability groups with the options for business continuity in Azure SQL.</td></tr> <tr><td><a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/06-PuttingItTogether.md" target="_blank">06 - Putting it all together</a></td><td>In the final activity, well validate your Azure SQL expertise with a challenging problem-solution exercise. Well then broaden your horizons to the many other opportunities and resources for personal and corporate growth that Azure offer.</td></tr>
</table>

Просмотреть файл

@ -55,7 +55,7 @@ Your workshop invitation may have instructed you that they will provide a Micros
In order to complete this workshop you need to install the following software:
1. Create a resource group for the workshop, naming it **azuresqlworkshopUID** where **ID** is some 4-6 digit identifier that you can easily remember (e.g. 0406 is my birthday so I might pick "azuresqlworkshop0406"). Use this same **ID** every time you are told to name something ending in **ID**. Select a region that is close to where you are, and use this region for all future resoureces.
1. Create a resource group for the workshop, naming it **azuresqlworkshopID** where **ID** is some 4-6 digit identifier that you can easily remember (e.g. 0406 is my birthday so I might pick "azuresqlworkshop0406"). Use this same **ID** every time you are told to name something ending in **ID**. Select a region that is close to where you are, and use this region for all future resources.
1. Deploy an [Azure virtual machine](https://ms.portal.azure.com/#create/Microsoft.VirtualMachine-ARM) (link goes to service in Azure portal). The recommended minimum size is a **D2s_v3**, and you should use a **Windows 10** image. Name the virtual machine **win-vmID** (i.e. "win-vm0406"). Accept other defaults, and refer to more information on deploying Azure virtual machines [here](https://docs.microsoft.com/en-us/azure/virtual-machines/windows/quick-create-portal#create-virtual-machine).
1. Connect to the virtual machine, and perform the remaining steps in the virtual machine.

Просмотреть файл

@ -1,69 +1,133 @@
# Module 1 Activity - Introduction to Azure SQL
![](../graphics/microsoftlogo.png)
#### <i>The Azure SQL Workshop</i>
# The Azure SQL Workshop
#### <i>A Microsoft workshop from the SQL team</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/textbubble.png?raw=true"> <h2>Overview</h2>
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>01 - Introduction to Azure SQL</h2>
> You must complete the prerequisites before completing these activities. You can also choose to audit the materials if you cannot complete the prerequisites. If you were provided an environment to use for the workshop, then you **do not need** to complete the prerequisites.
In this module, you'll start with a brief history of why and how we built Azure SQL, then youll then learn about the various deployment options and service tiers, including what to use when. This includes Azure SQL Database and Azure SQL managed instance. Understanding what Platform as a Service (PaaS) encompasses and how it compares to the SQL Server “box” will help level-set what you get (and dont get) when you move to the cloud. Well cover deployment, configuration, and other getting started related tasks for Azure SQL (hands-on).
In this module's activities, you will deploy and configure Azure SQL, specifically Azure SQL Database. In addition to the Azure portal, you'll leverage SSMS, Azure Data Studio (including SQL and PowerShell Notebooks), and the Azure CLI.
In each module you'll get more references, which you should follow up on to learn more. Also watch for links within the text - click on each one to explore that topic.
Throughout the activities, it's important to also read the accompanying text to the steps, but know that you can always come back to this page to review what you did at a later time (after the workshop).
(<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/00-Prerequisites.md" target="_blank">Make sure you check out the <b>Prerequisites</b> page before you start</a>. You'll need all of the items loaded there before you can proceed with the workshop.)
There are X activities in this module:
* TODO1 WITH LINK
* TODO2 WITH LINK
* TODO3 WITH LINK
In this module, you'll cover these topics:
[1.1](#1.1): History
[1.2](#1.2): Azure SQL Overview
[1.3](#1.3): Purchasing models, service tiers, and hardware choices
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/point1.png?raw=true"><b>Activity 1: Deploy Azure SQL Database using the Azure portal</b></p>
<p style="border-bottom: 1px solid lightgrey;"></p>
In this activity, you'll deploy Azure SQL Database using the Azure portal. Throughout this exercise, you'll also get to explore the various options that are available to you.
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="1.1">1.1 History</h2></a>
**Step 1 - Deployment options**
Before you learn about Azure SQL and where it's going, let's briefly consider where it started. In 2008, at the [Microsoft Professional Developers Conference](https://www.youtube.com/watch?v=otuf3goxLsg), Microsoft's Chief Software Architect (at the time) [Ray Ozzie announced](https://news.microsoft.com/2008/10/27/microsoft-unveils-windows-azure-at-professional-developers-conference/#IP8XlBTCMpvORgaV.97) the new cloud computing operating system, Windows Azure (or "Project Red Dog"), which was later changed to Microsoft Azure. One of the five key components of the Azure Services Platform launch was "Microsoft SQL Services." From the beginning, SQL has been a big part of Azure. SQL Azure (then renamed to Azure SQL Database and now expanded to Azure SQL) was created to provide a cloud-hosted version of SQL Server.
Navigate to https://portal.azure.com/ and log in with your account, if you are not already. In the top search bar, type **Azure SQL** and review what appears:
![](../graphics/search.png)
* **Services**: this allows you to see your existing resources grouped by what type of service they are
* **Resources**: this allows you to select specific resources
* **Marketplace**: this allows you to deploy new resources
* **Documentation**: this searches docs.microsoft.com
* **Resource groups**: this allows you to select a resource group
[An explanation](https://social.technet.microsoft.com/wiki/contents/articles/1308.select-an-edition-of-sql-server-for-application-development/revision/7.aspx) of when you would want to use the early Azure SQL Database (2010) is as follows: [Azure SQL Database] is a cloud database offering that Microsoft provides as part of the Azure cloud computing platform. Unlike other editions of SQL Server, you do not need to provision hardware for, install or patch [Azure SQL Database]; Microsoft maintains the platform for you. You also do not need to architect a database installation for scalability, high availability or disaster recovery as these features are provided automatically by the service. Any application that uses [Azure SQL Database] must have Internet access in order to connect to the database.
Next, select **Azure SQL** under "Marketplace." This will bring you to the Azure SQL create experience. Take a few seconds to click around and explore.
This explanation still remains valid today, though the capabilities around security, performance, availability, and scale have been enhanced greatly. There are now multiple deployment options with the flexibility to scale to your needs, and there have been over seven million deployments of some form of Azure SQL.
Since 2008, SQL Server has changed a lot and Azure SQL has changed a lot. It's no surprise then that the role of the SQL Server professional has also changed a lot. The goal of this course is to help SQL Server professionals translate their existing skills to become not only better SQL Server professionals, but also Azure SQL professionals.
![](../graphics/AzureSQLDeploymentOptions.gif)
<br>
Next, select **Single database** and click **Create**.
<p style="border-bottom: 1px solid lightgrey;"></p>
**Step 2 - Database name**
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="1.2">1.2 Azure SQL Deployment Options</h2></a>
Select the subscription and resource group you created in the prerequisites (or were provided to use), then enter a database name **AdventureWorksID** where ID is the unique identifier you used in the prerequisites, or the unique ID at the end of the resource group you were provided (e.g. TODO).
Within the umbrella of the "Azure SQL" term, there are many deployment options and choices to be made in order to tailor to various customers' needs. While there are a lot of options, this is not meant to confuse or complicate things, but rather to give customers the flexibility to get and pay for exactly what they need. This topic will cover some of the challenges and scenarios that lead to choosing various Azure SQL deployment options, as well as some of the technical specifications for each of those options. The deployment options discussed in this topic include Azure SQL virtual machines, Azure SQL managed instances, Azure SQL Databases, and Azure SQL "pools" (Azure SQL Instance Pools and Azure SQL Elastic Pools).
**Step 3 - Server**
![](../graphics/azuresql.png)
When you create an Azure SQL MI, supplying the server name is the same as in SQL Server. However, for databases and elastic pools, an Azure SQL Database server is required. This is a *logical* construct that acts as a central administrative point for multiple single or pooled databases, logins, firewall rules, auditing rules, threat detection policies, and failover groups (more on these topics later). But having this logical server does not expose any instance-level access or features. More on SQL Database servers [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-servers).
At the highest level, when you're considering your options, the first question you may ask is, "What level of scope do I want?" As you move from virtual machines to managed instances to databases, your management scope decreases. With virtual machines, you not only get access to but are also responsible for the OS and the SQL Server. With managed instance, the OS is abstracted from you and now you have access to only the SQL Server. And the highest abstraction is SQL database where you just get a database, and you don't have access to instance-level features or the OS.
Select **Create new** next to "Server" and provide the following information:
* *Server name*: **aw-serverID** where ID is the same identifier you used for the database and resource group.
* *Server admin login*: **cloudadmin**. This is the equilavent to the system admin in SQL Server. This account connects using SQL authentication (username and password) and only one of these accounts can exist.
* *Password*: A complex password that meets the requirements.
* *Location*: Use the same location as your resource group.
## Azure SQL virtual machine
![](../graphics/sqlvm.png)
*[Extended Security Updates](https://www.microsoft.com/en-us/cloud-platform/extended-security-updates) worth 75% of license every year for the next three years after End of Service (July 9, 2019). Applicable to Azure Marketplace images, customers using customer SQL Server 2008/R2 custom images can download the Extended Security Updates for free and manually apply.
**[GigaOm Performance Study](https://gigaom.com/report/sql-transaction-processing-price-performance-testing/)
![](../graphics/newserver.png)
An Azure SQL virtual machine is simply a version of SQL Server that you specify running in an Azure VM. It's just SQL Server, so all of your SQL Server skills should directly transfer, though we can help automate backups and security patches. Azure SQL virtual machines are referred to as [Infrastructure as a Service (IaaS)](https://azure.microsoft.com/en-us/overview/what-is-iaas/). You are responsible for updating and patching the OS and SQL Server (apart from critical SQL security patches), but you have access to the full capabilities of SQL Server.
Then, select **OK**.
The customer example for Azure SQL virtual machines is [Allscripts](https://customers.microsoft.com/en-us/story/allscripts-partner-professional-services-azure). Allscripts is a leading healthcare software manufacturer, serving physician practices, hospitals, health plans, and Big Pharma. To transform its applications frequently and host them securely and reliably, Allscripts wanted to move to Azure quickly. In just three weeks, the company lifted and shifted dozens of acquired applications running on ~1,000 virtual machines to Azure with [Azure Site Recovery](https://azure.microsoft.com/en-us/services/site-recovery/).
**Step 4 - Opt-in for elastic pools**
This isn't the focus of this workshop, but if you're considering Azure SQL VMs, you'll want to review the [guidance on images to choose from](https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sql/virtual-machines-windows-sql-server-iaas-overview), the [quick checklist](https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sql/virtual-machines-windows-sql-performance) to obtain optimal performance of Azure SQL VMs, and the guidance for [storage configuration](https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sql/virtual-machines-windows-sql-server-storage-configuration).
In Azure SQL DB, you then decide if you want this database to be a part of an Elastic Pool (new or existing). In Azure SQL MI, [creating an instance pool (public preview) currently requires a different flow](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-instance-pools-how-to#create-an-instance-pool) than the Azure SQL create experience in the Azure portal.
> Note: if you installed SQL Server on an Azure VM yourself (as opposed to leveraging a [pre-installed Azure Marketplace image](https://azuremarketplace.microsoft.com/en-us/marketplace/apps?search=sql%20server&page=1&filters=virtual-machine-images%3Bmicrosoft)), [Resource Provider](http://www.aka.ms/sqlvm_rp_documentation) can bring the functionality of Azure Marketplace images to SQL Server instances self-installed on Azure VMs.
**Step 5 - Purchasing model**
> Note: If you're specifically looking at SQL Server on RHEL Azure VMs, there's a full operations guide available [here](https://azure.microsoft.com/en-us/resources/sql-server-on-rhel-azure-vms-operations-guide/
).
Next to "Compute + storage" select **Configure Database**. The top bar, by default shows the different service tiers available in the vCore purchasing model. You have two options for the purchasing model, [virtual core (vCore)-based](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore) (recommended) or [Database transaction unit (DTU)-based](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-dtu
## IaaS vs PaaS
Azure SQL virtual machines are considered IaaS. The other deployment options in the Azure SQL umbrella (Azure SQL managed instance and Azure SQL Database) are [Platform as a Service (PaaS)](https://azure.microsoft.com/en-us/overview/what-is-paas/) deployments. These PaaS Azure SQL deployment options use fully managed Database Engine that automates most of the database management functions such as upgrading, patching, backups, and monitoring. Throughout this course, you'll learn much more about the benefits and capabilities that the PaaS deployment options enable and how to optimally configure, manage, and troubleshoot them, but some highlights are listed below:
* [Business continuity](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-business-continuity) enables your business to continue operating in the face of disruption, particularly to its computing infrastructure.
* [High availability](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-high-availability) of Azure SQL Database guarantees your databases are up and running 99.99% of the time, no need to worry about maintenance/downtimes.
* [Automated backups](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-automated-backups) are created and use Azure read-access geo-redundant storage (RA-GRS) to provide geo-redundancy.
* [Long term backup retention](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-long-term-retention) enables you to store specific full databases for up to 10 years.
* [Geo-replication](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-active-geo-replication) by creating readable replicas of your database in the same or different data center (region).
* [Scale](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-scale-resources) by easily adding more resources (CPU, memory, storage) without long provisioning.
* Network Security
* [Azure SQL Database (single database and elastic pool)](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-overview#network-security) provides firewalls to prevent network access to the database server until access is explicitly granted based on IP address or Azure Virtual Network traffic origin.
* [Azure SQL Managed Instance](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-connectivity-architecture) has an extra layer of security in providing native virtual network implementation and connectivity to your on-premises environment using [Azure ExpressRoute](https://docs.microsoft.com/en-us/azure/expressroute/) or [VPN Gateway](https://docs.microsoft.com/en-us/azure/vpn-gateway/vpn-gateway-about-vpngateways).
* [Advanced security](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-index) detects threats and vulnerabilities in your databases and enables you to secure your data.
* [Automatic tuning](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-automatic-tuning) analyzes your workload and provides you the recommendations that can optimize performance of your applications by adding indexes, removing unused indexes, and automatically fixing the query plan issues.
* [Built-in monitoring](https://docs.microsoft.com/en-us/azure/log-analytics/log-analytics-azure-sql) capabilities enable you to get the insights into performance of your databases and workload, and troubleshoot the performance issues.
* [Built-in intelligence](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-intelligent-insights) automatically identifies the potential issues in your workload and provides you the recommendations that can [help you to fix the problems](https://azure.microsoft.com/en-us/blog/ai-helped-troubleshoot-an-intermittent-sql-database-performance-issue-in-one-day/).
## Azure SQL managed instance
![](../graphics/sqlmi.png)
[Azure SQL managed instance](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance) is a PaaS deployment option of Azure SQL that basically gives you an evergreen instance of SQL Server. Most of the features available in the SQL Server box products are available in Azure SQL managed instance (Azure SQL MI). This option is ideal for customers who want to leverage instance-scoped features (features that are tied to an instance of SQL Server as opposed to features that are tied to a database in an instance of SQL Server) like SQL Server Agent, Service Broker, Common Language Runtime (CLR), etc. and want to move to Azure without rearchitecting their applications. While Azure SQL MI allows customers to access the instance-scoped features, customers do not have to worry about (nor do they have access to) the OS or the infrastructure underneath.
A good customer example comes from [Komatsu](https://customers.microsoft.com/en-us/story/komatsu-australia-manufacturing-azure). Komatsu is a manufacturing company that produces and sells heavy equipment for construction. They had multiple mainframe applications for different types of data, which they wanted to consolidate to get a holistic view. Additionally, they wanted a way reduce overhead. Because Komatsu uses a large surface area of SQL Server features, they chose to move to **Azure SQL Managed Instance**. They were able to move about 1.5 terabytes of data smoothly, and [start enjoying benefits like automatic patching and version updates, automated backups, high availability, and reduced management overhead](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-technical-overview). After migrating, they reported ~49% cost reduction and ~25-30% performance gains.
## Azure SQL Database
![](../graphics/sqldb.png)
Azure SQL Database is a PaaS deployment option of Azure SQL that abstracts both the OS and the SQL Server instance away from the users. Azure SQL Database has the industry's highest availability [SLA](https://azure.microsoft.com/en-us/support/legal/sla/sql-database/v1_4/), along with other intelligent capabilities related to monitoring and performance, due in part to the fact that Microsoft is managing the instance. This deployment option allows you to just 'get a database' and start developing applications. Azure SQL Database (Azure SQL DB) is also the only deployment option that currently supports scenarios related to needing unlimited database storage ([Hyperscale](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-hyperscale)) and autoscaling for unpredictable workloads ([serverless](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-serverless)).
[AccuWeather](https://customers.microsoft.com/en-us/story/accuweather-partner-professional-services-azure) is a great example of using Azure SQL Database. AccuWeather has been analyzing and predicting the weather for more than 55 years. They wanted access to the rich and rapidly advanced platform of Azure that includes big data, machine learning, and AI capabilities. They want to focus on building new models and applications, not managing databases. They selected **Azure SQL Database** to use with other services, like [Azure Data Factory](https://docs.microsoft.com/en-us/azure/data-factory/) and [Azure Machine Learning Services](https://docs.microsoft.com/en-us/azure/machine-learning/service/), to quickly and easily deploy new internal applications to make sales and customer predictions.
## Azure SQL "pools"
You've now learned about the three main deployment options within Azure SQL: virtual machines, managed instances, and databases. For the PaaS deployment options (Azure SQL MI and Azure SQL DB), there are additional options for if you have multiple instances or databases, and these options are referred to as "pools". Using pools can help at a high level because they allow you to share resources between multiple instances/databases and cost optimize.
[Azure SQL Instance Pools](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-instance-pools) (currently in public preview) allow you to host multiple Azure SQL MIs and share resources. You can pre-provision the compute resources which can reduce the overall deployment time and thus make migrations easier. You can also host smaller Azure SQL MIs in an Instance Pool than in just a single Azure SQL MI (more on this in future sections).
[Azure SQL Database Elastic Pools](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-elastic-pool) (Generally Available) allow you to host many databases that may be multi-tenanted. This is ideal for a [Software as a Service (SaaS)](https://azure.microsoft.com/en-us/overview/what-is-saas/) application or provider, because you can manage and monitor performance in a simplified way for many databases.
A good example for where a customer leveraged Azure SQL Database Elastic Pools is [Paychex](https://customers.microsoft.com/en-us/story/paychex-azure-sql-database-us). Paychex is a human capital management firm that serves more than 650,000 businesses across the US and Europe. They needed a way to separately manage the time and pay management for each of their customers, and cut costs. They opted for [**Azure SQL Database Elastic Pools**](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-elastic-pool), which allowed them to simplify the management and enable resource sharing between separate databases to lower costs.
## Azure SQL Deployment Options - Summary
In this section, you've learned about Azure SQL and the deployment options that are available to you. A brief visual that summarizes the deployment options is below. In the next section, we'll go through deploying and configuring Azure SQL and how it compares to deploying and configuring the box SQL Server.
![](../graphics/azuresql2.png)
If you want to dive deeper into the deployment options and how to choose, check out the following resources:
* [Blog announcement for Azure SQL](https://techcommunity.microsoft.com/t5/Azure-SQL-Database/Unified-Azure-SQL-experience/ba-p/815368) which explains and walks through Azure SQL and some of the resulting views and experiences available in the Azure portal.
* [Microsoft Customer Stories](https://customers.microsoft.com/en-us/home?sq=&ff=&p=0) for many more stories similar to the ones above. You can use this to explore various use cases, industries, and solutions.
* [Choose the right deployment option in Azure SQL](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-paas-vs-sql-server-iaas) is a page in the documentation regularly updated to help provide insight into making the decisions between the Azure SQL options.
* [Choosing your database migration path to Azure](https://azure.microsoft.com/mediahandler/files/resourcefiles/choosing-your-database-migration-path-to-azure/Choosing_your_database_migration_path_to_Azure.pdf) is a white paper that talks about tools for discovering, assessing, planning and migrating SQL databases to Azure. This workshop will refer to it several times, and it's a highly recommended read. Chapter 5 deeply discusses choosing the right deployment option.
* [Feature comparison between SQL database, SQL managed instance, and SQL Server](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-features)
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="1.3">1.3 Purchasing models, service tiers, and hardware choices</h2></a>
Once you have an idea of what deployment option is best for your requirements, determining the purchasing model, service tier, and hardware, is the next thing to determine. In this section, you'll get an overview of the options and what to use when.
**Purchasing model**
You have two options for the purchasing model, [virtual core (vCore)-based](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore) (recommended) or [Database transaction unit (DTU)-based](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-dtu
). The DTU model is not available in Azure SQL MI.
> The vCore-based model is recommended because it allows you to independently choose compute and storage resources, while the DTU-based model is a bundled measure of compute, storage and I/O resources, which means you have less control over paying only for what you need. This model also allows you to use [Azure Hybrid Benefit for SQL Server](https://azure.microsoft.com/pricing/hybrid-benefit/) to gain cost savings. In the [vCore model](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore), you pay for:
@ -72,10 +136,10 @@ Next to "Compute + storage" select **Configure Database**. The top bar, by defa
> * The type and amount of data and log storage.
> * Backup storage ([read-access, geo-redundant storage (RA-GRS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-designing-ha-apps-with-ragrs)).
For the purposes of this workshop, we'll focus on the vCore purchasing model (recommended), so there is no action in this step. You can optionally review the DTU model by selecting **Looking for basic, standard, premium?** and by [comparing vCores and DTUs in-depth here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-purchase-models
For the purposes of this workshop, we'll focus on the vCore purchasing model (recommended). You can optionally review the DTU model by [comparing vCores and DTUs in-depth here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-purchase-models
).
**Step 6 - Service tier**
**Service tier**
The next decision is choosing the service tier for performance and availability. We recommend you start with the General Purpose, and adjust as needed. There are three tiers available in the vCore model:
* **[General purpose](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-general-purpose)**: Most business workloads. Offers budget-oriented, balanced, and scalable compute and storage options.
@ -88,217 +152,29 @@ If you choose **General Purpose within Azure SQL DB** and the **vCore-based mode
For a deeper explanation between provisioned and serverless compute (including scenarios), you can refer to the detailed [comparison in the documentation](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-serverless#comparison-with-provisioned-compute-tier). For a deeper explanation between the three service tiers (including scenarios), you can refer to the [service-tier characteristics](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore#service-tier-characteristics) in the documentation.
**Step 7 - Hardware**
**Hardware**
The vCore model lets you choose the generation of hardware:
* **Gen4**: Up to 24 logical CPUs based on Intel E5-2673 v3 (Haswell) 2.4-GHz processors, vCore = 1 physical core, 7 GB per core, attached SSD
* **Gen5**: Up to 80 logical CPUs based on Intel E5-2673 v4 (Broadwell) 2.3-GHz processors, vCore = 1 hyper-thread, 5.1 GB per core, fast NVMe SSD
Basically, Gen4 hardware offers substantially more memory per vCore. However, Gen5 hardware allows you to scale up compute resources much higher. [New Gen4 databases are no longer supported in certain regions](https://azure.microsoft.com/en-us/updates/gen-4-hardware-on-azure-sql-database-approaching-end-of-life-in-2020/), where Gen5 is available in most regions worldwide. As technology advances, you can expect that the hardware will change as well. For example, Fsv2-series (compute optimized) and M-series (memory optmized) hardware options recently became available in public preview for Azure SQL DB. You can reivew the latest hardware generations and availability [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore#hardware-generations).
Basically, Gen4 hardware offers substantially more memory per vCore. However, Gen5 hardware allows you to scale up compute resources much higher. [New Gen4 databases are no longer supported in certain regions](https://azure.microsoft.com/en-us/updates/gen-4-hardware-on-azure-sql-database-approaching-end-of-life-in-2020/), where Gen5 is available in most regions worldwide. As technology advances, you can expect that the hardware will change as well. For example, Fsv2-series (compute optimized) and M-series (memory optimized) hardware options recently became available in public preview for Azure SQL DB. You can review the latest hardware generations and availability [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore#hardware-generations).
> Note: If you choose General Purpose within Azure SQL DB and want to use the serverless compute tier, Gen5 hardware is the only option and it currently can scale up to 16 vCores.
For the workshop, you can leave the default hardware selection of **Gen5** but you can select **Change configuration** to view the other options available (may vary by region).
In this module, you learned about Azure SQL, including the deployment options, purchasing models, service tiers, and hardware choices. Hopefully, you also have a better understanding of what to choose when. In the next module, you'll learn more about deploying and configuring Azure SQL.
**Step 8 - Sizing**
One of the final steps is to determine how many vCores and the Data max size. For the workshop, you can select **2 vCores** and **32 GB Data max size**.
<p style="border-bottom: 1px solid lightgrey;"></p>
Generally, if you're migrating, you should use a similar size as to what you use on-premises. You can also leverage tools, like the [Data Migration Assistant SKU Recommender](https://docs.microsoft.com/en-us/sql/dma/dma-sku-recommend-sql-db?view=sql-server-ver15) to estimate the vCore and Data max size based on your current workload.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-paas-vs-sql-server-iaas" target="_blank">Choose the right deployment option in Azure SQL</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-purchase-models" target="_blank">Purchasing models</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-general-purpose-business-critical" target="_blank">Service tiers</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tiers-vcore?tabs=azure-portal" target="_blank">vCore Model</a></li>
</ul>
You might also be wondering what "9.6 GB LOG SPACE ALLOCATED" in the bottom right corner means. TODO
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Before you select **Apply**, confirm your selections look similar to those below:
![](../graphics/configuredb.png)
The "Basics" pane should now look similar to the image below:
![](../graphics/basicspane.png)
**Step 9 - Networking**
Select **Next : Networking**.
Choices for networking for Azure SQL DB and Azure SQL MI are different. When you deploy an Azure SQL Database, currently the default is "No access".
You can then choose to select Public endpoint or Private endpoint (preview). In this workshop we'll use the public endpoint and set the "Allow Azure services and resources to access this server" blade to yes, meaning that other Azure services (e.g. Azure Data Factory or an Azure VM) can access the database if you configure it. You can also select "Add current client IP address" if you want to be able to connect from the IP address you use to deploy Azure SQL Database, which you do. Make sure your settings match below:
![](../graphics/networkconnect.png)
With Azure SQL MI, you deploy it inside an Azure virtual network and a subnet that is dedicated to managed instances. This enables you to have a completely secure, private IP address. Azure SQL MI provides the ability to connect an on-prem network to a managed instance, connect a managed instance to a linked server or other on-prem data store, and connect a managed instance to other resources. You can additionally enable a public endpoint so you can connect to managed instance from the Internet without VPN. This access is disabled by default.
The principle of private endpoints through virtual network isolation is making it's way to Azure SQL DB in something called 'private link' (currently in public preview), and you can learn more [here](https://docs.microsoft.com/en-us/azure/private-link/private-link-overview).
More information on connectivity for Azure SQL DB can be found [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-connectivity-architecture) and for Azure SQL MI [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-connectivity-architecture). There will also be more on this topic in upcoming sections/modules.
For now, select **Next : Additional settings**
**Step 10 - Data source**
In Azure SQL DB, upon deployment you have the option to select the AdventureWorksLT database as the sample in the Azure portal. In Azure SQL MI, however, you deploy the instance first, and then databases inside of it, so there is not an option to have the sample database upon deployment (similar to in SQL Server).
For the workshop, select **Sample**.
**Step 11 - Database collations**
Since we're using the AdventureWorksLT sample, the **database collation is already set**. For a review of collations and how they apply in Azure SQL, continue reading, otherwise **you can skip to Step 12**.
Collations in SQL Server and Azure SQL tell the Database Engine how to treat certain characters and languages. A collation provides the sorting rules, case, and accent sensitivity properties for your data. When you're creating a new Azure SQL DB or MI, it's important to first take into account the locale requirements of the data you're working with, because the collation set will affect the characteristics of many operations in the database. In the SQL Server box product, the default collation is typically determined by the OS locale. In Azure SQL MI, you can set the server collation upon creation of the instance, and it cannot be changed later. The server collation sets the default for all of the databases in that instance of Azure SQL MI, but you can modify the collations on a database and column level. In Azure SQL DB, you can not set the server collation, it is set at the default (and most common) collation of `SQL_Latin1_General_CP1_CI_AS`, but you can set the database collation. If we break that into chunks:
* `SQL` means it is a SQL Server collation (as opposed to a Windows or Binary collation)
* `Latin1_General` specifies the alphabet/language to use when sorting
* `CP1` references the code page used by the collation
* `CI` means it will be case insensitive, where `CS` is case sensitive
* `AS` meand it will be accent sensitive, where `AI` is accent insensitive
There are other options available related to widths, UTF-8, etc., and more details about what you can and can't do with Azure SQL [here](https://docs.microsoft.com/en-us/sql/relational-databases/collations/collation-and-unicode-support?view=sql-server-ver15).
**Step 12 - Opt-in for Advanced Data Security**
When you deploy Azure SQL DB in the portal, you are prompted if you'd like to enable [Advanced Data Security (ADS)](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-advanced-data-security) on a free trial. Select **Start free trial**. After the free trial, it is billed according to the [Azure Security Center Standard Tier pricing](https://azure.microsoft.com/en-us/pricing/details/security-center/). If you choose to enable it, you get functionality related to data discovery and classification, identifying/mitigating potential database vulnerabilities, and threat detection. You'll learn more about these capabilities in the next module (<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/02-Security.md" target="_blank"><i>02 - Security</i></a>). In Azure SQL MI, you can enable it on the instance after deployment.
Your "Additional settings" pane should now look similar to the image below.
![](../graphics/additionalsettings.png)
**Step 13 - Tags**
Select **Next : Tags**.
Tags can be used to logically organize Azure resources across a subscription. For example, you can apply the name "Environment" and the value "Development" to this SQL database and Database server, but you might use the value "Production" for production resources. This can helpful for organizing resources for billing or management. You can read more [here](https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/tag-resources).
![](../graphics/tags.png)
**Step 14 - Review and create**
Finally, select **Next : Review + create**. Here you can review your deployment selections and the [Azure marketplace terms](https://go.microsoft.com/fwlink/?linkid=2045624).
> You also have the option to "Download a template for automation." We won't get in to that here, but if you're interested, you can [learn more](https://docs.microsoft.com/en-us/azure/azure-resource-manager/).
Finally, select **Create** to deploy the service.
Soon after selecting Create, you will be redirected to a page that looks like this (below), and where you can monitor the status of your deployment.
![](../graphics/deploymentunderway.png)
And some time later ...
![](../graphics/deploymentunderway2.png)
And finally...
![](../graphics/deploymentunderway3.png)
If, for whatever reason, you get lost from this page and the deployment has not completed, you can navigate to your resource group, and select **Deployments**. This will give you the various deployments, their statuses, and more information.
![](../graphics/deploymentstatus.png)
Once your resource has deployment, review the "Overview" pane for the SQL database in the Azure portal and confirm that the Status is "Online."
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/point1.png?raw=true"><b>Activity 2: Initial connect and comparison</b></p>
**Step 1 - Connect to SQL Server 2019**
Now that everything looks to be up and running in the Azure portal, let's switch to a familiar tool, SQL Server Management Studio (SSMS). Open SSMS and connect, using Windows Authentication, to the local instance of SQL Server 2019 that's running on your Azure VM (if you don't have this, please revisit the prerequisites).
![](../graphics/localconnect.png)
If you completed the prerequisites, expanding the databases and system databases folders should result in a view similar to the following.
![](../graphics/localserver.png)
**Step 2 - Connect to Azure SQL Database**
Next, let's connect to your Azure SQL Database logical server and compare. First, select **Connect > Database Engine**.
![](../graphics/dbengine.png)
For server name, input the name of your Azure SQL Database logical server. You may need to refer to the Azure portal to get this, e.g. *aw-server0406.database.windows.net*.
Change the authentication to **SQL Server Authentication**, and input the corresponding admin Login and Password.
Check the **Remember password** box and select **Connect**.
![](../graphics/connectazsql.png)
Expanding the databases and system databases should result in a view similar to the following.
![](../graphics/azureserver.png)
Spend a few minutes clicking around and exploring the differences, at first glance, between the Azure SQL Database logical server and Azure SQL Database. You won't deploy an Azure SQL Managed Instance as part of this workshop, but the image below shows how Azure SQL Managed Instance would appear in SSMS.
**TODO SCREENSHOT OF SSMS WITH ADVENTUREWORKS**
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/point1.png?raw=true"><b>Activity 3: Verify deployment queries</b></p>
Now that you've seen how Azure SQL appears in SSMS, let's explore a tool that may be new to you called Azure Data Studio (ADS). ADS is a source-open tool that provides a lightweight editor and other tools (including Notebooks which you'll see soon) for interacting with Azure Data Services (including SQL Server on-prem, Azure SQL, Azure Database for PostgreSQL, and more). Let's take a brief tour to get acquainted.
**Step 1 - Open Azure Data Studio and Connect**
Open Azure Data Studio (ADS). When opening for the first time, you'll first be prompted to make a connection.
![](../graphics/adsconnect.png)
Note that you can connect to your local instance of SQL Server 2019 here. Let's do that first. You can also supply a Server group and Name, if you want to group different connections together. For example, when you connect to SQL Server 2019, you might place it in a new Server group called **SQL Server 2019**. Fill in your information and connect to SQL Server 2019 by selecting **Connect**.
![](../graphics/adsconnectss.png)
You'll then go to a page that contains the "Server Dashboard". Select the **Connections** button (red square in below image) to view your Server groups and connections.
![](../graphics/serverdashboard.png)
Your results should be similar to what you saw in SSMS. Select the **New connection** button in the "Servers" bar.
![](../graphics/newconnection.png)
Now, connect to your Azure SQL Database logical server, just as you did in SSMS, but putting it in a new Server group called "Azure SQL Database", and selecting **Connect**.
![](../graphics/adsconnectdb.png)
In your "Connections" tab, under "Servers," you should now see both connections, and you should be able to expand the folders similar to SSMS.
![](../graphics/adsservers.png)
**Step 2 - Set up easy file access with ADS**
Now that your connected, you might want an easy way to access scripts and Jupyter notebooks. A Jupyter notebook (often referred to just as "Notebooks") is a way of integrating runnable code with text. If you aren't familiar with Jupyter notebooks, you will be soon, and you can check out more details later in the [documentation](https://docs.microsoft.com/en-us/sql/big-data-cluster/notebooks-guidance?view=sql-server-ver15).
First, in ADS, select **File > Open Folder**.
![](../graphics/openfolder.png)
Next, navigate to where the repository of all the workshop resources are. If you followed the prerequisites, the path should be similar to `C:\Users\<vm-username>\sqlworkshops\AzureSQLWorkshop`. Once you're there, select **Select Folder**.
![](../graphics/selectfolder.png)
Next, select the **Explorer** icon from the left taskbar to navigate through the files in the workshop.
![](../graphics/explorer.png)
Throughout the workshop, you'll be instructed at various points to open a notebook (file ending in `.ipynb`) or a script (file ending in `.sql`), and you can access those through here directly.
**Step 3 - Verify deployment queries**
Once you've deployed an instance of SQL (be in Azure SQL or SQL Server), there are typically some queries you would run to verify your deployment. In Azure SQL, some of these queries vary from SQL Server. In this step, you'll see what and how things change from SQL Server, and what is new.
For this step, you'll use the notebook **VerifyDeployment.ipynb** which is under `azuresqlworkshop\01-IntroToAzureSQL\verifydeployment\VerifyDeployment.ipynb`. Navigate to that file in ADS to complete this activity, and then return here.
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/point1.png?raw=true"><b>Activity 4: Azure CLI</b></p>
So you've seen the Azure portal, SSMS, and SQL Notebooks in ADS, but there are other tools available to you to use to manage Azure SQL. Two of the most popular are the [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/?view=azure-cli-latest) and [Azure PowerShell](https://docs.microsoft.com/en-us/powershell/azure/?view=azps-3.3.0). They are similar in their functionality, but for this workshop we will focus on the Azure CLI.
To complete this activity, you'll use a PowerShell notebook, which is the same concept as a SQL notebook, but the coding language is PowerShell. You can use PowerShell notebooks to leverage Azure CLI or Azure PowerShell, but we will focus on Azure CLI. For more information on the Azure PowerShell module, [see the documentation](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-powershell-samples?tabs=single-database). For both of these tools, you can also use the [Azure Cloud Shell](https://docs.microsoft.com/en-us/azure/cloud-shell/overview), which is an interactive shell environment that you can use through your browser in the Azure portal.
For this activity, you'll use the notebook called **AzureCli.ipynb** which is under `azuresqlworkshop\01-IntroToAzureSQL\cli\AzureCli.ipynb`. Navigate to that file in ADS to complete this activity, and then return here.
>In the `cli` folder, you'll also find a script if you want to try the activity with the Azure Cloud Shell.
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/geopin.png?raw=true"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/azuresqlworkshop/azuresqlworkshop/02-Security.md" target="_blank"><i> 02 - Security</i></a>.
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/02-DeployAndConfigure.md" target="_blank"><i> 02 - Deploy and Configure</i></a>.

Просмотреть файл

@ -1,7 +0,0 @@
# Module 1 Activities - Introduction to Azure SQL
These represent demos and examples you can run that accompany Module 1. See [Module 1](../01-IntroToAzureSQL.md) for details on how to use the files in this module.
## verifydeployment
Run the verify deployment queries and review the results across Azure SQL Database, Azure SQL Managed Instance, and SQL Server 2019. Main notebook file [here](./verifydeployment/VerifyDeployment.ipynb).

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Просмотреть файл

@ -0,0 +1,348 @@
# Module 2 - Deploy and Configure
#### <i>The Azure SQL Workshop</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/textbubble.png?raw=true"> <h2>Overview</h2>
> You must complete the [prerequisites](../azuresqlworkshop/00-Prerequisites.md) before completing these activities. You can also choose to audit the materials if you cannot complete the prerequisites. If you were provided an environment to use for the workshop, then you **do not need** to complete the prerequisites.
In this module's activities, you will deploy and configure Azure SQL, specifically Azure SQL Database. In addition to the Azure portal, you'll leverage SSMS, Azure Data Studio (including SQL and PowerShell Notebooks), and the Azure CLI.
The in-class version of this workshop involves a short presentation, which you can review [here](../slides/AzureSQLWorkshop.pptx).
Throughout the activities, it's important to also read the accompanying text to the steps, but know that you can always come back to this page to review what you did at a later time (after the workshop).
In this module, you'll cover these topics:
[2.1](#2.1): Pre-deployment planning
[2.2](#2.2): Deploy and verify
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 1](#1): Deploy Azure SQL Database using the Azure portal
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 2](#2): Initial connect and comparison
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 3](#3): Verify deployment queries
[2.3](#2.3): Configure
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Bonus) [Activity 4](#4): Configure with Azure CLI
[2.4](#2.4): Load data
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Bonus) [Activity 5](#5): Load data
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="2.1">2.1 Pre-deployment planning</h2></a>
Before you start deploying things in Azure, it's important to understand what your requirements are and how they match to offerings in Azure SQL. Using what you learned in Module 1, it's time to make a plan. You need to determine the following:
* Deployment method: GUI or unattended?
* Deployment option: VM, DB, Elastic Pool, MI, or Instance Pool?
* Purchasing model: DTU or vCore?
* Service tier (SLO): General purpose, business critical, or hyperscale?
* Hardware: Gen4, Gen5, or something new?
* Sizing: number of vCores and data max size?
> The Data Migration Assistant tool (DMA) has a [SKU Recommender](https://docs.microsoft.com/en-us/sql/dma/dma-sku-recommend-sql-db?view=sql-server-ver15) that can help you determine the number of vCores and size if you are migrating.
<br>
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="2.2">2.2 Deploy and Verify</h2></a>
Once you've completed your pre-deployment planning, it's time to deploy and verify that deployment. In this stage, you'll deploy Azure SQL (using the Azure portal or command-line), determine network configuration and how to connect, and run some queries that verify your deployment configuration.
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="1"><b>Activity 1</a>: Deploy Azure SQL Database using the Azure portal</b></p>
In this activity, you'll deploy Azure SQL Database using the Azure portal. Throughout this exercise, you'll also get to explore the various options that are available to you.
**Step 1 - Deployment options**
Navigate to https://portal.azure.com/ and log in with your account, if you are not already. In the top search bar, type **Azure SQL** and review what appears:
![](../graphics/search.png)
* **Services**: this allows you to see your existing resources grouped by what type of service they are
* **Resources**: this allows you to select specific resources
* **Marketplace**: this allows you to deploy new resources
* **Documentation**: this searches docs.microsoft.com
* **Resource groups**: this allows you to select a resource group
Next, select **Azure SQL** under "Marketplace." This will bring you to the Azure SQL create experience. Take a few seconds to click around and explore.
![](../graphics/AzureSQLDeploymentOptions.gif)
Next, select **Single database** and click **Create**.
**Step 2 - Database name**
Select the subscription and resource group you created in the prerequisites (or were provided to use), then enter a database name **AdventureWorksID** where ID is the unique identifier you used in the prerequisites, or the unique ID at the end of the resource group you were provided (e.g. TODO).
**Step 3 - Server**
When you create an Azure SQL MI, supplying the server name is the same as in SQL Server. However, for databases and elastic pools, an Azure SQL Database server is required. This is a *logical* construct that acts as a central administrative point for multiple single or pooled databases, logins, firewall rules, auditing rules, threat detection policies, and failover groups (more on these topics later). But having this logical server does not expose any instance-level access or features. More on SQL Database servers [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-servers).
Select **Create new** next to "Server" and provide the following information:
* *Server name*: **aw-serverID** where ID is the same identifier you used for the database and resource group.
* *Server admin login*: **cloudadmin**. This is the equivalent to the system admin in SQL Server. This account connects using SQL authentication (username and password) and only one of these accounts can exist.
* *Password*: A complex password that meets the requirements.
* *Location*: Use the same location as your resource group.
![](../graphics/newserver.png)
Then, select **OK**.
**Step 4 - Opt-in for elastic pools**
In Azure SQL DB, you then decide if you want this database to be a part of an Elastic Pool (new or existing). In Azure SQL MI, [creating an instance pool (public preview) currently requires a different flow](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-instance-pools-how-to#create-an-instance-pool) than the Azure SQL create experience in the Azure portal.
**Step 5 - Purchasing model**
>For more details on purchasing models and comparisons, refer to [Module 1](../azuresqlworkshop/01-IntroToAzureSQL.md).
Next to "Compute + storage" select **Configure Database**. The top bar, by default shows the different service tiers available in the vCore purchasing model.
For the purposes of this workshop, we'll focus on the vCore purchasing model (recommended), so there is no action in this step. You can optionally review the DTU model by selecting **Looking for basic, standard, premium?** and by [comparing vCores and DTUs in-depth here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-purchase-models
).
**Step 6 - Service tier**
>For more details on service tiers and comparisons, refer to [Module 1](../azuresqlworkshop/01-IntroToAzureSQL.md).
The next decision is choosing the service tier for performance and availability. We recommend you start with the General Purpose and adjust as needed.
**Step 7 - Hardware**
>For more details on available hardware and comparisons, refer to [Module 1](../azuresqlworkshop/01-IntroToAzureSQL.md).
For the workshop, you can leave the default hardware selection of **Gen5**, but you can select **Change configuration** to view the other options available (may vary by region).
**Step 8 - Sizing**
One of the final steps is to determine how many vCores and the Data max size. For the workshop, you can select **2 vCores** and **32 GB Data max size**.
Generally, if you're migrating, you should use a similar size as to what you use on-premises. You can also leverage tools, like the [Data Migration Assistant SKU Recommender](https://docs.microsoft.com/en-us/sql/dma/dma-sku-recommend-sql-db?view=sql-server-ver15) to estimate the vCore and Data max size based on your current workload.
The Data max size is not necessarily the database size of your data today. It is the maximum amount of data space that can be allocated for your database. For more information about the difference between data space used, data space allocated, and data max size, refer to this [explanation in the documentation](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-file-space-management#understanding-types-of-storage-space-for-a-database). This will also help you understand the log space allocated, which scales with your data max size.
Before you select **Apply**, confirm your selections look similar to those below:
![](../graphics/configuredb.png)
The "Basics" pane should now look similar to the image below:
![](../graphics/basicspane.png)
**Step 9 - Networking**
Select **Next : Networking**.
Choices for networking for Azure SQL DB and Azure SQL MI are different. When you deploy an Azure SQL Database, currently the default is "No access".
You can then choose to select Public endpoint or Private endpoint (preview). In this workshop we'll use the public endpoint and set the "Allow Azure services and resources to access this server" blade to yes, meaning that other Azure services (e.g. Azure Data Factory or an Azure VM) can access the database if you configure it. You can also select "Add current client IP address" if you want to be able to connect from the IP address you use to deploy Azure SQL Database, which you do. Make sure your settings match below:
![](../graphics/networkconnect.png)
With Azure SQL MI, you deploy it inside an Azure virtual network and a subnet that is dedicated to managed instances. This enables you to have a completely secure, private IP address. Azure SQL MI provides the ability to connect an on-prem network to a managed instance, connect a managed instance to a linked server or other on-prem data store, and connect a managed instance to other resources. You can additionally enable a public endpoint so you can connect to managed instance from the Internet without VPN. This access is disabled by default.
The principle of private endpoints through virtual network isolation is making its way to Azure SQL DB in something called 'private link' (currently in public preview), and you can learn more [here](https://docs.microsoft.com/en-us/azure/private-link/private-link-overview).
More information on connectivity for Azure SQL DB can be found [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-connectivity-architecture) and for Azure SQL MI [here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-connectivity-architecture). There will also be more on this topic in upcoming sections/modules.
For now, select **Next : Additional settings**.
**Step 10 - Data source**
In Azure SQL DB, upon deployment you have the option to select the AdventureWorksLT database as the sample in the Azure portal. In Azure SQL MI, however, you deploy the instance first, and then databases inside of it, so there is not an option to have the sample database upon deployment (similar to in SQL Server).
For the workshop, select **Sample**.
**Step 11 - Database collations**
Since we're using the AdventureWorksLT sample, the **database collation is already set**. For a review of collations and how they apply in Azure SQL, continue reading, otherwise **you can skip to Step 12**.
Collations in SQL Server and Azure SQL tell the Database Engine how to treat certain characters and languages. A collation provides the sorting rules, case, and accent sensitivity properties for your data. When you're creating a new Azure SQL DB or MI, it's important to first take into account the locale requirements of the data you're working with, because the collation set will affect the characteristics of many operations in the database. In the SQL Server box product, the default collation is typically determined by the OS locale. In Azure SQL MI, you can set the server collation upon creation of the instance, and it cannot be changed later. The server collation sets the default for all of the databases in that instance of Azure SQL MI, but you can modify the collations on a database and column level. In Azure SQL DB, you cannot set the server collation, it is set at the default (and most common) collation of `SQL_Latin1_General_CP1_CI_AS`, but you can set the database collation. If we break that into chunks:
* `SQL` means it is a SQL Server collation (as opposed to a Windows or Binary collation)
* `Latin1_General` specifies the alphabet/language to use when sorting
* `CP1` references the code page used by the collation
* `CI` means it will be case insensitive, where `CS` is case sensitive
* `AS` means it will be accent sensitive, where `AI` is accent insensitive
There are other options available related to widths, UTF-8, etc., and more details about what you can and can't do with Azure SQL [here](https://docs.microsoft.com/en-us/sql/relational-databases/collations/collation-and-unicode-support?view=sql-server-ver15).
**Step 12 - Opt-in for Advanced Data Security**
When you deploy Azure SQL DB in the portal, you are prompted if you'd like to enable [Advanced Data Security (ADS)](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-advanced-data-security) on a free trial. Select **Start free trial**. After the free trial, it is billed according to the [Azure Security Center Standard Tier pricing](https://azure.microsoft.com/en-us/pricing/details/security-center/). If you choose to enable it, you get functionality related to data discovery and classification, identifying/mitigating potential database vulnerabilities, and threat detection. You'll learn more about these capabilities in the next module (<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/03-Security.md" target="_blank">03 - Security</a>). In Azure SQL MI, you can enable it on the instance after deployment.
Your "Additional settings" pane should now look similar to the image below.
![](../graphics/additionalsettings.png)
**Step 13 - Tags**
Select **Next : Tags**.
Tags can be used to logically organize Azure resources across a subscription. For example, you can apply the name "Environment" and the value "Development" to this SQL database and Database server, but you might use the value "Production" for production resources. This can helpful for organizing resources for billing or management. You can read more [here](https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/tag-resources).
![](../graphics/tags.png)
**Step 14 - Review and create**
Finally, select **Next : Review + create**. Here you can review your deployment selections and the [Azure marketplace terms](https://go.microsoft.com/fwlink/?linkid=2045624).
> You also have the option to "Download a template for automation." We won't get in to that here, but if you're interested, you can [learn more](https://docs.microsoft.com/en-us/azure/azure-resource-manager/).
Finally, select **Create** to deploy the service.
Soon after selecting Create, you will be redirected to a page that looks like this (below), and where you can monitor the status of your deployment.
![](../graphics/deploymentunderway.png)
And some time later ...
![](../graphics/deploymentunderway2.png)
And finally...
![](../graphics/deploymentunderway3.png)
If, for whatever reason, you get lost from this page and the deployment has not completed, you can navigate to your resource group, and select **Deployments**. This will give you the various deployments, their statuses, and more information.
![](../graphics/deploymentstatus.png)
Once your resource has deployment, review the "Overview" pane for the SQL database in the Azure portal and confirm that the Status is "Online."
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="2"><b>Activity 2</a>: Initial connect and comparison</b></p>
**Step 1 - Connect to SQL Server 2019**
Now that everything looks to be up and running in the Azure portal, let's switch to a familiar tool, SQL Server Management Studio (SSMS). Open SSMS and connect, using Windows Authentication, to the local instance of SQL Server 2019 that's running on your Azure VM (if you don't have this, please revisit the prerequisites).
![](../graphics/localconnect.png)
If you completed the prerequisites, expanding the databases and system databases folders should result in a view similar to the following.
![](../graphics/localserver.png)
**Step 2 - Connect to Azure SQL Database**
Next, let's connect to your Azure SQL Database logical server and compare. First, select **Connect > Database Engine**.
![](../graphics/dbengine.png)
For server name, input the name of your Azure SQL Database logical server. You may need to refer to the Azure portal to get this, e.g. *aw-server0406.database.windows.net*.
Change the authentication to **SQL Server Authentication**, and input the corresponding admin Login and Password.
Check the **Remember password** box and select **Connect**.
![](../graphics/connectazsql.png)
Expanding the databases and system databases should result in a view similar to the following.
![](../graphics/azureserver.png)
Spend a few minutes clicking around and exploring the differences, at first glance, between the Azure SQL Database logical server and Azure SQL Database. You won't deploy an Azure SQL Managed Instance as part of this workshop, but the image below shows how Azure SQL Managed Instance would appear in SSMS.
![](../graphics/miserver.png)
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="3"><b>Activity 3</a>: Verify deployment queries</b></p>
Now that you've seen how Azure SQL appears in SSMS, let's explore a tool that may be new to you called Azure Data Studio (ADS). ADS is a source-open tool that provides a lightweight editor and other tools (including Notebooks which you'll see soon) for interacting with Azure Data Services (including SQL Server on-prem, Azure SQL, Azure Database for PostgreSQL, and more). Let's take a brief tour to get acquainted.
**Step 1 - Open Azure Data Studio and Connect**
Open Azure Data Studio (ADS). When opening for the first time, you'll first be prompted to make a connection.
![](../graphics/adsconnect.png)
Note that you can connect to your local instance of SQL Server 2019 here. Let's do that first. You can also supply a Server group and Name, if you want to group different connections together. For example, when you connect to SQL Server 2019, you might place it in a new Server group called **SQL Server 2019**. Fill in your information and connect to SQL Server 2019 by selecting **Connect**.
![](../graphics/adsconnectss.png)
You'll then go to a page that contains the "Server Dashboard". Select the **Connections** button (red square in below image) to view your Server groups and connections.
![](../graphics/serverdashboard.png)
Your results should be similar to what you saw in SSMS. Select the **New connection** button in the "Servers" bar.
![](../graphics/newconnection.png)
Now, connect to your Azure SQL Database logical server, just as you did in SSMS, but putting it in a new Server group called "Azure SQL Database", and selecting **Connect**.
![](../graphics/adsconnectdb.png)
In your "Connections" tab, under "Servers," you should now see both connections, and you should be able to expand the folders similar to SSMS.
![](../graphics/adsservers.png)
**Step 2 - Set up easy file access with ADS**
Now that your connected, you might want an easy way to access scripts and Jupyter notebooks. A Jupyter notebook (often referred to just as "Notebooks") is a way of integrating runnable code with text. If you aren't familiar with Jupyter notebooks, you will be soon, and you can check out more details later in the [documentation](https://docs.microsoft.com/en-us/sql/big-data-cluster/notebooks-guidance?view=sql-server-ver15).
First, in ADS, select **File > Open Folder**.
![](../graphics/openfolder.png)
Next, navigate to where the repository of all the workshop resources are. If you followed the prerequisites, the path should be similar to `C:\Users\<vm-username>\sqlworkshops\AzureSQLWorkshop`. Once you're there, select **Select Folder**.
![](../graphics/selectfolder.png)
Next, select the **Explorer** icon from the left taskbar to navigate through the files in the workshop.
![](../graphics/explorer.png)
Throughout the workshop, you'll be instructed at various points to open a notebook (file ending in `.ipynb`), and you can access those from here directly.
**Step 3 - Verify deployment queries**
Once you've deployed an instance of SQL (be in Azure SQL or SQL Server), there are typically some queries you would run to verify your deployment. In Azure SQL, some of these queries vary from SQL Server. In this step, you'll see what and how things change from SQL Server, and what is new.
For this step, you'll use the notebook **VerifyDeployment.ipynb** which is under `azuresqlworkshop\02-DeployAndConfigure\verifydeployment\VerifyDeployment.ipynb`. Navigate to that file in ADS to complete this activity, and then return here.
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="2.3">2.3 Configure</h2></a>
TODO: Put in text here that talks about the process to configure, and configure/deploy databases with Azure SQL comparing this to SQL Server
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>(Bonus) <a name="4">Activity 4</a>: Configure with Azure CLI</b></p>
So you've seen the Azure portal, SSMS, and SQL Notebooks in ADS, but there are other tools available to you to use to manage Azure SQL. Two of the most popular are the [Azure CLI](https://docs.microsoft.com/en-us/cli/azure/?view=azure-cli-latest) and [Azure PowerShell](https://docs.microsoft.com/en-us/powershell/azure/?view=azps-3.3.0). They are similar in their functionality, but for this workshop we will focus on the Azure CLI.
To complete this activity, you'll use a PowerShell notebook, which is the same concept as a SQL notebook, but the coding language is PowerShell. You can use PowerShell notebooks to leverage Azure CLI or Azure PowerShell, but we will focus on Azure CLI. For more information on the Azure PowerShell module, [see the documentation](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-powershell-samples?tabs=single-database). For both of these tools, you can also use the [Azure Cloud Shell](https://docs.microsoft.com/en-us/azure/cloud-shell/overview), which is an interactive shell environment that you can use through your browser in the Azure portal.
In the example that follows, you'll also explore the latency effects of using different connection policies in Azure SQL.
For this activity, you'll use the notebook called **AzureCli.ipynb** which is under `azuresqlworkshop\02-DeployAndConfigure\cli\AzureCli.ipynb`. Navigate to that file in ADS to complete this activity, and then return here.
<br>
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="2.4">2.4 Load data</h2></a>
TODO: Put in text here that talks about the process to load data with Azure SQL comparing this to SQL Server
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>(Bonus) <a name="5">Activity 5</a>: Load data into Azure SQL Database</b></p>
In this activity, you'll explore one scenario for bulk loading data from Azure Blob storage using T-SQL and Shared Access Signatures (SAS) into Azure SQL Database.
For this activity, you'll use the notebook called **LoadData.ipynb** which is under `azuresqlworkshop\02-DeployAndConfigure\loaddata\LoadData.ipynb`. Navigate to that file in ADS to complete this activity, and then return here.
In this module and throughout the activities, you learned how to deploy and configure Azure SQL. In the next module, you'll dive in to security for Azure SQL.
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="https://docs.microsoft.com/en-us/sql/dma/dma-sku-recommend-sql-db?view=sql-server-ver15" target="_blank">Data Migration Assistant tool (DMA) SKU Recommender</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-get-started" target="_blank">Quickstart: Create an Azure SQL Managed Instance</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-migrate" target="_blank">How to migrate to Azure SQL Managed Instance</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/geopin.png?raw=true"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/azuresqlworkshop/azuresqlworkshop/03-Security.md" target="_blank"><i> 03 - Security</i></a>.

Просмотреть файл

@ -19,7 +19,7 @@
"source": [
"# Activity 4: Azure CLI - Azure SQL Database\r\n",
"\r\n",
"#### <i>The Azure SQL Workshop - Module 1</i>\r\n",
"#### <i>The Azure SQL Workshop - Module 2</i>\r\n",
"\r\n",
"<p style=\"border-bottom: 1px solid lightgrey;\"></p>\r\n",
"\r\n",
@ -28,7 +28,7 @@
"\r\n",
"**Set up** \r\n",
"\r\n",
"0. You should have opened this file using Azure Data Studio. If you didn't, please refer to Module 1 Activity 3 in the readme.md file to get set up. \r\n",
"0. You should have opened this file using Azure Data Studio. If you didn't, please refer to Module 2 Activity 3 in the readme.md file to get set up. \r\n",
"1. In the bar at the top of this screen, confirm or change the \"Kernel\" to **PowerShell**. This determines what language the code blocks in the file are. In this case, that language is SQL. \r\n",
"2. You may be prompted to install Python, if you are select **New Python Installation**. This may take a few minutes, you'll see the output in a window that appears at the bottom of ADS.\r\n",
"TODO THIS TAKES ~5 minutes. \r\n",
@ -50,7 +50,8 @@
"az login"
],
"metadata": {
"azdata_cell_guid": "12d396b2-92c4-483b-a741-b9aa84807d29"
"azdata_cell_guid": "12d396b2-92c4-483b-a741-b9aa84807d29",
"tags": []
},
"outputs": [],
"execution_count": 1
@ -77,8 +78,14 @@
"azdata_cell_guid": "e91e1c9a-7175-45e0-879b-0f6ef150db3c",
"tags": []
},
"outputs": [],
"execution_count": 3
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": ""
}
],
"execution_count": 2
},
{
"cell_type": "markdown",
@ -99,7 +106,7 @@
"azdata_cell_guid": "828e608f-fadd-40a7-8ac2-1f5394a81412"
},
"outputs": [],
"execution_count": 4
"execution_count": 3
},
{
"cell_type": "markdown",
@ -120,12 +127,12 @@
"azdata_cell_guid": "85457b28-ac25-40a1-8dea-5cd1ca8a70f2"
},
"outputs": [],
"execution_count": 5
"execution_count": 4
},
{
"cell_type": "markdown",
"source": [
"You can also determine the usage."
"You can also determine the database suze and usage."
],
"metadata": {
"azdata_cell_guid": "9439704f-73d5-4c60-a422-233e2062dcff"
@ -141,12 +148,12 @@
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "[\n {\n \"currentValue\": 29360128.0,\n \"displayName\": \"Database Size\",\n \"limit\": 34359738368.0,\n \"name\": \"database_size\",\n \"nextResetTime\": null,\n \"resourceName\": \"AdventureWorks0406\",\n \"unit\": \"Bytes\"\n },\n {\n \"currentValue\": 33554432.0,\n \"displayName\": \"Database Allocated Size\",\n \"limit\": 34359738368.0,\n \"name\": \"database_allocated_size\",\n \"nextResetTime\": null,\n \"resourceName\": \"AdventureWorks0406\",\n \"unit\": \"Bytes\"\n }\n]\n",
"output_type": "stream"
"text": "[\n {\n \"currentValue\": 556793856.0,\n \"displayName\": \"Database Size\",\n \"limit\": 34359738368.0,\n \"name\": \"database_size\",\n \"nextResetTime\": null,\n \"resourceName\": \"AdventureWorks0406\",\n \"unit\": \"Bytes\"\n },\n {\n \"currentValue\": 1610612736.0,\n \"displayName\": \"Database Allocated Size\",\n \"limit\": 34359738368.0,\n \"name\": \"database_allocated_size\",\n \"nextResetTime\": null,\n \"resourceName\": \"AdventureWorks0406\",\n \"unit\": \"Bytes\"\n }\n]\n"
}
],
"execution_count": 7
"execution_count": 5
},
{
"cell_type": "markdown",
@ -198,37 +205,65 @@
"azdata_cell_guid": "1af59f8b-afe3-43c5-863d-3549f06871ed"
},
"outputs": [],
"execution_count": 8
"execution_count": 6
},
{
"cell_type": "markdown",
"source": [
"So the results above tell us the connection type is \"Default\". What if we want to make everything `Redirect` so we can achieve reduced latency? It's as easy as running two commands. \r\n",
"\r\n",
"First, we need to update the firewall rules to allow the required ports."
"So the results above tell us the connection type is \"Default\". Let's set it to \"Proxy\" and determine the round trip time."
],
"metadata": {
"azdata_cell_guid": "7eb584ef-cc42-4dc8-920d-1b049f33fe0c"
"azdata_cell_guid": "57406599-f0e7-4862-ae96-feb2bf4b3d04"
}
},
{
"cell_type": "code",
"source": [
"az sql server firewall-rule create -n NewRule --start-ip-address 11000 --end-ip-address 11999 # ?????????????????????????????????????????????????????????????"
"# update policy\r\n",
"az sql server conn-policy update --connection-type Proxy\r\n",
"# confirm update\r\n",
"az sql server conn-policy show"
],
"metadata": {
"azdata_cell_guid": "af7d38e8-5cb8-4ef1-8718-b871403943ca"
"azdata_cell_guid": "5114ebd7-b37d-4cd8-abbc-4a00ba77f798"
},
"outputs": [],
"execution_count": 0
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "{\n \"connectionType\": \"Proxy\",\n \"id\": \"/subscriptions/227e9423-1792-43b0-82e6-ac94397ed789/resourceGroups/azuresqlworkshop0406/providers/Microsoft.Sql/servers/aw-server0406/connectionPolicies/default\",\n \"kind\": null,\n \"location\": null,\n \"name\": \"default\",\n \"resourceGroup\": \"azuresqlworkshop0406\",\n \"type\": \"Microsoft.Sql/servers/connectionPolicies\"\n}\n{\n \"connectionType\": \"Proxy\",\n \"id\": \"/subscriptions/227e9423-1792-43b0-82e6-ac94397ed789/resourceGroups/azuresqlworkshop0406/providers/Microsoft.Sql/servers/aw-server0406/connectionPolicies/default\",\n \"kind\": null,\n \"location\": \"West US\",\n \"name\": \"default\",\n \"resourceGroup\": \"azuresqlworkshop0406\",\n \"type\": \"Microsoft.Sql/servers/connectionPolicies\"\n}\n"
}
],
"execution_count": 16
},
{
"cell_type": "markdown",
"source": [
"Then, we can update the connection policy"
"If you want to test round trip time, you can connect with SSMS, create a new query (below), and choose to \"Include Client Statistics\" in your results. In the results, the \"Wait time on server replies\" is the best indicator of network latency. You can run this a few times to get a good average. \r\n",
"\r\n",
"```sql\r\n",
"-- Proxy\r\n",
"SELECT * FROM SalesLT.Product\r\n",
"GO 20\r\n",
"```\r\n",
"\r\n",
"After 10 trials, I had an average wait time on server replies of `46.6000`"
],
"metadata": {
"azdata_cell_guid": "fe00029a-e099-47f3-932f-7e62c4e4e9ee"
"azdata_cell_guid": "77286271-b423-407a-9b8e-4c96d80b6d0e"
}
},
{
"cell_type": "markdown",
"source": [
"What if we want to make everything `Redirect` so we can attempt to achieve reduced latency?\r\n",
"\r\n",
"First, for anything that is **outside Azure**, you need to allow inbound and outbound communication on ports in the range of 11000 - 11999. This is required for the Redirect connection policy. Since you are connecting through an Azure VM, there is no action here. \r\n",
"\r\n",
"Update the connection policy and confirm that update with the following two commands. "
],
"metadata": {
"azdata_cell_guid": "7eb584ef-cc42-4dc8-920d-1b049f33fe0c"
}
},
{
@ -242,8 +277,68 @@
"metadata": {
"azdata_cell_guid": "c540779a-9c4d-472f-8ddc-6fc59b8437d4"
},
"outputs": [],
"execution_count": 0
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "{\n \"connectionType\": \"Redirect\",\n \"id\": \"/subscriptions/227e9423-1792-43b0-82e6-ac94397ed789/resourceGroups/azuresqlworkshop0406/providers/Microsoft.Sql/servers/aw-server0406/connectionPolicies/default\",\n \"kind\": null,\n \"location\": null,\n \"name\": \"default\",\n \"resourceGroup\": \"azuresqlworkshop0406\",\n \"type\": \"Microsoft.Sql/servers/connectionPolicies\"\n}\n{\n \"connectionType\": \"Redirect\",\n \"id\": \"/subscriptions/227e9423-1792-43b0-82e6-ac94397ed789/resourceGroups/azuresqlworkshop0406/providers/Microsoft.Sql/servers/aw-server0406/connectionPolicies/default\",\n \"kind\": null,\n \"location\": \"West US\",\n \"name\": \"default\",\n \"resourceGroup\": \"azuresqlworkshop0406\",\n \"type\": \"Microsoft.Sql/servers/connectionPolicies\"\n}\n"
}
],
"execution_count": 17
},
{
"cell_type": "markdown",
"source": [
"Now, to test network latency from the `Redirect` policy, connect with SSMS, create a new query (below), and choose to \"Include Client Statistics\" in your results. Compare the \"Wait time on server replies\" with your query for `Proxy`. \r\n",
"\r\n",
"> Note: you'll need to create a **new connection to query** to evaluate (i.e. right-click on the Adventure Works database and select New Query, do not use the query window you used to test Proxy).\r\n",
"\r\n",
"```sql\r\n",
"-- Redirect\r\n",
"SELECT * FROM SalesLT.Product\r\n",
"GO 20\r\n",
"```\r\n",
"\r\n",
"After 10 trials, I have an average wait time on server replies of `25.8000`, which is almost half that of the Proxy connection policy. \r\n",
"\r\n",
"### To review\r\n",
"\r\n",
"Redirect is faster because after the intial connection, you can bypass the gateway and go straight to the database. This means less hops, which results in less latency, which ultimately helps in preventing bottlenecks (especially important for chatty applications).\r\n",
"\r\n",
""
],
"metadata": {
"azdata_cell_guid": "1b519a96-f1ea-4454-a081-2a7ffc28c38a"
}
},
{
"cell_type": "markdown",
"source": [
"To set it back to default (or change to Proxy), you can use the same set of commands."
],
"metadata": {
"azdata_cell_guid": "45a85215-ed6c-447f-8c06-7f59220e9915"
}
},
{
"cell_type": "code",
"source": [
"# update policy\r\n",
"az sql server conn-policy update --connection-type Default\r\n",
"# confirm update\r\n",
"az sql server conn-policy show"
],
"metadata": {
"azdata_cell_guid": "34f23b31-0252-436f-abdb-f0b7208f44b7"
},
"outputs": [
{
"output_type": "stream",
"name": "stdout",
"text": "{\n \"connectionType\": \"Default\",\n \"id\": \"/subscriptions/227e9423-1792-43b0-82e6-ac94397ed789/resourceGroups/azuresqlworkshop0406/providers/Microsoft.Sql/servers/aw-server0406/connectionPolicies/default\",\n \"kind\": null,\n \"location\": null,\n \"name\": \"default\",\n \"resourceGroup\": \"azuresqlworkshop0406\",\n \"type\": \"Microsoft.Sql/servers/connectionPolicies\"\n}\n{\n \"connectionType\": \"Default\",\n \"id\": \"/subscriptions/227e9423-1792-43b0-82e6-ac94397ed789/resourceGroups/azuresqlworkshop0406/providers/Microsoft.Sql/servers/aw-server0406/connectionPolicies/default\",\n \"kind\": null,\n \"location\": \"West US\",\n \"name\": \"default\",\n \"resourceGroup\": \"azuresqlworkshop0406\",\n \"type\": \"Microsoft.Sql/servers/connectionPolicies\"\n}\n"
}
],
"execution_count": 18
}
]
}

Просмотреть файл

@ -0,0 +1,407 @@
{
"metadata": {
"kernelspec": {
"name": "SQL",
"display_name": "SQL",
"language": "sql"
},
"language_info": {
"name": "sql",
"version": ""
}
},
"nbformat_minor": 2,
"nbformat": 4,
"cells": [
{
"cell_type": "markdown",
"source": [
"# Activity 5: Load data - Azure SQL Database\r\n",
"\r\n",
"#### <i>The Azure SQL Workshop - Module 2</i>\r\n",
"\r\n",
"<p style=\"border-bottom: 1px solid lightgrey;\"></p>\r\n",
"\r\n",
"In this activity, you'll get to see how you can bulk load data into Azure SQL Database. \r\n",
"\r\n",
"\r\n",
"**Set up - Attach the notebook to Azure SQL Database** \r\n",
"\r\n",
"0. You should have opened this file using Azure Data Studio. If you didn't, please refer to Module 2 Activity 3 in the main Module 2 file to get set up. \r\n",
"1. In the bar at the top of this screen, confirm or change the \"Kernel\" to **SQL**. This determines what language the code blocks in the file are. In this case, that language is SQL. \r\n",
"2. For \"Attach to\", use the drop-down to select **Change Connection**. From the Recent Connections pane, you should be able to select your Azure SQL Database logical server. \r\n",
"\r\n",
"Now that you're set up, you should read the text cells and \"Run\" the code cells by selecting the play button that appears in the left of a code cell when you hover over it. \r\n",
"> Some of the cells have been run before, this is just to show you the expected result from the testing of the labs. If you choose not to complete the labs/prerequisites, do not run any cells, just review the results. \r\n",
""
],
"metadata": {
"azdata_cell_guid": "2c06b521-aaf9-41a8-9824-f06a3fb12e2c"
}
},
{
"cell_type": "markdown",
"source": [
"When you're bulk loading data, it has to come from somewhere. In Azure, it's very common to store or dump data into an [Azure Blob Storage](https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blobs-introduction) because Blob storage is optimized for storing massive amounts of unstructured data at a relatively low cost. \r\n",
"\r\n",
"In this scenario, AdventureWorks is receiving store return data based on store identification number (e.g. 1, 2, etc.) This return data is being stored in `.dat` files which are then pushed into Azure Blob storage. \r\n",
"\r\n",
"Within blob storage, there exists three types of resources: \r\n",
"* Storage account: this provides a unique namespace for a storage account, and a way to connect or access it \r\n",
"* Containers: these are used to organize a set of blobs. A storage account can have an unlimited number of containers \r\n",
"* Blobs: there are several types of blobs but we will use Block blobs that can store text and binary data that can be managed individually. \r\n",
"\r\n",
"Now, once the data is in blob storage, Azure SQL needs a way to access it. You can do that by [creating an external data source](https://docs.microsoft.com/en-us/sql/t-sql/statements/create-external-data-source-transact-sql?view=azuresqldb-current) that has access to the Azure Storage account. \r\n",
"\r\n",
"You can [control access to Azure Storage accounts](https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview?toc=%2fazure%2fstorage%2fblobs%2ftoc.json#control-access-to-account-data) through Azure Active Directory, Shared Key authorization, or with a Shared access signature (SAS). The link points to more details, but we will use SAS for this exercise. \r\n",
"\r\n",
"If you want to read more about how SAS works with regards to Azure Storage, please [read here](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview) before continuing. \r\n",
"\r\n",
""
],
"metadata": {
"azdata_cell_guid": "8cdf43ef-a2ce-42b1-af4c-4cb2669440f3"
}
},
{
"cell_type": "markdown",
"source": [
"**Step 1 - Create a table and schema** \r\n",
"\r\n",
"First, we need to create a table and schema for our data to be loaded into. This is pretty straightforward, good old-fashioned T-SQL."
],
"metadata": {
"azdata_cell_guid": "234764b7-3174-401f-b105-66f22bee5ab3"
}
},
{
"cell_type": "code",
"source": [
"IF SCHEMA_ID('DataLoad') IS NULL \r\n",
"EXEC ('CREATE SCHEMA DataLoad')\r\n",
"\r\n",
"CREATE TABLE DataLoad.store_returns\r\n",
"(\r\n",
" sr_returned_date_sk bigint,\r\n",
" sr_return_time_sk bigint,\r\n",
" sr_item_sk bigint ,\r\n",
" sr_customer_sk bigint,\r\n",
" sr_cdemo_sk bigint,\r\n",
" sr_hdemo_sk bigint,\r\n",
" sr_addr_sk bigint,\r\n",
" sr_store_sk bigint,\r\n",
" sr_reason_sk bigint,\r\n",
" sr_ticket_number bigint ,\r\n",
" sr_return_quantity integer,\r\n",
" sr_return_amt float,\r\n",
" sr_return_tax float,\r\n",
" sr_return_amt_inc_tax float,\r\n",
" sr_fee float,\r\n",
" sr_return_ship_cost float,\r\n",
" sr_refunded_cash float,\r\n",
" sr_reversed_charge float,\r\n",
" sr_store_credit float,\r\n",
" sr_net_loss float\r\n",
"\r\n",
") "
],
"metadata": {
"azdata_cell_guid": "46df44da-7cb9-48c1-b071-fee30b67d4d1"
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:00:00.011"
},
"metadata": {}
}
],
"execution_count": 3
},
{
"cell_type": "markdown",
"source": [
"**Step 2 - Create a `MASTER KEY`** \r\n",
"\r\n",
"Leveraging [an example in the docs](https://docs.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver15#f-importing-data-from-a-file-in-azure-blob-storage) you learn that a `MASTER KEY` is required to create a `DATABASE SCOPED CREDENTIAL` since the blob storage is not configured to allow public (anonymous) access. \r\n",
"\r\n",
"So, let's first create a `MASTER KEY`"
],
"metadata": {
"azdata_cell_guid": "c66497ba-8fc8-4296-829e-562ccc6a942b"
}
},
{
"cell_type": "code",
"source": [
"CREATE MASTER KEY \r\n",
"ENCRYPTION BY PASSWORD='MyComplexPassword00!';"
],
"metadata": {
"azdata_cell_guid": "65b4c01a-9bec-417b-a471-4b99c4e28cea"
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:00:00.017"
},
"metadata": {}
}
],
"execution_count": 4
},
{
"cell_type": "markdown",
"source": [
"**Step 3 - Create a `DATABASE SCOPED CREDENTIAL`** \r\n",
"\r\n",
"A `MASTER KEY` is required to create a `DATABASE SCOPED CREDENTIAL`, which we can now create. The credential refers to the Azure blob storage account and the `data/` portion specifies the container where the store return data is located. \r\n",
"\r\n",
"We use `SHARED ACCESS SIGNATURE` as the identity which SQL knows how to interpret, and the secret provided is the SAS token that you can generate from the Azure blob storage account. \r\n",
"\r\n",
"> Note: the `?` at the beginning of the SAS token should be removed \r\n",
"\r\n",
"> Note: if you are completing this as part of an in-person workshop and were provided an environment to use, please refer to instructor guidance to obtain the SAS token. Otherwise, please refer to the **PREREQS TODO**.\r\n",
""
],
"metadata": {
"azdata_cell_guid": "68c55330-e433-4526-a62f-904660fb8adb"
}
},
{
"cell_type": "code",
"source": [
"CREATE DATABASE SCOPED CREDENTIAL [https://azuresqlworkshopsa.blob.core.windows.net/data/]\r\n",
"WITH IDENTITY = 'SHARED ACCESS SIGNATURE',\r\n",
"SECRET = 'redacted';"
],
"metadata": {
"azdata_cell_guid": "26c0a508-595d-4ead-a680-b8ea422a8d68",
"tags": []
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:00:00.014"
},
"metadata": {}
}
],
"execution_count": 5
},
{
"cell_type": "markdown",
"source": [
"**Step 4 - Create an external data source to the container** \r\n",
"\r\n",
"> Note: `LOCATION` doesn't have a trailing `/`, even through the `CREDENTIAL` does."
],
"metadata": {
"azdata_cell_guid": "3e65516c-6c05-4cc4-b8f4-310d9fe41da2"
}
},
{
"cell_type": "code",
"source": [
"CREATE EXTERNAL DATA SOURCE dataset\r\n",
"WITH \r\n",
"(\r\n",
" TYPE = BLOB_STORAGE,\r\n",
" LOCATION = 'https://azuresqlworkshopsa.blob.core.windows.net/data',\r\n",
" CREDENTIAL = [https://azuresqlworkshopsa.blob.core.windows.net/data/]\r\n",
");"
],
"metadata": {
"azdata_cell_guid": "e8e3ad86-2f58-41ef-a568-18ffc9128438"
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:00:01.471"
},
"metadata": {}
}
],
"execution_count": 6
},
{
"cell_type": "markdown",
"source": [
"**Step 5 - `BULK INSERT` a single file** \r\n",
"\r\n",
"You're finally ready to `BULK INSERT` one of the store return files. \r\n",
"\r\n",
"Review the comments before running the following cell."
],
"metadata": {
"azdata_cell_guid": "7d0ffa7d-660a-48c1-aa6b-2a295aff2e30"
}
},
{
"cell_type": "code",
"source": [
"SET NOCOUNT ON -- Reduce network traffic by stopping the message that shows the number of rows affected\r\n",
" BULK INSERT DataLoad.store_returns -- Table you created in Step 1\r\n",
" FROM 'dataset/store_returns/store_returns_1.dat' -- Within the container, the location of the file\r\n",
" WITH (\r\n",
"\t\t\tDATA_SOURCE = 'dataset' -- Using the External data source from Step 4\r\n",
"\t\t\t,DATAFILETYPE = 'char' \r\n",
"\t ,FIELDTERMINATOR = '\\|' \r\n",
"\t ,ROWTERMINATOR = '\\|\\n' \r\n",
" ,BATCHSIZE=100000 -- Reduce network traffic by inserting in batches\r\n",
" , TABLOCK -- Minimize number of log records for the insert operation\r\n",
" )"
],
"metadata": {
"azdata_cell_guid": "bcc16f9e-1fc5-4f51-8cd2-11d47da4b24d"
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:01:07.828"
},
"metadata": {}
}
],
"execution_count": 7
},
{
"cell_type": "markdown",
"source": [
"In the **Module 4: Performance**, there will be an opportunity to explore how you can improve your throughput and performance of bulk loading activities. \r\n",
"\r\n",
"For now, let's check how many rows were inserted into our table:"
],
"metadata": {
"azdata_cell_guid": "96f0dcbb-7bf6-456e-b5d8-53763eae3630"
}
},
{
"cell_type": "code",
"source": [
"select count(*) from DataLoad.store_returns"
],
"metadata": {
"azdata_cell_guid": "36c0c58f-b0a1-4854-88d1-e576e31b37d0",
"tags": []
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:00:01.460"
},
"metadata": {}
},
{
"output_type": "execute_result",
"metadata": {},
"execution_count": 8,
"data": {
"application/vnd.dataresource+json": {
"schema": {
"fields": [
{
"name": "(No column name)"
}
]
},
"data": [
{
"0": "2807797"
}
]
},
"text/html": "<table><tr><th>(No column name)</th></tr><tr><td>2807797</td></tr></table>"
}
}
],
"execution_count": 8
},
{
"cell_type": "markdown",
"source": [
"If you want to run throught the exercise again, run the following code to reset what you've done."
],
"metadata": {
"azdata_cell_guid": "3ab4cac4-9c02-4ee1-818d-c6cbaa54ca6e"
}
},
{
"cell_type": "code",
"source": [
"DROP EXTERNAL DATA SOURCE dataset\r\n",
"DROP DATABASE SCOPED CREDENTIAL [https://azuresqlworkshopsa.blob.core.windows.net/data/]\r\n",
"DROP TABLE DataLoad.store_returns\r\n",
"DROP MASTER KEY"
],
"metadata": {
"azdata_cell_guid": "297d59bb-08be-4680-94e4-028161ac0b4e"
},
"outputs": [
{
"output_type": "display_data",
"data": {
"text/html": "Commands completed successfully."
},
"metadata": {}
},
{
"output_type": "display_data",
"data": {
"text/html": "Total execution time: 00:00:00.042"
},
"metadata": {}
}
],
"execution_count": 2
}
]
}

Просмотреть файл

@ -0,0 +1,17 @@
# Module 2 Activities - Deploy and Configure
These represent demos and examples you can run that accompany Module 2. See [Module 2](../02-DeployAndConfigure.md) for details on how to use the files in this module.
## verifydeployment
Run the verify deployment queries and review the results across Azure SQL Database, Azure SQL Managed Instance, and SQL Server 2019. Main notebook file [here](./verifydeployment/VerifyDeployment.ipynb).
## cli
Get started managing your Azure SQL resources using the Azure CLI. In the example that follows, you'll also explore the latency effects of using different connection policies in Azure SQL. Main notebook file [here](./cli/AzureCLI.ipynb).
## loaddata
In this activity, you'll explore one scenario for loading data from Azure Blob storage using T-SQL and Shared Access Signatures (SAS). Main notebook file [here](./loaddata/LoadData.ipynb).

Различия файлов скрыты, потому что одна или несколько строк слишком длинны

Просмотреть файл

@ -1,74 +0,0 @@
![](../graphics/microsoftlogo.png)
# The Azure SQL Workshop
#### <i>A Microsoft workshop from the SQL team</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>02 - Security</h2>
Ensuring security and compliance of your data is always a top priority. In this module, youll learn how to use Azure SQL to secure your data, how to configure logins and users, how to use tools and techniques for monitoring security, how to ensure your data meets industry and regulatory compliance standards, and how to leverage the extra benefits and intelligence that is only available in Azure. Well also cover some of the networking considerations for securing SQL.
In each module you'll get more references, which you should follow up on to learn more. Also watch for links within the text - click on each one to explore that topic.
(<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/00-Prerequisites.md" target="_blank">Make sure you check out the <b>Prerequisites</b> page before you start</a>. You'll need all of the items loaded there before you can proceed with the workshop.)
In this module, you'll cover these topics:
[2.1](#2.1): TODO
[2.2](#2.2): TODO
[2.3](#2.3): TODO
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="2.1">2.1 TODO: Topic Name</h2></a>
TODO: Topic Description
<br>
<img style="height: 400; box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19);" src="linkToPictureEndingIn.png">
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="2.2">2.2 TODO: Topic Name</h2></a>
TODO: Topic Description
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="url" target="_blank">TODO: Enter courses, books, posts, whatever the student needs to extend their study</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/03-Performance.md" target="_blank"><i> 03 - Performance</i></a>.

Просмотреть файл

@ -1,7 +0,0 @@
# Module 2 Activities - Security
These represent demos and examples you can run that accompany Module 2. See [Module 2](../02-Security.md) for details on how to use the files in this module.
## verifydeployment TODO
TODO Run the verify deployment queries and review the results across Azure SQL Database, Azure SQL Managed Instance, and SQL Server 2019. Main notebook file [here](./verifydeployment/VerifyDeployment.ipynb).

Просмотреть файл

@ -1,74 +0,0 @@
![](../graphics/microsoftlogo.png)
# The Azure SQL Workshop
#### <i>A Microsoft workshop from the SQL team</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>03 - Performance</h2>
Youve been responsible for getting your SQL fast, keeping it fast, and making it fast again when something is wrong. In this module, well show you how to leverage your existing performance skills, processes, and tools and apply them to Azure SQL, including taking advantage of the intelligence in Azure to keep your database tuned.
In each module you'll get more references, which you should follow up on to learn more. Also watch for links within the text - click on each one to explore that topic.
(<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/00-Prerequisites.md" target="_blank">Make sure you check out the <b>Prerequisites</b> page before you start</a>. You'll need all of the items loaded there before you can proceed with the workshop.)
In this module, you'll cover these topics:
[3.1](#3.1): TODO
[3.2](#3.2): TODO
[3.3](#3.3): TODO
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="3.1">3.1 TODO: Topic Name</h2></a>
TODO: Topic Description
<br>
<img style="height: 400; box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19);" src="linkToPictureEndingIn.png">
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="3.2">3.2 TODO: Topic Name</h2></a>
TODO: Topic Description
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="url" target="_blank">TODO: Enter courses, books, posts, whatever the student needs to extend their study</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/04-Availability.md" target="_blank"><i> 04 - Availability</i></a>.

Просмотреть файл

@ -0,0 +1,625 @@
![](../graphics/microsoftlogo.png)
# The Azure SQL Workshop
#### <i>A Microsoft workshop from the SQL team</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>03 - Security</h2>
> You must complete the [prerequisites](../azuresqlworkshop/00-Prerequisites.md) before completing these activities. You can also choose to audit the materials if you cannot complete the prerequisites. If you were provided an environment to use for the workshop, then you **do not need** to complete the prerequisites.
Ensuring security and compliance of your data is always a top priority. In this module, youll learn how to use Azure SQL to secure your data, how to configure logins and users, how to use tools and techniques for monitoring security, how to ensure your data meets industry and regulatory compliance standards, and how to leverage the extra benefits and intelligence that is only available in Azure. Well also cover some of the networking considerations for securing SQL.
In this module, you'll cover these topics:
[3.1](#3.1): Platform and network security
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 1](#1): Create and manage firewall rules for Azure SQL Database
[3.2](#3.2): Access management and Authorization
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 2](#2): Getting started with Azure AD authentication
[3.3](#3.3): Information protection and encryption
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 3](#3): Confirm TDE is enabled
[3.4](#3.4): Security management
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 4](#4): Auditing
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 5](#5): Advanced data security
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;(Bonus) [Activity 6](#6): Data classification, Dynamic data masking, and SQL Audit
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="3.1">3.1 Platform and network security</h2></a>
TODO: Put in text here that talks about the process for network security with Azure SQL comparing this to SQL Server
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="1"><b>Activity 1</a>: Create and manage firewall rules for Azure SQL Database</b></p>
In this short activity, you'll see how to review and manage your firewall rules using the Azure portal.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
During deployment of Azure SQL Database, you selected "Allow Azure services and resources access to this server" to **ON**. If you can, switching it to **OFF** is the most secure configuration. This can be complicated, since it means you'll have to specify a range of IP addresses for all your connections. In this activity, you'll simply see how to view and edit your firewall rules. In reality, you'll want to partner with your networking team to ensure you have the most secure, functional network. A few handy resources include:
* [Azure SQL Database network access controls](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-networkaccess-overview)
* [Connecting your applications to Managed Instance](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-connect-app)
* [IP firewall rules for Azure SQL Database](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-firewall-configure)
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
**Step 1 - create and manage firewall rules with the Azure portal**
In your Azure virtual machine, navigate to the Azure portal, specifically to your Azure SQL Database logical server. Select **Firewalls and virtual networks** from the left-hand menu.
![](../graphics/fwvn.png)
Switch "Allow Azure services and resources to access this server" to **OFF**. During deployment, you should have added your Client IP address already, but if one of the Rules do not match your Client IP displayed (see below), select **Add Client IP**.
![](../graphics/clientip.png)
Finally, select **Save**.
To confirm you still have access from your Azure VM, navigate to SSMS and refresh your connection to the Azure SQL Database logical server. If no errors occur, you have successfully configured access to your Azure SQL Database logical server for your IP address only.
![](../graphics/dbrefresh.png)
**Step 2 - Create and manage firewall rules with the Azure Cloud Shell**
You can also use commands `az sql server firewall-rule` to create, delete, and view server-level firewall rules. You can use the Azure CLI through the command-line of your Azure VM or through a PowerShell notebook. For this step, you'll experiment with the Azure Cloud Shell.
Return to the Azure portal in your Azure VM. In the top bar, select the Azure Cloud Shell button.
![](../graphics/cloudshell.png)
If this is your first time using the Azure Cloud Shell, you will be prompted to select a subscription to create a storage account and Microsoft Azure Files share. For this workshop, you can just use any of the storage accounts that are in your resource group already. More information about the Azure Cloud Shell can be found in the [documentation](https://docs.microsoft.com/en-us/azure/cloud-shell/overview).
Then, you can select Bash or PowerShell. Select **Bash**. You should see a view similar to below.
![](../graphics/acsbash.png)
Next, run `az account list` to find the name of the subscription you are using for the workshop.
Then, run `az account set --subscription 'my-subscription-name'` to set the default subscription for this Azure Cloud Shell session. You can confirm this worked by running `az account show`.
Now that you're set up, you can list your server's firewall settings with the following command:
```bash
az sql server firewall-rule list -g <ResourceGroup> -s <Server>
```
Your client IP address rule should match what you saw in the previous step using the Azure portal.
![](../graphics/fwlist.png)
There are other commands available for creating, deleting, and updating rules, which you can explore [here](https://docs.microsoft.com/en-us/cli/azure/sql/server/firewall-rule?view=azure-cli-latest).
Note that this method of setting the firewall rules (using the Azure portal or Azure Cloud Shell) grants your client IP address access to all of the databases that are in that logical server. After you've configured the server-level firewall rule, which you did above, you can optionally configure database-level firewall rules that apply to individual databases. This can only be done with T-SQL, using the command `EXECUTE sp_set_database_firewall_rule`. For more information, see the references in the **Description** of this activity.
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="3.2">3.2 Access management and Authorization</h2></a>
TODO: Put in text here that talks about the process to access management with Azure SQL comparing this to SQL Server
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="2"><b>Activity 2</a>: Getting started with Azure AD authentication</b></p>
In this activity, you'll learn how to configure an Azure AD administrator on a server level for Azure SQL Database. Next, you'll change your connection in SSMS from SQL authentication to Azure AD authentication, and you'll see how to grant other Azure AD users access to the database like normal users in SQL Server.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
**Step 1 - Create an Azure AD admin**
In the Azure portal, navigate to your Azure SQL Database logical server. In the left-hand task menu, select **Active Directory Admin** and **Set Admin**.
![](../graphics/aadadmin.png)
Search for you account. Easiest way is to type in your full email address. Click your user and then choose **Select**.
![](../graphics/aadselect.png)
You might think that's it, but you still have to select **Save** to confirm your actions.
![](../graphics/aadsave.png)
**Step 2 - Authenticate using Azure AD**
Now that you've configured access for yourself to your Azure SQL Database logical server, let's update the connection in SSMS and ADS.
First, in SSMS, right click on you Azure SQL Database logical server and select **Connect**.
![](../graphics/dbconnect.png)
Notice that under *Authentication*, there are several different Azure Active Directory authentication methods, which will depend on how your organization is set up. For this workshop, select **Azure Active Directory - Password**.
![](../graphics/connecttoserver.png)
> Note: If you get the following error, this indicates your organization requires you to select **Azure Active Directory - Universal with MFA**. Connect accordingly.
>
> ![](../graphics/cannotconnect.png)
Next to the server name, you should now be able to see that you are authenticated using your Azure AD account and not the `cloudadmin` SQL user as before.
![](../graphics/aadc.png)
**Step 3 - Grant other users access (SQL)**
Now that you're authenticated using Azure AD, your next step might be to add other users. Just as in SQL Server, you can add new SQL users. In SSMS, using your Azure AD connection, right-click on your database and create a new query. Run the following.
> Note: You must right-click on the **database** within your Azure SQL Database logical server. In SQL Server and Azure SQL managed instance, you can query at the server level and use `USE DatabaseName`, but in Azure SQL Database, you must query the database directly, the `USE` statement is not supported.
```sql
-- Create a new SQL user and give them a password
CREATE USER ApplicationUser WITH PASSWORD = 'YourStrongPassword1';
-- Until you run the following two lines, ApplicationUser has no access to read or write data
ALTER ROLE db_datareader ADD MEMBER ApplicationUser;
ALTER ROLE db_datawriter ADD MEMBER ApplicationUser;
```
As you likely already know, the best practice is to create non-admin accounts at the database level, unless they need to be able to execute administrator tasks.
**Step 3 - Grant other users access (Azure AD)**
Azure AD authentication is a little different. From the documentation, "*Azure Active Directory authentication requires that database users are created as contained. A contained database user maps to an identity in the Azure AD directory associated with the database and has no login in the master database. The Azure AD identity can either be for an individual user or a group*."
Additionally, the Azure portal can only be used to create administrators, and Azure RBAC roles don't propagate to Azure SQL Database logical servers, Azure SQL Databases, or Azure SQL Managed Instances. Additional server/database permissions must be granted using T-SQL.
How you complete this next step will depend on how you are consuming this workshop. If you were given an environment, find a neighbor to work with. If you are doing this self-paced, or in a group that is multi-organization, you will just review this step, observing the screenshots.
1. With your neighbor, first determine who will be *Person A* and who will be *Person B*.
2. Both *Person A* and *Person B* should note their Azure VM's **Public IP Address** (can locate this in the Azure portal)
3. *Person A* should run the following T-SQL to authorize *Person B* to their server:
```sql
-- Create the Azure AD user with access to the server
CREATE USER <Person B Azure AD account> FROM EXTERNAL PROVIDER;
-- Create firewall to allow Person B's Azure VM
EXECUTE sp_set_firewall_rule @name = N'AllowPersonB',
@start_ip_address = 'Person B VM Public IP',
@end_ip_address = 'Person B VM Public IP'
```
4. *Person B* should run the following T-SQL to authorize *Person A* to their server:
```sql
-- Create the Azure AD user with access to the server
CREATE USER <Person A Azure AD account> FROM EXTERNAL PROVIDER;
-- Create firewall to allow Person A's Azure VM
EXECUTE sp_set_firewall_rule @name = N'AllowPersonA',
@start_ip_address = 'Person A VM Public IP',
@end_ip_address = 'Person A VM Public IP'
```
5. **Person A** should now try to connect to **Person B**'s Azure SQL Database logical server.
6. **Person B** should now try to connect to **Person A**'s Azure SQL Database logical server.
7. Compare results.
TODO Add screenshots and test with Bob.
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="3.3">3.3 Information protection and encryption</h2></a>
TODO: Put in text here that talks about the process to protect information/encryption with Azure SQL comparing this to SQL Server
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="3"><b>Activity 3</a>: Confirm TDE is enabled</b></p>
This is a quick activity to show you how easily you can confirm that TDE is enabled, or you can enable it if it is not.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
In the Azure portal, navigate to your Azure SQL Database, and in the left-hand menu, under Security, select **Transparent data encryption**. Confirm your database is set to **ON**.
![](../graphics/tdeon.png)
Next, navigate to your Azure SQL Database logical server, and in the left-hand menu, under Security, select **Transparent data encryption**. Notice that you have a different view:
![](../graphics/tdeoption.png)
The default is to let the Azure service manage your key. As it says, Azure will automatically generate a key to encrypt your databases, and manage the key rotations. You've seen how to do this with the Azure portal, but you can also use PowerShell, Azure CLI, T-SQL, or REST APIs. For more details, [refer here](https://docs.microsoft.com/en-us/azure/sql-database/transparent-data-encryption-azure-sql?tabs=azure-portal).
You can, alternatively, bring your own key (BYOK) leveraging Azure key vault. In this scenario, you (not Azure) are responsible for and in full control of a key lifecycle management (key creation, rotation, deletion), key usage permissions, and auditing of operations on keys. For more information regarding Azure SQL TDE with BYOK, please [refer here](https://docs.microsoft.com/en-us/azure/sql-database/transparent-data-encryption-byok-azure-sql?view=sql-server-ver15).
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="3.4">3.4 Security management</h2></a>
TODO: Put in text here that talks about the process for security management with Azure SQL comparing this to SQL Server
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="4"><b>Activity 4</a>: Auditing</b></p>
The auditing feature tracks database events and writes events to an audit log in either Azure storage, Azure Monitor logs, or to an Event hub. Auditing helps maintain regulatory compliance, understand database activity, and gain insight into discrepancies and anomalies that could indicate potential security violations. In this activity, you'll set up Auditing at the server level (also available at the database level).
> **Aside**: The main differences between auditing in Azure SQL and auditing in SQL Server are:
> * With Azure SQL Database, auditing is at server or database level, but with Azure SQL Managed Instance and SQL Server is at the server level.
> * XEvent auditing supports Azure Blob storage targets
> * [SQL Server Auditing](https://docs.microsoft.com/en-us/sql/relational-databases/security/auditing/sql-server-audit-database-engine?view=sql-server-ver15) is only available (with some differences) in Azure SQL Managed Instance
> * For Azure SQL Managed Instance specifically:
> * With `CREATE AUDIT`, you have new syntax `TO URL` and `TO EXTERNAL MONITOR` allow you to specify an Azure Blob storage container and enable Event Hub and Azure Monitor logs target, respectively.
> * `TO FILE`, shutdown option, and `queue_delay`=0 are not supported in Azure SQL.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
**Step 1 - Enable auditing on the Azure SQL Database logical server**
Open the Azure portal and navigate to your Azure SQL Database. In the left-hand task menu, under Security, select **Auditing**. Review the options and then select **View server settings**. The Microsoft recommendation is to apply auditing at the server level, which then applies to all databases within the Azure SQL Database logical server.
![](../graphics/dbaudit.png)
Next, set **Auditing** to **ON**. Notice you have different options for your log destination, depending how you want to audit your data. In this lab, you'll configure Storage and Log Analytics. In a later activity in this module, you'll get to look at the logs in both. You can also explore the implementations by reviewing [the documentation](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-auditing).
Select **Log Analytics (Preview)** and the **Configure** button.
![](../graphics/serveraudit.png)
Next, select **+ Create New Workspace**.
![](../graphics/newws.png)
Fill in the information according to the subscription, resource group, and location, that you are using to complete this workshop. We recommend naming your Log Analytics Workspace **azuresqlworkshopUID-la**, using your unique ID for your resources. Select **OK**.
![](../graphics/laws.png)
This may take a few moments to validate and create. You should now see your Log Analytics account.
Next, select **Storage**. This option allows you to collect XEvent log files in an Azure Blob storage account. In a later activity, you'll see more on how this differs from Log Analytics. Select **Configure**.
![](../graphics/configstorage.png)
Next, select the subscription you're using for this workshop as well as the storage account that was created to be used with Advanced data security (should be *sql* + *a random string of letters and numbers*). In this storage account, auditing logs will be saved as a collection of blob files within a container named **sqldbauditlogs**.
You also have options for the number of days you want to retain data. The default, **0**, means to retain data forever. You can change this to something else, if you want to cut back on the storage that may be generated and charged here. For this exercise, input **7**.
Finally, you can make a decision of which storage access key to use. Note you can use this to switch between keys when it's time to rotate them. Select **Primary**.
After you've configured your options, select **OK**.
![](../graphics/sasql.png)
Select **Save**.
![](../graphics/savela.png)
Once it saves, you can select the **X** button to close the server level Auditing pane. Back in the Azure SQL Database Auditing pane, you may notice that the **Auditing** option says **OFF**. It's important to note that if auditing is enabled on the server, it will always apply to the database.
![](../graphics/dbauditoff.png)
This is the end of this activity. In a later activity in this module, you'll see how to analyze the audit logs with information from Data Discovery & Classification in a Security dashboard as well as in SSMS.
TODO: Topic Description CONTD
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="5"><b>Activity 5</a>: Advanced Data Security</b></p>
Advanced data security (ADS) is a unified package for advanced SQL security capabilities, providing a single go-to location for enabling and managing three main capabilities:
* [Data discovery & classification](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-data-discovery-and-classification)
* [Vulnerability assessment](https://docs.microsoft.com/en-us/azure/sql-database/sql-vulnerability-assessment)
* [Advanced Threat Protection](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-threat-detection-overview)
In this activity, you'll enable ADS and explore some of the features within each of the capabilities mentioned above.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
**Step 1 - Enable ADS**
In the Azure portal, navigate to your Azure SQL Database logical server. Then, in the left-hand menu, under Security, select **Advanced data security**. If you followed the deployment activity in Module 2, ADS should already be enabled. If it is not, select **ON** and select **Save**.
![](../graphics/adson.png)
**Step 2 - ADS server level settings**
In this step, you'll review the selections you've made for your Azure SQL Database logical server. In the same pane as step 1 (Azure SQL Database logical server > Security > Advanced data security), you will also see information regarding Vulnerability Assessments and Advanced Threat Protection.
At the highest level, SQL Vulnerability Assessment (VA) is a scanning service that provide visibility into your security state. It then provides actionable steps to address any potential concerns. When you configure periodic recurring scans, you're enabling the service to scan your databases every seven days and check for any vulnerabilities. You can then choose to send those reports to the admins, subscription owners, or anyone else that might need to be made notified of changes. In order for this service to operate, you have to specify a storage account for the results to be stored. This storage account was deployed during deployment of your Azure SQL Database, as you opted in to turn on ADS. Review the options and add your email address if you want to view a recurring scan.
![](../graphics/vasettings.png)
Lastly, you can configure your Advanced Threat Protection (ATP) settings. ATP enables you to detect and respond to potential threats as they occur by providing security alerts on anomalous activities. To check the ATP alert types available, select **All** under Advanced Threat Protection types.
![](../graphics/atptypes.png)
Just like you can configure who receives the VA scans, you can configure who receives the ATP alerts. Review the options and add your email address if you want to be alerted (recommended for future lab).
![](../graphics/atpsettings.png)
Once you've updated all your settings, don't forget to select **Save**.
![](../graphics/save.png)
Setting these settings up will enable you to complete some of the other steps in this activity, so you'll see more of VA and ATP soon.
**Step 3 - Data Discovery & Classification**
Navigate back to your Azure SQL Database (not the logical server!). In the left-hand menu, under Security, Select **Advanced data security**.
![](../graphics/adsdashboard.png)
First, you'll review Data Discovery & Classification (DD&C). Select the **Data Discovery & Classification** box. This wizard type of view is similar (but not exactly matching) to the Data Discovery & Classification tool that exists in SQL Server today through SSMS. Using the SSMS wizard is **not supported** for Azure SQL, but you can achieve similar functionality here.
Select the information bar that says **We have found XX columns with classification recommendations**.
![](../graphics/recs.png)
DD&C tries to identify potential sensitive data based on the column names in your tables. Review some of the suggested labels and then select **Select all** and **Accept selected recommendations**.
![](../graphics/ddcrecs.png)
Then, select **Save** near the top left corner.
![](../graphics/save.png)
Finally, select **Overview** to view the overview dashboard and review the classifications you've added.
![](../graphics/ddcoverview.png)
**Step 4 - Vulnerability Assessment**
Select the **X** in the top right corner of DD&C to bring you back to the ADS dashboard. Next, you'll review the Vulnerability Assessment (VA) capabilities. Start by selecting the **Vulnerability Assessment** box.
![](../graphics/adsdashboard2.png)
Next, select **Scan** to get the most current VA results. This will take a few moments, while VA scans all the databases in your Azure SQL Database logical server.
![](../graphics/vascan.png)
Your resulting view should be similar to below.
![](../graphics/vadashboard.png)
Every security risk has a risk level (high, medium, or low) and additional information. Select the security check **VA2065** to get a detailed view, similar to below. Review the status and other available information.
![](../graphics/va20651.png)
![](../graphics/va20652.png)
In this case, VA is suggesting that you configure a baseline of your server level firewall rules. Once you have a baseline, you can then monitor and assess any changes.
Depending on the security check, there will be alternate views and recommendations. For this security check, you can select the **Approve as Baseline** button at the top of the details page.
You can now re-scan your logical server to confirm that you are now passing **VA2065**.
![](../graphics/vabaseline.png)
You can then optionally complete another scan and confirm that VA2065 is now showing up as a **Passed** security check.
![](../graphics/va20653.png)
**Step 5 - Advanced Threat Protection overview**
Select the **X** in the top right corner of VA to get back to the ADS dashboard. Select the **Advanced Threat Protection** (ATP) box to drill in and review the results.
![](../graphics/adsdashboard3.png)
Likely, you won't see any security alerts. In the next step, you will run a test that will trigger an alert, so you can review the results in ATP.
**Step 6 - Testing ATP capabilities**
TODO with help of Bob
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>(Bonus) <a name="6">Activity 6</a>: Data classification, Dynamic data masking, and SQL Audit</b></p>
In this activity, you will learn how to audit users trying to view columns that were marked for data classification. This activity will combine several of the things you've already learned about in the module, and take those learnings to the next level.
<p><img style="margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/checkmark.png?raw=true"><b>Steps</b></p>
**Step 1 - Add new data classification manually**
In the Azure portal, navigate to your Azure SQL Database (not logical server!). In the left-hand menu, under Security, select **Advanced data security** and then select the **Data Discovery & Classification** box.
![](../graphics/adsdashboard4.png)
Next, select the **Classification** tab and then select **+ Add classification**.
![](../graphics/addddc.png)
In a previous activity, you added all the recommended column classifications. In this step, you will *manually* add a potentially sensitive column to the list of classified columns.
In the SalesLT Customer table, DD&C identified FirstName and LastName to be classified, but not MiddleName. Using the drop-downs, add it now. Then, select **Add classification**.
![](../graphics/addddc2.png)
Select **Save**.
![](../graphics/save.png)
You can confirm that this was successful by viewing the **Overview** tab and confirming that MiddleName is now present in the list of classified columns under the SalesLT schema.
**Step 2 - Apply Dynamic Data Masking over the Name columns**
Dynamic Data Masking (DDM) is something available in Azure SQL as well as in SQL Server. It limits data exposure by masking sensitive data to non-privileged users. Azure SQL will recommend things for you to mask, or you can add masks manually. You'll mask the FirstName, MiddleName, and LastName columns which you reviewed in the previous step.
In the Azure portal, navigate to your Azure SQL Database. In the left-hand menu, under Security, select **Dynamic Data Masking** and then select **+ Add mask**.
![](../graphics/addmask.png)
First, select the **SalesLT** schema, **Customer** table, and **FirstName** column. Then, you can review the options for masking, but the default is good for this scenario. Select **Add** to add the masking rule.
![](../graphics/addmask2.png)
Repeat this for both **MiddleName** and **LastName** in that table.
After, you should have three masking rules, similar to below.
![](../graphics/addmask3.png)
Select **Save**.
![](../graphics/save.png)
**Step 3 - Query classified and masked columns**
Now, navigate to SSMS and create a new query in your AdventureWorks database.
![](../graphics/newquery.png)
> Note: You should be connected to this logical server using Azure AD authentication. If you are connected as `cloudadmin`, create a new connection and connect using Azure AD authentication.
Now, run the following query to return the classified (and in some cases masked) data.
```sql
SELECT TOP 10 FirstName, MiddleName, LastName
FROM SalesLT.Customer
```
You should get a result of the first ten names, with no masking applied. Why? Because you are the admin for this Azure SQL Database logical server.
![](../graphics/names.png)
Now, run the following query to create a new user and run the previous query as that user. You may notice the first few commands, they are a repeat from Activity 2, Step 3.
```sql
-- Create a new SQL user and give them a password
CREATE USER Bob WITH PASSWORD = 'gocowboys1!';
-- Until you run the following two lines, Bob has no access to read or write data
ALTER ROLE db_datareader ADD MEMBER Bob;
ALTER ROLE db_datawriter ADD MEMBER Bob;
-- Execute as our new, low-privilege user, Bob
EXECUTE AS USER = 'Bob';
SELECT TOP 10 FirstName, MiddleName, LastName
FROM SalesLT.Customer;
REVERT;
```
Now, you should get a result of the first ten names, but masking applied. Bob has not been granted access to the unmasked form of this data.
![](../graphics/names2.png)
**Step 4 - Add excluded users from masking**
What if, for some reason, Bob needs access to the names and gets permission to have it?
You can update excluded users from masking in the Azure portal (in the Dynamic Data Masking pane under Security), but you can also do it using T-SQL. Use the query below to allow Bob to query the names results without masking.
```sql
GRANT UNMASK TO Bob;
EXECUTE AS USER = 'Bob';
SELECT TOP 10 FirstName, MiddleName, LastName
FROM SalesLT.Customer;
REVERT;
```
Your results should include the names in full.
![](../graphics/names.png)
Finally, you can also take away a user's unmasking privileges, and confirm that with the following T-SQL.
```sql
-- Remove unmasking privilege
REVOKE UNMASK TO Bob;
-- Execute as Bob
EXECUTE AS USER = 'Bob';
SELECT TOP 10 FirstName, MiddleName, LastName
FROM SalesLT.Customer;
REVERT;
```
Your results should include the masked names.
![](../graphics/names2.png)
**Step 5 - Analyze audit logs from Azure Blob storage with SSMS**
As an admin, you may want to review and audit who is accessing the databases and specifically the classified data. Next, you'll take a look at the audit files that are being sent to Azure Blob storage. The first thing you have to do is merge the audit files, in case logs span multiple files. You can do this from SSMS. First, select **File** > **Open** > **Merge Audit Files**.
![](../graphics/fileaudit.png)
Next, select **Add**.
![](../graphics/fileauditadd.png)
Specify to add then from Azure Blob storage and select **Connect**.
![](../graphics/fileauditconnect.png)
Now sign into Azure with the account you are using for this workshop.
![](../graphics/fileauditsignin.png)
Select the subscription, storage account, and blob container you configured Audit logs to go to (refer to your selection in the Azure portal under your Azure SQL Database logical server's Auditing blade). The container will be called `sqldbauditlogs`.
![](../graphics/fileauditselect.png)
Select your Azure SQL Database logical server and your AdventureWorks database. It should take everything from the day up until the second you select the Database name. Select **OK**.
![](../graphics/fileauditok.png)
The confirmation window lets you know how many files are being downloaded and merged. Select **OK**.
![](../graphics/downloadconf.png)
Review the files and select **OK** one last time.
![](../graphics/mergeaudit.png)
You should now be able to see all the audit logs. Look for where you were testing with masking with Bob. You can select the statement, and then use the detail pane below to review the information. For example, for one of the queries where Bob tries to view classified data, under the `data_sensitivity_information` field, you can see the data that is classified. For more information on the naming conventions in audit logs, [see here](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-audit-log-format).
This merged file can then be exported to an XEL or CSV file (or to a table) for additional analysis. You can also query the [Extended Events files using PowerShell](https://sqlscope.wordpress.com/reading-extended-event-files-using-client-side-tools-only/).
**Step 6 - Analyze audit logs with Log Analytics**
Analyzing your audit logs will depend on your preference, and the method in Step 5 may be more familiar. In this step, you'll be exposed to querying security logs in the Azure portal with Log Analytics.
In the Azure portal, navigate to your Azure SQL Database. In the left-hand menu, under Security, select **Auditing**. Then select **View audit logs**.
![](../graphics/viewauditlogs.png)
You should now be able to see a query of your event records, options to run in Query Editor (run T-SQL queries through the portal), options for Log Analytics/View dashboard, and more.
![](../graphics/auditrecords.png)
Feel free to click around and understand what some of the options are.
Then, click on **Log Analytics**. This takes you to a query editor but it is not T-SQL. This view allows you to query logs using Kusto query language or KQL, which is meant to be easy to use and understand by SQL professionals. For the KQL documentation, [refer here](https://docs.microsoft.com/en-us/azure/kusto/query/).
The default query is querying the category `SQLSecurityAuditEvents`, so while you might use this category now to view security related incidents, this tool can also be used for querying other Azure logs and categories in [Azure Monitor](https://docs.microsoft.com/en-us/azure/azure-monitor/log-query/log-query-overview).
![](../graphics/laview.png)
This workshop won't go deep into KQL querying of logs, but there are many resources in the references above if you want more practice later.
**Step 7 - Analyze audit logs and monitor security with the Log Analytics SQL Security dashboard**
In this step, you'll see how SQL Security has built a dashboard based on Log Analytics for you to monitor and audit the logs and other SQL activity. To get back to Audit records, select the **X** in the top right corner of the Log Analytics query window.
Then, select **View dashboard**.
![](../graphics/viewdb.png)
You should now see an overview dashboard. Drill in to **Azure SQL - Access to Sensitive Data**.
![](../graphics/securitydb.png)
You can use this drill down to find out:
1. How many queries are accessing sensitive data
1. Which types and sensitivities of are being accessed
1. Which principals are accessing sensitive data
1. Which IPs are accessing sensitive data
Review what's available here, and how you can audit usage with this tool.
When you're done, select the **X** in the top right corner of the **Azure SQL - Access to Sensitive Data** tab.
Back in the overview, select **Azure SQL - Security Insights**.
![](../graphics/securitydb.png)
This dashboard gives more auditing information to help you understand database activity, and gain insight into anomalies. Spend a few minutes reviewing the options here.
> Looking for another bonus security activity? Try this tutorial: [Always Encrypted: Protect sensitive data and store encryption keys in Azure Key Vault](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-always-encrypted-azure-key-vault?tabs=azure-powershell). You will need VS for this, you can download [Visual Studio Community for free here](https://visualstudio.microsoft.com/downloads/).
In this module and throughout the activities, you got to get hands-on with many security features that are available for Azure SQL. In the next module, you'll take a look at how performance is different in Azure, and you'll see how you can optimize it in Azure SQL.
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-overview" target="_blank">Azure SQL Security Documentation</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-best-practice" target="_blank">Azure SQL Security Best Practices Playbook</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-managed-instance-aad-security-tutorial" target="_blank">Configure security for Azure SQL Managed Instance</a></li>
<li><a href="https://docs.microsoft.com/en-us/azure/sql-database/sql-database-security-tutorial" target="_blank">Configure security for Azure SQL Database</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/04-Performance.md" target="_blank"><i> 04 - Performance</i></a>.

Просмотреть файл

@ -0,0 +1,5 @@
# Module 3 Activities - Security
These represent demos and examples you can run that accompany Module 3. See [Module 3](../03-Security.md) for details on how to use the files in this module.
> There are currently no files in this folder. All the instructions are contained in the main [Module 3](../03-Security.md) file.

Просмотреть файл

@ -1,74 +0,0 @@
![](../graphics/microsoftlogo.png)
# The Azure SQL Workshop
#### <i>A Microsoft workshop from the SQL team</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>04 - Availability</h2>
Depending on the SLA your business requires, Azure SQL has the options you need including built-in capabilities. In this module, you will learn how to translate your knowledge of backup/restore, Always on failover cluster instances, and Always On availability groups with the options for business continuity in Azure SQL.
In each module you'll get more references, which you should follow up on to learn more. Also watch for links within the text - click on each one to explore that topic.
(<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/00-Prerequisites.md" target="_blank">Make sure you check out the <b>Prerequisites</b> page before you start</a>. You'll need all of the items loaded there before you can proceed with the workshop.)
In this module, you'll cover these topics:
[4.1](#4.1): TODO
[4.2](#4.2): TODO
[4.3](#4.3): TODO
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="4.1">4.1 TODO: Topic Name</h2></a>
TODO: Topic Description
<br>
<img style="height: 400; box-shadow: 0 4px 8px 0 rgba(0, 0, 0, 0.2), 0 6px 20px 0 rgba(0, 0, 0, 0.19);" src="linkToPictureEndingIn.png">
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="4.2">4.2 TODO: Topic Name</h2></a>
TODO: Topic Description
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="url" target="_blank">TODO: Enter courses, books, posts, whatever the student needs to extend their study</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/05-PuttingItTogether.md" target="_blank"><i> 05 - Putting it all together</i></a>.

Просмотреть файл

@ -1,7 +0,0 @@
# Module 4 Activities - Availability
These represent demos and examples you can run that accompany Module 4. See [Module 4](../04-Availability.md) for details on how to use the files in this module.
## verifydeployment TODO
TODO Run the verify deployment queries and review the results across Azure SQL Database, Azure SQL Managed Instance, and SQL Server 2019. Main notebook file [here](./verifydeployment/VerifyDeployment.ipynb).

Просмотреть файл

@ -0,0 +1,528 @@
![](../graphics/microsoftlogo.png)
# Module 4 - Performance
#### <i>The Azure SQL Workshop</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="https://github.com/microsoft/sqlworkshops/blob/master/graphics/textbubble.png?raw=true"> <h2>Overview</h2>
> You must complete the [prerequisites](../azuresqlworkshop/00-Prerequisites.md) before completing these activities. You can also choose to audit the materials if you cannot complete the prerequisites. If you were provided an environment to use for the workshop, then you **do not need** to complete the prerequisites.
Youve been responsible for getting your SQL fast, keeping it fast, and making it fast again when something is wrong. In this module, well show you how to leverage your existing performance skills, processes, and tools and apply them to Azure SQL, including taking advantage of the intelligence in Azure to keep your database tuned.
In each module you'll get more references, which you should follow up on to learn more. Also watch for links within the text - click on each one to explore that topic.
(<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/00-Prerequisites.md" target="_blank">Make sure you check out the <b>Prerequisites</b> page before you start</a>. You'll need all of the items loaded there before you can proceed with the workshop.)
In this module, you'll cover these topics:
[4.1](#4.1): Azure SQL performance capabilities and Tasks<br>
[4.2](#4.2): Monitoring performance in Azure SQL<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 1](#1): How to monitor performance in Azure SQL Database
[4.3](#4.3): Improving Performance in Azure SQL<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 2](#2): Scaling your workload performance in Azure SQL Database<br>
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 3 (BONUS)](#2): Optimizing performance for index maintenance.
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="4.1">4.1 Azure SQL performance capabilities and Tasks</h2></a>
In this section you will learn how to monitor the performance of a SQL workload using tools and techniques both familiar to the SQL Server professional along with differences with Azure SQL.
**Azure SQL Performance Capabilities**
**Monitoring and Troubleshooting Performance**
**Accelerating and Improving Performance**
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="4.2">4.2 Monitoring performance in Azure SQL</h2></a>
In this section you will learn how to monitor the performance of a SQL workload using tools and techniques both familiar to the SQL Server professional along with differences with Azure SQL.
**Monitoring SQL queries**
- DMVs
- Extended Events
- Azure Portal
**Monitoring CPU usage**
- DMVs
- Azure Portal
- Query Store
**Monitoring Waits**
- DMVs
sys.dm_exec_requests can be used to see wait types, duration, and wait resources for any active request. This DMV also works across Azure SQL. There can be some wait types that are unique to Azure SQL which can be found at XXXXXX...
Some of the more common new wait type values new to Azure SQL are:
XXXX
XXXX
XXXX
SQL Server supports **sys.dm_os_wait_stats**. Azure SQL Database supports a database specific DMV for this called **sys.dm_db_wait_stats**. sys.dm_os_waits or sys.dm_db_wait_stats can be used with Azure SQL Database Managed Instance.
- Query Store
- Azure Portal
**Monitoring Memory**
**Monitoring Transaction Log Usage**
**Monitoring I/O**
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="1"><b>Activity 1</a>: How to monitor performance in Azure SQL Database</b></p>
>**IMPORTANT**: This activity assumes you have completed all the activities in Module 2.
All scripts for this activity can be found in the **azuresqlworkshop\04-Performance\monitor_and_scale** folder.
>**NOTE:** This activity will work against an Azure SQL Database Managed Instance. However, you may need to make some changes to the scripts to increase the workload since the minimum number of vCores for Managed Instance General Purpose is 4 vCores.
In this activity, you will take a typical workload based on SQL queries and learn how to monitor performance for Azure SQL Database. You will learn how to identify a potential performance bottleneck using familiar tools and techniques to SQL Server. You will also learn differences with Azure SQL Database for performance monitoring.
Using the Azure SQL Database based on the AdventureWorksLT sample, you are given an example workload and need to observe its performance. You are told there appears to be a performance bottleneck. Your goal is to identify the possible bottleneck and identify solutions.
>**NOTE**: These scripts use the database name **AdventureWorks0406**. Anywhere this database name is used you should substitute in the name of the database you deployed in Module 2.
**Step 1: Setup to monitor Azure SQL Database**
>**TIP**: To open a script file in the context of a database in SSMS, click on the database in Object Explorer and then use the File/Open menu in SSMS.
- Launch SQL Server Management Studio (SSMS) and load a query *in the context of the database you deployed in Module 2* to monitor the Dynamic Management View (DMV) **sys.dm_exec_requests** from the script **sqlrequests.sql** which looks like the following:
```sql
SELECT er.session_id, er.status, er.command, er.wait_type, er.last_wait_type, er.wait_resource, er.wait_time
FROM sys.dm_exec_requests er
INNER JOIN sys.dm_exec_sessions es
ON er.session_id = es.session_id
AND es.is_user_process = 1
```
Unlike SQL Server, the familiar DMV dm_exec_requests shows active requests for a specific Azure SQL Database vs an entire server. Azure SQL Database Managed instance will behave just like SQL Server.
In another session for SSMS *in the context of the database you deployed in Module 2* load a query to monitor a Dynamic Management View (DMV) unique to Azure SQL Database called **sys.dm_db_resource_stats** from a script called **azuresqlresourcestats.sql**
```sql
SELECT * FROM sys.dm_db_resource_stats
```
This DMV will track overall resource usage of your workload against Azure SQL Database such as CPU, I/O, and memory.
**Step 2: Run the workload and observe performance**
- Examine the workload query from the script **topcustomersales.sql**.
This database is not large so the query to retrieve customer and their associated sales information ordered by customers with the most sales shouldn't generate a large result set. It is possible to tune this query by reducing the number of columns from the result set but these are needed for demostration purposes of this activity.
```sql
SELECT c.*, soh.OrderDate, soh.DueDate, soh.ShipDate, soh.Status, soh.ShipToAddressID, soh.BillToAddressID, soh.ShipMethod, soh.TotalDue, soh.Comment, sod.*
FROM SalesLT.Customer c
INNER JOIN SalesLT.SalesOrderHeader soh
ON c.CustomerID = soh.CustomerID
INNER JOIN SalesLT.SalesOrderDetail sod
ON soh.SalesOrderID = sod.SalesOrderID
ORDER BY sod.LineTotal desc
GO
```
- Run the workload from the command line using ostress.
Edit the script script that runs ostress **sqlworkload.cmd**:<br><br>
Substitute your Azure Database Server created in Module 2 for the **-S parameter**<br>
Substitute the login name created for the Azure SQL Database Server created in Module 2 for the **-U parameter**
Substitute the database you deployed in Module 2 for the **-d parameter**<br>
Substitute the password for the login for the Azure SQL Database Server created in Module 2 for the **-P parameter**.
This script will use 10 concurrent users running the workload query 1500 times.
>**NOTE:** If you are not seeing CPU usage behavior with this workload for your environment you can adjust the **-n parameter** for number of users and **-r parameter** for iterations.
From a powershell command prompt, change to the directory for this module activity:
[vmusername] is the name of the user in your Windows Virtual Machine. Substitute in the path for c:\users\[vmusername] where you have cloned the GitHub repo.
<pre>
cd c:\users\[vmusername]\AzureSQLWorkshop\azuresqlworkshop\03-Performance\monitor_and_scale
</pre>
Run the workload with the following command
```Powershell
.\sqlworkload.cmd
```
Your screen at the command prompt should look similar to the following
<pre>[datetime] [ostress PID] Max threads setting: 10000
[datetime] [ostress PID] Arguments:
[datetime] [ostress PID] -S[server].database.windows.net
[datetime] [ostress PID] -isqlquery.sql
[datetime] [ostress PID] -U[user]
[datetime] [ostress PID] -dAdventureWorks0406
[datetime] [ostress PID] -P********
[datetime] [ostress PID] -n10
[datetime] [ostress PID] -r1500
[datetime] [ostress PID] -q
[datetime] [ostress PID] Using language id (LCID): 1024 [English_United States.1252] for character formatting with NLS: 0x0006020F and Defined: 0x0006020F
[datetime] [ostress PID] Default driver: SQL Server Native Client 11.0
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_1.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_2.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_3.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_4.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_5.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_6.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_7.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_8.out]
[datetime] [ostress PID] Attempting DOD5015 removal of [directory]\sqlquery_9.out]
[datetime] [ostress PID] Starting query execution...
[datetime] [ostress PID] BETA: Custom CLR Expression support enabled.
[datetime] [ostress PID] Creating 10 thread(s) to process queries
[datetime] [ostress PID] Worker threads created, beginning execution...</pre>
- Use the query in SSMS to monitor dm_exec_requests (**sqlrequests.sql**) to observe active requests. Run this query 5 or 6 times and observe some of the results.
You should see many of the requests have a status = RUNNABLE and last_wait_type = SOS_SCHEDULER_YIELD. One indicator of many RUNNABLE requests and many SOS_SCHEDULER_YIELD seen often is a possible lack of CPU resources for active queries.
>**NOTE:** You may see one or more active requests with a command = SELECT and a wait_type = XE_LIVE_TARGET_TVF. These are queries run by services managed by Microsoft to help power capabilities like Performance Insights using Extended Events. Microsoft does not publish the details of these Extended Event sessions.
The familiar SQL DMV dm_exec_requests can be used with Azure SQL Database but must be run in the context of a database unlike SQL Server (or Azure SQL Database Managed Instance) where dm_exec_requests shows all active requests across the server instance.
- Run the query in SSMS to monitor dm_db_resource_stats (**azuresqlresourcestats.sql**). Run the query to see the results of this DMV 3 or 4 times.
This DMV records of snapshot of resource usage for the database every 15 seconds (kept for 1 hour). You should see the column **avg_cpu_percent** close to 100% for several of the snapshots. (at least in the high 90% range). This is a symptom of a workload pushing the limits of CPU resources for the database. You can read more details about this DMV at https://docs.microsoft.com/en-us/sql/relational-databases/system-dynamic-management-views/sys-dm-db-resource-stats-azure-sql-database?view=azuresqldb-current. This DMV also works with Azure SQL Database Managed Instance.
For a SQL Server on-premises environment you would typically use a tool specific to the Operating System like Windows Performance Monitor to track overall resource usage such a CPU. If you ran this example on a on-premises SQL Server or SQL Server in a Virtual Machine with 2 CPUs you would see near 100% CPU utilization on the server.
>**NOTE**: Another DMV called, **sys.resource_stats**, can be run in the context of the master database of the Azure Database Server to see resource usage for all Azure SQL Database databases associated with the server. This view is less granular and shows resource usage every 5 minutes (kept for 14 days).
- Let the workload complete and take note of its overall duration. When the workload completes you should see results like the following and a return to the command prompt
<pre>[datetime] [ostress PID] Total IO waits: 0, Total IO wait time: 0 (ms)
[datetime] [ostress PID] OSTRESS exiting normally, elapsed time: 00:01:22.637</pre>
Your duration time may vary but this typically takes at least 1 minute or more.
**Step 3: Use Query Store to do further performance analysis**
Query Store is a capability in SQL Server to track performance execution of queries. Performance data is stored in the user database. You can read more about Query Store at https://docs.microsoft.com/en-us/sql/relational-databases/performance/monitoring-performance-by-using-the-query-store?view=sql-server-ver15.
Query Store is not enabled by default for databases created in SQL Server but is on by default for Azure SQL Database (and Azure SQL Database Managed Instance). You can read more about Query Store and Azure SQL Database at https://docs.microsoft.com/en-us/azure/sql-database/sql-database-operate-query-store.
Query Store comes with a series of system catalog views to view performance data. SQL Server Management Studio (SSMS) provides reports using these system views.
- Look at queries consuming the most resource usage using SSMS.
Using the Object Explorer in SSMS, open the Query Store Folder to find the report for **Top Resource Consuming Queries**<br>
<img src="../graphics/SSMS_QDS_Find_Top_Queries.png" alt="SSMS_QDS_Find_Top_Queries"/>
Select the report to find out what queries have consumed the most avg resources and execution details of those queries. Based on the workload run to this point, your report should look something like the following:<br>
<img src="../graphics/SSMS_QDS_Top_Query_Report.png" alt="SSMS_QDS_Find_Top_Queries"/>
The query shown is the SQL query from the workload for customer sales. This report has 3 components: Queries with the high total duration (you can change the metric), the associated query plan and runtime statistics, and the associated query plan in a visual map.
If you click on the bar chart for the query (the query_id may be different for your system), your results should look like the following:<br>
<img src="../graphics/SSMS_QDS_Query_ID.png" alt="SSMS_QDS_Query_ID"/>
You can see the total duration of the query and query text.
Right of this bar chart is a chart for statistics for the query plan associated with the query. Hover over the dot associated with the plan. Your results should look like the following:<br>
<img src="../graphics/SSMS_Slow_Query_Stats.png" alt="SSMS_Slow_Query_Stats" width=350/>
Note the average duration of the query. Your times may vary but the key will be to compare this average duration to the average wait time for this query and eventually the average duration when we introduce a performance improvement.
The final component is the visual query plan. The query plan for this query looks like the following:<br>
<img src="../graphics/SSMS_Workload_Query_Plan.png" alt="SSMS_Workload_Query_Plan"/>
Given the small nature of rows in the tables in this database, this query plan is not inefficient. There could be some tuning opportunities but not much performance will be gained by tuning the query itself.
- Observe waits to see if they are affecting performance.
We know from earlier diagnostics that a high number of requests constantly were in a RUNNABLE status along with almost 100% CPU. Query Store comes with reports to look at possible performance bottlenecks to due waits on resources.
Below the Top Resource Consuming Queries report in SSMS is a report called Query Wait Statistics. Click on this report and hover over the bar chart. Your results should look like the following:<br>
<img src="../graphics/SSMS_Top_Wait_Stats.png" alt="SSMS_Top_Wait_Stats"/>
You can see the top wait category is CPU and the average wait time. Furthermore, the top query waiting for CPU is the query from the workload we are using.
Click on the bar chart for CPU to see more about query wait details. Hover over the bar chart for the query. Your results should look like the following:<br>
<img src="../graphics/SSMS_Top_Wait_Stats_Query.png" alt="SSMS_Top_Wait_Stats_Query"/>
Notice that the average wait time for CPU for this query is a high % of the overall average duration for the query.
The DMV **sys.dm_db_wait_stats** will show a high number of SOS_SCHEDULER_YIELD waits with this scenario.
Given the evidence to this point, without any query tuning, our workload requires more CPU capacity than we have deployed for our Azure SQL Database.
**Step 5: Observe performance using the Azure Portal**
The Azure Portal provides performance information in the form of a graph. The standard default view is called **Compute Utilization** which you can see on the Overview blade for your database:<br><br>
<img src="../graphics/Azure_Portal_Compute_Slow_Query.png" alt="Azure_Portal_Compute_Slow_Query"/>
Notice in this example, the compute utilization near 100% for a recent time range. This chart will show resource resource usage over the last hour and is refreshed continually. If you click on the chart you customize the chart (Ex. bar chart) and look at other resource usage.
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="4.3">4.3 Improving Performance in Azure SQL</h2></a>
In this section you will learn how to improve the performance of a SQL workload in Azure SQL using your knowledge of SQL Server and gained knowledge from Module 4.2.
**SQL Query Tuning**
**Azure SQL Database Auto Tuning**
**Scaling Performance**
Here is a good article to reference: https://docs.microsoft.com/en-us/azure/sql-database/sql-database-monitor-tune-overview#troubleshoot-performance-problems and https://docs.microsoft.com/en-us/azure/sql-database/sql-database-monitor-tune-overview#improve-database-performance-with-more-resources.
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="2"><b>Activity 2</a>: Scaling your workload performance in Azure SQL Database</b></p>
>**IMPORTANT**: This activity assumes you have completed all the steps in Activity 1 in Module 4.
In this activity you will take the results of your monitoring in Module 4.2 and learn how to scale your workload in Azure to see improved results.
All scripts for this activity can be found in the **azuresqlworkshop\04-Performance\monitor_and_scale** folder.
**Step 1: Decide options on how to scale performance**
Since workload is CPU bound one way to improve performance is to increase CPU capacity or speed. A SQL Server user would have to move to a different machine or reconfigure a VM to get more CPU capacity. In some cases, even a SQL Server administrator may not have permission to make these scaling changes or the process could take time.
For Azure, we can use ALTER DATABASE, az cli, or the portal to increase CPU capacity.
Using the Azure Portal we can see options for how you can scale for more CPU resources. Using the Overview blade for the database, select the Pricing tier current deployment.<br>
<img src="../graphics/Azure_Portal_Change_Tier.png" alt="Azure_Portal_Change_Tier"/>
Here you can see options for changing or scaling compute resources. For General Purpose, you can easily scale up to something like 8 vCores.<br>
<img src="../graphics/Azure_Portal_Compute_Options.png" alt="Azure_Portal_Compute_Options"/>
Instead of using the portal, I'll show you a different method to scale your workload.
**Step 2: Increase capacity of your Azure SQL Database**
There are other methods to change the Pricing tier and one of them is with the T-SQL statement ALTER DATABASE.
>**NOTE**: For this demo you must first flush the query store using the following script **flushhquerystore.sql** or T-SQL statement:
```sql
EXEC sp_query_store_flush_db
```
- First, learn how to find out your current Pricing tier using T-SQL. The Pricing tier is also know as a *service objective*. Using SSMS, open the script **get_service_object.sql** or the T-SQL statements to find out this information:
```sql
SELECT database_name,slo_name,cpu_limit,max_db_memory, max_db_max_size_in_mb, primary_max_log_rate,primary_group_max_io, volume_local_iops,volume_pfs_iops
FROM sys.dm_user_db_resource_governance;
GO
SELECT DATABASEPROPERTYEX('AdventureWorks0406', 'ServiceObjective');
GO
```
For the current Azure SQL Database deployment, your results should look like the following:<br><br>
<img src="../graphics/service_objective_results.png" alt="service_objective_results"/>
Notice the term **slo_name** is also used for service objective. The term **slo** stands for *service level objective*.
The various slo_name values are not documented but you can see from the string value this database uses a General Purpose SKU with 2 vCores:
>**NOTE:** Testing shows that SQLDB_OP_... is the string used for Business Critical.
The documentation for ALTER DATABASE shows all the possible options for service objectives and how they match to the Azure portal: https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-database-transact-sql?view=sql-server-ver15.
When you view the ALTER DATABASE documentation, notice the ability to click on your target SQL Server deployment to get the right syntax options. Click on SQL Database single database/elastic pool to see the options for Azure SQL Database. To match the compute scale you found in the portal you need the service object **'GP_Gen5_8'**
Using SSMS, run the script modify_service_objective.sql or T-SQL command:
```sql
ALTER DATABASE AdventureWorks0406 MODIFY (SERVICE_OBJECTIVE = 'GP_Gen5_8');
```
This statement comes back immediately but the scaling of the compute resources take place in the background. A scale this small should take less than a minute and for a short period of time the database will be offline to make the change effective. You can monitor the progress of this scaling activity using the Azure Portal.<br>
<img src="../graphics/Azure_Portal_Update_In_Progress.png" alt="Azure_Portal_Update_In_Progress"/>
TAnother way to monitor the progress of a change for the service object for Azure SQL Database is to use the DMV **sys.dm_operation_status**. This DMV exposes a history of changes to the database with ALTER DATABASE to the service objective and will show active progress of the change. Here is an example of this DMV after executing the above ALTER DATABASE statement:
<pre>
session_activity_id resource_type resource_type_desc major_resource_id minor_resource_id operation state state_desc percent_complete error_code error_desc error_severity error_state start_time last_modify_time
97F9474C-0334-4FC5-BFD5-337CDD1F9A21 0 Database AdventureWorks0406 ALTER DATABASE 1 IN_PROGRESS 0 0 0 0 [datetime] [datetime]</pre>
During a change for the service objective, queries are allowed against the database until the final change is implemented so an application cannot connect for a very brief period of time. For Azure SQL Database Managed Instance, a change to Tier (or SKU) will allow queries and connections but prevents all database operations like creation of new databases (in these cases operations like these will fail with the error message "**The operation could not be completed because a service tier change is in progress for managed instance '[server]' Please wait for the operation in progress to complete and try again**".)
When this is done using the queries listed above to verify the new service objective or pricing tier of 8 vCores has taken affect.
**Step 3: Run the workload again**
Now that the scaling has complete, we need to see if the workload duration is faster and whether waits on CPU resources has decreased.
Run the workload again using the command **sqlworkload.cmd** that you executed in Section 4.2
**Step 4: Observe new performance of the workload**
- Observe DMV results
Use the same queries from Section 4.2 Activity 1 to observe results from **dm_exec_requests** and **dm_db_resource_stats**.
You will see there are more queries with a status of RUNNING (less RUNNABLE although this will appear some) and the avg_cpu_percent should drop to 40-60%.
- Observe the new workload duration.
The workload duration from **sqlworkload.cmd** should now be much less and somewhere ~20 seconds.
- Observe Query Store reports
Using the same techniques as in Section 4.2 Activity 1, look at the **Top Resource Consuming Queries** report from SSMS:<br>
<img src="../graphics/SSMS_QDS_Top_Query_Faster.png" alt="Azure_Portal_Update_In_Progress"/>
You will now see two queries (query_id). These are the same query but show up as different query_id values in Query Store because the scale operation required a restart so the query had to be recompiled. You can see in the report the overall and average duration was significantly less.
Look also at the Query Wait Statistics report as you did in Section 4.2 Activity 1. You can see the overall average wait time for the query is less and a lower % of the overall duration. This is good indication that CPU is not as much of a resource bottleneck when the database had a lower number of vCores:<br>
<img src="../graphics/SSMS_Top_Wait_Stats_Query_Faster.png" alt="Azure_Portal_Update_In_Progress"/>
- Observe Azure Portal Compute Utilization
Look at the Overview blade again for the Compute Utilization. Notice the significant drop in overall CPU resource usage compared to the previous workload execution:<br>
<img src="../graphics/Azure_Portal_Compute_Query_Comparison.png" alt="Azure_Portal_Compute_Query_Comparison"/>
>**NOTE:** If you continue to increase vCores for this database you can improve performance up to a threshold where all queries have plenty of CPU resources. This does not mean you must match the number of vCores to the number of concurrent users from your workload. In addition, you can change the Pricing Tier to use **Serverless** *Compute Tier* instead of **Provisioned** to achieve a more "auto-scaled" approach to a workload. For example, for this workload if you chose a min vCore value of 2 and max VCore value of 8, this workload would immediately scale to 8vCores.
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="2"><b>Activity 3</a>: Optimizing application performance for Azure SQL Database</b></p>
>**IMPORTANT**: This activity assumes you have completed all Activities in Module 2
Good article read: https://azure.microsoft.com/en-us/blog/resource-governance-in-azure-sql-database/
In some cases, migrating an existing application and SQL query workload to Azure may uncover opportunities to optimize and tune queries.
Assume that to support a new extension to a website for AdventureWorks orders to support a rating system from customers you need to add a new table to support a heavy set of concurrent INSERT activity for ratings. You have tested the SQL query workload on a development computer that has a local SSD drive for the database and transaction log.
When you move your test to Azure SQL Database using the General Purpose tier (8 vCores), the INSERT workload is slower. You need to discover whether you need to change the service objective or tier to support the new workload.
All scripts for this activity can be found in the **azuresqlworkshop\04-Performance\tuning_applications** folder.
**Step 1 - Create a new table**
Run the following statement (or use the script**order_rating_ddl.sql**) to create a table in the AdventureWorks database you have used in Activity 1 and 2:
```sql
DROP TABLE IF EXISTS SalesLT.OrderRating;
GO
CREATE TABLE SalesLT.OrderRating
(OrderRatingID int identity not null,
SalesOrderID int not null,
OrderRatingDT datetime not null,
OrderRating int not null,
OrderRatingComments char(500) not null);
GO
```
**Step 2 - Load up a query to monitor query execution**
- Use the following query or script **sqlrequests.sql** to look at active SQL queries *in the context of the AdventureWorks database*:
```sql
SELECT er.session_id, er.status, er.command, er.wait_type, er.last_wait_type, er.wait_resource, er.wait_time
FROM sys.dm_exec_requests er
INNER JOIN sys.dm_exec_sessions es
ON er.session_id = es.session_id
AND es.is_user_process = 1;
```
- Use the following query or script **top_waits.sql** to look at top wait types by count *in the context of the AdventureWorks database*:
```sql
SELECT * FROM sys.dm_os_wait_stats
ORDER BY waiting_tasks_count DESC;
```
- Use the following query or script **tlog_io.sql** to observe latency for transaction log writes:
```sql
SELECT io_stall_write_ms/num_of_writes as avg_tlog_io_write_ms, *
FROM sys.dm_io_virtual_file_stats
(db_id('AdventureWorks0406'), 2);
```
**Step 3 - Run the workload**
Run the test INSERT workload using the script order_rating_insert_single.cmd. This script uses ostress to run 25 concurrent users running the following T-SQL statement (in the script **order_rating_insert_single.sql**):
```sql
DECLARE @x int;
SET @x = 0;
WHILE (@x < 100)
BEGIN
SET @x = @x + 1;
INSERT INTO SalesLT.OrderRating
(SalesOrderID, OrderRatingDT, OrderRating, OrderRatingComments)
VALUES (@x, getdate(), 5, 'This was a great order');
END
```
You can see from this script that it is not exactly a real depiction of data coming from the website but it does simulate many order ratings being ingested into the database.
**Step 4 - Observe query requests and duration**
Using the queries in Step 2 you should observe the following:
- Many requests constantly have a wait_type of WRITELOG with a value > 0
- The WRITELOG wait type is the highest count
- The avg time to write to the transaction log is somewhere around 2ms.
The duration of this workload on a SQL Server 2019 instance with a SSD drive is somewhere around 15 seconds. The total duration using this on Azure SQL Database using a Gen5 v8core is around 32+ seconds.
WRITELOG wait types are indicative of latency flushing to the transaction log. 2ms per write doesn't seem like much but on a local SSD drive these waits may < 1ms.
TODO: WRITELOG waits sometimes don't show up in Query Store?
**Step 5 - Decide on a resolution**
The problem is not a high% of log write activity. The Azure Portal and **dm_db_resource_stats** don't show any numbers higher than 20-25%. The problem is not an IOPS limit as well. The issue is that application requires low latency for transaction log writes but with the General Purpose database configuration a latency. In fact, the documenation for resource limits lists latency between 5-7ms (https://docs.microsoft.com/en-us/azure/sql-database/sql-database-vcore-resource-limits-single-databases).
If you examine the workload, you will see each INSERT is a single transaction commit which requires a transaction log flush.
One commit for each insert is not efficient but the application was not affected on a local SSD because each commit was very fast. The Business Critical pricing tier (servie objective or SKU) provides local SSD drives with a lower latency but maybe there ia an application optimization.
The T-SQL batch can be changed for the workload to wrap a BEGIN TRAN/COMMIT TRAN around the INSERT iterations.
**Step 6 - Run the modified workload and observe**
The modified workload can be found in the script **order_rating_insert.sql**. Run the modified workload using the script with ostress called **order_rating_insert.cmd**
Now the workload runs in almost 5 seconds compared to even 18-19 seconds with a local SSD using singleton transactions. This is an example of tuning an application for SQL queries that will run after in or outside of Azure.
The workload runs so fast it may be difficult to observe diagnostic data from queries used previously in this activity. It is important to note that sys.dm_os_wait_stats cannot be cleared using DBCC SQLPERF as it can be with SQL Server.
TODO: What does this workload look like in MI?
The concept of "batching" can help most applications including Azure. Read more at https://docs.microsoft.com/en-us/azure/sql-database/sql-database-use-batching-to-improve-performance.
>*NOTE:** Very large transactions can be affected on Azure and the symptoms will be LOG_RATE_GOVERNOR. In this example, the char(500) not null column pads spaces and causes large tlog records. Performance can even be more optimized by making that column a variable length column. TODO: Add more to this paragraph.
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="url" target="_blank">TODO: Enter courses, books, posts, whatever the student needs to extend their study</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/05-Availability.md" target="_blank"><i> 05 - Availability</i></a>.

Просмотреть файл

@ -0,0 +1 @@
SELECT * FROM sys.dm_db_resource_stats

Просмотреть файл

@ -0,0 +1,2 @@
EXEC sp_query_store_flush_db
GO

Просмотреть файл

@ -0,0 +1,5 @@
SELECT database_name,slo_name,cpu_limit,max_db_memory, max_db_max_size_in_mb, primary_max_log_rate,primary_group_max_io, volume_local_iops,volume_pfs_iops
FROM sys.dm_user_db_resource_governance;
GO
SELECT DATABASEPROPERTYEX('AdventureWorks0406', 'ServiceObjective');
GO

Просмотреть файл

@ -0,0 +1,2 @@
ALTER DATABASE AdventureWorks0406 MODIFY (SERVICE_OBJECTIVE = 'GP_Gen5_8');
GO

Просмотреть файл

@ -0,0 +1,6 @@
SELECT er.session_id, er.status, er.command, er.wait_type, er.last_wait_type, er.wait_resource, er.wait_time
FROM sys.dm_exec_requests er
INNER JOIN sys.dm_exec_sessions es
ON er.session_id = es.session_id
AND es.is_user_process = 1
GO

Просмотреть файл

@ -0,0 +1 @@
ostress.exe -Saw-server<ID>.database.windows.net -itopcustomersales.sql -Ucloudadmin -dAdventureWorks<ID> -P<password> -n10 -r1500 -q

Просмотреть файл

@ -0,0 +1,8 @@
SELECT c.*, soh.OrderDate, soh.DueDate, soh.ShipDate, soh.Status, soh.ShipToAddressID, soh.BillToAddressID, soh.ShipMethod, soh.TotalDue, soh.Comment, sod.*
FROM SalesLT.Customer c
INNER JOIN SalesLT.SalesOrderHeader soh
ON c.CustomerID = soh.CustomerID
INNER JOIN SalesLT.SalesOrderDetail sod
ON soh.SalesOrderID = sod.SalesOrderID
ORDER BY sod.LineTotal desc
GO

Просмотреть файл

@ -1,6 +1,6 @@
# Module 3 Activities - Performance
# Module 4 Activities - Performance
These represent demos and examples you can run that accompany Module 3. See [Module 3](../03-Performance.md) for details on how to use the files in this module.
These represent demos and examples you can run that accompany Module 4. See [Module 4](../04-Performance.md) for details on how to use the files in this module.
## verifydeployment TODO

Просмотреть файл

@ -0,0 +1,9 @@
DROP TABLE IF EXISTS SalesLT.OrderRating;
GO
CREATE TABLE SalesLT.OrderRating
(OrderRatingID int identity not null,
SalesOrderID int not null,
OrderRatingDT datetime not null,
OrderRating int not null,
OrderRatingComments char(500) not null);
GO

Просмотреть файл

@ -0,0 +1 @@
ostress.exe -Sbobazuresqlserver.database.windows.net -iorder_rating_insert.sql -Uthewandog -dAdventureWorks0406 -P$cprsqlserver2019 -n25 -r100 -q

Просмотреть файл

@ -0,0 +1,12 @@
DECLARE @x int
SET @x = 0
BEGIN TRAN
WHILE (@x < 100)
BEGIN
SET @x = @x + 1
INSERT INTO SalesLT.OrderRating
(SalesOrderID, OrderRatingDT, OrderRating, OrderRatingComments)
VALUES (@x, getdate(), 5, 'This was a great order')
END
COMMIT TRAN
GO

Просмотреть файл

@ -0,0 +1 @@
ostress.exe -Sbobazuresqlserver.database.windows.net -iorder_rating_insert_single.sql -Uthewandog -dAdventureWorks0406 -P$cprsqlserver2019 -n25 -r100 -q

Просмотреть файл

@ -0,0 +1,10 @@
DECLARE @x int;
SET @x = 0;
WHILE (@x < 100)
BEGIN
SET @x = @x + 1;
INSERT INTO SalesLT.OrderRating
(SalesOrderID, OrderRatingDT, OrderRating, OrderRatingComments)
VALUES (@x, getdate(), 5, 'This was a great order');
END
GO

Просмотреть файл

@ -0,0 +1,6 @@
SELECT er.session_id, er.status, er.command, er.wait_type, er.last_wait_type, er.wait_resource, er.wait_time
FROM sys.dm_exec_requests er
INNER JOIN sys.dm_exec_sessions es
ON er.session_id = es.session_id
AND es.is_user_process = 1;
GO

Просмотреть файл

@ -0,0 +1,4 @@
SELECT io_stall_write_ms/num_of_writes as avg_tlog_io_write_ms, *
FROM sys.dm_io_virtual_file_stats
(db_id('AdventureWorks0406'), 2);
GO

Просмотреть файл

@ -0,0 +1,3 @@
SELECT * FROM sys.dm_os_wait_stats
ORDER BY waiting_tasks_count DESC;
GO

Просмотреть файл

@ -0,0 +1,136 @@
![](../graphics/microsoftlogo.png)
# The Azure SQL Workshop
#### <i>A Microsoft workshop from the SQL team</i>
<p style="border-bottom: 1px solid lightgrey;"></p>
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>05 - Availability</h2>
> You must complete the [prerequisites](../azuresqlworkshop/00-Prerequisites.md) before completing these activities. You can also choose to audit the materials if you cannot complete the prerequisites. If you were provided an environment to use for the workshop, then you **do not need** to complete the prerequisites.
Depending on the SLA your business requires, Azure SQL has the options you need including built-in capabilities. In this module, you will learn how to translate your knowledge of backup/restore, Always on failover cluster instances, and Always On availability groups to the options for business continuity in Azure SQL.
In this module, you'll cover these topics:
[5.1](#5.1): Azure SQL high availability basics
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 1](#1): TODO-Turn-key FCIs
[5.2](#5.2): Backup and restore
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity X](#X): Restore a deleted database
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 2](#2): Restore to a point in time
[5.3](#5.3): The highest availability
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 3](#3): TODO-Turn-key AGs in Business critical
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 4](#4): Geo-distributed auto-failover groups with read-scale in Business critical
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="5.1">5.1 Azure SQL high availability basics</h2></a>
TODO: Explain basic architecture of general purpose/business critical/hyperscale for availability
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="1"><b>Activity 1</a>: TODO: TODO-Turn-key FCIs</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="5.2">5.2 TODO: Backup and restore</h2></a>
TODO: Explain how on prem you have to have a plan for DR and a BU/R strategy, but how it's built for you in Azure. Also talk about how it all works, ADR, LTR, etc.
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="X"><b>Activity X</a>: TODO: Restore a deleted database</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
![](../graphics/deletedb.png)
![](../graphics/deleteddb.png)
![](../graphics/deleteddb2.png)
![](../graphics/deleteddbview.png)
(this takes about 2 min to show up)
![](../graphics/restoredb.png)
![](../graphics/restoredb2.png)
![](../graphics/deploynotification.png)
![](../graphics/deployunderway.png)
NOTE: It took 11 MINUTES to restore the deleted database and 2-3 min for it to show up so I could delete it. With 14 minutes waiting time, I think we should drop this lab, I will mention it at the end of activity 2.
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="2"><b>Activity 2</a>: Undo errors to a point in time</b></p>
In all organizations, big or small, mistakes can happen. That's why you always have to have a plan for how you will restore to where you need to be. In SQL Server, ideally, you want choose to [restore to a point in time](https://docs.microsoft.com/en-us/sql/relational-databases/backup-restore/restore-a-sql-server-database-to-a-point-in-time-full-recovery-model?view=sql-server-ver15), but you can only do that if you are running in full recovery model. Under the bulk-logged recovery model, it's more likely that you'll have to recover the database to the end of the transaction log backup.
One of the benefits of Azure SQL is that Azure can take care of all of this for you. Since Azure SQL manages your backups and runs in full recovery model, it can restore you to any point in time (you can even [restore a deleted database](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-recovery-using-backups#deleted-database-restore)). In this activity, you'll see how a common error can be recovered using point in time restore (PITR). This is easy to do in the portal or programmatically, but in this activity you'll see how to do it with the Azure CLI.
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
For this activity, you'll use the notebook called **pitr.ipynb** which is under `azuresqlworkshop\05-Availability\pitr\pitr.ipynb`. Navigate to that file in ADS to complete this activity, and then return here.
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="5.3">5.3 The highest availability</h2></a>
TODO: We've shown you basics/how to get data back, now we'll show HA tech, what do you get in BC… Focus on BC here
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="3"><b>Activity 3</a>: TODO-Turn-key AGs in Business critical</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="4"><b>Activity 4</a>: Geo-distributed auto-failover groups with read-scale in Business critical</b></p>
TODO: Activity Description and tasks
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Description</b></p>
TODO: Enter activity description with checkbox
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/checkmark.png"><b>Steps</b></p>
TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<p><img style="margin: 0px 15px 15px 0px;" src="../graphics/owl.png"><b>For Further Study</b></p>
<ul>
<li><a href="url" target="_blank">TODO: Enter courses, books, posts, whatever the student needs to extend their study</a></li>
</ul>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/geopin.png"><b >Next Steps</b></p>
Next, Continue to <a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/06-PuttingItTogether.md" target="_blank"><i> 06 - Putting it all together</i></a>.

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 5.5 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 6.4 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 1.8 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 5.8 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 26 KiB

Просмотреть файл

@ -0,0 +1,368 @@
{
"metadata": {
"kernelspec": {
"name": "powershell",
"display_name": "PowerShell"
},
"language_info": {
"name": "powershell",
"codemirror_mode": "shell",
"mimetype": "text/x-sh",
"file_extension": ".ps1"
}
},
"nbformat_minor": 2,
"nbformat": 4,
"cells": [
{
"cell_type": "markdown",
"source": [
"# Activity 2: Undo errors to a point in time\r\n",
"\r\n",
"#### <i>The Azure SQL Workshop - Module 5</i>\r\n",
"\r\n",
"<p style=\"border-bottom: 1px solid lightgrey;\"></p>\r\n",
"\r\n",
"In this activity, you'll see how a common error can be recovered using point in time restore (PITR). This is easy to do in the portal or programmatically, but in this activity you'll see how to do it with the Azure CLI through the Azure Cloud Shell. \r\n",
"\r\n",
"\r\n",
"**Set up** \r\n",
"\r\n",
"0. You should have opened this file using Azure Data Studio. If you didn't, please refer to Module 2 Activity 3 in the readme.md file to get set up. \r\n",
"1. In the bar at the top of this screen, confirm or change the \"Kernel\" to **PowerShell**. This determines what language the code blocks in the file are. In this case, that language is PowerShell. \r\n",
"2. Confirming the Kernel is **PowerShell**, for \"Attach to\", it should read **localhost**. \r\n",
"\r\n",
"Now that you're set up, you should read the text cells and \"Run\" the code cells by selecting the play button that appears in the left of a code cell when you hover over it. \r\n",
" \r\n",
""
],
"metadata": {
"azdata_cell_guid": "cfeaf504-99ef-4000-9481-88ac986f2e4b"
}
},
{
"cell_type": "markdown",
"source": [
"**Step 0 - Delete a database on *accident*** \r\n",
"\r\n",
"First, let's confirm that the table we'll *accidentally* delete does exist and have data in it. Let's take a look at some of the values in `SalesLT.OrderDetail`. \r\n",
"\r\n",
"**Using SSMS**, run the following query and review the results. \r\n",
"\r\n",
"```sql\r\n",
"SELECT TOP 10 * from SalesLT.SalesOrderDetail\r\n",
"```\r\n",
"\r\n",
"![](./graphics/salesdetailssms.png) \r\n",
"\r\n",
"\r\n",
"For whatever reason, let's create a scenario where someone accidentally deletes that table. Today, you will be that someone. \r\n",
"\r\n",
"**Using SSMS**, run the following query.\r\n",
"\r\n",
"```sql\r\n",
"DROP TABLE SalesLT.SalesOrderDetail\r\n",
"```\r\n",
"\r\n",
"Also, copy the `Completion time`. For example, in the below image, you would copy `2020-01-31T17:15:18` \r\n",
"\r\n",
"![](./graphics/completiontime.png) \r\n",
"\r\n",
"Then, paste the completion time **AND THEN SUBTRACT 2 MINUTES** in the PowerShell cell below and run it, so you can refer to it later. \r\n",
"\r\n",
""
],
"metadata": {
"azdata_cell_guid": "0f550d3c-e7c0-4df9-957b-cfaa950cbbe7"
}
},
{
"cell_type": "markdown",
"source": [
"**Step 1 - Determine the time you need to go back to** \r\n",
"Before you go any further, it's important to understand the recommended process for doing point in time restore (PITR): \r\n",
"\r\n",
"1. Determine the time that you need to go back to. This should be **before** the error or mistake took place. \r\n",
"1. Complete PITR via PowerShell or the Azure portal to go back to this time. This deploys a new database and restores a copy of your database, e.g. **AdventureWorks0406-copy**. \r\n",
"1. Confirm the new database (e.g. **AdventureWorks0406-copy**) is where you need to be. \r\n",
"1. Rename the original database, e.g. **AdventureWorks0406** to **AdventureWorks0406-old**. \r\n",
"1. Rename the new database to the original database name, e.g. **AdventureWorks0406-copy** to **AdventureWorks0406**. \r\n",
"1. Delete the original database, e.g. **AdventureWorks0406-old**. \r\n",
"\r\n",
"In order to complete step 1, you need to know when the last \"good\" transaction occurred, before the \"bad\" on, so you can restore to before the \"bad\" transaction but after the last \"good\" one. \r\n",
"\r\n",
"To do this, run the following query in **SSMS**, then note the completion time of the last good query run. In your case, it should be the `SELECT TOP 10 * FROM SalesLT.SalesOrderDetail`. \r\n",
"\r\n",
"```sql\r\n",
"SELECT dest.text, deqs.last_execution_time\r\n",
"FROM sys.dm_exec_query_stats AS deqs\r\n",
" CROSS APPLY sys.dm_exec_sql_text(deqs.sql_handle) AS dest\r\n",
"ORDER BY \r\n",
" deqs.last_execution_time DESC\r\n",
"```\r\n",
"\r\n",
"It should be similar to below, but with a different date/time. \r\n",
"\r\n",
"![](./graphics/lastgoodq.png) \r\n",
"\r\n",
"You'll notice, in this example, the date/time is `2020-01-31 21:11:42.993`. The required format is slightly different. Update it using this example as a reference and to the definition of `$before_error_time`. \r\n",
"* SSMS format: `2020-01-31 21:11:42.993`\r\n",
"* Required format: `2020-01-31T21:11:42.993` \r\n",
"\r\n",
"The last part of this step is filling in your Subscription ID and database name information so the rest goes smoothly. \r\n",
"\r\n",
""
],
"metadata": {
"azdata_cell_guid": "0a78a603-f251-4ac2-aadd-3200353db5d0"
}
},
{
"cell_type": "code",
"source": [
"$before_error_time = \"2020-01-31T21:11:42.993\"\r\n",
"\r\n",
"$subscription_id = \"<SubscriptionIdHere>\"\r\n",
"$unique_id = \"<WorkshopUserID>\"\r\n",
"$database_name = \"AdventureWorks$($unique_id)\"\r\n",
"$database_name_copy = \"$($database_name)-copy\"\r\n",
"$database_name_old = \"$($database_name)-old\"\r\n",
"$logical_server = \"aw-server$($unique_id)\"\r\n",
"$resource_group = \"azuresqlworkshop$($unique_id)\""
],
"metadata": {
"azdata_cell_guid": "e66b65ce-c896-4457-97c4-9554c94d2aef",
"tags": []
},
"outputs": [],
"execution_count": 19
},
{
"cell_type": "markdown",
"source": [
"**Step 2 - Complete PITR using the Azure CLI** \r\n",
"\r\n",
"In the next step you'll use `az cli db restore` to restore to before the table was deleted. \r\n",
"\r\n",
""
],
"metadata": {
"azdata_cell_guid": "99d7b0f5-9405-46e5-93ff-5129f79d6227"
}
},
{
"cell_type": "markdown",
"source": [
"First we want to make sure we're logged in and set up to use the Azure CLI locally. "
],
"metadata": {
"azdata_cell_guid": "9fbe71ed-ccf8-4d9a-934a-a5874a080ccf"
}
},
{
"cell_type": "code",
"source": [
"# Log in to the Azure portal with your workshop credentials\r\n",
"# You may get an error initially, run again, and you should get a pop-up that directs you through authenticating\r\n",
"az login"
],
"metadata": {
"azdata_cell_guid": "12d396b2-92c4-483b-a741-b9aa84807d29",
"tags": []
},
"outputs": [],
"execution_count": 14
},
{
"cell_type": "code",
"source": [
"# Specify your subscription for the workshop\r\n",
"az account set --subscription $subscription_id\r\n",
"\r\n",
"# Confirm you're connected to the correct subscription\r\n",
"az account show"
],
"metadata": {
"azdata_cell_guid": "da44d206-65f3-4749-b186-364ee211b766",
"tags": []
},
"outputs": [],
"execution_count": 21
},
{
"cell_type": "code",
"source": [
"# Specify your default subscription, resource group, and Azure SQL Database logical server\r\n",
"az configure --defaults group=$resource_group sql-server=$logical_server"
],
"metadata": {
"azdata_cell_guid": "e91e1c9a-7175-45e0-879b-0f6ef150db3c",
"tags": []
},
"outputs": [],
"execution_count": 3
},
{
"cell_type": "markdown",
"source": [
"This next command will take about 10 minutes. This is because, in the background, Azure is deploying a new Azure SQL Database in your Azure SQL Database logical server that has all the same configuration options as the original. After it's deployed, it will then restore the database into that new Azure SQL Database. \r\n",
"\r\n",
"After about 6-8 minutes, you may be able to refresh your view of databases in **SSMS** and see that the database has been deployed and the restore is now in progress. \r\n",
"\r\n",
"![](./graphics/dbrestoring.png) \r\n",
"\r\n",
"Once you see this, it should only be 1-2 minutes more. You will know it is done, because the \"stop\" like button in the code cell below will stop spinning and go back to the standard \"play\" like button. "
],
"metadata": {
"azdata_cell_guid": "fec1c2b9-9af9-4b41-81c8-365031d6021f"
}
},
{
"cell_type": "code",
"source": [
"# Restore the database to the time before the database was deleted\r\n",
"az sql db restore --dest-name $database_name_copy --name $database_name --time $before_error_time"
],
"metadata": {
"azdata_cell_guid": "828e608f-fadd-40a7-8ac2-1f5394a81412"
},
"outputs": [],
"execution_count": 23
},
{
"cell_type": "markdown",
"source": [
"TODO WHILE YOU WAIT"
],
"metadata": {
"azdata_cell_guid": "5770ca8c-856f-4fb0-b134-f583d6707381"
}
},
{
"cell_type": "markdown",
"source": [
"**Step 3 - Confirm the new database is where you need to be**\r\n",
"\r\n",
"In order to do this, refresh your connection to the Azure SQL Database logical server in SSMS (right-click on the logical server and select **Refresh**). \r\n",
"\r\n",
"Then, right-click on your new database, e.g. **AdventureWorks0406-copy** and select **New Query**. \r\n",
"\r\n",
"![](./graphics/newnewquery.png) \r\n",
"\r\n",
"Use the following query to confirm the table exists. \r\n",
"\r\n",
"```sql\r\n",
"SELECT TOP 10 * from SalesLT.SalesOrderDetail\r\n",
"```\r\n",
"You should get something similar to the following screenshot, which confirms your database has been restored to where you want it to be. \r\n",
"\r\n",
"![](./graphics/salesdetailssms.png) \r\n",
"\r\n",
"\r\n",
""
],
"metadata": {
"azdata_cell_guid": "b4afdd5e-130c-4dea-a246-8905b9055831"
}
},
{
"cell_type": "markdown",
"source": [
"**Step 4 - Rename the original database** \r\n",
"\r\n",
"This step involves renaming the original database to something similar to **AdventureWorks0406-old** so you can later rename the new database to the original database name. As long as your applications use retry logic, this will make it so no connection strings need to be changed. \r\n",
"\r\n",
"Now, you're very familiar with how to rename databases in SSMS, but here you will see how it can be easily done using the Azure CLI. "
],
"metadata": {
"azdata_cell_guid": "0dfb6493-4a9d-450c-8ae8-d9dfe11b744f"
}
},
{
"cell_type": "code",
"source": [
"az sql db rename --name $database_name --new-name $database_name_old"
],
"metadata": {
"azdata_cell_guid": "f2a96ef9-d39c-4f16-af2d-d41adb56403e"
},
"outputs": [],
"execution_count": 27
},
{
"cell_type": "markdown",
"source": [
"**Step 5 - Rename the new database to the original database name** \r\n",
"\r\n",
"Now that the original database name is no longer taken, you can rename the copy database to that of the original, again using the Azure CLI. \r\n",
""
],
"metadata": {
"azdata_cell_guid": "8c6edfb6-f3e5-4a62-9082-5d9991e89ad1"
}
},
{
"cell_type": "code",
"source": [
"az sql db rename --name $database_name_copy --new-name $database_name"
],
"metadata": {
"azdata_cell_guid": "dd42263b-5296-4ae8-9d69-a2c1cd2b2e1f"
},
"outputs": [],
"execution_count": 5
},
{
"cell_type": "markdown",
"source": [
"**Step 6 - Delete the original database** \r\n",
"\r\n",
"Finally, you have no need for the old database, so you can delete it with `az sql db delete`. \r\n",
""
],
"metadata": {
"azdata_cell_guid": "2c86884d-ba69-4b30-ae06-98c15d8a4e4c"
}
},
{
"cell_type": "code",
"source": [
"az sql db delete --name $database_name_old --yes"
],
"metadata": {
"azdata_cell_guid": "86e9abb5-8db7-460c-ac20-b730a0db7b40"
},
"outputs": [],
"execution_count": 9
},
{
"cell_type": "markdown",
"source": [
"And you can confirm it no longer exists with the following command."
],
"metadata": {
"azdata_cell_guid": "ff220914-8349-455d-9c7f-aa29612fa312"
}
},
{
"cell_type": "code",
"source": [
"az sql db list"
],
"metadata": {
"azdata_cell_guid": "18ef1258-ace5-481c-b3a7-b6962894d2cc"
},
"outputs": [],
"execution_count": 25
},
{
"cell_type": "markdown",
"source": [
"You've now seen how you can leverage point in time restore (PITR) in Azure SQL Database. PITR is also available in Azure SQL Managed Instance, **for databases not the whole instance**. You can use almost the same commands except with `az sql midb` as opposed to `az sql db`. For more information, see the [documentation](https://docs.microsoft.com/en-us/cli/azure/sql/midb?view=azure-cli-latest#az-sql-midb-restore)."
],
"metadata": {
"azdata_cell_guid": "057514b2-bcdd-4624-8b95-c8ecd829ae07"
}
}
]
}

Просмотреть файл

@ -0,0 +1,7 @@
# Module 5 Activities - Availability
These represent demos and examples you can run that accompany Module 5. See [Module 5](../05-Availability.md) for details on how to use the files in this module.
## pitr
In this lab, you'll see how you can restore a database to a point in time using the Azure CLI and SSMS. Main notebook file [here](./pitr/pitr.ipynb).

Просмотреть файл

@ -8,21 +8,24 @@
<img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/textbubble.png"> <h2>05 - Putting it all together</h2>
> You must complete the [prerequisites](../azuresqlworkshop/00-Prerequisites.md) before completing these activities. You can also choose to audit the materials if you cannot complete the prerequisites. If you were provided an environment to use for the workshop, then you **do not need** to complete the prerequisites.
In the final activity, well validate your Azure SQL expertise with a challenging problem-solution exercise. Well then broaden your horizons to the many other opportunities and resources for personal and corporate growth that Azure offer.
In each module you'll get more references, which you should follow up on to learn more. Also watch for links within the text - click on each one to explore that topic.
(<a href="https://github.com/microsoft/sqlworkshops/blob/master/AzureSQLWorkshop/azuresqlworkshop/00-Prerequisites.md" target="_blank">Make sure you check out the <b>Prerequisites</b> page before you start</a>. You'll need all of the items loaded there before you can proceed with the workshop.)
In this module, you'll cover these topics:
[5.1](#5.1): TODO
[5.2](#5.2): TODO
[5.3](#5.3): TODO
[6.1](#6.1): TODO
[6.2](#6.2): TODO
[6.3](#6.3): TODO
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 1](#1): TODO
&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[Activity 2](#2): TODO
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="5.1">5.1 TODO: Topic Name</h2></a>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="6.1">6.1 TODO: Topic Name</h2></a>
TODO: Topic Description
@ -32,7 +35,7 @@ TODO: Topic Description
<br>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="1"><b>Activity 1</a>: TODO: Activity Name</b></p>
TODO: Activity Description and tasks
@ -46,11 +49,11 @@ TODO: Enter activity steps description with checkbox
<p style="border-bottom: 1px solid lightgrey;"></p>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="5.2">5.2 TODO: Topic Name</h2></a>
<h2><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/pencil2.png"><a name="6.2">6.2 TODO: Topic Name</h2></a>
TODO: Topic Description
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><b>Activity: TODO: Activity Name</b></p>
<p><img style="float: left; margin: 0px 15px 15px 0px;" src="../graphics/point1.png"><a name="2"><b>Activity 2</a>: TODO: Activity Name</b></p>
TODO: Activity Description and tasks

Просмотреть файл

@ -1,6 +1,6 @@
# Module 5 Activities - PuttingItTogether
# Module 6 Activities - PuttingItTogether
These represent demos and examples you can run that accompany Module 5. See [Module 5](../05-PuttingItTogether.md) for details on how to use the files in this module.
These represent demos and examples you can run that accompany Module 6. See [Module 6](../06-PuttingItTogether.md) for details on how to use the files in this module.
## verifydeployment TODO

Двоичные данные
AzureSQLWorkshop/graphics/Azure_Portal_Change_Tier.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 131 KiB

Двоичные данные
AzureSQLWorkshop/graphics/Azure_Portal_Compute_Options.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 212 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 157 KiB

Двоичные данные
AzureSQLWorkshop/graphics/Azure_Portal_Compute_Slow_Query.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 229 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 16 KiB

Двоичные данные
AzureSQLWorkshop/graphics/Azure_Portal_Update_In_Progress.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 73 KiB

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 180 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_QDS_Find_Top_Queries.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 82 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_QDS_Query_ID.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 334 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_QDS_Top_Query_Faster.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 446 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_QDS_Top_Query_Report.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 382 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_Slow_Query_Stats.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 155 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_Top_Wait_Stats.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 607 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_Top_Wait_Stats_Query.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 390 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_Top_Wait_Stats_Query_Faster.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 463 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_Wait_Stats_Faster_Query.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 181 KiB

Двоичные данные
AzureSQLWorkshop/graphics/SSMS_Workload_Query_Plan.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 33 KiB

Двоичные данные
AzureSQLWorkshop/graphics/aadadmin.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 88 KiB

Двоичные данные
AzureSQLWorkshop/graphics/aadc.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 9.0 KiB

Двоичные данные
AzureSQLWorkshop/graphics/aadsave.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 37 KiB

Двоичные данные
AzureSQLWorkshop/graphics/aadselect.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 36 KiB

Двоичные данные
AzureSQLWorkshop/graphics/acsbash.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 25 KiB

Двоичные данные
AzureSQLWorkshop/graphics/addddc.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 13 KiB

Двоичные данные
AzureSQLWorkshop/graphics/addddc2.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 26 KiB

Двоичные данные
AzureSQLWorkshop/graphics/addmask.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 59 KiB

Двоичные данные
AzureSQLWorkshop/graphics/addmask2.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 24 KiB

Двоичные данные
AzureSQLWorkshop/graphics/addmask3.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 36 KiB

Двоичные данные
AzureSQLWorkshop/graphics/adsconnectdb.png

Двоичный файл не отображается.

До

Ширина:  |  Высота:  |  Размер: 25 KiB

После

Ширина:  |  Высота:  |  Размер: 25 KiB

Двоичные данные
AzureSQLWorkshop/graphics/adsdashboard.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 86 KiB

Двоичные данные
AzureSQLWorkshop/graphics/adsdashboard2.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 57 KiB

Двоичные данные
AzureSQLWorkshop/graphics/adsdashboard3.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 81 KiB

Двоичные данные
AzureSQLWorkshop/graphics/adsdashboard4.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 103 KiB

Двоичные данные
AzureSQLWorkshop/graphics/adson.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 5.9 KiB

Двоичные данные
AzureSQLWorkshop/graphics/atpsettings.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 14 KiB

Двоичные данные
AzureSQLWorkshop/graphics/atptypes.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 19 KiB

Двоичные данные
AzureSQLWorkshop/graphics/auditrecords.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 68 KiB

Двоичные данные
AzureSQLWorkshop/graphics/basicspane.png

Двоичный файл не отображается.

До

Ширина:  |  Высота:  |  Размер: 65 KiB

После

Ширина:  |  Высота:  |  Размер: 64 KiB

Двоичные данные
AzureSQLWorkshop/graphics/cannotconnect.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 18 KiB

Двоичные данные
AzureSQLWorkshop/graphics/clientip.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 34 KiB

Двоичные данные
AzureSQLWorkshop/graphics/cloudshell.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 12 KiB

Двоичные данные
AzureSQLWorkshop/graphics/completiontime.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 5.5 KiB

Двоичные данные
AzureSQLWorkshop/graphics/configstorage.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 6.0 KiB

Двоичные данные
AzureSQLWorkshop/graphics/connecttoserver.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 137 KiB

Двоичные данные
AzureSQLWorkshop/graphics/dbaudit.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 146 KiB

Двоичные данные
AzureSQLWorkshop/graphics/dbauditoff.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 75 KiB

Двоичные данные
AzureSQLWorkshop/graphics/dbconnect.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 7.8 KiB

Двоичные данные
AzureSQLWorkshop/graphics/dbrefresh.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 10 KiB

Двоичные данные
AzureSQLWorkshop/graphics/dbrestoring.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 6.4 KiB

Двоичные данные
AzureSQLWorkshop/graphics/ddcoverview.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 91 KiB

Двоичные данные
AzureSQLWorkshop/graphics/ddcrecs.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 99 KiB

Двоичные данные
AzureSQLWorkshop/graphics/deletedb.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 10 KiB

Двоичные данные
AzureSQLWorkshop/graphics/deleteddb.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 4.3 KiB

Двоичные данные
AzureSQLWorkshop/graphics/deleteddb2.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 33 KiB

Двоичные данные
AzureSQLWorkshop/graphics/deleteddbview.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 29 KiB

Некоторые файлы не были показаны из-за слишком большого количества измененных файлов Показать больше