[Build and Deploy Staging Site] Publish from microsoft/azuretipsandtricks-private:main/src/blog

This commit is contained in:
erjuntun75 2022-06-17 20:54:31 +00:00
Родитель 8abbdd3a6b
Коммит 76a87733d9
355 изменённых файлов: 32681 добавлений и 0 удалений

45
blog/blog/tip1.md Normal file
Просмотреть файл

@ -0,0 +1,45 @@
---
type: post
title: "Tip 1 - Use Keyboard Shortcuts in the Azure Portal"
excerpt: "Learn how to use developer keyboard shortcuts for use in the Azure Portal"
tags: [Management and Governance]
date: 2017-08-20 17:00:00
---
::: tip
:bulb: Learn more : [Azure Portal Documentation](https://docs.microsoft.com/azure/azure-portal?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use keyboard shortcuts in the Azure portal](https://www.youtube.com/watch?v=A0uXwdLDzf4&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=1?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Use Keyboard Shortcuts in the Azure Portal
#### Azure Portal Keyboard Shortcuts
Developers love keyboard shortcuts and there are plenty keyboard shortcuts in the Azure platform. You can see a list by going to Help and then Keyboard Shortcuts in the portal as shown below.
<img :src="$withBase('/files/azuretip1.gif')">
You will see that you have the following keyboard shortcuts available:
Keyboard shortcuts
ACTIONS
CTRL+/ Search blade menu items
G+/ Search resources (global)
G+N Create a new resource
G+B Open the 'More services' pane
NAVIGATION
G+, Move focus to command bar
G+. Toggle focus between top bar and side bar
GO TO
G+D Go to dashboard
G+A Go to all resources
G+R Go to resource groups
G+number Open the item pinned to the favorites bar at this position

19
blog/blog/tip10.md Normal file
Просмотреть файл

@ -0,0 +1,19 @@
---
type: post
title: "Tip 10 - Quickly Connect to a Linux VM with SSH"
excerpt: "Learn how to quickly connect to a Linux VM with SSH"
tags: [Virtual Machines]
date: 2017-09-04 17:00:00
---
::: tip
:bulb: Learn more : [Azure Virtual Machines](https://docs.microsoft.com/azure/virtual-machines/?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to quickly connect to a Linux VM with SSH](https://www.youtube.com/watch?v=7pmn6luCwQ4&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=8?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Quickly Connect to a Linux VM with SSH
You can quickly connect to an existing Linux Virtual Machine by navigating to the "Virtual Machine" blade in the Azure Portal. Once you are on the page, click the "Connect" button at the top of the page. It will provide a command that you can copy and paste into BASH or anywhere that supports SSH. After you paste the command, then it will connect to your Linux Virtual Machine. Provide your username and password and you are logged into your new Virtual Machine. In the example below, I logged into my Ubuntu Linux VM.
<img :src="$withBase('/files/azuretip10.gif')">

144
blog/blog/tip100.md Normal file
Просмотреть файл

@ -0,0 +1,144 @@
---
type: post
title: "Tip 100 - How to create an email subscription with Azure Functions - sending emails"
excerpt: "Learn how to generate a weekly digest email for a blog using Azure Functions, SendGrid and Azure Storage"
tags: [Serverless]
date: 2018-03-04 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### How to create an email subscription with Azure Functions - Sending Emails
#### Overview
**Full Source Code** The source code for the app can be found on [GitHub](https://github.com/mbcrump/EmailSubscription?WT.mc_id=github-azuredevtips-azureappsdev)
This blog post is part of a series on how to generate a weekly digest email for a blog using Azure Functions, SendGrid and Azure Storage.
* [Part 1 - What we're going to build and how to build it](https://microsoft.github.io/AzureTipsAndTricks/blog/tip97.html)
* [Part 2 - Storing Emails using Azure Table Storage](https://microsoft.github.io/AzureTipsAndTricks/blog/tip98.html)
* [Part 3 - Writing the Frontend with HTML5 and jQuery](https://microsoft.github.io/AzureTipsAndTricks/blog/tip99.html)
* [Part 4 - Sending Emails with Sendgrid and Azure Functions (this post)](https://microsoft.github.io/AzureTipsAndTricks/blog/tip100.html)
We're trying to build a Email Subscription similar to the following. If you want to catch up, then read the previous posts.
<img :src="$withBase('/files/emailsub1.png')">
#### Generating and Sending Emails
In our last post, we left off by creating a frontend that used HTML5, jQuery and some light CSS work. When the user filled out the form and clicked **Submit**, then it would check to ensure the email is valid and then use an AJAX call to POST the data to our Azure Function that we wrote in part 2. Today, we'll wrap things up by using SendGrid, C# and Azure Functions to send emails every Sunday at 9:30AM.
#### Use the Azure Functions Template inside of Visual Studio
Return to the project we created earlier and right-click the project and select **Add Item** and select **Azure Functions**. Now give it a name such as **SendEmail** and select **Timer Trigger** and provide the following schedule **0 30 9 * * SUN**.
<img :src="$withBase('/files/emailsub5.png')">
We'll begin by declaring the feedurl and looping through the feed to collect the last 7 days worth of blog posts and append them to a string. Here we are using SyndicationClient to make easier work of everything. We could also use StringBuilder but for now this will do.
```csharp
string feedurl = "https://www.michaelcrump.net/feed.xml";
string last7days = "";
XmlReader reader = XmlReader.Create(feedurl);
SyndicationFeed feed = SyndicationFeed.Load(reader);
reader.Close();
last7days = last7days + "<b>New updates in the last 7 days:</b><br><br>";
foreach (SyndicationItem item in feed.Items)
{
if ((DateTime.Now - item.PublishDate).TotalDays < 7)
{
last7days = last7days + "<a href=\"" + item.Links[0].Uri + "\')">" + item.Title.Text + "</a><br>";
}
}
```
We'll now grab a list of our **EmailSubscribers** that is in our Azure Storage Table.
```csharp
var serviceClient = new TableServiceClient(ConfigurationManager.AppSettings["TableStorageConnString"]);
TableClient table = serviceClient.GetTableClient("MCBlogSubscribers");
table.CreateIfNotExists();
```
We'll need to add this helper method outside of the **Run** method we are currently in. It will search for the **Partition Key** that matches what we sent in our POST request in post #2.
```csharp
public static List<string> GetAllEmailAddresses(TableClient table)
{
var retList = new List<string>();
var queryResults = table.Query<EmailEntity>(ent => ent.PartitionKey.Equals("SendEmailToReaders"));
foreach (EmailEntity emailname in queryResults)
{
retList.Add(emailname.EmailAddress);
}
return retList;
}
```
Now we need to send our emails. This is where it gets slightly tricky as we need to use the **X-SMTPAPI** header to hide the email address of all our users. Search **NuGet** and add **Sendgrid.SmtpApi** to your references. We'll need to get our SendGrid Username and Password and pass them in the credentials.
```csharp
var header = new Header();
SmtpClient client = new SmtpClient();
client.Port = 587;
client.Host = "smtp.sendgrid.net";
client.Timeout = 10000;
client.DeliveryMethod = SmtpDeliveryMethod.Network;
client.UseDefaultCredentials = false;
client.Credentials = new System.Net.NetworkCredential(ConfigurationManager.AppSettings["SendGridUserName"], ConfigurationManager.AppSettings["SendGridSecret"]);
```
Now we need to form our email message. We'll pull the list of email addresses from our Azure Storage Table and force the HTML view to ensure our links are clickable and tell those that don't have HTML enabled, to enable it. :)
Finally, we'll send the email asynchronously and dispose of our client.
```csharp
MailMessage mail = new MailMessage();
List<string> recipientlist = GetAllEmailAddresses(table);
header.SetTo(recipientlist);
mail.From = new MailAddress("michael@michaelcrump.net", "Azure Tips and Tricks");
mail.To.Add("no-reply@michaelcrump.net");
mail.Subject = "Weekly Digest for MichaelCrump.net Blog";
mail.BodyEncoding = Encoding.UTF8;
mail.SubjectEncoding = Encoding.UTF8;
AlternateView htmlView = AlternateView.CreateAlternateViewFromString(last7days);
htmlView.ContentType = new System.Net.Mime.ContentType("text/html");
mail.AlternateViews.Add(htmlView);
mail.Body = "Please enable HTML in order to view the message";
mail.Headers.Add("X-SMTPAPI", header.JsonString());
await client.SendMailAsync(mail);
mail.Dispose();
```
Before we publish the updates to our Azure Function, we'll need to ensure that users can unsubscribe easily. I was originally going to handle this myself, but it is super simple with SendGrid.
Log into your [SendGrid](https://app.sendgrid.com/) account and go to **Settings** and then **Tracking** and you'll see **Subscription Tracking**. If you turn this on, then it will add the unsubscribe link for you and manage those users. NEAT!
<img :src="$withBase('/files/emailsubtracking.png')">
Now is a great time to go ahead and publish our Azure Function. Simply right click the project name and select Publish, then Publish again as shown below.
<img :src="$withBase('/files/emailsub8.png')">
Once it deploys, if you click on the **SendEmail** function, then you can run it (note you can also run it inside of Visual Studio).
It should say that it compeleted successfully, and now go check your email and it should be working.
<img :src="$withBase('/files/emailcompletedsuccessfully.png')">
And we're done! If something isn't working then check the source code for the app on [GitHub](https://github.com/mbcrump/EmailSubscription?WT.mc_id=github-azuredevtips-azureappsdev) and if you have any questions then ping me on [twitter](http://twitter.com/mbcrump?WT.mc_id=twitter-azuredevtips-azureappsdev). You should follow me btw, I might have another tip and trick to share!

99
blog/blog/tip101.md Normal file
Просмотреть файл

@ -0,0 +1,99 @@
---
type: post
title: "Tip 101 - Part 1 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-11 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 1 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
#### Prerequisites
Please download the required software listed below. If you already have the software and a different version, that is no problem at all, but probably best to use the latest version. If you have access to paid versions of the software, these are fine as well.
The tutorial can be completed for free, but will require a Azure account. Note: The Azure account asks you for a credit card number, but will not charge you at all or "roll into a paid version", it simply expires when your trial month is up.
* [Azure](https://www.azure.com) - Get a free account - You get $200 USD credit a month, these are "free" credits on a trial account and cost you nothing.
* Visual Studio 2017 - As a note, the instructions will be using Visual Studio 2017. You can get Visual Studio 2017 Community for free here: [https://www.visualstudio.com/vs/community/](https://www.visualstudio.com/vs/community/).
* SQL Server Express: [https://www.microsoft.com/sql-server/sql-server-editions-express](https://www.microsoft.com/sql-server/sql-server-editions-express)
* SQL Server Management Studio: [https://docs.microsoft.com/sql/ssms/download-sql-server-management-studio-ssms](https://docs.microsoft.com/sql/ssms/download-sql-server-management-studio-ssms)
* Basic understanding of coding & installation
#### Local Setup - SQL Server
The local setup will start with setting up your database. You will then open the solution in Visual Studio. You need to connect the API project to your SQL Server. Then connect your front end Angular project to the API project.
1.) We'll be working with an existing app. So download [it here](https://github.com/catenn/ToDoList/archive/master.zip?WT.mc_id=github-azuredevtips-azureappsdev) and extract it to a folder on your hard drive.
2.) Open SQL Server Management Studio (SSMS) and click the dropdown on Server Name and choose **Browse for more**.
<img :src="$withBase('/files/e2e-browseformore.jpg')">
3.) Choose the Server name of your instance. This name most likely will be in the format **ComputerName\ServerName**.
<img :src="$withBase('/files/e2e-servers.jpg')">
4.) Choose Windows Authentication. Save your **ComputerName\ServerName** in a Notepad, we will need this again later. Hit Connect.
<img :src="$withBase('/files/e2e-sqllogin.jpg')">
5.) Open the folder that we downloaded earlier by double clicking **ToDoList.sln**. It should open in Visual Studio 2017.
6.) Right click on the ToDoListDb project and choose **Publish**.
<img :src="$withBase('/files/e2e-slnexplorerpublish.jpg')">
7.) On the modal, click Edit:
<img :src="$withBase('/files/e2e-editdbconnection.jpg')">
8.) For Server name, take the Notepad value you saved for **ComputerName\ServerName** and enter it here. Make sure the Database Name is ToDoListDb, but that should be filled in for you. Click OK.
<img :src="$withBase('/files/e2e-connection.jpg')">
9.) Don't edit any other values on this modal and just hit **Publish**. Note: Test Connection will not work.
<img :src="$withBase('/files/e2e-publishdb.jpg')">
10.) You will see the publish begin.
<img :src="$withBase('/files/e2e-publish1.jpg')">
11.) It is done when you see this:
<img :src="$withBase('/files/e2e-publish2.jpg')">
12.) Go back to **SQL Server Management Studio** and hit refresh:
<img :src="$withBase('/files/e2e-refresh.jpg')">
13.) Your SQL database should look something like this now.
<img :src="$withBase('/files/e2e-sqlverify.jpg')">
Congrats, we now have our SQL database setup locally. Come back and we'll continue setting up Visual Studio and Swagger.

101
blog/blog/tip102.md Normal file
Просмотреть файл

@ -0,0 +1,101 @@
---
type: post
title: "Tip 102 - Part 1 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-12 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 2 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
#### Local Setup - Visual Studio to talk to our SQL Database
1.) Open the project in Visual Studio by double clicking **ToDoList.sln**, if it is not already open from Part 1.
2.) Open the **Web.config** file of the **ToDoListDataAPI** project. Make sure you are in the right project.
<img :src="$withBase('/files/e2e-webconfig.jpg')">
3.) Edit the "**ComputerName\ServerName**" highlighted and change it to your computer & SQL server name that you saved in a Notepad.
<img :src="$withBase('/files/e2e-webconfig2.jpg')">
Mine looks like:
```
<add name="todoItems" connectionString="Server=MICHAELCRUM0FD9\SQLEXPRESS;Initial Catalog=todolistdb;MultipleActiveResultSets=False;Integrated Security=True" providerName="System.Data.EntityClient" />
```
4.) Save the file and set the ToDoListDataAPI project to Set as Startup project by right clicking on the project and choosing that option.
<img :src="$withBase('/files/e2e-setstartup.jpg')">
5.) Hit F5 or Run inside of any browser.
<img :src="$withBase('/files/e2e-run.jpg')">
Note: If you get **The Web server is configured to not list the contents of this directory.**, then just proceed to step 6.
6.) Add /swagger to the URL if it is not already there for you. The page should look like this if everything is working properly:
<img :src="$withBase('/files/e2e-swagger.jpg')">
7.) Click Show/Hide to get a full list of APIs available to the application.
<img :src="$withBase('/files/e2e-showhide.jpg')">
8.) Click on **Get** (the first one in the list) to expand it. Click **Try it out!**. If you get a 200 Response Code, it worked! Also take note of the URL port number in your browser.
<img :src="$withBase('/files/e2e-get.jpg')">
<img :src="$withBase('/files/e2e-get1.jpg')">
9.) Switch back over to Visual Studio and go to the **Web.config** in the **ToDoListAngular** project.
<img :src="$withBase('/files/e2e-angularprojwebconfig.jpg')">
10.) Make sure that the port number matches the port from the last step.
<img :src="$withBase('/files/e2e-angularwebconfig.jpg')">
11.) Set the **ToDoListAngular** project as Startup Project.
<img :src="$withBase('/files/e2e-angularstart.jpg')">
12.) Hit F5 or hit Run
13.) You should see the Angular app running in your web browser:
<img :src="$withBase('/files/e2e-todohome.jpg')">
14. Click on Todo list menu and add an entry. Try editing it and deleting it. You can put breakpoints in the code to learn more about how it is performing the CRUD operations. You can also check the SQL database to see the entries.
<img :src="$withBase('/files/e2e-todolist.gif')">
<img :src="$withBase('/files/e2e-sql1.jpg')">

91
blog/blog/tip103.md Normal file
Просмотреть файл

@ -0,0 +1,91 @@
---
type: post
title: "Tip 103 - Part 3 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-13 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 3 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
#### Local Setup - Working with Swagger
If you noticed in the last post, we started working with Swagger.
**What is Swagger UI?** is a collection of HTML, Javascript, and CSS assets that dynamically generate beautiful documentation from a Swagger-compliant API.
The nice thing about Swagger is that you can create an existing **Web API** app using the VS Templates and add **Swagger** via Nuget.
<img :src="$withBase('/files/e2e-swagger1.jpg')">
Then if you spin up a project, you simply add **/swagger** to see the UI. In the example below, we've already added it and supplied the comments in the app to where it recognizes it. This makes testing APIs very simply and it works in real-time, meaning if you run a **POST**, then you can immediately check your database for the new record.
Learn more about Swagger [here](https://github.com/swagger-api/swagger-ui?WT.mc_id=github-azuredevtips-azureappsdev).
#### Continuing where we left off
1.) Open the project in Visual Studio by double clicking **ToDoList.sln**, if it is not already open from the previous parts. Navigate to the **ToDoListDataAPI** project.
2.) Set the ToDoListDataAPI project to Set as Startup project by right clicking on the project and choosing that option and run the application.
<img :src="$withBase('/files/e2e-setstartup.jpg')">
3.) Add "/swagger" to the end of your URL if it is not already there, you should see a page like this:
<img :src="$withBase('/files/e2e-swagger.jpg')">
Click on the **Show/Hide** button.
4.) Run a **GET** which is the first API on the page /api/ToDoList, you should see:
<img :src="$withBase('/files/e2e-02.png')">
5.) Run a **POST**, click where the screen-shot shows, and fill in an ID with a random number and any description you want and then click **Try it out**.
<img :src="$withBase('/files/e2e-03.png')">
6.) Run a **GET** again, you should see your added value:
<img :src="$withBase('/files/e2e-04.png')">
7.) Run a **PUT**, again click to get the format from where it's shown in the screen-shot and modify an existing record's description.
<img :src="$withBase('/files/e2e-05.png')">
8.) Try to run a **GET** by ID, use 1 for example:
<img :src="$withBase('/files/e2e-06.png')">
9.) Switch back to SQL Server Management Studio (and log in if you need to) and choose **Select Top 1000 Rows** on the **ToDoListDb** db to see the data.
<img :src="$withBase('/files/e2e-sqlselect.jpg')">
10.) Your SQL Server Management Studio table should look like this now:
<img :src="$withBase('/files/e2e-sqlserver.jpg')">

78
blog/blog/tip104.md Normal file
Просмотреть файл

@ -0,0 +1,78 @@
---
type: post
title: "Tip 104 - Part 4 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-18 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 4 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
# Deploy SQL Database to Azure SQL
1.) Log into the Azure Portal at [portal.azure.com](https://portal.azure.com) if you aren't already logged in.
2.) Create a new SQL Database. Click **New**, select **Databases**, choose **SQL Database**, then lastly hit **Create**.
<img :src="$withBase('/files/e2e-01SelectSQLDBPortal.png')">
3.) Click on **Server and Pricing Tier** to get a slideout options. In the **Server slideout**, make sure you create a username and password and keep it somewhere safe as you will need this to login using SQL Server Management Studio (SSMS). In the **Pricing Tier**, change to **Basic** so it only costs about $5 per month. Your screen will look approximately like this:
<img :src="$withBase('/files/e2e-02DBOptions.png')">
4.) Click on **All Resources** on the left menu. You should see your new SQL Server and SQL Database. Click on the **SQL Database**.
<img :src="$withBase('/files/e2e-03AllResources.png')">
5.) On the **Overview** tab. Copy the Server name to somewhere safe. Click on the **Show Connection Strings** and copy it somewhere safe.
<img :src="$withBase('/files/e2e-05DatabaseOverview.png')">
The **connection string** will look like this (save this in a Notepad for the web.config in the solution later):
<img :src="$withBase('/files/e2e-06ConnectionString.png')">
6.) Open **SSMS** and enter the **Server name, username, and password** as shown below.
<img :src="$withBase('/files/e2e-07SSMS.png')">
**Note** if you cannot login, please go to the Portal and add your **IP address** by clicking on the **SQL Server** you created, then going to **Firewall**. You may also be able to set the firewall prompt through the SQL Server tool.
<img :src="$withBase('/files/e2e-10.PNG')">
7.) Go back to [Day 1](tip101/) and repeat steps 6-13, except use the **Azure SQL Server name** that we created earlier instead of your local DB.
8.) Once the DB has been saved to Azure, go into the **connection strings** of your API project that can be found in the **web.config** as shown below.
<img :src="$withBase('/files/e2e-webconfig.jpg')">
9.) In the **web.config**, change your **connection string** so that it points to your **Azure SQL Server connection string** (that you should have saved into Notepad earlier). Make sure you add your username and password for your Azure SQL Server into the connection string.
<img :src="$withBase('/files/e2e-webconfig3.jpg')">

93
blog/blog/tip105.md Normal file
Просмотреть файл

@ -0,0 +1,93 @@
---
type: post
title: "Tip 105 - Part 5 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-19 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 5 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
We will use Visual Studio to deploy to Azure in this tutorial. This can also be done by packaging up the files and uploading manually to Azure. Or, you could do it via an automated CI/CD (Build and Release) process which will be shown in upcoming posts.
#### Front-end Angular + Back-end API projects
Before we begin, I'm assuming you're using the same email address for VSTS that you are using for Azure.
1.) Open the solution file in Visual Studio, if it is not already opened. Login to Visual Studio with the same email address that you used to signup for your Azure account.
2.) Right click on the API project and choose Publish.
<img :src="$withBase('/files/e2e-08.png')">
3.) Choose an App Service.
<img :src="$withBase('/files/e2e-09.png')">
4.) Fill in all the settings: add in a name, choose the subscription, create a new resource group. For the App Service Plan: choose a name, the closest location to you and Free. Then on the main modal click **Create**.
<img :src="$withBase('/files/e2e-10-1.png')">
If you are on the **ToDoListAPI** project, make sure you have API selected.
<img :src="$withBase('/files/e2e-18.png')">
If you are on the **ToDoListAngular** project, make sure you have Web App selected.
<img :src="$withBase('/files/e2e-19.png')">
<img :src="$withBase('/files/e2e-11.png')">
5.) Make sure it shows up in the Azure Portal after giving it a few minutes to publish. Click on the API project to go to the overview (red arrow).
<img :src="$withBase('/files/e2e-12.png')">
6.) Copy the URL of the API App Service as highlighted in the screen-shot.
<img :src="$withBase('/files/e2e-13.png')">
7.) Let's connect the front end to the API project. Open up the **ToDoListAngular** solution. Go to the **web.config** file of your front end **ToDoListAngular** project. Paste in the URL from the previous step.
<img :src="$withBase('/files/e2e-14.png')">
8.) Let's do the same publishing to Azure for the front end project.
**Repeat steps 2-5, BUT do it on the front end ToDoListAngular project. Make sure on Step 4 you choose the right option of "Web App" for the Angular Web project.**
9.) Verify once you are done Publishing that it is in the Azure Portal. Click on the App Service (red arrow in screenshot).
<img :src="$withBase('/files/e2e-15.png')">
10.) On the **Overview** page, get the URL and copy it.
<img :src="$withBase('/files/e2e-16.png')">
11.) Paste the URL into your browser and click on the **Todo** tab to see the Todo list. You should now have a working Azure App Service Web front end talking to an Azure App Service API which connects to Azure SQL.
<img :src="$withBase('/files/e2e-17.png')">

75
blog/blog/tip107.md Normal file
Просмотреть файл

@ -0,0 +1,75 @@
---
type: post
title: "Tip 107 - Part 6 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-25 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 6 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
**Pre-requisite:** Install [Git](https://git-scm.com/downloads)
#### Create the VSTS Account
1.) Sign up for VSTS if you do not have an account by clicking the [Sign Up button](https://www.visualstudio.com/team-services/) on the homepage.
Make sure to use the same email address that you used for Azure.
2.) Create a new VSTS Account by hitting the button on the top right.
<img :src="$withBase('/files/blog5-00.png')">
3.) After your account is created, then you'll see the following:
<img :src="$withBase('/files/blog5-mc1.png')">
4.) You can optionally rename the project by clicking on the **Gear** icon and clicking on the **Project Name**. Note that if you do this, it updates all of your version control paths, work items, queries, and other team project artifacts to reflect the new name.
<img :src="$withBase('/files/blog5-mc2.png')">
5.) Click on the **push an existing repository from the command line** and copy the copy to notepad.
<img :src="$withBase('/files/blog5-mc3.png')">
6.) If you've already installed [Git](https://git-scm.com/downloads), then run navigate to your Visual Studio solution and run the following commands in order. You'll need to change the 4th command to the URL you copied earlier
```
git init
git add .
git commit -m "Initial commit"
git remote add origin https://YOURPROJECT.visualstudio.com/_git/AzureWebApp
git push -u origin --all
```
<img :src="$withBase('/files/blog5-mc04.png')">
7.) Go to VSTS and click **Code**, you should see your code there:
<img :src="$withBase('/files/blog5-mc05.png')">

89
blog/blog/tip108.md Normal file
Просмотреть файл

@ -0,0 +1,89 @@
---
type: post
title: "Tip 108 - Part 7 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-26 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 7 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
#### Two ways to skin a cat
We have one Visual Studio Solution and 2 web projects that need to be deployed to Azure. We can tackle this in different ways depending on the operation of our team.
1. If we think that one project will be worked on by one team, and another by a different team, we could separate the code into two solutions and upload both to VSTS. We could also keep them in the same solution and have two Build (CI) definitions, one to build the Angular project and one to build the API project. Then we could follow it with two separate Release (CD) definitions so that each part can be separately deployed.
2. Some enterprises will choose to put all pieces of their solution through as a whole, and some places only want to deploy one part at a time. It depends on the complexity of the code, the amount of CI/CD setup you are willing to setup, how the solution(s)/project(s) are divided, and performance requirements (speed/size of CI/CD process).
Note: For the simplicity of this little project and to just teach the basics, I am choosing to group these as one solution that gets built as a whole, then I will have one Release that deploys both parts.
#### Getting Started
1.) Make sure that you've completed the following two steps before moving forward:
* [04a Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [04b Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
The resources must exist before you can complete the CI/CD steps.
2.) Click on the **Build** tab then hit **New Build Definition**.
<img :src="$withBase('/files/blog6-mc01.jpg')">
3.) Leave all the **defaults** and hit **Continue**.
<img :src="$withBase('/files/blog6-mc02.jpg')">
4.) Select the **ASP.NET template**, mouseover it, and hit **Apply**.
<img :src="$withBase('/files/blog6-mc3.jpg')">
5.) You should see the following on the left, choose **Process** first.
<img :src="$withBase('/files/blog6-mc3b.jpg')">
6.) Under **process**, it will populate a name.
<img :src="$withBase('/files/blog6-mc4.jpg')">
7.) Choose **Save & Queue**.
<img :src="$withBase('/files/blog6-mc5.jpg')">
8.) A modal will popup, leave the **defaults**, hit **Save & Queue**.
<img :src="$withBase('/files/blog6-mc6.jpg')">
9.) You will see a notification show up with your **Build number/name**, click on it:
<img :src="$withBase('/files/blog6-mc7.jpg')">
10.) You will be brought to a **build screen**, wait for it to complete and it will show that the "Build succeeded".
<img :src="$withBase('/files/blog6-mc8.jpg')">

127
blog/blog/tip109.md Normal file
Просмотреть файл

@ -0,0 +1,127 @@
---
type: post
title: "Tip 109 - Part 8 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-03-27 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 8 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
# VSTS Continuous Deployment
1.) On the top menu of **VSTS**, click on **Build and Release**, then choose **Releases** from the drop-down:
<img :src="$withBase('/files/blog7-mc9.jpg')">
2.) On the left, click the **"+" button** and choose **Create release definition**.
<img :src="$withBase('/files/blog7-mc10.jpg')">
3.) Choose the **Azure App Service Deployment template** and hit **Apply**.
<img :src="$withBase('/files/blog7-mc11a.jpg')">
4.) On the right side, name the environment **Production**, then click the **X** on the top left to close this.
<img :src="$withBase('/files/blog7-mc12a.jpg')">
5.) On the left, click **Add artifact**.
<img :src="$withBase('/files/blog7-mc12b.jpg')">
6.) For the **artifact**, choose **Build**, and choose the **FE-Angular-CI** (or whatever it is named) build, hit **Add**.
<img :src="$withBase('/files/blog7-mc12c.jpg')">
7.) Click **Tasks**.
<img :src="$withBase('/files/blog7-mc13.jpg')">
8.) Rename your **Release definition** by clicking by the **name**:
<img :src="$withBase('/files/blog7-mc14.jpg')">
9.) Choose your **Azure Subscription**. Choose the **Web App** type. Choose the **App Service name** you used for the Angular web app.
<img :src="$withBase('/files/blog7-mc15.jpg')">
10.) Scroll down more on the same page, click the 3 dots under **Package or folder**.
<img :src="$withBase('/files/blog7-mc16.jpg')">
11.) On the modal, choose **ToDoListAngular.zip**, then hit **OK**.
<img :src="$withBase('/files/blog7-mc17.jpg')">
12.) Click the **"+"** button to add an **additional task.** Choose the **Azure App Service Deploy task**.
<img :src="$withBase('/files/blog7-mc17a.jpg')">
13.) Select the **new task** on the left. Then, on the right add your **Azure subscription** again. Choose **API App**. Select the **API App Service** that you created in the Azure Portal.
<img :src="$withBase('/files/blog7-mc18a.jpg')">
14.) Scroll down on the same task, click the 3 dots under **Package or folder**.
<img :src="$withBase('/files/blog7-mc18b.jpg')">
15.) On the modal, choose **ToDoListDataAPI.zip**, then hit **OK**.
<img :src="$withBase('/files/blog7-mc18c.jpg')">
16.) Click **Save** at the top.
<img :src="$withBase('/files/blog7-mc19.jpg')">
17.) Click **Release**, then **Create Release**.
<img :src="$withBase('/files/blog7-mc20.jpg')">
18.) A notification with the name of the release will show up, click on this:
<img :src="$withBase('/files/blog7-mc21.jpg')">
19.) Click **Logs**, then it will bring you to the logs which will display live results.
<img :src="$withBase('/files/blog7-mc22.jpg')">
20.) Wait until it finishes and shows **Success**.
<img :src="$withBase('/files/blog7-mc23.jpg')">
21.) Go to your **Azure Portal**. Choose your resource from **All Resources**, click on the **name** of the resource.
<img :src="$withBase('/files/blog7-mc24.jpg')">
22.) View the **overview page** to get the URL:
<img :src="$withBase('/files/blog7-mc25.jpg')">
23.) Your completed page should look like this (hit the link with the red arrow to go to the To Do list!):
<img :src="$withBase('/files/blog7-mc26.jpg')">

18
blog/blog/tip11.md Normal file
Просмотреть файл

@ -0,0 +1,18 @@
---
type: post
title: "Tip 11 - Access Cloud Shell from within Microsoft Documentation"
excerpt: "Learn how to quickly access Azure Cloud Shell from within the Microsoft Docs"
tags: [Management and Governance]
date: 2017-09-05 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to access Cloud Shell from within Microsoft docs](https://www.youtube.com/watch?v=JSWji3bPDJc&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=9?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Access Cloud Shell from within Microsoft Documentation
Most everyone is aware that you can access the Azure Cloud Shell from within the [Azure Portal](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev) or from mobile apps such as iOS and Android. But a little known fact is that a lot of the Azure documentation pages include an embedded Cloud Shell experience that can be found with the "Try It" button as shown below.
<img :src="$withBase('/files/azuretip11.gif')">

54
blog/blog/tip110.md Normal file
Просмотреть файл

@ -0,0 +1,54 @@
---
type: post
title: "Tip 110 - Part 9 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD"
excerpt: "A tutorial on creating a To-Do list app with .NET and using Azure App Service, API Apps, SQL, VSTS and CI/CD"
tags: [Web]
date: 2018-04-01 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Part 9 - An end to end scenario with Azure App Service, API Apps, SQL, VSTS and CI/CD
#### A multi-part series showing an end-to-end possibility
[Crystal Tenn](https://www.linkedin.com/in/crystal-tenn-6a0b9b67/) and I teamed up to bring an E2E blog series that features an Azure App Service website that communicates with an API project, which communicates to an Azure SQL back-end. The app is a traditional To-Do application based on an existing sample that used ADO.NET, but adapted for Azure deploy and to Visual Studio 2017. The technology/tooling stack is Visual Studio, VSTS, C#, Angular, and SQL.
The process for the app is described below. In Visual Studio, you will start out with a working To Do list application. You will push the code to VSTS (Visual Studio Team Services). Then you will create a CI/CD (Continuous Integration/Continuous Delivery) process in order to deploy to Azure. In Azure you will create 3 resources: Azure Web App, Azure API App, and an Azure SQL Server through this exercise.
* [Local Setup - SQL Server](https://microsoft.github.io/AzureTipsAndTricks/blog/tip101.html) - Locally connect a front-end website to an API, and connect the API to a SQL Server.
* [Local Setup - Visual Studio and Swagger](https://microsoft.github.io/AzureTipsAndTricks/blog/tip102.html) - Continue Part 1 and use a local instance of Visual Studio and Swagger to communicate to our db.
* [Swagger - Learn how to use Swagger for API management](https://microsoft.github.io/AzureTipsAndTricks/blog/tip103.html)
* [Azure Deployment - Deploy the SQL database to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip104.html)
* [Azure Deployment - Deploy the front-end Web App and API App to Azure manually](https://microsoft.github.io/AzureTipsAndTricks/blog/tip105.html)
* [Adding the project to VSTS with Git](https://microsoft.github.io/AzureTipsAndTricks/blog/tip107.html)
* [VSTS Continuous Integration - Setup a CI Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip108.html)
* [VSTS Continuous Deployment - Setup a CD Process in VSTS](https://microsoft.github.io/AzureTipsAndTricks/blog/tip109.html)
* [Cleanup - Cleanup and delete the Azure resources created in this tutorial](https://microsoft.github.io/AzureTipsAndTricks/blog/tip110.html)
Keep in mind : While we won't be going into the deep specifics of how to code, you should be able to use this guide to look at several parts of the Azure technology stack and how you can best implement them in your organization.
<img :src="$withBase('/files/todolist-diagram.png')">
#### Clean-up Resources
We've finally made it to the end of the series and I wanted to use this post to remind you to delete the resources that you created in Azure during this tutorial or during development.
If you want to remove your resources from Azure now, please see the below instructions:
1.) Go to the **Azure Portal**. Click on the **resource group name** (whatever you named it).
2.) Click **All Resources**.
<img :src="$withBase('/files/blog10-mc01.jpg')">
3.) Click **Delete** Resource Group.
<img :src="$withBase('/files/blog10-mc02.jpg')">
4.) Confirm the **name**, and click **Delete**.
<img :src="$withBase('/files/blog10-mc03.jpg')">

61
blog/blog/tip111.md Normal file
Просмотреть файл

@ -0,0 +1,61 @@
---
type: post
title: "Tip 111 - Deployment Slots for Web Apps using the Azure CLI"
excerpt: "Learn how to work with deployment slots with this quick tutorial"
tags: [Management and Governance, Web]
date: 2018-04-02 17:00:00
---
::: tip
:bulb: Learn more : [Azure Command-Line Interface (CLI)](https://docs.microsoft.com/cli/azure?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
This post was brought to you by [Lohith (kashyapa)](https://www.twitter.com/kashyapa?WT.mc_id=twitter-azuredevtips-azureappsdev).
### Deployment Slots for Web Apps using the Azure CLI
#### What are Deployment Slots?
Deployment Slots are a feature of Azure App Service. They actually are live apps with their own hostnames. You can create different slots for your application (for e.g. Dev, Test or Stage). The Production slot is the slot where your live app resides. With deploymet slots, you can validate app changes in staging before swapping it with your production slot. You can read more about deployment slots [here](https://docs.microsoft.com/azure/app-service/web-sites-staged-publishing "Set up staging environments in Azure App Service").
#### Pre-Requisites
* Microsoft Azure Subscription (Sign up for [free](https://azure.microsoft.com/free?WT.mc_id=azure-azuredevtips-azureappsdev)/ "Create your Azure free account today"))
* Microsoft Azure CLI (Install from [here](https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest "Install Azure CLI 2.0"))
#### Log in to Azure
Before executing any Azure CLI commands, you will need to login first.
* Open **Command Prompt** or a **Powershell** session
* Enter following command:
`az login`
The command will prompt you to log in with an authentication code via a website.
#### Listing Deployment Slots
To list **deployment slots** in an **Azure App Service**, execute the following command:
`az webapp deployment slot list -n "web app name" -g "resource group name"`
#### Creating Deployment Slot
To create a **new deployment slot** in an Azure App Service, execute the following command:
`az webapp deployment slot create -n "web app name" -g "resource group name" -s "deployment slot name"`
#### Swapping Deployment Slot
To **swap a deployment slot** in an Azure App Service, execute the following command:
`az webapp deployment slot swap -n "web app name" -g "resource group name" -s "source slot name" --target-slot "target slot"`
#### Deleting a Deployment Slot
To **delete a deployment slot** in an Azpp Service, execute the following command:
`az webapp deployment slot delete -n "web app name" -g "resource group name" -s "deployment slot name"`
#### Conclusion

183
blog/blog/tip112.md Normal file
Просмотреть файл

@ -0,0 +1,183 @@
---
type: post
title: "Tip 112 - Quick and Dirty User Authentication with Azure Web Apps and MVC5"
excerpt: "A tutorial on how to do quick and dirty user authentication with Azure Web Apps and MVC 5"
tags: [Web]
date: 2018-04-08 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Quick and Dirty User Authentication with Azure Web Apps and MVC5
#### What is Quick and Dirty User Authentication?
When I'm building out a website with MVC5 and Azure, it typically lands on *.azurewebsites.net and generally I don't need any user authentication. But if I need it, I typically need 1 administrator account and 0 users. So why didn't I just go to **Settings** -> **Authentication/Authorization** and turn on AAD or create a gmail, twitter, etc. login? Turns out that I could have set something like that up (after spending time researching how), but I really just needed a layer of authentication for myself (the one and only administrator) and prevent anyone else without that password to my site. I didn't want to use any of the built-in authentications methods of ASP.NET either as I didn't want/need a database to maintain.
#### My Requirements
* My requirements are a login page that has a username and password.
* I will store the actual username and password as a setting in Azure App Settings for my web app.
* I'll use Forms authentication.
* I want to do this with the FREE Tier of Azure Web App Service
#### How I roll Single User Authentication with Azure Web Apps and MVC5
Part 1:
* I create a new MVC5 App in Visual Studio.
* In my **web.config**, I add the following lines inside my **System.Web** tags to turn on forms-based authentication configuration.
```asciidoc
<authentication mode="Forms")">
<forms loginUrl="~/Account/LogOn" timeout="30"/>
</authentication>
```
Note: Make sure you are using the root `web.config`.
<img :src="$withBase('/files/usersecret1.png')">
Part 2:
* Add the following filter to register the **AuthorizeAttribute** inside the `~\App_Start\FilterConfig.cs`.
```csharp
filters.Add(new AuthorizeAttribute());
```
Part 3:
* Create a ViewModel and give it the name of **LogOnViewModel**.
* I would put this in the following folder `Models\Account\LogOnViewModel.cs`.
* We'll use DataAnnotations and require a Username and Password.
```csharp
using System.ComponentModel.DataAnnotations;
namespace MVCMobileApp.Models.Account
{
public class LogOnViewModel
{
[Required]
[Display(Name = "User name")]
public string UserName { get; set; }
[Required]
[DataType(DataType.Password)]
[Display(Name = "Password")]
public string Password { get; set; }
}
}
```
Part 4:
* We need to add a controller named **AccountController** in our Controllers folder.
* We'll allow anonymous access to this controller and validate rather the user typed in the proper username and passsword as defined by our **App Settings Stored in Azure**.
* If they are successful, then redirect to our Home page else, display an error.
* By using the CloudConfigurationManager, I can also commit this to GitHub without worry if my secrets are revealed.
```csharp
public class AccountController : Controller
{
[AllowAnonymous]
public ActionResult LogOn()
{
LogOnViewModel model = new LogOnViewModel();
return View(model);
}
[AllowAnonymous]
[HttpPost]
public ActionResult LogOn(LogOnViewModel model, string returnUrl)
{
if (ModelState.IsValid)
{
if (model.UserName == CloudConfigurationManager.GetSetting("UName") && model.Password == CloudConfigurationManager.GetSetting("UPw"))
{
FormsAuthentication.SetAuthCookie(model.UserName, false);
return RedirectToAction("Index", "Home");
}
else
{
ModelState.AddModelError("", "Incorrect username or password");
}
}
return View(model);
}
public ActionResult LogOff()
{
Request.Cookies.Remove("UserName");
FormsAuthentication.SignOut();
return RedirectToAction("Index", "Home");
}
}
```
Part 5:
* You'll need to use **NuGet** to pull in references to : Microsoft.WindowsAzure.ConfigurationManager inside of Visual Studio.
<img :src="$withBase('/files/storagethroughcsharp2.png')"/>
Part 6:
* Go into your Azure Web App -> Settings -> Application Settings and define two keys with whatever name you want along with the value you want for your username and password.
<img :src="$withBase('/files/usersecret2.png')">
Part 7:
* Finally add your view in named **LogOn.cshtml**. I typically put it inside another folder like `Views\Account`
* This will simply create our form with a username and password. After the user types in the information, it validates it with the values with what is in our Azure App Service settings.
```csharp
<meta name="viewport" content="width=device-width, initial-scale=1.0')">
@model MVCMobileApp.Models.Account.LogOnViewModel
@{
Layout = null;
ViewBag.Title = "Log On";
ViewBag.ReturnUrl = Request["ReturnUrl"];
}
<div class="login')">
@using (Html.BeginForm(null, null, new { returnUrl = ViewBag.ReturnUrl }, FormMethod.Post))
{
@Html.AntiForgeryToken()
@Html.ValidationSummary(true)<br />
@Html.TextBoxFor(m => m.UserName, new { placeholder = Html.DisplayNameFor(m => m.UserName) })<br />
@Html.PasswordFor(m => m.Password, new { placeholder = Html.DisplayNameFor(m => m.Password) })<br />
<button type="submit" class="btn btn-primary btn-block btn-large')">Log On</button>
}
</div>
```
Part 8:
On the page that I want to protect (for example my Index page in Home) on the controller I'd do the following:
```csharp
[Authorize]
public class HomeController : Controller
...
```
Part 9:
Add a Sign Out Action link inside the `_Layouts.cshtml` in the Shared folder.
```csharp
<li>@Html.ActionLink("Log Off", "LogOn", "Account", null, new { @class = "actnclass" })</li>
```
Nice! Our single user authentication is now in place. See the quick demo below and keep in mind that this really is for **quick and dirty user authencation**.
<img :src="$withBase('/files/usersecret3.gif')">

87
blog/blog/tip113.md Normal file
Просмотреть файл

@ -0,0 +1,87 @@
---
type: post
title: "Tip 113 - Prevent secrets from getting out with .NET Core"
excerpt: "A tutorial on how to quickly hide secrets with .NET Core"
tags: [Web, Languages & Frameworks]
date: 2018-04-09 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Prevent secrets from getting out with .NET Core
I was recently building out a .NET Core Console App and it had secrets such as my **Bit.ly API key** and **Azure Storage Table DB Connection string** (which also has my password). I got busy and forgot what I was doing and **committed it to GitHub**. That really sucks, but
#### I had options, but didn't take them
So why didn't I use [Azure Key Vault](https://azure.microsoft.com/services/key-vault?WT.mc_id=azure-azuredevtips-azureappsdev) or [Secret Manager](https://docs.microsoft.com/aspnet/core/security/app-secrets?tabs=visual-studio?WT.mc_id=docs-azuredevtips-azureappsdev)
For Azure Key Vault, I felt there was some overhead (such as learning it) that I didn't want to pay. It also is a very cheap service, but I wanted FREE. Regarding Secret Manager, that information is always stored in the user profile directory such as `%APPDATA%\microsoft\UserSecrets\<userSecretsId>\secrets.json` for Windows or `~/.microsoft/usersecrets/<userSecretsId>/secrets.json` for Mac/Linux. This means if other folks want to get your key store, they can target those directories b/c the JSON file is unencrypted. Not that my version is encrypted, it just isn't stored in the user profile directory.
#### How I Prevent secrets from getting pushed to GitHub with .NET Core
Part 1:
* I create a new .NET Core App in Visual Studio. (For example: A console app)
* I add a file called `appSecrets.json` and define a couple of secrets that I don't want getting out. Such as my Bit.ly API key and Azure Table Storage Connection String.
```asciidoc
{
"ConnectionStrings": {
"BitlyAPI": "A_BITLY_API_KEY",
"StorageAccountAPI": "MY_STORAGE_ACCOUNT_KEY"
}
}
```
Part 2:
* Set the `appSecrets.json` file to **Copy if newer** inside of Visual Studio.
<img :src="$withBase('/files/azconsecret1.png')">
Part 3:
* I add the following **NuGet** packages that allows us to easily read a local JSON file (such as our `appSecrets.json`) and extract key pieces of information:
* Microsoft.Extensions.Configuration
* Microsoft.Extensions.Configuration.FileExtensions
* Microsoft.Extensions.Configuration.Json
Part 4:
* I add the following code inside the **Main** method.
* This uses ConfigurationBuilder and searches for the file.
```csharp
var builder = new ConfigurationBuilder()
.SetBasePath(Directory.GetCurrentDirectory())
.AddJsonFile("appSecrets.json", optional: false, reloadOnChange: true);
IConfigurationRoot configuration = builder.Build();
```
You can now access the value of the string with the following :
```csharp
configuration.GetConnectionString("StorageAccountAPI")
```
Part 5: **VERY IMPORTANT**
* Set your `/.gitignore` to ignore the `appSecrets.json` that we added.
```markdown
#### Ignore Visual Studio temporary files, build results, and
#### files generated by popular Visual Studio add-ons.
appSecrets.json
```
<img :src="$withBase('/files/azconsecret2.png')">
You can verify this file is ignored by looking for the **red circle** if using Visual Studio.
<img :src="$withBase('/files/azconsecret3.png')">

85
blog/blog/tip114.md Normal file
Просмотреть файл

@ -0,0 +1,85 @@
---
type: post
title: "Tip 114 - Easily Send JSON to IoT Hub with C#"
excerpt: "A tutorial on how to quickly send JSON to IoT Hub with C#"
tags: [Management and Governance, Internet of Things]
date: 2018-04-15 17:00:00
---
::: tip
:bulb: Learn more : [Azure IoT Hub Overview](https://docs.microsoft.com/azure/iot-hub/about-iot-hub?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Easily Send JSON to IoT Hub with C#
I recently needed to send JSON that an IoT Hub could receive and display on an AZ3166 device. Once the AZ3166 device receives the message, then it could do a number of things with the data such as open an door.
Part 1:
* Create an IoT Hub and provision the MX Chip (AZ3166) as a device. While we could go into the Azure Portal and create a new IoT Hub and walk through the setup of our device, etc. There is an easier way.
* [Download the tools](https://docs.microsoft.com/azure/iot-hub/iot-hub-arduino-iot-devkit-az3166-get-started#prepare-the-development-environment?WT.mc_id=docs-azuredevtips-azureappsdev)
* Open **VS Code**, look under **Arduino Examples** and open the **GetStarted** sample and run `task cloud-provision` in the VS Code terminal..
<img :src="$withBase('/files/aziothub1.png')">
* If you switch over to the Azure Portal, and look under your new IoT, then Devices - you should see your new device.
<img :src="$withBase('/files/aziothub2.png')">
Part 2:
* I took the code from the “GetStarted” sample found in VS Code and tweaked the `Screen.print(0, "Unlock Door");` lines in the `GetStarted.ino` file in the `Setup` method.
* Now on my deivce it prints the “Unlock Door” message in fancy yellow and displays the IP Address and waits for a message to be sent to IoT Hub.
<img :src="$withBase('/files/aziothub3.png')">
Part 3:
* Open Visual Studio and create a console application.
* Add NuGet package : Microsoft.Azure.Devices (Service SDK for Azure IoT Hub)
* I hardcoded my connection string (found in IoT Hub) and mocked the JSON data.
```csharp
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.Azure.Devices;
namespace SendMessageToIoTHub
{
class Program
{
static ServiceClient serviceClient;
static string connectionString = "mykey";
static void Main(string[] args)
{
serviceClient = ServiceClient.CreateFromConnectionString(connectionString);
SendCloudToDeviceMessageAsync().Wait();
Console.ReadLine();
}
private async static Task SendCloudToDeviceMessageAsync()
{
string mockedJsonData =
"{ \"Locked\":true}";
var commandMessage = new Message(Encoding.ASCII.GetBytes(mockedJsonData));
await serviceClient.SendAsync("AZ3166", commandMessage);
}
}
}
```
Part 4:
* The MX Board receives the message from IoT Hub
* It prints the JSON message from the serviceClient code above to the board display.
<img :src="$withBase('/files/aziothub4.png')">

165
blog/blog/tip115.md Normal file
Просмотреть файл

@ -0,0 +1,165 @@
---
type: post
title: "Tip 115 - Remove Azure Secrets committed to GitHub"
excerpt: "A tutorial on how to quickly remove Azure secrets committed to GitHub"
tags: [Security, DevOps]
date: 2018-04-16 17:00:00
---
::: tip
:bulb: Learn more : [Keys, secrets, and certificates](https://docs.microsoft.com/azure/key-vault/about-keys-secrets-and-certificates?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Remove Azure Secrets committed to GitHub
#### Remove passwords committed to GitHub on accident
Writing code day after day means secrets, connection strings and more get added to your code accidentally. And if you are like me, they get committed to your GitHub repo and then you have to live in shame. :) In this post, I'll walk you through removing secrets from a GitHub repo that you've already committed the secret to.
Part 1 - Initial setup:
Scenario: You have committed a password with the value of `qph@}uC,7cGLBdsX` to your GitHub repo. This password should be confidential and not stored in the code.
How do you fix it?
* Ensure you have the repo on your local disk or clone a fresh copy with HTTPS or SSH. I'll use SSH `git clone git@github.com:mbcrump/crumpbot.git` as a sample.
* Clone a copy of your repo that has the secret stored using the mirror option, like the following `git clone --mirror git@github.com:mbcrump/crumpbot.git`.
* You'll now have a BARE repo. CD into it with `cd crumpbot.git` and run `ls -l` to list out the contents on macOS or `dir` on Windows.
Below is an example of my repo.
```
[mbcrump@Michaels-MBP-3]:[~/Documents/code]$ cd crumpbot.git
[mbcrump@Michaels-MBP-3]:[~/Documents/code/crumpbot.git] (BARE:master)$ ls -l
total 32
-rw-r--r-- 1 mbcrump staff 23 Dec 1 19:47 HEAD
-rw-r--r-- 1 mbcrump staff 211 Dec 1 19:47 config
-rw-r--r-- 1 mbcrump staff 73 Dec 1 19:47 description
drwxr-xr-x 13 mbcrump staff 416 Dec 1 19:47 hooks
drwxr-xr-x 3 mbcrump staff 96 Dec 1 19:47 info
drwxr-xr-x 27 mbcrump staff 864 Dec 1 19:48 objects
-rw-r--r-- 1 mbcrump staff 105 Dec 1 19:47 packed-refs
drwxr-xr-x 4 mbcrump staff 128 Dec 1 19:47 refs
```
Part 2 - Create a file of passwords that you'd like to remove:
* Create a `passwords.txt` file and place and enter the passwords that you'd like to remove from your GitHub repo.
I created mine on macOS with `touch passwords.txt` or `echo some-text > passwords.txt` on Windows and added the password that I accidentally committed:
```
qph@}uC,7cGLBdsX
```
* Save the file.
Part 3 - Install BFG:
Enter [BFG](https://rtyley.github.io/bfg-repo-cleaner/). According to the author:
>BFG is a simpler, faster alternative to git-filter-branch for cleansing bad data out of your Git repository history:
>Removing Crazy Big Files
>Removing Passwords, Credentials & other Private data
* Install [BFG](https://rtyley.github.io/bfg-repo-cleaner/) with `brew install bfg` assuming you have Homebrew installed and using a Mac or [download](https://rtyley.github.io/bfg-repo-cleaner/) the JAR file if you are on Windows.
Part 4 - Clean up the passwords previously committed:
* Run `bfg --replace-text passwords.txt crumpbot.git` on Mac or `java -jar bfg.jar --replace-text passwords.txt crumpbot.git` if using the JAR file.
* Below is output from that command:
```bash
[mbcrump@Michaels-MBP-3]:[~/Documents/code]$ bfg --replace-text passwords.txt crumpbot.git
Using repo : /Users/mbcrump/Documents/code/crumpbot.git
Found 2489 objects to protect
Found 2 commit-pointing refs : HEAD, refs/heads/master
Protected commits
-----------------
These are your protected commits, and so their contents will NOT be altered:
* commit 58969937 (protected by 'HEAD')
Cleaning
--------
Found 11 commits
Cleaning commits: 100% (11/11)
Cleaning commits completed in 96 ms.
Updating 1 Ref
--------------
Ref Before After
---------------------------------------
refs/heads/master | 58969937 | 3f9041c9
Updating references: 100% (1/1)
...Ref update completed in 24 ms.
Commit Tree-Dirt History
------------------------
Earliest Latest
| |
D D D D DD D D m m m
D = dirty commits (file tree fixed)
m = modified commits (commit message or parents changed)
. = clean commits (no changes to file tree)
Before After
-------------------------------------------
First modified commit | 39e68d03 | 95e6f9f4
Last dirty commit | 2007b5c5 | 0f57a693
Changed files
-------------
Filename Before & After
--------------------------------------------------------
bot.js | 1b55a8d0 ⇒ 02758dd8, cba19782 ⇒ db95f8c2, ...
In total, 19 object ids were changed. Full details are logged here:
/Users/mbcrump/Documents/code/crumpbot.git.bfg-report/2019-12-01/19-48-22
BFG run is complete! When ready, run: git reflog expire --expire=now --all && git gc --prune=now --aggressive
```
Part 5 - Pushing to GitHub:
* Run `git reflog expire --expire=now --all && git gc --prune=now --aggressive` as indicated by the output.
* Run `git push` to push it to your repo.
Part 6 - Wrap-up and verify your repo was updated successfully:
If you go back to your GitHub repo and look at prior commits, then you should see ****REMOVED**** like the following:
```javascript
var tmi = require("tmi.js")
var channel = "mbcrump"
var config = {
options: {
debug: true
},
connection: {
cluster: "aws",
reconnect: true
},
identity: {
username: "mbcrump",
password: "***REMOVED***"
},
channels: [channel]
}
```
I hope this helps someone out there and if you want to stay in touch then I can be found on [Twitch](http://twitch.tv/mbcrump), [Twitter](http://twitter.com/mbcrump) or [GitHub](http://github.com/mbcrump).

58
blog/blog/tip116.md Normal file
Просмотреть файл

@ -0,0 +1,58 @@
---
type: post
title: "Tip 116 - Easily Upload and download Azure dashboards"
excerpt: "A tutorial on how to easily Upload and download Azure dashboards"
tags: [Management and Governance]
date: 2018-04-22 17:00:00
---
::: tip
:bulb: Learn more : [Azure portal documentation](https://docs.microsoft.com/azure/azure-portal?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Easily Upload and download Azure dashboards
Azure has recently added the ability for you to download or upload an existing Azure dashboard with just a couple of clicks. Previously, you had to use a separate tool like **Azure Resource Explorer** to get this data. Now, youll see a new **Download** and **Upload** option on your Azure Dashboard as shown below:
<img :src="$withBase('/files/azportal1.png')">
Once you download the file it will be in a JSON format. Below is a short snippet that list my **Linux VM** that is on my portal page.
```json
"1": {
"position": {
"x": 4,
"y": 0,
"colSpan": 4,
"rowSpan": 6
},
"metadata": {
"inputs": [
{
"name": "queryInputs",
"value": {
"metrics": [
{
"resourceId": "/subscriptions/d1ecc7ac-c1d8-40dc-97d6-2507597e7404/resourcegroups/crumplinux-rg/providers/microsoft.compute/virtualmachines/crumplinux",
"name": "Disk Read Bytes"
},
...
```
<img :src="$withBase('/files/azportal2.png')">
Note the **colSpan** and **rowSpan** values here.
You could edit this to change the **colSpan** and **rowSpan** to be a smaller tile by changing them to :
```json
"colSpan": 2,
"rowSpan": 2
```
Now save the file and upload it and you have your new tile:
<img :src="$withBase('/files/azportal3.png')">

37
blog/blog/tip117.md Normal file
Просмотреть файл

@ -0,0 +1,37 @@
---
type: post
title: "Tip 117 - Enable HTTP 2.0 support for Azure App Service"
excerpt: "A tutorial on how to enable HTTP/2.0 support for Azure App Service"
tags: [Web]
date: 2018-04-23 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Enable HTTP/2.0 support for Azure App Service
Azure has recently rolled out the ability for you to switch any app service to use HTTP/2.0 Support. It really is as easy as toggling a field in Azure Resource Manager, but first why should you care about HTTP/2.0?
HTTP/2 supports queries multiplexing, headers compression, priority and more intelligent packet streaming management. All of this results in reduced latency and accelerates content download on modern web pages which you should be writing now. :) If you want more details, then [this source](https://daniel.haxx.se/http2/) is the one that I personally trust.
#### Getting Started
Before you go to the Azure Portal, take your *.azurewebsites.net url and test it [here](https://tools.keycdn.com/http2-test). It will quickly tell you whether or not your site supports HTTP/2.0. The reason that I want to start with this site is because in the future HTTP/2.0 will be automatically enabled on future *.azurewebsites.net urls.
Switch over to the **Azure Portal** now and click on **App Service** and then your existing site. Now click on **Resource Explorer** as shown below.
<img :src="$withBase('/files/azhttp2-1.png')">
It should navigate you to Azure Resource Explorer (Preview) and you'll just need to click on **config** and then **web**.
<img :src="$withBase('/files/azhttp2-2.png')">
Click on Read/Write, then toggle `"http20Enabled": false` to `"http20Enabled": true` and click **Put**.
<img :src="$withBase('/files/azhttp2-3.gif')">
Now you can go back to our [HTTP/2.0 testing tool](https://daniel.haxx.se/http2/) and input your *.azurewebsites.net url. I tested mine and received the following:
<img :src="$withBase('/files/azhttp2-4.png')">

39
blog/blog/tip119.md Normal file
Просмотреть файл

@ -0,0 +1,39 @@
---
type: post
title: "Tip 119 - Determine the outbound IP addresses of your Azure App Service"
excerpt: "Learn how to determine the outbound IP addresses of your Azure App Service"
tags: [Web]
date: 2018-04-29 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Determine the outbound IP addresses of your Azure App Service
Because some networks are locked down and only allow whitelisted IP addresses, I hear these questions a lot.
* What is my Azure Web Apps outbound IP address?
* What IP addresses do I need to whitelist?
##### Question 1
For an individual Azure Web App, you can simply go to the **Properties** of the application:
<img :src="$withBase('/files/azoutbound1.png')">
You can click the copy button to add them to your clipboard.
##### Question 2
If you need to whitelist a region, I first lookup to see what region it is currently deployed in. You can find this information from the **Overview** page of the application.
<img :src="$withBase('/files/azoutbound2.png')">
Microsoft provide a list of the [Azure Data Center IP ranges in XML format](https://www.microsoft.com/download/details.aspx?id=41653?WT.mc_id=microsoft-azuredevtips-azureappsdev)
So download this file and search for your region and now you know what IP address to whitelist.
<img :src="$withBase('/files/azoutbound3.png')">

19
blog/blog/tip12.md Normal file
Просмотреть файл

@ -0,0 +1,19 @@
---
type: post
title: "Tip 12 - Easily Start, Restart, Stop or Delete Multiple VMs"
excerpt: "Learn how to quickly start, restart, stop or delete Multiple VMs with just one click"
tags: [Virtual Machines]
date: 2017-09-06 17:00:00
---
::: tip
:bulb: Learn more : [Azure Virtual Machines](https://docs.microsoft.com/azure/virtual-machines/?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to start, restart, stop or delete multiple VMs](https://www.youtube.com/watch?v=cePvuKDdNv8&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=10?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Easily Start, Restart, Stop or Delete Multiple VMs
You may be aware that you can restart, start, stop or delete a VM but did you know that you can select multiple at the same time? Just open the Azure Portal and select the VMs you wish to control and press the desired button. It really is that easy!
<img :src="$withBase('/files/azuretip12.gif')">

42
blog/blog/tip120.md Normal file
Просмотреть файл

@ -0,0 +1,42 @@
---
type: post
title: "Tip 120 - Run Azure PowerShell Cmdlets in Visual Studio 2017"
excerpt: "Learn how to run Azure PowerShell Cmdlets in Visual Studio 2017"
tags: [Languages & Frameworks, Visual Studio Family]
date: 2018-04-30 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure PowerShell](https://docs.microsoft.com/powershell/azure/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Run Azure PowerShell Cmdlets in Visual Studio 2017
Because some folks like to work with PowerShell and Azure AND want to stay inside of Visual Studio 2017, I hear the following questions from time to time.
* I use the Windows PowerShell ISE and would like to run cmdlets using Visual Studio 2017. How can I do this?
* Is there any Nuget Packages available for referencing the library for Azure Powershell commands?
#### Grab the extension or install via the VS 2017 installer
Since folks typically have VS already installed, the easiest way is to grab the extension. If VS is open, then go to **Tools** and **Extensions** and search the online marketplace for **PowerShell**.
<img :src="$withBase('/files/powershellext1.png')">
Download and install the extension and you'll have access to PowerShell in **Other Languages** and can create a **PowerShell project** and manage your existing .ps1 files.
<img :src="$withBase('/files/powershellext2.png')">
If you prefer an interactive windows, then click **View** -> **Other Windows** -> **PowerShell Interactive Window** -> and run your cmdlet:
<img :src="$withBase('/files/powershellext3.png')">
If you don't have Visual Studio installed, then you can install the PowerShell tools through the Visual Studio installer with the Azure workload.
1. Click the **Individual components** tab after selecting Azure development.
2. Look under **Optional**.
3. Check **PowerShell tools**.
4. Click **Install**
and you should be off and running! I hope this helps.

60
blog/blog/tip122.md Normal file
Просмотреть файл

@ -0,0 +1,60 @@
---
type: post
title: "Tip 122 - Creating an IoT Hub for the IoT Button"
excerpt: "Learn how to configure and explore working with the IoT Button"
tags: [Identity]
date: 2018-05-14 17:00:00
---
::: tip
:bulb: Learn more : [Azure IoT Hub Overview](https://docs.microsoft.com/azure/iot-hub/about-iot-hub?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Creating an IoT Hub for the IoT Button
#### The Series So Far
At Build 2018, we first saw the [IoT Button](http://aka.ms/button?WT.mc_id=akams-azuredevtips-azureappsdev). I started [exploring the device](https://www.youtube.com/watch?v=OdGHWwRBf_c?WT.mc_id=youtube-azuredevtips-azureappsdev) with the very first unboxing and decided to create a mini-series to walk you how to use the device from start to finish. The series (so far) is located below
* [This post - Creating an IoT for the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip122.html)
* [Configuring and Setting up the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip123.html)
* [Creating the Azure Logic App for our IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip124.html)
* [Using Azure Function to call our Logic App with the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip125.html)
I recently recorded a fun video with my daughter unboxing the new [IoT Button](http://aka.ms/button?WT.mc_id=akams-azuredevtips-azureappsdev) that was handed out at Build 2018.
<iframe width="560" height="315" src="https://www.youtube.com/embed/OdGHWwRBf_c?rel=0" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>
#### We need an IoT Hub, Captain!
Before we can start enjoying the IoT Button, we first need to setup an IoT Hub.
Go inside of the Azure Portal and search for **IoT Hub** and begin to create one. Fill out the following information, but keep notepad open and save the **IoT Hub Name**.
<img :src="$withBase('/files/iotbutton1.png')">
Make sure that you select the **Free Tier** and you can leave the rest at defaults. If you already have an IoT Hub with a free tier, then you'll need to either use that one or delete it to create another free tier.
<img :src="$withBase('/files/iotbutton2.png')">
Once it creates, save your Hostname as you'll use that later.
<img :src="$withBase('/files/iotbutton3.png')">
You'll want to click on **Shared Access Policies** and then **iothubowner** and copy and paste the **Conneection String - Primary** for later use.
<img :src="$withBase('/files/iotbutton4.png')">
Now download [Device Explorer](https://github.com/Azure/azure-iot-sdks/releases?WT.mc_id=github-azuredevtips-azureappsdev) for Windows or you can use the iothub-explorer tool for Mac. Now paste in your IoT Hub Connection string and press **Update**. You should see SAS populate.
<img :src="$withBase('/files/iotbutton5.png')">
Switch over to the **Management** tab and click **Create** and give it a name and select **Auto-Generate Keys**, and then **Create**.
<img :src="$withBase('/files/iotbutton6.png')">
The keys are now created, copy them someplace safe.
<img :src="$withBase('/files/iotbutton7.png')">
Right click on the newly created device and select **Copy connection string for selected device**. Now that you have they keys, we'll need to configure the device tomorrow.

57
blog/blog/tip123.md Normal file
Просмотреть файл

@ -0,0 +1,57 @@
---
type: post
title: "Tip 123 - Configuring and Setting up the IoT Button"
excerpt: "Learn how to configure and explore working with the IoT Button"
tags: [Identity]
date: 2018-05-15 17:00:00
---
::: tip
:bulb: Learn more : [Azure IoT Hub Overview](https://docs.microsoft.com/azure/iot-hub/about-iot-hub?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Configuring and Setting up the IoT Button
#### The Series So Far
At Build 2018, we first saw the [IoT Button](http://aka.ms/button?WT.mc_id=akams-azuredevtips-azureappsdev). I started [exploring the device](https://www.youtube.com/watch?v=OdGHWwRBf_c?WT.mc_id=youtube-azuredevtips-azureappsdev) with the very first unboxing and decided to create a mini-series to walk you how to use the device from start to finish. The series (so far) is located below
* [This post - Creating an IoT for the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip122.html)
* [Configuring and Setting up the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip123.html)
* [Creating the Azure Logic App for our IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip124.html)
* [Using Azure Function to call our Logic App with the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip125.html)
In the blog post, I spent time walking you through creating an IoT Hub that we'll be using in the rest of this series. Now we're going to take a look at configuring the actual IoT Button to make use of our IoT Hub.
#### Configuring and Setting up the IoT Button
You first need to take the button and hold it down until you see a yellow LED and then release. This will give you access to the AP which you will connect to on your Windows or Mac. Once connected, go to the device configuration page in a web browser http://192.168.4.1 and you'll see the following:
Click on the **IoT Hub Configuration** page and provide the Azure IoT Hub, IoT Device name and secret (that you got out of Device Explorer) and click Save.
<img :src="$withBase('/files/iotbutton9.png')">
If you click on **TimeServer**, then you can change the default time server. It should be fine as is.
<img :src="$withBase('/files/iotbutton10.png')">
Now you'll need to configure the Wifi that the device connects to when you click the button. You need to use 2.4 wireless and WPA-PSK2. At least that is what works at my home. Make sure you save the information after entering it.
<img :src="$withBase('/files/iotbutton11.png')">
Next up, is **IP Configuration**, I am using DHCP so left this as **yes** and 0.0.0.0 for everything except **Netmask**. This needs to be 255.255.255.0.
<img :src="$withBase('/files/iotbutton12.png')">
On the **User JSON** screen, this is a great paste to paste some sample JSON that you wish to test with.
<img :src="$withBase('/files/iotbutton13.png')">
Finally, on the **Shutdown** page, it allows you to save all settings and exit.
<img :src="$withBase('/files/iotbutton14.png')">
Once this is complete, switch back over to **Device Explorer** and click on the **Data** tab and ensure the right EventHub and DeviceID is selected and press **Monitor**. Press you button one time and you should see the sample data coming through. Nice!
<img :src="$withBase('/files/iotbutton15.png')">

73
blog/blog/tip124.md Normal file
Просмотреть файл

@ -0,0 +1,73 @@
---
type: post
title: "Tip 124 - Creating the Azure Logic App for our IoT Button"
excerpt: "Learn how to configure and explore working with the IoT Button"
tags: [Internet of Things, Integration]
date: 2018-05-20 17:00:00
---
::: tip
:bulb: Learn more : [Azure IoT Hub Overview](https://docs.microsoft.com/azure/iot-hub/about-iot-hub?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Creating the Azure Logic App for our IoT Button
#### The Series so Far
At Build 2018, we first saw the [IoT Button](http://aka.ms/button?WT.mc_id=akams-azuredevtips-azureappsdev). I started [exploring the device](https://www.youtube.com/watch?v=OdGHWwRBf_c?WT.mc_id=youtube-azuredevtips-azureappsdev) with the very first unboxing and decided to create a mini-series to walk you how to use the device from start to finish. The series (so far) is located below
* [Creating an IoT for the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip122.html)
* [Configuring and Setting up the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip123.html)
* [Creating the Azure Logic App for our IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip124.html)
* [Using Azure Function to call our Logic App with the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip125.html)
#### Creating the Azure Logic App for our IoT Button
Now that we know how to setup an IoT Hub and configure out device to get on the network and use said IoT Hub, now we need to actually get to what we are going to build:
> An app that automatically add a row to an excel sheet that includes a time along with a status (such as start or stop).
Add an **Azure Logic App** and provide the details for Name, Description, Resource Group and more and click Add.
<img :src="$withBase('/files/iotbutton16.png')">
Click on **When a HTTP request is received**, to use as our template
<img :src="$withBase('/files/iotbutton17.png')">
Since we're going to pass a field into this Logic App that writes if the status is **Started** or **Stop** in the excel sheet. This means that we need to add a field into the **Request Body JSON Schema** as shown below:
```json
{
"properties": {
"status": {
"type": "string"
}
},
"type": "object"
}
```
We'll also want to make sure that this is a **POST** method.
Before we proceed to add an action, we need to open a tab and login to OneDrive to create an Excel book and a table we can use in the next connector. I created one in the following location in my OneDrive `/Excel/Book1.xlsx`. Go ahead and create some data as shown below, with at least a **StartTime** and a **Text** in your OneDrive account.
<img :src="$withBase('/files/iotbutton21.png')">
You'll also need to create a table inside of it. Just open the Excel file add some data and then click the **Table** button.
<img :src="$withBase('/files/iotbutton20.png')">
Now we'll add an action and use the **Add a row into a table (OneDrive)** connector.
<img :src="$withBase('/files/iotbutton18.png')">
And we'll provide our **File**, **Table** as indicated above.
For the **StartTime**, we'll want to use an **Expression** with the code `utcNow('M/d/yyyy h:mm')`. This will give us a nicely formatted Date that we can easily work with.
For the **Text**, just use the **Dynamic content** called **status** that we are passing in.
<img :src="$withBase('/files/iotbutton19.png')">

135
blog/blog/tip125.md Normal file
Просмотреть файл

@ -0,0 +1,135 @@
---
type: post
title: "Tip 125 - Using Azure Function to call our Logic App with the IoT Button"
excerpt: "Learn how to use Azure Function to call our Logic App with the IoT Button"
tags: [Serverless, Internet of Things, Integration]
date: 2018-05-21 17:00:00
---
::: tip
:bulb: Learn more : [Azure IoT Hub Overview](https://docs.microsoft.com/azure/iot-hub/about-iot-hub?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Using Azure Function to call our Logic App with the IoT Button
#### Special Thanks
Special Thanks to [Stefan Wick](http://twitter.com/StefanWickDev?WT.mc_id=twitter-azuredevtips-azureappsdev) on the Azure IoT Team for his review and edits on this post.
#### The Series so Far
At Build 2018, we first saw the [IoT Button](http://aka.ms/button?WT.mc_id=akams-azuredevtips-azureappsdev). I started [exploring the device](https://www.youtube.com/watch?v=OdGHWwRBf_c?WT.mc_id=youtube-azuredevtips-azureappsdev) with the very first unboxing and decided to create a mini-series to walk you how to use the device from start to finish. The series (so far) is located below
* [Creating an IoT for the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip122.html)
* [Configuring and Setting up the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip123.html)
* [Creating the Azure Logic App for our IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip124.html)
* [Using Azure Function to call our Logic App with the IoT Button](https://microsoft.github.io/AzureTipsAndTricks/blog/tip125.html)
We know how to setup an IoT Hub and configure our device to get on the network, we've worked with Logic Apps that will automatically add a row to an excel sheet that includes a time along with a status (such as start or stop). All that is left is to add an Azure Function that calls the Logic Apps and passes a parameter.
#### Using Azure Function to call our Logic App with the IoT Button
Open Visual Studio and click on **Cloud** and then **Azure Functions** and give it a name and click OK.
<img :src="$withBase('/files/iotbutton22.png')">
We'll begin with an **Empty** project type and **Azure Function v2**. You can leave the Storage Account as the emulator. We won't need it.
<img :src="$withBase('/files/iotbutton23.png')">
After your project is loaded, we need to add an item. So right click the project and select **Add new item** and select **Azure Functions** and give it a name and click Add.
<img :src="$withBase('/files/iotbutton24.png')">
Make sure you select **IoT Hub trigger**.
<img :src="$withBase('/files/iotbutton25.png')">
For the Connection string, give it the name **IoTHubConnectionString** and leave the **Path** as-is and click OK.
Once your project settles down, go into your `local.settings.json` and add the following placeholder for our connection string (which will be used shortly:
```
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true",
"IoTHubConnectionString": ""
}
}
```
Go into the IoT Hub that you created, select Built-in endpoints, and copy the **Event Hub-compatible endpoint** to notepad somewhere.
<img :src="$withBase('/files/iotbutton28.png')">
For example mine is `Endpoint=sb://xxxservicebus.windows.net/;SharedAccessKeyName=xxx=;EntityPath=myioteventhubcompatiblename`. Now copy that into the **IoTHubConnectionString** field in our `local.settings.json`.
If you run the application now, you'll see our function can accept messages. So go ahead and press your IoT Button and you should see something return.
<img :src="$withBase('/files/iotbutton26.png')">
Now in order for our app to use the Logic App, we need to write some code, but remember how we added a **status** field that the Logic App is looking for?
We'll first use create a file that will store the value either **Started** or **Stopped** and save that on the local machine or published app service site. We'll create a quick helper method that will toggle the fields every time the function runs.
```csharp
var persistedCount = LoadCurrentStatus("status.txt");
log.Info("Status is " + persistedCount);
private static string LoadCurrentStatus(string fileName)
{
var folder = Environment.ExpandEnvironmentVariables(@"%HOME%\data\AzureFunctionAppData");
var fullPath = Path.Combine(folder, fileName);
Directory.CreateDirectory(folder);
string persistedState = "Start";
if (File.Exists(fullPath))
{
persistedState = File.ReadAllText(fullPath);
if (persistedState == "Start")
{
persistedState = "Stop";
}
else if (persistedState == "Stop")
{
persistedState = "Start";
}
}
File.WriteAllText(fullPath, persistedState);
return persistedState;
}
```
Now we need to send the status to our Logic App. We'll use **Newtonsoft.Json** by pulling in the proper NuGet package.
<img :src="$withBase('/files/iotbutton27.png')">
And we'll add the following **status** code to be serialized as a class:
```csharp
public class MyStatus
{
public string status { get; set; }
}
```
We'll initialize the field with the current status as indicated on our file system and then set the header and call our Azure Logic App url which can be copied in the Logic App Designer after you save it.
<img :src="$withBase('/files/iotbutton29.png')">
```csharp
var myContent = JsonConvert.SerializeObject(myStatus);
var buffer = System.Text.Encoding.UTF8.GetBytes(myContent);
var byteContent = new ByteArrayContent(buffer);
byteContent.Headers.ContentType = new MediaTypeHeaderValue("application/json");
await client.PostAsync("https://xxx.westus.logic.azure.com:443/workflows/xxx/triggers/manual/paths/invoke?api-version=2016-10-01&sp=%2Ftriggers%2Fmanual%2Frun&sv=1.0&sig=xxx", byteContent);
```
Now if we run the Azure Function (either locally or published) and press our IoT Button, then it will call the Logic App and finally insert a Start (or Stop) status along with the DateTime in our Excel workbook.
In the sample below, I'll use the [IoT Button Simulator](https://prodiotsimulator.blob.core.windows.net/site/index.html) to paste in my Connection String, which my Azure Function will process and it will call my Logic App which writes an entry to my table in MS Excel. Very cool!
<iframe width="560" height="315" src="https://www.youtube.com/embed/-EWIbX_DfF0?rel=0" frameborder="0" allow="autoplay; encrypted-media" allowfullscreen></iframe>

71
blog/blog/tip127.md Normal file
Просмотреть файл

@ -0,0 +1,71 @@
---
type: post
title: "Tip 127 - Mount a drive and upload files to Cloud Shell"
excerpt: "Learn how to mount a drive and upload files to Cloud Shell"
tags: [Management and Governance]
date: 2018-05-29 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Mount a drive and upload files to Cloud Shell
#### Overview
When working with [Azure Cloud Shell](http://shell.azure.com), you sometimes need the ability to upload files to work with later. I'm going to call out the two methods that I use to accomplish this task all the time.
#### Mount a drive and upload via the Azure Portal
In method one, we'll update the file share that's associated with Cloud Shell by using the `clouddrive mount` command. Note: that you may already have a cloud drive that is created upon initial start of cloud shell. Go ahead and spin up Azure Cloud Shell and type `clouddrive -h` to see the commands to mount and unmount a drive.
```shell
michael@Azure:~$ clouddrive -h
Group
clouddrive :Manage storage settings for Azure Cloud Shell.
Commands
mount :Mount a file share to Cloud Shell.
unmount :Unmount a file share from Cloud Shell.
```
To mount a drive, we'll type `clouddrive mount -h` to see a help screen that is looking for the following parameters:
```shell
Arguments
-s | --subscription id [Required]:Subscription ID or name.
-g | --resource-group group [Required]:Resource group name.
-n | --storage-account name [Required]:Storage account name.
-f | --file-share name [Required]:File share name.
-d | --disk-size size :Disk size in GB. (default 5)
-F | --force :Skip warning prompts.
-? | -h | --help :Shows this usage text.
```
We'll now simply call `clouddrive mount -s subscription-id -g your-resource-group-name -n storage-account -f storage-file-name` to create our drive. Once it has completed, we'll navigate to the resource and hit the **Upload** button and upload a file. Again, you could have navigated to your existing resource group instead of creating a new one - but I want you to learn how to do this manually.
<img :src="$withBase('/files/cloudshellnew1.png')">
Now type `cd clouddrive` and `ls -l` and you should see the file you just uploaded:
```shell
michael@Azure:~/clouddrive$ ls -l
total 53
-rwxrwxrwx 1 root root 53385 May 29 23:55 cloudshellnew1.png
michael@Azure:~/clouddrive$
```
#### Upload via Cloud Shell button
The second method involves pressing the **Upload** button built right into Azure Cloud Shell.
<img :src="$withBase('/files/cloudshellnew2.png')">
After you press this button and provide the file, you'll see that it is uploading it into your `/HOME/username` folder
<img :src="$withBase('/files/cloudshellnew3.png')">
Now you can simply type `cp filename cloudrive` to copy the file and have access to the file via cloud drive.

49
blog/blog/tip128.md Normal file
Просмотреть файл

@ -0,0 +1,49 @@
---
type: post
title: "Tip 128 - Download all Azure Documentation for offline viewing"
excerpt: "Learn how to quickly download all of the Azure documentation for offline viewing"
tags: [Productivity]
date: 2018-06-03 17:00:00
---
::: danger
This tip is marked obsolete. More info can be found [here](https://github.com/microsoft/AzureTipsAndTricks/issues/128).
:::
::: tip
:bulb: Learn more : [Azure Security Center](https://docs.microsoft.com/azure/security-center/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Download all Azure Documentation for offline viewing
There have been several times when I've wished to have all the Azure documentation on my local computer whether it be a flight, etc.. I've never found a way except finding the [pieces of the documentation](https://docs.microsoft.com/azure/security-center/) that I wanted and pressing the **Download PDF** button.
<img :src="$withBase('/files/documentation1.png')">
Until Now...
If you want to download **ALL** of the Azure documentation, then follow the instructions below:
1.) You'll need to first download [jq](https://stedolan.github.io/jq/download/) with is a JSON processor. If you have a Mac, then you can use `brew install jq` or on Windows use Chocolatey NuGet `chocolatey install jq`. Sample output from my machine is below:
```
Michaels-MBP:Documents mbcrump$ brew install jq
==> Installing jq
==> Downloading https://homebrew.bintray.com/bottles/jq-1.5_3.high_sierra.bottle
################################################################################################################################################ 100.0%
==> Pouring jq-1.5_3.high_sierra.bottle.tar.gz /usr/local/Cellar/jq/1.5_3: 19 files, 946.6KB
```
2.) Next you'll need to run the following command which uses curl and jq to download every PDF contained in the [GitHub repo](https://api.github.com/repositories/72685026/contents/articles?WT.mc_id=github-azuredevtips-azureappsdev):
```
for article in $(curl -s https://api.github.com/repositories/72685026/contents/articles | jq -r '.[] | select(.type | contains("dir")) | .name'); do
wget "https://docs.microsoft.com/en-us/azure/opbuildpdf/$article/toc.pdf" -O$article.pdf;
done
```
3.) Give it some time as it is about 2GB and check the folder where you ran that command.
<img :src="$withBase('/files/documentation2.png')">
4.) Success! You'll see all the PDF file and you now have a current snapshot of Azure's documentation.

90
blog/blog/tip129.md Normal file
Просмотреть файл

@ -0,0 +1,90 @@
---
type: post
title: "Tip 129 - Using OCR to extract text from images from the Azure Portal"
excerpt: "Learn how to use OCR to extract text from images from the Azure Portal"
tags: [AI + Machine Learning]
date: 2018-06-04 17:00:00
---
::: tip
:bulb: Learn more : [Azure Cognitive Services](https://docs.microsoft.com/azure/cognitive-services?WT.mc_id=docs-azuredevtips-azureappsdev).
:bulb: Checkout [Azure AI resources for developers](https://azure.microsoft.com/en-us/overview/ai-platform/dev-resources/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Using OCR to extract text from images from the Azure Portal
I recently needed the ability to extract text from an image. I was very cautious as several free alternatives that exist on the web said they may keep the image (and or text). So I did what any developer would do and just rolled my own. But instead of creating an application, I took it upon myself to use the power of the Azure Portal to accomplish this.
1.) Open the [Azure Portal](www.portal.azure.com) and select Cloud Shell from the top menu. Now change the scripting language from BASH to PowerShell.
<img :src="$withBase('/files/powershell1.png')">
2.) We now need to install the PowerShell Cognitive Services module. You can do so by typing `Install-Module PSCognitiveservice -Verbose -Force`. Sample output is below:
```powershell
PS Azure:\> Install-Module PSCognitiveservice -Verbose -Force
VERBOSE: Using the provider 'PowerShellGet' for searching packages.
VERBOSE: The -Repository parameter was not specified. PowerShellGet will use all of the registered repositories.
VERBOSE: Getting the provider object for the PackageManagement Provider 'NuGet'.
VERBOSE: The specified Location is 'https://www.powershellgallery.com/api/v2/' and PackageManagementProvider is 'NuGet'.
VERBOSE: Searching repository 'https://www.powershellgallery.com/api/v2/FindPackagesById()?id='PSCognitiveservice'' for ''.
...
VERBOSE: Module 'pscognitiveservice' was installed successfully to path 'C:\Users\ContainerAdministrator\Documents\WindowsPowerShell\Modules\pscognitiveservice\0.3.5'.
Azure:\
PS Azure:\>
```
3.) Import the module by typing `Import-Module PSCognitiveservice -V` Sample output is below:
```powershell
PS Azure:\> Import-Module PSCognitiveservice -Verbose
VERBOSE: Loading module from path 'C:\Users\ContainerAdministrator\Documents\WindowsPowerShell\Modules\PSCognitiveservice\0.3.5\PSCognitiveservice.psd1'.
VERBOSE: Populating RepositorySourceLocation property for module PSCognitiveservice.
VERBOSE: Loading module from path 'C:\Users\ContainerAdministrator\Documents\WindowsPowerShell\Modules\PSCognitiveservice\0.3.5\PSCognitiveService.psm1'.
VERBOSE: Importing function 'ConvertTo-Thumbnail'.
VERBOSE: Importing function 'Get-Face'.
VERBOSE: Importing function 'Get-ImageAnalysis'.
...
VERBOSE: Importing alias 'analyze'.
VERBOSE: Importing alias 'bing'.
...
VERBOSE: Importing alias 'ocr'.
VERBOSE: Importing alias 'sentiment'.
VERBOSE: Importing alias 'tag'.
VERBOSE: Importing alias 'thumbnail'.
Azure:\
```
4.) Now you'll need to create a cognitive services account. You can do either in the portal (you may already have one) or by typing `New-CognitiveServiceAccount AccountType ComputerVision -Verbose`
Just make sure you select **Free**.
<img :src="$withBase('/files/powershell2.png')">
5.) Load the configuration into your environment variables with `lcfg -fromAzure -Verbose` Sample output is below:
```powershell
PS Azure:\> lcfg -fromAzure -Verbose
VERBOSE: Testing Azure login
VERBOSE: Logged in.
VERBOSE: Fetching AzureRM Cognitive Service accounts
VERBOSE: 1 Service found in AzureRM [ComputerVision]
VERBOSE: Setting $env:API_SubscriptionKey_ComputerVision for Cognitive Service: ComputerVision
VERBOSE: Setting $env:API_Location_ComputerVision for Cognitive Service: ComputerVision
Name Value
---- -----
ServiceName ComputerVision
Location westus
SubscriptionKey yourkey
EndPoint https://westus.api.cognitive.microsoft.com/vision/v1.0
```
6.) Type the following line and just replace the **url**
`ocr -URL https://s3-us-west-2.amazonaws.com/i.cdpn.io/10994.zekgx.4df25d8a-eb50-4007-a0df-6ae0aaf87974.png -Verbose | ForEach-Object {$_.regions.lines} | ForEach-Object { $_.words.text -join ' '}`
And you now have your text!

59
blog/blog/tip13.md Normal file
Просмотреть файл

@ -0,0 +1,59 @@
---
type: post
title: "Tip 13 - Demystifying storage in Cloud Shell"
excerpt: "Understand what the Azure Cloud Shell is using storage for."
tags: [Management and Governance]
date: 2017-09-10 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How the Azure Cloud Shell uses storage](https://www.youtube.com/watch?v=JRvKnMqdBcY&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=11?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Demystifying storage in Cloud Shell
#### What's under the hood of Azure Cloud Shell?
The [Azure Cloud Shell](https://azure.microsoft.com/features/cloud-shell?WT.mc_id=azure-azuredevtips-azureappsdev) is something that I've took for granted since it launched at Build 2017. I always knew that I could use it to run [CLI 2.0](https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest?WT.mc_id=docs-azuredevtips-azureappsdev) commands and didn't really stop to think what is "Under the hood"... until now.
When you first open the Cloud Shell, you will find that it requires you to create a Storage account. The reason for that Storage Account is to persist the scripts, keys, etc that you'll use over and over as you interact with your resources.
You can find it once you go to your Resource Group and look for `cloud-shell*` as shown below.
<img :src="$withBase('/files/cloudshell1.png')">
If you drill down into the Storage account, you'll land on two directories - `.cloudconsole` and `.pscloudshell`. More on that later.
<img :src="$withBase('/files/cloudshell2.png')">
Open the Azure Cloud Shell inside of the portal by clicking on the icon at the top (looks like `>_`)
Keep in mind that the Cloud Shell is based off an open-source implementation of [Xterm.js](https://github.com/sourcelair/xterm.js?WT.mc_id=github-azuredevtips-azureappsdev) that emulates the terminal in your browser. It is talking over a web socket to a full Linux BASH shell. Begin by typing:
michael@Azure:~$ ls -l
total 0
lrwxrwxrwx 1 root root 23 Sep 10 16:27 clouddrive -> /usr/michael/clouddrive
Great, we see a clouddrive that is mapped to /usr/michael/clouddrive
Change into that directory and list it out.
michael@Azure:~$ cd clouddrive
michael@Azure:~/clouddrive$ ls -l
total 0
michael@Azure:~/clouddrive$
Nothing there? Or is there?
Remember the `.cloudconsole` and `.pscloudshell` directories from above?
michael@Azure:~/clouddrive$ cd .cloudconsole
michael@Azure:~/clouddrive/.cloudconsole$ ls
acc_michael.img
Nice! We just found a `acc_michael.img` file. This is a 5 gig image that persist your home directory. You could have also navigated through the portal to see what was inside this directory but now you under the CLI better! For those that want an extra challenge, go to the Azure Portal and download the Image file and explore it. Feel free to post comments on what you found below.
So what about the other file called `.pscloudshell`?
We'll this is for the PowerShell scripting language in the Cloud Shell!

25
blog/blog/tip130.md Normal file
Просмотреть файл

@ -0,0 +1,25 @@
---
type: post
title: "Tip 130 - Manage Application Settings for Azure Functions within Visual Studio"
excerpt: "Learn how to use OCR to extract text from images from the Azure Portal"
tags: [Serverless, Visual Studio Family]
date: 2018-06-10 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Manage Application Settings for Azure Functions within Visual Studio
Generally when I'm working with Azure Functions and Visual Studio and need to add an **Application Setting**, then I'll head over to the Azure Portal, click on my Function, Configuration and add the Application Setting.
<img :src="$withBase('/files/vsappsetting1.png')">
Over the weekend I was about to do the same and noticed that you can actually apply Application Settings within the IDE. Just go to **Publish** and then click on **Application Setting**.
<img :src="$withBase('/files/vsappsetting2.png')">
You'll now see a list of **Application Settings** that you can add, edit or delete. Easy...
<img :src="$withBase('/files/vsappsetting3.png')">

23
blog/blog/tip131.md Normal file
Просмотреть файл

@ -0,0 +1,23 @@
---
type: post
title: "Tip 131 - Quickly display a list of all Azure Web Apps URL from Azure Cloud Shell"
excerpt: "Learn how to quickly display a list of all Azure Web Apps URL from Azure Cloud Shell"
tags: [Web, Management and Governance]
date: 2018-06-11 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Quickly display a list of all Azure Web Apps URL from Azure Cloud Shell
Often I need to quickly list out the URLs for all Azure App Services in a given resource. In the past, when it just a small number then I'd do it manually, but it has recently grown to a point where I needed to find a better way.
Enter PowerShell and Azure Cloud Shell.
Wherever you are logged in with Azure Cloud Shell and are using PowerShell, then you can quickly run this command:
`Get-AzureRmWebApp | foreach-object {$_} | select-object SiteName, DefaultHostName, ResourceGroup`
<img :src="$withBase('/files/powershellallwebsites.png')">

41
blog/blog/tip132.md Normal file
Просмотреть файл

@ -0,0 +1,41 @@
---
type: post
title: "Tip 132 - Increase the timeout of ASP.NET Core 2.0 API hosted in Azure App Service"
excerpt: "Learn how to quickly increase the timeout of ASP.NET Core 2.0 API hosted in Azure App Service"
tags: [Web, Languages & Frameworks]
date: 2018-06-17 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Increase the timeout of ASP.NET Core 2.0 API hosted in Azure App Service
There are reasons that you **might** have a request that takes 2-3 minutes to complete and this post is for you. For most, you should probably look at decoupling these long running request.
If you're using ASP.NET Core 2.0 API and deploying to an Azure App Service, then you might run into an issue where it takes a process request longer than 2 minutes to complete. You'll typically get a `502 Bad Gateway` with the following info:
`"The specified CGI application encountered an error and the server terminated the process".`
If you check your diagnostic logfile you might see:
```
018-06-15 03:47:03.232 +00:00 [Error] Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddleware: An unhandled exception has occurred while executing the request
System.Threading.Tasks.TaskCanceledException: A task was canceled.
```
You can fix this by going into your web.config in your sites/wwwroot folder and adding a `requestTimeout="00:20:00` to the file as shown below.
```
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<system.webServer>
<handlers>
<add name="aspNetCore" path="*" verb="*" modules="AspNetCoreModule" resourceType="Unspecified" />
</handlers>
<aspNetCore processPath="dotnet" arguments=".\WebApplication1.dll" stdoutLogEnabled="false" stdoutLogFile=".\logs\stdout" requestTimeout="00:20:00" />
</system.webServer>
</configuration>
<!--ProjectGuid: 3b93921c-f843-46c8-914e-xxx-->
```

61
blog/blog/tip133.md Normal file
Просмотреть файл

@ -0,0 +1,61 @@
---
type: post
title: "Tip 133 - Use the Azure Portal for Durable Functions Development"
excerpt: "Learn how to quickly use the Azure Portal for Durable Functions Development"
tags: [Serverless]
date: 2018-06-18 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use the Azure Portal for Durable Functions Development
Durable Functions addresses the task of managing state for an application. They are intended to address a variety of patterns and scenarios that would quickly get complicated using triggers, timers, etc. especially when orchestrating a range of activities with a set of tasks that need to happen each time a particular event occurs.
Here is one example: I have one task, that causes another task to occur, and so on with some conditional statements and other business logic to fork the workflow but Im trying to go from point a to point b. An example of this is called **Function chaining**. This refers to the pattern of executing a sequence of functions in a particular order.
Head over to our [docs](https://docs.microsoft.com/azure/azure-functions/durable-functions-sequence?WT.mc_id=docs-azuredevtips-azureappsdev) for more info or follow along with this tutorial and it might make sense.
#### Getting Started
Log into the Azure Portal and create a new Azure Function project like the following:
<img :src="$withBase('/files/azdfunc1.png')">
Configure the function app to use the 2.0 runtime version in the **Function app** settings tab.
<img :src="$withBase('/files/azdfunc2.png')">
Create a new custom function.
<img :src="$withBase('/files/azdfunc3.png')">
Search for the **Durable Functions Http Starter - C#** template.
<img :src="$withBase('/files/azdfunc4.png')">
Install the extention when prompted.
<img :src="$withBase('/files/azdfunc5.png')">
Give the orchestration client function a name **HttpStart** that is created by selecting Durable Functions Http Starter - C# template.
<img :src="$withBase('/files/azdfunc6.png')">
Once this is complete, then copy the URL as we'll use it later on.
Create a new orchestration function named **HelloSequence** and select **Durable Functions Orchestrator** template.
Create another function named **Hello** and use the **Durable Functions Activity** template.
Install [Postman](https://www.getpostman.com/apps), and create a POST request and use the following URL (after suppling the new Azure Function Name) : `https://yourfunctionname.azurewebsites.net/api/orchestrators/HelloSequence`
You should see the following:
<img :src="$withBase('/files/azdfunc7.png')">
Click on one of the **statusQueryGetUri** URLs and you see the status of the Durable Function :
<img :src="$withBase('/files/azdfunc8.png')">

35
blog/blog/tip134.md Normal file
Просмотреть файл

@ -0,0 +1,35 @@
---
type: post
title: "Tip 134 - Use Run-From-Zip to deploy a site to Azure Web Apps or Functions"
excerpt: "Learn how to use Run-From-Zip to deploy a site to Azure Web Apps or Functions"
tags: [Web, Serverless]
date: 2018-06-24 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use Run-From-Zip to deploy a site to Azure Web Apps or Functions
Here is a neat feature that I just discovered dispite it being added about 6 months or so ago. It is the ability to deploy a site to Azure Web Apps or Azure Functions from a zip file.
With **Run-From-Zip** there is no longer a deployment step which copies the files to wwwroot such as git, ftp, etc. Instead, the zip file that you point to in your App Settings gets mounted on wwwroot as read-only.
To get started:
Using **Azure Storage Explorer**, create a storage blob container and upload your zip file and select **Generate SAS Signature** as shown below:
<img :src="$withBase('/files/azblobfunction1.png')">
Hit **Create** and then **Copy**
<img :src="$withBase('/files/azblobfunction2.png')">
<img :src="$withBase('/files/azblobfunction3.png')">
Now head back over to the Azure Portal and add an Azure App Setting called `WEBSITE_RUN_FROM_ZIP`, and point it to your zip file.
Mine looks like : `WEBSITE_RUN_FROM_ZIP=https://REMOVED.blob.core.windows.net/michael-test/MichaelSampleApp.zip?st=2018-06-24T22%3A16%3A40Z&se=2018-06-25T22%3A16%3A40Z&sp=rl&sv=2017-07-29&sr=b&sig=01h%3D`
Now gives your site a couple of seconds and you should see your site that was deployed via a zip file.

28
blog/blog/tip135.md Normal file
Просмотреть файл

@ -0,0 +1,28 @@
---
type: post
title: "Tip 135 - Use Run-From-Zip without Azure Storage to deploy a site to Azure Web Apps or Functions"
excerpt: "Learn how to use Run-From-Zip to deploy a site to Azure Web Apps or Functions with Azure Storage"
tags: [Web, Serverless]
date: 2018-06-25 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use Run-From-Zip without Azure Storage to deploy a site to Azure Web Apps or Functions
[Yesterday](https://microsoft.github.io/AzureTipsAndTricks/blog/tip134.html) I discussed a feature that gives you the ability to deploy a site to Azure Web Apps or Azure Functions from a zip file. It is called **Run-From-Zip** which you simply point to the location in your App Settings and it automatically gets mounted on wwwroot as read-only.
The one requirement that it had was that you need an Azure Storage Blob Container. If you don't want to do that than an alternative approach is to host the files on Kudu.
Open Kudu and create a `home\data\SitePackages` folder, and drop your zip file in there.
<img :src="$withBase('/files/azkudu1.png')">
Create a file named `packagename.txt` and give it the name of your zip file.
Mine looks like the following `mcsample.zip`
In **Azure App Settings**, set `WEBSITE_RUN_FROM_ZIP` to `1` instead of the full path that we used yesterday with Azure Storage Blob Container.

65
blog/blog/tip136.md Normal file
Просмотреть файл

@ -0,0 +1,65 @@
---
type: post
title: "Tip 136 - Quickly Restore your Local Settings File for Azure Functions"
excerpt: "Learn how to quickly restore your local settings file for Azure Functions"
tags: [Serverless]
date: 2018-07-01 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Quickly Restore your Local Settings File for Azure Functions
If you've ever worked with Azure Functions then no doubt you've seen the `local.settings.json` file before. This file stores app settings, connection strings, etc. for local development.
It looks like the following to refresh your memory:
```
{
"IsEncrypted": true,
"Values": {
"FUNCTIONS_EXTENSION_VERSION": "VALUE",
"WEBSITE_CONTENTAZUREFILECONNECTIONSTRING": "VALUE",
"WEBSITE_CONTENTSHARE": "VALUE",
"AzureWebJobsDashboard": "VALUE",
"AzureWebJobsStorage": "VALUE",
"ConsumerKey": "VALUE",
"ConsumerSecret": "VALUE",
"OAuthTokenSecret": "VALUE",
"WEBSITE_TIME_ZONE": "VALUE"
},
"ConnectionStrings": {}
}
```
This file is also by default **not** checked into source control. If you open your `.gitignore` file you'll see the following:
```
#### Ignore Visual Studio temporary files, build results, and
#### files generated by popular Visual Studio add-ons.
# Azure Functions localsettings file
local.settings.json
```
With this knowledge, you might have a need one day to restore this file (for example, working with the source code on another machine pulled down from source) and you can easily do so.
Simply install the [Azure Functions Core Tools](https://docs.microsoft.com/azure/azure-functions/functions-run-local#install-the-azure-functions-core-tools?WT.mc_id=docs-azuredevtips-azureappsdev) with `npm install -g azure-functions-core-tools`.
Navigate to the source code where your Azure Function is and run `func azure account list`. This will ask you to login and you should ensure we are in the proper subscription where your Azure Function exist. You'll see something like the following:
```
C:\Users\mbcrump\src\FunctionTest>func azure account list
Subscription Current
------------ -------
Visual Studio Enterprise (xxx) True
Michael's Internal Subscription (xxx) False
```
If you're not in the right subscription then type `func azure account set <subid>` where `subid` is the correct subscription.
Now run `func azure functionapp fetch-app-settings <functionname>` where `functionname` is your Azure Function and it will restore your `local.settings.json` file!
<img :src="$withBase('/files/functioncliappsettings.png')">

75
blog/blog/tip137.md Normal file
Просмотреть файл

@ -0,0 +1,75 @@
---
type: post
title: "Tip 137 - Export Azure Resources to CSV files with PowerShell"
excerpt: "Learn how to quickly export Azure resources to CSV files with PowerShell"
tags: [Languages & Frameworks, Management and Governance]
date: 2018-07-02 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure PowerShell](https://docs.microsoft.com/powershell/azure/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Export Azure Resources to CSV files with PowerShell
If you've ever had a need to create a CSV file of various Azure Resources for reports, etc. than this post is for you. I'm going to quickly show you how to generate a CSV file with PowerShell that lists VMs in the active subscription along with a couple of additional details.
Begin by typing `Install-Module -Name AzureRM` and follow along with the prompts below. You may also install the update if you wish, but this should work fine in 5.x.
```powershell
Windows PowerShell
Copyright (C) Microsoft Corporation. All rights reserved.
PS C:\Windows\system32> Install-Module -Name AzureRM
NuGet provider is required to continue
PowerShellGet requires NuGet provider version '2.8.5.201' or newer to interact with NuGet-based repositories. The NuGet
provider must be available in 'C:\Program Files\PackageManagement\ProviderAssemblies' or
'C:\Users\mbcrump\AppData\Local\PackageManagement\ProviderAssemblies'. You can also install the NuGet provider by
running 'Install-PackageProvider -Name NuGet -MinimumVersion 2.8.5.201 -Force'. Do you want PowerShellGet to install
and import the NuGet provider now?
[Y] Yes [N] No [S] Suspend [?] Help (default is "Y"):
Untrusted repository
You are installing the modules from an untrusted repository. If you trust this repository, change its
InstallationPolicy value by running the Set-PSRepository cmdlet. Are you sure you want to install the modules from
'PSGallery'?
[Y] Yes [A] Yes to All [N] No [L] No to All [S] Suspend [?] Help (default is "N"): Y
WARNING: Version '5.7.0' of module 'AzureRM' is already installed at 'C:\Program
Files\WindowsPowerShell\Modules\AzureRM\5.7.0'. To install version '6.3.0', run Install-Module and add the -Force
parameter, this command will install version '6.3.0' in side-by-side with version '5.7.0'.
```
Run the `Import-Module AzureRM` command and then type `A` as shown below:
```powershell
PS C:\Windows\system32> Import-Module AzureRM
Do you want to run software from this untrusted publisher?
File C:\Program Files\WindowsPowerShell\Modules\AzureRM\5.7.0\AzureRM.psm1 is published by CN=Microsoft Corporation,
O=Microsoft Corporation, L=Redmond, S=Washington, C=US and is not trusted on your system. Only run scripts from trusted
publishers.
[V] Never run [D] Do not run [R] Run once [A] Always run [?] Help (default is "D"): A
```
In case you aren't in the working subscription where you have your resources then you might have to type `Set-AzureRmContext -SubscriptionId <SubId>` where `SubId` is the subscription id that you want to query.
Now add the following in your PowerShell window.
```powershell
$VMs = Get-AzureRmVM
$vmOutput = $VMs | ForEach-Object {
[PSCustomObject]@{
"VM Name" = $_.Name
"VM Type" = $_.StorageProfile.osDisk.osType
"VM Profile" = $_.HardwareProfile.VmSize
"VM OS Disk Size" = $_.StorageProfile.OsDisk.DiskSizeGB
"VM Data Disk Size" = ($_.StorageProfile.DataDisks.DiskSizeGB) -join ','
}
}
$vmOutput | export-csv C:\Users\mbcrump\data.csv -delimiter ";" -force -notypeinformation
```
You can now open this CSV file in Excel and use a `;` delimeter to format each item into columns.
<img :src="$withBase('/files/powershellexport.png')">

37
blog/blog/tip138.md Normal file
Просмотреть файл

@ -0,0 +1,37 @@
---
type: post
title: "Tip 138 - Host a Static Website with Azure Storage"
excerpt: "Learn how to quickly host a static website with Azure Storage"
tags: [Storage, Web]
date: 2018-07-08 17:00:00
---
::: tip
:bulb: Learn more : [Azure storage account overview](https://docs.microsoft.com/azure/storage/common/storage-account-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to host a static website with Azure Storage](https://www.youtube.com/watch?v=gYpNC_tdbQQ&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=51?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Host a Static Website with Azure Storage
A feature that was recently announced was the ability to run a static website using Azure Storage. I decided to take it for a quick test spin and show you the experience.
Begin by creating a new **Azure Storage Account** and provide a name and under the **Account Kind** make sure that you select **StorageV2**. Go ahead and configure the rest of the options and press Create.
<img :src="$withBase('/files/azurestoragestaticsite1.png')">
After it creates the resouce then go to **Settings** and select **Static website**. You'll see a couple of options after selecting **Enabled** for Static Website.
Under the **Index Document Name** type `index.html` and under **Error document path** type `404.html`.
<img :src="$withBase('/files/azurestoragestaticsite2.png')">
Once you press **Save**, you'll see there is a `$web` folder that you can click on to upload your files. I simply dropped a single `index.html` file with some text to test. You'll also want to jot down the **Primary endpoint** location as you'll test your site with that URL.
<img :src="$withBase('/files/azurestoragestaticsite3.png')">
<img :src="$withBase('/files/azurestoragestaticsite5.png')">
Once you've uploaded your file to `$web` then go to your browser and paste in the URL provided in the previous step.
<img :src="$withBase('/files/azurestoragestaticsite4.png')">

39
blog/blog/tip139.md Normal file
Просмотреть файл

@ -0,0 +1,39 @@
---
type: post
title: "Tip 139 - Prevent AzCopy Uploads from maxing out Internet Connection Speed"
excerpt: "Learn how to prevent AzCopy uploads from maxing out internet connection speed"
tags: [Storage]
date: 2018-07-09 17:00:00
---
::: tip
:bulb: Learn more : [AzCopy v10](https://docs.microsoft.com/azure/storage/common/storage-use-azcopy-v10?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Prevent AzCopy Uploads from maxing out Internet Connection Speed
**What is AzCopy?** AzCopy is a command-line utility designed for copying data to/from Microsoft Azure Blob, File, and Table storage, using simple commands designed for optimal performance. You can copy data between a file system and a storage account, or between storage accounts. *(courtesy of docs)*
You can download either the latest version of AzCopy on [Windows](http://aka.ms/downloadazcopy?WT.mc_id=akams-azuredevtips-azureappsdev) or [Linux](https://docs.microsoft.com/azure/storage/common/storage-use-azcopy-linux?WT.mc_id=docs-azuredevtips-azureappsdev).
For this example, I'm going to use Windows. If you type AzCopy then you'll see there is a list of parameters available.
<img :src="$withBase('/files/azcopy1blog.png')">
At first glance, you may think this is all but you can easily type `azcopy /?` to get a complete list which will show you additional parameters - which includes the one that we'll talk about shortly.
If you are using **AzCopy** to send large amounts of data to Azure on either a residential or small business internet pipe, or if for whatever reason you want to limit the concurrent-operations that the app uses then this tip is for you.
Simply use the `/NC:"number-of-concurrent-operations"` where `number-of-concurrent-operations` specifies the number of concurrent operations. (for example: azcopy /NC:1)
```
AzCopy by default starts a certain number of concurrent operations to increase the data transfer throughput. Note that large number of concurrent operations in a low-bandwidth environment may overwhelm the network connection and prevent the operations from fully completing. Throttle concurrent operations based on actual available network bandwidth.
The upper limit for concurrent operations is 512.
Applicable to: Blobs, Files, Tables
```
For a complete list of parameters and additional information then visit [docs](https://docs.microsoft.com/azure/storage/common/storage-use-azcopy?toc=%2fazure%2fstorage%2ffiles%2ftoc.json#azcopy-parameters?WT.mc_id=docs-azuredevtips-azureappsdev)

120
blog/blog/tip14.md Normal file
Просмотреть файл

@ -0,0 +1,120 @@
---
type: post
title: "Tip 14 - Generate SSH public key to log into Linux VM with Cloud Shell"
excerpt: "Learn how to generate SSH keys to log into a Linux VM with Cloud Shell and BASH on Windows 10"
tags: [Management and Governance, Virtual Machines]
date: 2017-09-11 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to generate SSH public key to log into Linux VM](https://www.youtube.com/watch?v=16bUZ43CGxs&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=12?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Generate SSH keys to log into Linux VM with Cloud Shell
For these instructions, I'll assume you have a Linux VM already setup and connecting via Cloud Shell.
1.) Log into Azure Cloud Shell and type `ssh-keygen -t rsa -b 2048`. Accept all default by pressing enter. It has generated a public key that is stored in `/home/michael/.ssh/id_rsa.pub.` as shown below.
```
michael@Azure:~/clouddrive$ ssh-keygen -t rsa -b 2048
Generating public/private rsa key pair.
Enter file in which to save the key (/home/michael/.ssh/id_rsa):
Created directory '/home/michael/.ssh'.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /home/michael/.ssh/id_rsa.
Your public key has been saved in /home/michael/.ssh/id_rsa.pub.
The key fingerprint is:
SHA256:FHZVjZfU0zZaXoEvbg37/YUW+02VMIXl6UtUIumpHs0 michael@cc-72f9-63c154d-32136390-qk3bs
The key's randomart image is:
+---[RSA 2048]----+
| o ..ooBB*|
| . o .++*X|
| . . +=*+|
| . o+=o.|
| S +. *+.|
| o E+.=o|
| . .. =.+|
| . . ++|
| =|
+----[SHA256]-----+
michael@Azure:~/clouddrive$
```
2.) Ensure the key was generated by typing `ls -a`.
```
michael@Azure:~$ ls -a
. .. .azure .bash_history .bash_logout .bashrc clouddrive .profile .ssh
```
3.) Looks good (we see `.ssh`), we'll go ahead and copy it to our server with `ssh-copy-id user@ipaddy:`
```
michael@Azure:~$ ssh-copy-id user@ipaddy
mbcrump@52.161.31.243's password:
id_rsa.pub 100% 420 0.4KB/s 00:00
```
4.) SSH to the Linux server with `ssh user@ipaddy`.
5.) Edit the ssh server configuration file with `sudo nano /etc/ssh/sshd_config`.
5.1) These entries must be set to yes and they should already be that way by default:
RSAAuthentication yes
PubkeyAuthentication yes
6.) Reload the configuration with `sudo service ssh reload`.
7.) Disconnect and try to connect without the need to give the password to the ssh-client `ssh user@ipaddy`.
8.) If everything goes as planned, you should see:
```
michael@Azure:~$ ssh user@ipaddy
Welcome to Ubuntu 16.04.3 LTS (GNU/Linux 4.4.0-92-generic x86_64)
* Documentation: https://help.ubuntu.com
* Management: https://landscape.canonical.com
* Support: https://ubuntu.com/advantage
Get cloud support with Ubuntu Advantage Cloud Guest:
http://www.ubuntu.com/business/services/cloud
15 packages can be updated.
0 updates are security updates.
*** System restart required ***
Last login: Sun Sep 10 23:49:35 2017 from 40.83.147.69
```
<img :src="$withBase('/files/cloudshellpersistdata.gif')">
#### BONUS: If you want to disable the password you previously set on the Linux machine
If you want to disable the password on the Linux machine that you previously set:
1.) SSH back into the machine with `ssh user@ipaddy`.
2.) Disable password authentication with `sudo nano /etc/ssh/sshd_config`.
2.1) Ensure the following settings should are set to no:
ChallengeResponseAuthentication no
PasswordAuthentication no
UsePAM no
2.2.) Reload the configuration with `sudo service ssh reload`
3.) You can see if the password authentication is disabled by logging out and then trying to connect with key file authentication disabled with `ssh user@ipaddress -o PubkeyAuthentication=no`. You should get "Permission denied".
#### BONUS #2: You can easily do the same with BASH on Windows 10
You can have the same goodness that you have with Azure Cloud Shell on your local machine. In my case, I'm using BASH on Windows and can just run steps 1-5 listed above. Boom!

36
blog/blog/tip140.md Normal file
Просмотреть файл

@ -0,0 +1,36 @@
---
type: post
title: "Tip 140 - Easily copy your SQL Azure database to your local development server"
excerpt: "Learn how to easily copy your SQL Azure database to your local development server"
tags: [Databases]
date: 2018-07-15 17:00:00
---
::: tip
:bulb: Learn more : [Azure SQL Database Documentation](https://docs.microsoft.com/azure/sql-database?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Easily copy your SQL Azure database to your local development server
I've ran across folks at conferences that asked me "How do you copy a SQL Azure database to my local development machine?" While chatting with them, I always found it difficult to understand why (as it is dirt cheap to have a development SQL Azure instance in the cloud) but nevertheless it is their data and there is an easy way to do this.
First off, [download SQL Server Management Studio (SSMS)](https://docs.microsoft.com/sql/ssms/download-sql-server-management-studio-ssms?view=sql-server-2017?WT.mc_id=docs-azuredevtips-azureappsdev) and connect to your SQL Azure database that you want to copy locally.
**Note by cbattlegear** One important caveat to this process (as shown below). If any writes are happening on the database while you do the export the import will be broken. Best practice is to run `CREATE DATABASE AS COPY` to create a copy of the database and create an export of the copy.
Right-click on the **Database** -> click **Tasks** > **Export data-tier application**
<img :src="$withBase('/files/sqlazure1.png')">
You'll see a wizard and will export your database to a local .bacpac file.
<img :src="$withBase('/files/sqlazure2.png')">
You'll see something similar to this screen once it has finished processing.
<img :src="$withBase('/files/sqlazure4.png')">
Now ensure you are connected to your local target SQL server instance (or SQL Azure instance) and right-click on **Databases** (the parent folder of your actual database) -> click **Tasks** > **Import data-tier application** and select the .backpac file that you created earlier.
<img :src="$withBase('/files/sqlazure3.png')">

104
blog/blog/tip141.md Normal file
Просмотреть файл

@ -0,0 +1,104 @@
---
type: post
title: "Tip 141 - Generate a Zip file from Azure Blob Storage Files"
excerpt: "Learn how to easily generate a Zip file from Azure Blob Storage Files"
tags: [Storage]
date: 2018-07-16 17:00:00
---
::: tip
:bulb: Learn more : [Azure storage account overview](https://docs.microsoft.com/azure/storage/common/storage-account-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Generate a Zip file from Azure Blob Storage Files
You might have a task that pops up where you need to generate a zip file from a number of files in your Azure blob storage account. For 1 or 2 files, this may not be a problem but for 20-2000, you might want to find a way to automate this.
One option is to zip the files directly to the output stream using the blob streams. If you do this it still means that you are routing the stream through the web server.
To begin you will want a way work with zip easily. I used the latest version of [ICSharpZipLib](https://github.com/icsharpcode/SharpZipLib?WT.mc_id=github-azuredevtips-azureappsdev) and make sure you pull in the [Azure Storage library](https://www.nuget.org/packages/Azure.Storage.Blobs/).
We'll setup our Storage Service which will read from the stream and get a reference to the blob we are currently looping through.
```csharp
public void ReadToStream(IFileIdentifier file, Stream stream, StorageType storageType = StorageType.Stored, ITenant overrideTenant = null)
{
var blob = GetBlockBlobClient(file, storageType, overrideTenant);
blob.DownloadTo(stream);
}
private BlockBlobClient GetBlockBlobClient(IFileIdentifier file, StorageType storageType = StorageType.Stored, ITenant overrideTenant = null)
{
var filepath = GetFilePath(file, storageType);
var container = GetTenantContainer(overrideTenant);
return container.GetBlockBlobClient(filepath);
}
```
Now we have one method that takes in a response stream and loops through all the files to generate our zip file.
```csharp
public void ZipFilesToResponse(HttpResponseBase response, IEnumerable<Asset> files, string zipFileName)
{
using (var zipOutputStream = new ZipOutputStream(response.OutputStream))
{
zipOutputStream.SetLevel(0); // 0 - store only to 9 - means best compression
response.BufferOutput = false;
response.AddHeader("Content-Disposition", "attachment; filename=" + zipFileName);
response.ContentType = "application/octet-stream";
foreach (var file in files)
{
var entry = new ZipEntry(file.FilenameSlug())
{
DateTime = DateTime.Now,
Size = file.Filesize
};
zipOutputStream.PutNextEntry(entry);
storageService.ReadToStream(file, zipOutputStream);
response.Flush();
if (!response.IsClientConnected)
{
break;
}
}
zipOutputStream.Finish();
zipOutputStream.Close();
}
response.End();
}
```
Another "quick and dirty" MVC sample can be found below. This sample specifies the file names and the zip file name.
```csharp
public ActionResult Download()
{
StorageSharedKeyCredential credential = new StorageSharedKeyCredential("<StorageAccountName>", "<StorageAccountKey>");
BlobServiceClient serviceClient = new BlobServiceClient(new Uri("<StorageAccountUri>"), credential);
BlobContainerClient container = serviceClient.GetBlobContainerClient("test");
var blobFileNames = new string[] { "file1.png", "file2.png", "file3.png", "file4.png" };
using (var zipOutputStream = new ZipOutputStream(Response.OutputStream))
{
foreach (var blobFileName in blobFileNames)
{
zipOutputStream.SetLevel(0);
var blob = container.GetBlockBlobClient(blobFileName);
var entry = new ZipEntry(blobFileName);
zipOutputStream.PutNextEntry(entry);
blob.DownloadTo(zipOutputStream);
}
zipOutputStream.Finish();
zipOutputStream.Close();
}
Response.BufferOutput = false;
Response.AddHeader("Content-Disposition", "attachment; filename=" + "zipFileName.zip");
Response.ContentType = "application/octet-stream";
Response.Flush();
Response.End();
return null;
}
```

25
blog/blog/tip142.md Normal file
Просмотреть файл

@ -0,0 +1,25 @@
---
type: post
title: "Tip 142 - Quickly edit files within Cloud Shell using Code"
excerpt: "Learn how to quickly edit files within Cloud Shell using Code"
tags: [Management and Governance]
date: 2018-07-22 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Quickly edit files within Cloud Shell using Code
Recently I noticed that I don't mention how to use Code with Cloud Shell. This tip is to make that right!
I'm sure by now everyone has used the lovely [Code editor](https://code.visualstudio.com/) in some application before but may not be aware that you use the editor within Cloud Shell without installing anything. To give this a spin, then open up Cloud Shell and type ``code .`` and you'll see the following:
<img :src="$withBase('/files/azcodeinportal.gif')">
Notice that you can do things such as navigate directories, view files with the same syntax used in VS Code and you can easily save, close the editor or open a file outside the current working directory and open the command pallete.
If you open the command pallete then you'll see a very familiar list of commands that you've probably used in the editor on your desktop.
<img :src="$withBase('/files/azcodeinportal1.gif')">

27
blog/blog/tip143.md Normal file
Просмотреть файл

@ -0,0 +1,27 @@
---
type: post
title: "Tip 143 - Keep your Azure Web App Hydrated and Responsive"
excerpt: "Learn how to easily keep your Azure Web App hydrated and responsive"
tags: [Web]
date: 2018-07-23 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Keep your Azure Web App Hydrated and Responsive
If you have ever noticed that after a publish or restart of your Azure web app it might load slowly the first time, then when you refresh with F5 it is ok again.
If you've ever noticed this behavior an option **might** be to go inside of the **Application Settings** for the web app in the portal, and activating **Always On**, which keeps your web hydrated and responsive.
<img :src="$withBase('/files/azurewebappalwayson1.png')">
According to the information icon:
> Indicates that your web app needs to be loaded at all times. By default, web apps are unloaded after they have been idle. It is recommended that you enable this option when you have continuous web jobs running on the web app.
Keep in mind that you will not be able to turn this feature on when using the **Free version** of Azure App Services. It is only available in **Basic** or **Standard** plans.
To recap, if your app runs continuous WebJobs or runs WebJobs triggered using a CRON expression, you should enable **Always On**, or the web jobs may not run reliably.

31
blog/blog/tip144.md Normal file
Просмотреть файл

@ -0,0 +1,31 @@
---
type: post
title: "Tip 144 - Swiftly understand what versions of .NET are supported on Azure App Service"
excerpt: "Learn how to swiftly understand what versions of .NET are supported on Azure App Services"
tags: [Web]
date: 2018-07-29 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Swiftly understand what versions of .NET are supported on Azure App Service
With the release of .NET Framework 4.7.2, I've been asked multiple times it Azure App Services (Websites) supports it yet. While I can quickly answer this question, there will always be a vNext and this question may come up again. So how do you check to see what version of the .NET Framework Azure App Services Supports?
One of the easiest ways that I know of it to use an existing website that you created that is hosted on Azure App Services and go to the **Development Tools** and **Advanced Tools** and open the Kudu Portal
<img :src="$withBase('/files/azureappkudu1.png')">
Then to go the **Debug console** and then **CMD**.
<img :src="$withBase('/files/azureappkudu2.png')">
Type `cd D:\Program Files (x86)\Reference Assemblies\Microsoft\Framework\.NETFramework`
Type `dir`
You'll see a list of the supported .NET Frameworks!
<img :src="$withBase('/files/azureappkudu3.png')">

34
blog/blog/tip145.md Normal file
Просмотреть файл

@ -0,0 +1,34 @@
---
type: post
title: "Tip 145 - Easily reset the Administrator password for an Azure SQL database"
excerpt: "Learn how to easily reset the password for Azure SQL database"
tags: [Databases]
date: 2018-07-30 17:00:00
---
::: tip
:bulb: Learn more : [Azure SQL Database Documentation](https://docs.microsoft.com/azure/sql-database?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Easily reset the Administrator password for an Azure SQL database
A common scenario that I have heard folks ask is "How do I reset the Admin password for an Azure SQL database that I've forgotten or lost?"
An easy solution to this is
1. Go to the [Azure portal](https://portal.azure.com)
2. Select **SQL databases**
3. Select the name of the database that you want to change the Admin password.
4. Click on the **Server name url** for the selected database.
<img :src="$withBase('/files/azuresqlpw1.png')">
The **Reset password** option is at the top.
<img :src="$withBase('/files/azuresqlpw2.png')">
Please note that if you reset the SQL Database server password during a time when there are active connections to databases on the server, you may want to use the KILL statement to terminate user sessions. This will force client connections to refresh their sessions with the database and the host server.

34
blog/blog/tip146.md Normal file
Просмотреть файл

@ -0,0 +1,34 @@
---
type: post
title: "Tip 146 - Rename an Azure SQL database"
excerpt: "Learn how to easily rename an Azure SQL database"
tags: [Databases]
date: 2018-08-05 17:00:00
---
::: tip
:bulb: Learn more : [Azure SQL Database Documentation](https://docs.microsoft.com/azure/sql-database?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Rename an Azure SQL database
Last week, I did a SQL post on [Easily reset the Administrator password for an Azure SQL database](https://microsoft.github.io/AzureTipsAndTricks/blog/tip145.html) and it did rather well. So I'm back with another SQL post that addresses another common scenario that folks ask "How do I rename an Azure SQL database"?
#### Rename with command-line - TSQL
1. Connect with **SQL Server Management Studio** to your Azure database server
2. Right-click on the master database and select **New Query**
3. In the **New Query window** type `ALTER DATABASE [dbname] MODIFY NAME = [newdbname]`. (Make sure you include the square brackets around both database names.)
#### Rename with a GUI - SQL Server Management Studio
1. Connect with SQL Server Management Studio
2. Make sure **Object Explorer** pane is open.
3. Click on the database name *(as the rename option from the dropdown will be greyed out)* and type in the new name.
4. The Azure Portal should show the reflected the change almost immediately.

105
blog/blog/tip147.md Normal file
Просмотреть файл

@ -0,0 +1,105 @@
---
type: post
title: "Tip 147 - Run TSQL on an Azure SQL database with Azure Functions"
excerpt: "Learn how to run TSQL on an Azure SQL database with Azure Functions"
tags: [Serverless, Databases]
date: 2018-08-06 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Run TSQL on an Azure SQL database with Azure Functions
I've recently been adding Azure SQL tips such as [Easily reset the Administrator password for an Azure SQL database](https://microsoft.github.io/AzureTipsAndTricks/blog/tip145.html) and [Rename an Azure SQL database](https://microsoft.github.io/AzureTipsAndTricks/blog/tip146.html). and you all seem to like them. So I'm back with another SQL post that addresses another common scenario that folks ask "How do I run TSQL on an Azure SQL database with Azure Functions"?
#### SQL Database
Before we begin you'll need to grab the connection string from the database you created earlier. Simply select **SQL Databases** and select your database on the SQL databases page.
Click **Show database connection strings** and copy the string to your clipboard.
<img :src="$withBase('/files/azconstring1.png')">
Go ahead and replace {your_username} and {your_password} placeholders with real values and save it somewhere easily accessible.
#### Azure Functions
Create a new Azure Function and select Timer Trigger. You typically want to store this secret in **Platform features > Application settings** in the **Connection strings** placeholder. So go ahead and do that as shown below:
<img :src="$withBase('/files/azconstring2.png')">
Now use the following code
```csharp
#r "System.Configuration"
#r "System.Data"
using System.Configuration;
using System.Data.SqlClient;
using System.Threading.Tasks;
public static async Task Run(TimerInfo myTimer, TraceWriter log)
{
var str = ConfigurationManager.ConnectionStrings["sqldb_connection"].ConnectionString;
using (SqlConnection conn = new SqlConnection(str))
{
conn.Open();
var text = "UPDATE MichaelTestDB.User " +
"SET [Item] = 5 WHERE DateAdded < GetDate();";
using (SqlCommand cmd = new SqlCommand(text, conn))
{
var rows = await cmd.ExecuteNonQueryAsync();
}
}
}
```
The previouse code may not work on Azure Functions Runtime Version 2 or above.ConfigurationManager is not used anymore. Check out this post
https://www.koskila.net/how-to-access-azure-function-apps-settings-from-c/
Also, according to this Microsoft doc, ConnctionString is not recommended.
"A collection. Don't use this collection for the connection strings used by your function bindings. This collection is used only by frameworks that typically get connection strings from the ConnectionStrings section of a configuration file, like Entity Framework. Connection strings in this object are added to the environment with the provider type of System.Data.SqlClient. Items in this collection aren't published to Azure with other app settings. You must explicitly add these values to the Connection strings collection of your function app settings. If you're creating a SqlConnection in your function code, you should store the connection string value with your other connections in Application Settings in the portal."
https://docs.microsoft.com/en-us/azure/azure-functions/functions-run-local#local-settings-file
In addtion, Microsoft also recommended to use Microsoft.Data.SqlClient instead of System.Data.SqlClient.
https://devblogs.microsoft.com/dotnet/introducing-the-new-microsoftdatasqlclient/
Proposed to use the following code:
```csharp
#r "System.Configuration"
#r "System.Data"
#r more namespace will be generted by using visual studio
using System.Configuration;
//using System.Data.SqlClient; //replace it with Microsoft.Data.SqlClient;
using Microsoft.Data.SqlClient;
using System.Threading.Tasks;
using Microsoft.Extensions.Configuration; //need this for ConfigurationBuilder
public static async Task Run(TimerInfo myTimer, TraceWriter log, ExecutionContext context))
{
var config = new ConfigurationBuilder().SetBasePath(context.FunctionAppDirectory).
AddJsonFile("local.settings.json", optional: true, reloadOnChange: true).AddEnvironmentVariables().Build();
//var str = ConfigurationManager.ConnectionStrings["sqldb_connection"].ConnectionString;
var str = config["sqldb_connection"];
using (SqlConnection conn = new SqlConnection(str))
{
conn.Open();
var text = "UPDATE MichaelTestDB.User " +
"SET [Item] = 5 WHERE DateAdded < GetDate();";
using (SqlCommand cmd = new SqlCommand(text, conn))
{
var rows = await cmd.ExecuteNonQueryAsync();
}
}
}
```

94
blog/blog/tip148.md Normal file
Просмотреть файл

@ -0,0 +1,94 @@
---
type: post
title: "Tip 148 - Share Business Logic between Azure Functions"
excerpt: "Learn how to run share business logic between Azure Functions"
tags: [Serverless]
date: 2018-08-12 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Share Business Logic between Azure Functions
A common scenario to share business logic between function is: I have an Azure Function that runs daily on a timer trigger and need the same function to run when I request it manually. While the template used may change you basically dont want to copy/paste the logic into another function. Instead, your function app architecture could look like:
```
MyFunctionApp
| host.json
|____ shared
| |____ businessLogic.js
|____ function1
| |____ index.js
| |____ function.json
|____ function2
|____ index.js
|____ function.json
```
In the `shared` folder, you'd create a file named `businessLogic.js` and put your shared code.
A sample in JS could be the following:
```javascript
module.exports = function (context, req) {
context.res = {
body: "<b>Hello World, from Azure Tips and Tricks</b>",
status: 201,
headers: {
'content-type': "text/html"
}
};
context.done();
};
```
Then you'd create two separate functions called `function1` and `function2`.
Then in your `function1/index.js` and `function2/index.js`, you would use the following lines of code that reference the shared logic folder and file.
```javascript
var logic = require("../shared/businessLogic.js");
module.exports = logic;
```
Notice that each function has their own `function.json` file. So here you could define them to be different triggers such as `function1` could be an HTTP Trigger
```json
{
"bindings": [
{
"authLevel": "anonymous",
"type": "httpTrigger",
"direction": "in",
"name": "req",
"methods": [
"get",
"post"
]
},
{
"type": "http",
"direction": "out",
"name": "res"
}
],
"disabled": false
}
```
and `function2` could be a Timer Trigger
```json
{
"bindings": [
{
"name": "myTimer",
"type": "timerTrigger",
"direction": "in",
"schedule": "0 */5 * * * *"
}
],
"disabled": false
}
```

30
blog/blog/tip149.md Normal file
Просмотреть файл

@ -0,0 +1,30 @@
---
type: post
title: "Tip 149 - Use PowerShell to quickly see if your Deployment Slot Swapped Successfully"
excerpt: "Learn how to run share business logic between Azure Functions"
tags: [Languages & Frameworks]
date: 2018-08-13 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use PowerShell to quickly see if your Deployment Slot Swapped Successfully
A common scenario after sending a swap action to Azure App Service is to check its progress. While you can easily use the Azure Portal, another alternative that I often use is PowerShell.
You'll quickly learn that `Microsoft.Web/sites/slots/slotsswap/action` contains the information "Succeeded" that you are looking for at the 50th character if you start digging around in the PowerShell and Azure docs.
We can wrap this up in a bow with the following line:
```powershell
$a = Get-AzLog -ResourceGroupName <ResourceGroupName> | Where-Object { $_.operationname.value -contains "Microsoft.Web/sites/slots/slotsswap/action" -and $_.Status.Value -eq 'Succeeded'}
$a | select { $_.eventtimestamp,$_.operationname.value,$_.status.value,$_.resourceid.substring(50) }
```
Now if you paste this in PowerShell, you should get the following:
<img :src="$withBase('/files/powershellslot1.png')">
As always, I hoped this help someone out there!

72
blog/blog/tip15.md Normal file
Просмотреть файл

@ -0,0 +1,72 @@
---
type: post
title: "Tip 15 - Underlying Software in Azure Cloud Shell"
excerpt: "Learn about some of the software found inside a Azure Cloud Shell instance"
tags: [Management and Governance]
date: 2017-09-12 17:00:00
---
::: tip
:bulb: Learn more : [Overview of Azure Cloud Shell](https://docs.microsoft.com/azure/cloud-shell/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [Learn about the underlying Software in Azure Cloud Shell](https://www.youtube.com/watch?v=wODji8h6YYI&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=13?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Underlying Software in Azure Cloud Shell
When you spin up an Azure Cloud Shell, it create a container that contains things such the OS and other runtimes. By default you get Linux, Node.js and more (covered later). The storage account setup the first time you use Cloud Shell then persist data (like shell scripts, SSH keys, etc.) that you can use once you are connected to the container. It also persist things automatically such as your `.bash_history` and stores your Azure authentication token in `./azure/accessTokens.json`.
With that information, let's see what is under the hood. Spin up your Azure Cloud Shell now!
#### Host Operating System
The container that your Azure Cloud Shell instance is running in is Ubuntu Linux. You can gather additional information about the release with the following commands.
You can type `lsb_release -a` to see print [OS level distribution information](https://refspecs.linuxfoundation.org/LSB_3.0.0/LSB-PDA/LSB-PDA/lsbrelease.html) that is being used.
michael@Azure:~$ lsb_release -a
No LSB modules are available.
Distributor ID: Ubuntu
Description: Ubuntu 16.04.2 LTS
Release: 16.04
Codename: xenial
You can use `uname -a` to [print information](https://www.computerhope.com/unix/uuname.htm) about the current system.
michael@Azure:~$ uname -a
Linux cc-72f9-63c154d-1351310522-4x9jr 4.4.0-93-generic #116-Ubuntu SMP Fri Aug 11 21:17:51 UTC 2017 x86_64 x86_64 x86_64 GNU/Linux
Things like `arch` gives you architecture information
michael@Azure:~$ arch
x86_64
#### You have access to typical Linux apps
You can type `man` for access to the manual.
michael@Azure:~$ man
What manual page do you want?
You can pull up specific pages for help documentation such as `man ls`.
You have access to vim, nano and other editors.
<img :src="$withBase('/files/azuretip15.gif')">
#### Additional Software Installed in Cloud Shell
The container also contains things like Git, Python, Node.js, .NET Core. You can test this by the following commands:
michael@Azure:~$ git --version
git version 2.7.4
michael@Azure:~$ python --version
Python 3.5.2
michael@Azure:~$ nodejs -v
v6.9.4
michael@Azure:~$ dotnet --version
1.0.1

48
blog/blog/tip150.md Normal file
Просмотреть файл

@ -0,0 +1,48 @@
---
type: post
title: "Tip 150 - Use the Mac Touch Bar to launch the Azure Portal"
excerpt: "Learn how to create a shortcut to Azure using the Mac Touch Bar "
tags: [Management and Governance]
date: 2018-08-19 17:00:00
---
::: tip
:bulb: Learn more : [Azure portal documentation](https://docs.microsoft.com/azure/azure-portal/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use the Mac Touch Bar to launch the Azure Portal
For those new to the Mac Touch Bar, it sits at the top of your keyboard and adapts to what you're doing and provides intuitive shortcuts and app controls when you need them. For example, the controls will change when you are in Chrome vs Outlook. Below is a screenshot of the default layout before you switch to an application:
<img :src="$withBase('/files/keyboardaz1.png')">
**Screenshot courtesy of Apple**
Kinda disappointing. I felt I could do better. After searching all over I found a tool called [BetterTouchTool](https://folivora.ai/) that allows you to customize the Mac Touch Bar. After spending an entire weekend playing with it, I landed on the following type (in case you are curious):
<img :src="$withBase('/files/keyboardaz2.png')">
You'll see the following from left to right:
* Shortcut to VS Code
* Shortcut to Hyper.js
* Shortcut to Azure
* Shortcut to TextEdit
* Volume
* Current Temp where I'm at
* Clipboard Manager
* Create New File in Finder with Dialog
* Copy path to current folder in Finder
* BetterTouchTool
* Brightness
* Mute
* Lock Computer
* Spotlight
Since we're interested in adding a shortcut to the Azure Portal, all you have to do is download [BetterTouchTool](https://folivora.ai/) and click on the **TouchBar** option and **Add a TouchBar Button** and configure it to open a URL such `portal.azure.com`
<img :src="$withBase('/files/keyboardaz3.png')">
Easy enough and will definitely get you a couple of extra nerd points!
What is also interesting is the ability to call an API and retrieve data and parse it to the Touch Bar like I did with the weather. Leave your ideas below on other ways to be more productive with Azure below.

77
blog/blog/tip152.md Normal file
Просмотреть файл

@ -0,0 +1,77 @@
---
type: post
title: "Tip 152 - Get the Record Count in Cosmos DB"
excerpt: "Learn how to get the record count in Azure Cosmos DB"
tags: [Databases]
date: 2018-08-26 17:00:00
---
::: tip
:bulb: Learn more : [Azure Cosmos DB](https://docs.microsoft.com/azure/cosmos-db/introduction?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Get the Record Count in Cosmos DB
When working with Azure Cosmos DB, it is guaranteed that at some point that you'll need to get the record count of a document. There are a couple of quick ways of how to do this through the Azure Portal by navigating to the Cosmos DB resource you wish to query and selecting the **Data Explorer** tab and using the following query: `SELECT VALUE COUNT(1) FROM c`
<img :src="$withBase('/files/dataexpcosmos1.png')">
> If youre wondering about the VALUE keyword – all queries return JSON fragments back. By using VALUE, you can get the scalar value of count e.g., 100, instead of the JSON document {"$1": 100}
You may want to know the **Request Charge** for that query and you can see that by clicking on **Query Information**.
<img :src="$withBase('/files/dataexpcosmos2.png')">
If you compare that to some of the older methods folks used before `COUNT` was available such as `SELECT c.id FROM c` then you'd see that it might be time to update the queries.
<img :src="$withBase('/files/dataexpcosmos3.png')">
If I need to put this logic inside of an application instead of access it through the Azure Portal, here is how I do that in C#.
First, I create a `app.config` file and add the following appSettings tags.
>You can get the **endpoint** and **authkey** from the **Keys** section in your Cosmos DB blade in the portal.
```
<appSettings>
<add key="endpoint" value="enter" />
<add key="authkey" value="enter" />
<add key="database" value="enter" />
<add key="collection" value="enter" />
</appSettings>
```
```csharp
private static readonly string DatabaseId = ConfigurationManager.AppSettings["database"];
private static readonly string CollectionId = ConfigurationManager.AppSettings["collection"];
private static readonly string EndPointId = ConfigurationManager.AppSettings["endpoint"];
private static readonly string AuthKeyId = ConfigurationManager.AppSettings["authkey"];
private static DocumentClient client;
client = new DocumentClient(new Uri(EndPointId), AuthKeyId);
CreateDatabaseIfNotExistsAsync().Wait();
CreateCollectionIfNotExistsAsync().Wait();
FeedOptions queryOptions = new FeedOptions { MaxItemCount = -1 };
IQueryable<dynamic> familyQueryInSql = client.CreateDocumentQuery<dynamic>(
UriFactory.CreateDocumentCollectionUri(DatabaseId, CollectionId),
"SELECT VALUE COUNT(1) FROM c",
queryOptions);
Console.WriteLine("Running direct SQL query...");
foreach (dynamic family in familyQueryInSql)
{
Console.WriteLine("\tRead {0}", family);
}
Console.Read();
```
<img :src="$withBase('/files/dataexpcosmos4.png')">
Success!!

39
blog/blog/tip153.md Normal file
Просмотреть файл

@ -0,0 +1,39 @@
---
type: post
title: "Tip 153 - How to get the Azure Account Tenant Id?"
excerpt: "Learn how to quickly get the Azure Account Tenant ID"
tags: [Management and Governance]
date: 2018-08-27 17:00:00
---
::: tip
:bulb: Learn more : [Azure portal documentation](https://docs.microsoft.com/azure/azure-portal/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### How to get the Azure Account Tenant Id?
Your Office 365 tenant ID is a globally unique identifier (GUID) that is different than your tenant name or domain. On rare occasion, you might need this identifier, such as when configuring Windows group policy for OneDrive for Business.
Knowing this, you'll find that there are times when you will want to grab the Tenant Id and while you can do this through PowerShell, sometimes it is just as easy to grab this through the Azure Portal.
> The **DirectoryId** and **TenantId** both equate to the GUID representing the ActiveDirectory Tenant. Depending on context, either term may be used by Microsoft documentation and products, which can be confusing.
Open the Azure Portal and navigate to **Azure Active Directory**, then **Properties** and copy the **Directory ID**
<img :src="$withBase('/files/aadazure1.png')">
> In other words, the "Tenant ID" IS the "Directory ID".
You can also do this via Azure CLI or Azure Powershell:
```
az account show --subscription "MySubscriptionName" --query tenantId --output tsv
az account list --query [].tenantId --output tsv
```
```
Get-AzTenant | Select-Object -ExpandProperty Id
```
I hope it helps!

40
blog/blog/tip154.md Normal file
Просмотреть файл

@ -0,0 +1,40 @@
---
type: post
title: "Tip 154 - How to quickly check the EndPoint API of QnA Maker"
excerpt: "Learn how to quickly test the QnA Maker knowledge base with Fiddler"
tags: [AI + Machine Learning]
date: 2018-09-02 17:00:00
---
::: tip
:bulb: Learn more : [Azure Cognitive Services](https://docs.microsoft.com/azure/cognitive-services?WT.mc_id=docs-azuredevtips-azureappsdev).
:bulb: Checkout [Azure AI resources for developers](https://azure.microsoft.com/en-us/overview/ai-platform/dev-resources/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### How to quickly check the EndPoint API of QnA Maker
If you haven't experimented with [QnA Maker](https://qnamaker.ai/) then it is time. It enables you to quickly create a question and answer service from content like FAQ documents, URLs, and product manuals. You can create a knowledge base with existing data sources that you already have. Once complete, you might want to consume the endpoint API through applications such as Fiddler or cURL. In this post, I'll show you quickly how you can do both.
For Fiddler:
You need to specify `TLS1.2`. Simply go to **Tools** > **Options** > **HTTPS** to make **tls1.2** allowable.
<img :src="$withBase('/files/fiddlerazure1.png')">
For cURL:
Simply replace placeholder text and it should work!
```
curl \
--header "Content-type: application/json" \
--header "Authorization: EndpointKey placeholder-text-remove-me" \
--request POST \
--data '{"question":"Have you completed the Azure Tips and Tricks Survey yet?"}' \
https://myazureresourcename.azurewebsites.net/qnamaker/knowledgebases/placeholder-text-remove-me/generateAnswer
```
I hope this helped!

26
blog/blog/tip155.md Normal file
Просмотреть файл

@ -0,0 +1,26 @@
---
type: post
title: "Tip 155 - Archive the Azure Activity Log"
excerpt: "Learn how to archive the Azure Activity Log"
tags: [Management and Governance]
date: 2018-09-09 17:00:00
---
::: tip
:bulb: Learn more : [Azure Activity Log](https://docs.microsoft.com/azure/azure-monitor/platform/activity-logs-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Archive the Azure Activity Log
The Azure Activity Log is a subscription log that provides insight into subscription-level events that have occurred in Azure. This includes a range of data, from Azure Resource Manager operational data to updates on Service Health events. You may want to Archive the Azure Activity Log if you want to retain your Activity Log longer than 90 days (with full control over the retention policy) for audit, static analysis, or backup. In this post, I'll show you now to archive it with a couple of clicks.
In the portal, search for the **Activity Log** service. Now click on the **Export** button as shown below:
<img :src="$withBase('/files/azactivitylog1.png')">
Select a Subscription, Region and place a checkmark in the **Export to an Azure Storage Account**. Now use the slider to select a number of days (0 to 365) for which Activity Log events should be kept in your storage account. You can may also select 0 to save it indefinitely. Now click Save.
<img :src="$withBase('/files/azactivitylog2.png')">
Now your logs are safe and sound for the time you specified.

46
blog/blog/tip156.md Normal file
Просмотреть файл

@ -0,0 +1,46 @@
---
type: post
title: "Tip 156 - Use Azure Logic Apps to Detect when a new SQL record is inserted"
excerpt: "Learn how to use Azure Logic Apps to detect when a new SQL record is inserted"
tags: [Integration, Databases]
date: 2018-09-10 17:00:00
---
::: tip
:bulb: Learn more : [Azure Logic Apps Documentation](https://docs.microsoft.com/azure/logic-apps/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use Azure Logic Apps to Detect when a new SQL record is inserted
I recently needed the ability to detect when a new SQL record was added and send an email. Since the customer didn't want the existing logic in their app to be modified, I relied on Azure Logic Apps and all their powerful connectors.
In the portal, create a new **Azure Logic App** and then select **Start with a blank template**. Under the trigger, choose **New step > Add an action**.
In the search box, enter "sql" as your filter. and pick **When an item is created**.
<img :src="$withBase('/files/logicsql1.png')">
You'll be prompted for connection details, so do so now.
<img :src="$withBase('/files/logicsql2.png')">
Now you'll need to select the **Table Name** and how often you want to check for item. We are going to go with every 5 seconds.
<img :src="$withBase('/files/logicsql3.png')">
Now choose, **New step > Choose an action**.
In the search box, enter "email" as your filter. and pick **Send an email**.
<img :src="$withBase('/files/logicsql4.png')">
Type the email address and select which fields to send. You can put custom text as shown below:
<img :src="$withBase('/files/logicsql5.png')">
Now insert a record into your database and it should fire (as long as you have the Logic app running)
<img :src="$withBase('/files/logicsql6.png')">
Easy enough!

63
blog/blog/tip157.md Normal file
Просмотреть файл

@ -0,0 +1,63 @@
---
type: post
title: "Tip 157 - Create Thumbnail Images with Azure Functions and Azure Storage - Part 1"
excerpt: "Learn how to create a thumbnail images with Azure Functions and Azure Storage"
tags: [Serverless, Storage]
date: 2018-09-16 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Create Thumbnail Images with Azure Functions and Azure Storage - Part 1
In this mini-series, we're going to create an Azure Function that detects when a new image is added to Azure Storage and automatically creates a thumbnail image for us.
* [Azure Tips and Tricks Part 157 - Part 1 Create Thumbnail Images with Azure Functions and Azure Storage](tip157.html)
* [Azure Tips and Tricks Part 158 - Part 2 Create Thumbnail Images with Azure Functions and Azure Storage](tip158.html)
#### Part 1 (Setup) Azure Portal
Go ahead and open the **Azure Portal** and click **Create a Resource** and select **Azure Storage**. Well keep it simple as shown below to get started.
<img :src="$withBase('/files/imageresizer1.png')">
Once complete, go into the resource and look under **Services**.
<img :src="$withBase('/files/storageacct2.png')">
Go ahead and click on **Blobs** and create a **Container** and give it the name **originals** and then create another one called **thumbs**.
**Remember this!** Think of a container in this sense as a folder. https://myblob/**container**/image1.jpg
<img :src="$withBase('/files/imageresizer2.png')">
We're going to need our Access Key shortly, so look under **Settings**, then **Access Keys** and copy the **connection string** to your clipboard.
**What is an Access Key?** This string will allow us to connect to the Storage Account.
<img :src="$withBase('/files/storagethroughcsharp1.png')">
#### Part 2 (Setup) Visual Studio
Create a C# Azure Function application by opening Visual Studio and selecting the template under the **Cloud** node as shown below:
<img :src="$withBase('/files/imageresizer3.png')">
Under Storage, change the default emulator to the **Azure Storage Account** that we created earlier:
<img :src="$withBase('/files/imageresizer4.png')">
We'll begin by using the **Timer Trigger** and **Azure Function v1** leaving everything as the defaults.
<img :src="$withBase('/files/imageresizer5.png')">
Once the project spins up, we'll use NuGet to pull in references to :
* ImageResizer `A helper class for Image Resizing`
Looking good so far and a good stopping point for today. Come back soon for the next post in the series where we'll begin putting this all together.

86
blog/blog/tip158.md Normal file
Просмотреть файл

@ -0,0 +1,86 @@
---
type: post
title: "Tip 158 - Create Thumbnail Images with Azure Functions and Azure Storage - Part 2"
excerpt: "Learn how to create a thumbnail images with Azure Functions and Azure Storage"
tags: [Serverless, Storage]
date: 2018-09-17 17:00:00
---
### Create Thumbnail Images with Azure Functions and Azure Storage - Part 2
In this mini-series, we're going to create an Azure Function that detects when a new image is added to Azure Storage and automatically creates a thumbnail image for us.
* [Azure Tips and Tricks Part 157 - Part 1 Create Thumbnail Images with Azure Functions and Azure Storage](tip157.html)
* [Azure Tips and Tricks Part 158 - Part 2 Create Thumbnail Images with Azure Functions and Azure Storage](tip158.html)
#### Part 3 Time to Code
Make sure you read [Part 1 Create Thumbnail Images with Azure Functions and Azure Storage](tip157.html) before proceeding with this post.
Inside of your Azure Function app in Visual Studio, open your **local.settings.json**, now add the following value of **StorageConnection** with your **ConnectionString** found in the previous part:
```
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "UseDevelopmentStorage=true",
"AzureWebJobsDashboard": "UseDevelopmentStorage=true",
"StorageConnection": "Enter Your Access Key From the Previous Part HERE"
}
}
```
Be sure to include the NuGet package called **ImageResizer**
Copy the following code into your `Function1.cs`:
```csharp
using ImageResizer;
using Microsoft.Azure.WebJobs;
using Microsoft.Azure.WebJobs.Host;
using System.IO;
namespace ImageResizer1
{
public static class Function1
{
[FunctionName("Function1")]
public static void Run(
[BlobTrigger("originals/{name}", Connection = "StorageConnection")]Stream image,
[Blob("thumbs/s-{name}", FileAccess.Write, Connection = "StorageConnection")]Stream imageSmall,
TraceWriter log)
{
var instructions = new Instructions
{
Height = 320,
Width = 200,
Mode = FitMode.Carve,
Scale = ScaleMode.Both
};
ImageBuilder.Current.Build(new ImageJob(image, imageSmall, instructions));
log.Info($"C# Blob trigger function Processed blob\n Name:{image} \n Size: {image.Length} Bytes");
}
}
}
```
Now run the Application by pressing F5 and switch over to the Azure Portal and open your Storage account that you just created. Click on **Blobs** and the **originals** Container and you'll now want to click **Upload** and select a file on your physical disk.
<img :src="$withBase('/files/imageresizer6.png')">
Make note of the filename and check your running Azure Function.
<img :src="$withBase('/files/imageresizer7.png')">
If you switch over to the **thumbs** container, you should see a new file with the format that we specified.
<img :src="$withBase('/files/imageresizer8.png')">
If you click on the filename, you can see the Properties and even Download the file.
Success! We've accomplished this task in the time it would take to watch a 30-minute TV Show.

74
blog/blog/tip159.md Normal file
Просмотреть файл

@ -0,0 +1,74 @@
---
type: post
title: "Tip 159 - Use Azure Logic Apps and Cosmos DB to monitor and archive Twitter hashtags"
excerpt: "Learn how to use Azure Logic Apps and Cosmos DB to monitor and archive Twitter hashtags"
tags: [Databases, Integration]
date: 2018-09-23 17:00:00
---
::: tip
:bulb: Learn more : [Azure Logic Apps Documentation](https://docs.microsoft.com/azure/logic-apps/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Use Azure Logic Apps and Cosmos DB to monitor and archive Twitter hashtags
I love data and use it constantly to improve everything in my personal life as well as my professional life. As we are about to begin the Microsoft Ignite conference, I wanted to collect tweets that use the #MSIgnite hashtag and save them to a database. I also don't want to code as I'm working on 3 sessions right now. Here's how I did it.
#### Create an Cosmos DB instance
Inside of the Azure Portal, create a Cosmos DB instance.
For Cosmos DB :
* Use **SQL** for the API
* For Database ID use **cosmosdb-ignite**
* For Collection ID use **items**
* Throughput use **400**
<img :src="$withBase('/files/azlogiccosmos1.png')">
<img :src="$withBase('/files/azlogiccosmos3.png')">
#### Create an Logic App instance
Inside of the Azure Portal, create a Logic App instance per the screenshot below
<img :src="$withBase('/files/azlogiccosmos2.png')">
#### Logic App Designer
Open the Logic App that you just created and select **When a new tweet is posted** and log in with your Twitter credentials and select the interval and text you wish to search for. In my case I'm using #MSIgnite.
<img :src="$withBase('/files/azlogiccosmos4.png')">
Choose an action that is **Create or update document**
<img :src="$withBase('/files/azlogiccosmos5.png')">
Provide the Connection Name (anything you want) and the account you wish to use.
<img :src="$withBase('/files/azlogiccosmos6.png')">
Fill out the following fields:
* For Database ID use **cosmosdb-ignite**
* For Collection ID use **items**
* For Document use:
```
{
"created": @{triggerBody()?['CreatedAtIso']},
"id": @{triggerBody()?['TweetId']},
"text": @{triggerBody()?['TweetText']},
"user": "@{triggerBody()?['TweetedBy']}"
}
```
Please note that these are dynamic fields, so you might not be able to copy and paste that text.
<img :src="$withBase('/files/azlogiccosmos7.png')">
Click Save and then go into your Cosmos DB Instance and you can query the database to see the data coming in.
<img :src="$withBase('/files/azlogiccosmos8.png')">

41
blog/blog/tip16.md Normal file
Просмотреть файл

@ -0,0 +1,41 @@
---
type: post
title: "Tip 16 - Deploy Jekyll Site Hosted on GitHub Pages to Azure"
excerpt: "Learn how to quickly deploy a Jekyll based site hosted on GitHub Pages to Azure"
tags: [Web, DevOps]
date: 2017-09-13 17:00:00
---
### Deploy Jekyll Site Hosted on GitHub Pages to Azure
If you have already have an existing [Jekyll](https://jekyllrb.com/) based site that is hosted on GitHub, you can easily deploy that site to [Azure App Services](https://azure.microsoft.com/services/app-service/web?WT.mc_id=azure-azuredevtips-azureappsdev).
But why? If [GitHub Pages](https://pages.github.com/) is free, then why pay?
* You might want to push your site to a private repo (instead of public)
* Setting up "real" SSL, compared to the [workarounds](https://css-tricks.com/switching-site-https-shoestring-budget/) (see comments)
* Taking advantage of [deployment slots](https://docs.microsoft.com/azure/app-service-web/web-sites-staged-publishing?WT.mc_id=docs-azuredevtips-azureappsdev).
I'm sure there are more, but those are top of mind for me.
#### Let's begin
I'm assuming you already have a GitHub Pages site that uses Jekyll hosted on GitHub. If that is true, then the first thing that you'll want to do is grab these three files.
* [deploy.cmd](https://github.com/mbcrump/mbcrump.github.io/blob/master/deploy.cmd?WT.mc_id=github-azuredevtips-azureappsdev) - is a [Kudu](https://github.com/projectkudu/kudu?WT.mc_id=github-azuredevtips-azureappsdev) Deployment script that handles setup and deployment of the web site and ensures Ruby is installed
* [getruby.cmd](https://github.com/mbcrump/mbcrump.github.io/blob/master/getruby.cmd?WT.mc_id=github-azuredevtips-azureappsdev) - is a site that ensure the latest version of Ruby is installed and ensures Jekyll has been built
* [.deployment](https://github.com/mbcrump/mbcrump.github.io/blob/master/.deployment?WT.mc_id=github-azuredevtips-azureappsdev) - is a configuration file that Kudu understands that calls the `deploy.cmd` script
* [Gemfile](https://github.com/mbcrump/mbcrump.github.io/blob/master/Gemfile?WT.mc_id=github-azuredevtips-azureappsdev) - you probably already have this but ensure it is there and if not then just copy mine.
***Thanks goes to [Khalid Abuhakmeh](https://github.com/khalidabuhakmeh?WT.mc_id=github-azuredevtips-azureappsdev) for writing the scripts***
Once you have these three files, ensure they are in the root of your public GitHub pages site (ex. something.github.io)
You'll want to go inside of your **Azure Portal** (or use the CLI tools) and create an **App Service** -> **Web App**. Once the site is deployed, then go to **Deployment Options** and select GitHub, your project and press OK.
<img :src="$withBase('/files/azuretip16.gif')">
You should see "Settup up Deployment Source..." in the notification windows. You'll probably want to wait a good 15 to 20 minutes for Azure to setup everything. You can stay on the **Deployment Options** blade and you should see the status of the deployment.
<img :src="$withBase('/files/fetchanddeploy.png')">
After a while you see a check mark that it completed successfully. Now you can navigate to the URL listed on the **Overview** blade.

24
blog/blog/tip161.md Normal file
Просмотреть файл

@ -0,0 +1,24 @@
---
type: post
title: "Tip 161 - Change the Azure Function runtime version after Deployment"
excerpt: "Learn how to use change the azure function runtime version after deployment"
tags: [Serverless]
date: 2018-10-01 17:00:00
---
::: tip
:bulb: Learn more : [Azure Functions Documentation](https://docs.microsoft.com/azure/azure-functions/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Change the Azure Function runtime version after Deployment
If you have used Azure Functions since the beginning, then chances are you've started with a 1.x runtime. Since 2.x is out, you may want to upgrade to it but will be greeted with the following message:
<img :src="$withBase('/files/changetheazure181001-1.png')">
This helps protect users from breaking their Azure Function. As v1 and v2 runtimes are not meant to be interchanged.
If you still want to do this, then you can simply change the `FUNCTIONS_EXTENSION_VERSION` App Setting to `~1` or `~2` to target the runtime that you want.
<img :src="$withBase('/files/changetheazure181001-2.png')">

98
blog/blog/tip162.md Normal file
Просмотреть файл

@ -0,0 +1,98 @@
---
type: post
title: "Tip 162 - ARM Templates Demystified"
excerpt: "Learn how to get started with ARM Templates"
tags: [Management and Governance]
date: 2018-10-07 17:00:00
---
::: tip
:bulb: Learn more : [An introduction to Azure Automation](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use Azure Automation with PowerShell](https://www.youtube.com/watch?v=pQ9dQ13B2vM&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=50?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### ARM Templates Demystified
* [Part 1 - This Post](tip162.html)
* [Part 2](tip163.html)
* [Part 3](tip164.html)
#### Intro
Ive been hearing that a lot of people are having trouble with ARM templates. Either they dont understand them and dont know how to use them, or they do use them but the templates are too hard to use. This feedback really surprised me and calls out for a quick demystification. In this post, Im going to explain what ARM templates are and why you should care.
#### ARM Templates Demystified
First off, it occurred to me that the name might be confusing. **ARM** here doesnt have anything to do with CPU architecture such as what is used in phones and tablets, which would be an honest misunderstanding. ARM stands for [Azure Resource Manager](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-overview?WT.mc_id=docs-azuredevtips-azureappsdev), which is how you organize all your stuff on Azure—virtual machines, databases, storage accounts, containers, web apps, bots, and more. In Azure, we call your stuff resources.
<img :src="$withBase('/files/resources_page.PNG')">
You will tend to create many related items together in Azure, like a web app and a database will be contained in a single resource group. But what if you had to do this frequently for many different clients? Wouldnt it be nice to have an easy way to keep track of all the repeatable groups of resources you are creating for people? Good news, this is exactly what ARM Templates do!
#### What are some real-world scenarios for using ARM Templates?
When you need an easy way to repeat infrastructure deployments on Azure, ARM templates are going to save you a lot of time. If you ever need to share your infrastructure with other people—like for open source projects or for blogs, tutorials, and documentation—ARM templates will be a lifesaver and will let your readers and users replicate what you did. Finally, if you just need a way to keep track of what you deployed on Azure, ARM templates are a great way to help you remember.
#### An ARM Template is just a JSON file
This is where ARM templates come in. ARM templates are **JSON files** that act like blueprints for the related resources you want to deploy together. Youll also hear this called “infrastructure as code,” which is geek speak for being able to upload your infrastructure notes to GitHub if you want to. Its a structured format for keeping track of your Azure infrastructure with some superpowers. The biggest ARM template superpower is that you can use templates to automate your infrastructure deployments because Azure knows how to read them.
In the example below, we are creating an ARM template that creates a Notification Hub. You'll see that it contains things that the deployment needs such as Name, Location, etc.
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json",
"contentVersion": "1.0.0.0",
"parameters": {
"namespaceName": {
"type": "string",
"metadata": {
"description": "The name of the Notification Hubs namespace."
}
},
"location": {
"type": "string",
"defaultValue": "[resourceGroup().location]",
"metadata": {
"description": "The location in which the Notification Hubs resources should be deployed."
}
}
},
"variables": {
"hubName": "MyHub"
},
"resources": [
{
"apiVersion": "2014-09-01",
"name": "[parameters('namespaceName')]",
"type": "Microsoft.NotificationHubs/namespaces",
"location": "[parameters('location')]",
"kind": "NotificationHub",
"resources": [
{
"name": "[concat(parameters('namespaceName'), '/', variables('hubName'))]",
"apiVersion": "2014-09-01",
"type": "Microsoft.NotificationHubs/namespaces/notificationHubs",
"location": "[parameters('location')]",
"dependsOn": [
"[parameters('namespaceName')]"
]
}
]
}
]
}
```
But more on that later!
#### Getting Started
You can make an ARM template in Visual Studio, in Visual Studio Code, or in the Azure portal. The last way is probably the easiest since it walks you through the process. Start creating a resource through the portal the way you normally would, by clicking on the **Create Resource** on your Azure Portal Dashboard. Now select what you'd like to create. I'm going to create a **Web App**. Look at the bottom of this page and you'll see **Automation options**.
<img :src="$withBase('/files/new_webapp.png')">
But what does that unassuming link labeled **Automation options** next to it do? Youve probably never noticed it before or been afraid to mess with it, but if you click on that link, you will be on the road to creating an ARM template for your resource. But you'll have to come back tomorrow for Part 2!

92
blog/blog/tip163.md Normal file
Просмотреть файл

@ -0,0 +1,92 @@
---
type: post
title: "Tip 163 - Provide Static App Settings Values in an ARM Template"
excerpt: "Learn how to use provide static values to ARM templates"
tags: [Management and Governance, Web]
date: 2018-10-08 18:00:00
---
::: tip
:bulb: Learn more : [An introduction to Azure Automation](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use Azure Automation with PowerShell](https://www.youtube.com/watch?v=pQ9dQ13B2vM&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=50?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
#### Provide Static App Settings Values in an ARM Template
* [Part 1](tip162.html)
* [Part 2 - This Post](tip163.html)
* [Part 3](tip164.html)
#### Intro
Building on my previous Azure Tips and Tricks post about what ARM templates are and why you should care, I wanted to give you a quick recipe for a common development task. Youve already seen that you can automate deploying a web app (and many other resources), but can you also copy configuration information like app settings with your ARM template? Yes!
#### Getting Started
Go ahead and click **Create a resource** inside the Azure Portal and select **Web App**.
Enter a **Name** and a **Resource Group** for your web app and click **Automation options** at the bottom before you hit **Create** in order to start creating your ARM template.
<img :src="$withBase('/files/new_webapp.png')">
After you click **Automation options**, then this is what you will see:
<img :src="$withBase('/files/arm_template.png')">
The template to create a web app (or any other Azure resource) is simply a JSON file with multiple values describing how your web app is going to be deployed.
#### Create Static App Settings for your Azure App Service
To make things as easy as possible, lets assume for now that you want to add the exact same settings every time you deploy your web app template.
Go to **Deploy** then **Edit Template** and paste the following settings fragment overwriting your templates resource section. (You could, of course, add as many keys as your web app needs.)
Make note that we are adding three names and 3 values for **MyFirstName**, **MyLastName**, and **MySSN**.
```
"resources": [
{
"apiVersion": "2016-03-01",
"name": "[parameters('name')]",
"type": "Microsoft.Web/sites",
"properties": {
"name": "[parameters('name')]",
"siteConfig": {
"appSettings": [
{
"name": "MyFirstName",
"value": "Michael"
},
{
"name": "MyLastName",
"value": "Crump"
},
{
"name": "MySSN",
"value": "355-643-3356"
}
]
},
"serverFarmId": "[concat('/subscriptions/', parameters('subscriptionId'),'/resourcegroups/', parameters('serverFarmResourceGroup'), '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]",
"hostingEnvironment": "[parameters('hostingEnvironment')]"
},
"location": "[parameters('location')]"
}],
```
Press **Save** and ensure the **Basic** and **Settings** is filled out. Agree to the terms and check the **Purchase** option.
<img :src="$withBase('/files/customdeployment.png')">
Note: If it says failure to deploy, then give it a shot again. I have had this happened but maybe it is only happening since I use the Preview.
Your Azure App Settings (for **MyFirstName**, **MyLastName**, and **MySSN**) will now be deployed.
After deployment, navigate to your **App Service** and go to **Application Settings** and you'll see your site deployed along with the settings (for **MyFirstName**, **MyLastName**, and **MySSN**) that we specified earlier.
<img :src="$withBase('/files/create_resource1.png')">
Come back tomorrow and we'll take a look at adding parameters!

191
blog/blog/tip164.md Normal file
Просмотреть файл

@ -0,0 +1,191 @@
---
type: post
title: "Tip 164 - Defining Parameters to be used with ARM Templates"
excerpt: "Learn how to use provide parameters with ARM templates"
tags: [Management and Governance]
date: 2018-10-14 17:00:00
---
::: tip
:bulb: Learn more : [An introduction to Azure Automation](https://docs.microsoft.com/azure/azure-resource-manager/resource-group-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use Azure Automation with PowerShell](https://www.youtube.com/watch?v=pQ9dQ13B2vM&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=50?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Defining Parameters to be used with ARM Templates
* [Part 1](tip162.html)
* [Part 2](tip163.html)
* [Part 3 - This Post](tip164.html)
#### Intro
Youve already seen that you can automate deploying static configuration information like app settings with your ARM template. But what about providing parameters that allows end-users to input values **BEFORE** deployment. That is what we'll learn today!
#### Getting Started
Go ahead and search for **Templates** inside the Azure Portal and click **Add** to create a new one.
Enter a **name** and a **description** on the ARM Template.
<img :src="$withBase('/files/customdeploy5.png')">
#### Fill-in-the-blank settings
We want to have dynamic settings that are customizable every time you deploy your web app instead of having them be the same each time, you just need to add the parameter values for what you want to your ARM template.
Below is a sample of three values (`FirstNameValue`, `LastNameValue` and `SSNValue`) that we previously hard-coded before:
```json
"FirstNameValue": {
"type": "string"
},
"LastNameValue": {
"type": "string"
},
"SSNValue": {
"type": "string"
},
```
We'll add the same parameters called `FirstNameValue`, `LastNameValue` and `SSNValue` to the parameters collection of the template. From now on, every time you deploy this template, you will be prompted to enter a value for each one.
```json
"siteConfig": {
"appSettings": [
{
"name": "MyFirstName",
"value": "[parameters('FirstNameValue')]"
},
{
"name": "MyLastName",
"value": "[parameters('LastNameValue')]"
},
{
"name": "MySSN",
"value": "[parameters('SSNValue')]"
}
]
}
```
#### Putting it all together
Our full template file looks like the following:
```json
{
"$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
"contentVersion": "1.0.0.0",
"parameters": {
"appServiceName": {
"type": "string",
"minLength": 1,
"maxLength": 10
},
"appServicePlanName": {
"type": "string",
"minLength": 1
},
"FirstNameValue": {
"type": "string"
},
"LastNameValue": {
"type": "string"
},
"SSNValue": {
"type": "string"
},
"appServicePlanSkuName": {
"type": "string",
"defaultValue": "S1",
"allowedValues": [
"F1",
"D1",
"B1",
"B2",
"B3",
"S1",
"S2",
"S3",
"P1",
"P2",
"P3",
"P4"
],
"metadata": {
"description": "Describes plan's pricing tier and capacity. Check details at https://azure.microsoft.com/pricing/details/app-service/"
}
}
},
"variables": {
"appHostingPlanNameVar": "[concat(parameters('appServicePlanName'),'-apps')]"
},
"resources": [
{
"name": "[variables('appHostingPlanNameVar')]",
"type": "Microsoft.Web/serverfarms",
"location": "[resourceGroup().location]",
"apiVersion": "2015-08-01",
"sku": {
"name": "[parameters('appServicePlanSkuName')]"
},
"dependsOn": [],
"tags": {
"displayName": "appServicePlan"
},
"properties": {
"name": "[variables('appHostingPlanNameVar')]",
"numberOfWorkers": 1
}
},
{
"name": "[parameters('appServiceName')]",
"type": "Microsoft.Web/sites",
"location": "[resourceGroup().location]",
"apiVersion": "2015-08-01",
"dependsOn": [
"[resourceId('Microsoft.Web/serverfarms', variables('appHostingPlanNameVar'))]"
],
"tags": {
"[concat('hidden-related:', resourceId('Microsoft.Web/serverfarms', variables('appHostingPlanNameVar')))]": "Resource",
"displayName": "webApp"
},
"properties": {
"name": "[parameters('appServiceName')]",
"serverFarmId": "[resourceId('Microsoft.Web/serverfarms', variables('appHostingPlanNameVar'))]",
"siteConfig": {
"appSettings": [
{
"name": "MyFirstName",
"value": "[parameters('FirstNameValue')]"
},
{
"name": "MyLastName",
"value": "[parameters('LastNameValue')]"
},
{
"name": "MySSN",
"value": "[parameters('SSNValue')]"
}
]
}
}
}
],
"outputs": {}
}
```
If you save the template, then the next time you deploy resources using this ARM template, you will be required to put in a new value for the **First Name**, **Last Name**, and **SSN** that will be used in your application settings.
<img :src="$withBase('/files/customdeploy3.png')">
And after deployment, go and check your **App Settings**.
<img :src="$withBase('/files/customdeploy4.png')">
I hope this three part series helped!

96
blog/blog/tip166.md Normal file
Просмотреть файл

@ -0,0 +1,96 @@
---
type: post
title: "Tip 166 - Data Storage Options with Azure Storage and Cosmos DB"
excerpt: "Learn about different storage options in Azure"
tags: [Databases, Storage]
date: 2018-10-21 17:00:00
---
::: tip
:bulb: Learn more : [Azure Cosmos DB](https://docs.microsoft.com/azure/cosmos-db/introduction?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Data Storage Options with Azure Storage and Cosmos DB
#### Azure Table Storage and Azure Cosmos DB
Before you dive into this article, keep in mind that this is not a comparison and use what you feel is right for your scenario.
#### Azure Table Storage in a nutshell
[Azure Table Storage](https://azure.microsoft.com/services/storage/tables?WT.mc_id=azure-azuredevtips-azureappsdev) offers a NoSQL key-value store for semi-structured data.
Unlike a traditional relational database, each entity (such as a row - in relational database terminology) can have a different structure, allowing your application to evolve without downtime to migrate between schemas.
#### Azure Cosmos DB in a nutshell
[Azure Cosmos DB](https://azure.microsoft.com/services/cosmos-db?WT.mc_id=azure-azuredevtips-azureappsdev) is a multimodal database service designed for global use in mission-critical systems. Not only does it expose a Table API, it also has a SQL API, Apache Cassandra, MongoDB, Gremlin and Azure Table Storage. These allow you to easily swap out existing dbs with a Cosmos DB implementation.
#### Performance
Azure Table Storage has no upper bound on latency. Cosmos DB defines latency of single-digit milliseconds for reads and writes along with operations at sub-15 milliseconds at the 99th percentile worldwide. (That was a mouthful) Throughput is limited on Table Storage to 20,000 operations per second. On Cosmos DB, there is no upper limit on throughput, and more than 10 million operations per second are supported. Unlike Table Storage, Cosmos DB automatically indexes on all properties, and queries can take advantage of this to improve performance.
#### Global Distribution
Azure Table Storage supports a single region with an optional read-only secondary region for availability. Cosmos DB supports distribution from 1 to more than 30 regions with automatic failovers worldwide. You can easily manage this from the Azure portal and define the failover behavior. Five defined consistency levels allow you to select the right balance between availability, latency, throughput, and consistency.
#### Billing
Azure Table Storage uses storage volume to determine billing. It is priced per GB and the rates vary depending on the redundancy level selected. Pricing is tiered to get progressively cheaper per GB the more storage you use. Operations incur a charge measured per 10,000 transactions. All operation types are treated the same.
For Cosmos DB, throughput is measured in request units (RU)is also billed. The database is provisioned with a level of throughput in increments of 100 RU per second. These are billed hourly and you can elastically scale to meet changes in workload. This is in addition to a storage rate at a higher price per GB. Therefore, if your requirements are more modest and you don't have the need for the additional performance and redundancy options, Azure Table Storage could be an option. But this may seem like something that's very difficult to calculate, but there is an online tool that makes it easy for you to estimate costs.
<img :src="$withBase('/files/azure-cosmos-planner.png')">
Calculator for estimating request units and data storage can be found at : https://www.documentdb.com/capacityplanner.
#### Consistent API
Both Azure Table Storage and Azure Cosmos DB support the same [Table API](https://docs.microsoft.com/azure/cosmos-db/table-introduction?WT.mc_id=docs-azuredevtips-azureappsdev). SDKs are available for common programming environments along with a generic REST API. Because Cosmos DB exposes a superset of functionality, there are some overloaded API methods to specify additional options. The common API makes it easier to migrate a solution from Azure Table Storage to Cosmos DB as it grows. It also makes learning the API easier. You can target Azure Table Storage, Azure Cosmos DB, or the Azure storage emulator running location by simply replacing the connection string.
#### A Code Sample
Sometimes I like to just look at the code vs. writing a full app and here is an example of accessing the Table API mentioned above.
In code, I'd create an instance of **CloudStorageAccount** passing your connection string. Then, from this object, you can call **CreateCloudTableClient()** to return the **CloudTableClient** object that can work with the Table Storage or Cosmos DB instance. To create a new table requires just two further lines of code: first, **GetTableReference(tablename)**; then, call **CreateIfNotExists** on the returned **CloudTable**.
To create entities to store in your table, they have to be classes derived from **TableEntity**. This base class handles the serialization of the entity into a format that can be stored in the database. It also has standard properties for **PartitionKey** and **RowKey** that provide a unique identity for the item.
An individual operation on the table is performed with either a **TableOperation** or a **TableBatchOperation** for multiple items. You create a new operation, set up the actions to perform, and then call Execute on the **CloudTable** passing the operation object.
```
EmployeeEntity employee1 = new EmployeeEntity("Case", "Justin");
employee1.Email = "justin@contoso.com";
employee1.PhoneNumber = "425-555-0101";
// Create the TableOperation object that inserts the employee entity.
TableOperation insertOperation = TableOperation.Insert(employee1);
// Execute the insert operation.
table.Execute(insertOperation);
```
In addition to **Insert**, **Retrieve**, **Replace**, and **Delete** operations, there is an **InsertOrReplace** operation that will overwrite an entity matching the partition and row key.
#### Migration
If you have existing data in Azure Table Storage, you can migrate it to Cosmos DB using the [Data Migration tool](https://docs.microsoft.com/azure/cosmos-db/import-data?WT.mc_id=docs-azuredevtips-azureappsdev) or [AzCopy](https://docs.microsoft.com/azure/storage/common/storage-use-azcopy?WT.mc_id=docs-azuredevtips-azureappsdev). Cosmos also supports import from other databases such as MongoDB. You may choose to start development with Table Storage and then migrate to Cosmos as your requirements evolve. Because the API is the same there is no impact to the code you write.
#### Wrapping up
So far we've covered a lot of information, this handy table below should help make sense of it all.
| | Azure Table Storage | Azure Cosmos DB |
| ---------- | ------------------- | --------------- |
| Throughput | Up to 20K operations per second | No upper limit, supports >10M operations per second |
| Redundancy | One optional secondary read-only region | Multiple configurable worldwide regions |
| Latency | No upper bounds | Single-digit milliseconds |
| Consistency | Strong and Eventual consistency models | Five defined consistency models |
| Query | Optimized for query on primary key | Improved query performance because all fields are indexed |
| Failover | Can't initiate failover | Automatic and manual failovers |
| Billing | Storage-based | Throughput-based |
Cosmos DB is a superset of the Azure Table Storage functionality. You will choose Cosmos DB when you need multiple region redundancy, the highest throughput, minimal latency, or control of failover scenarios. You can learn more about the differences between Azure Table Storage and Cosmos DB, along with the Table API [here](https://docs.microsoft.com/azure/cosmos-db/table-introduction?WT.mc_id=docs-azuredevtips-azureappsdev).
Thanks for reading and until next time! MC signing off.

53
blog/blog/tip167.md Normal file
Просмотреть файл

@ -0,0 +1,53 @@
---
type: post
title: "Tip 167 - Migrating Data from Cosmos DB to Local JSON files"
excerpt: "Learn how to use migrating data from cosmos db to local json files"
tags: [Databases]
date: 2018-10-22 17:00:00
---
::: tip
:bulb: Learn more : [Azure Cosmos DB](https://docs.microsoft.com/azure/cosmos-db/introduction?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Migrating Data from Cosmos DB to Local JSON files
#### Using the Data Migration Tool with Cosmos DB
One tasks that seems to come up over and over is migrating data from one database/format into another. I recently used Cosmos DB as my database to store every tweet that came out of Ignite. Once I had the data and wouldn't be using Cosmos DB any more for that exercise, I needed to dump the data out to a local file to preserve the data and save money. Here is how I did it.
#### The Tools
Download and install the [Azure DocumentDB Data Migration Tool](https://aka.ms/csdmtool?WT.mc_id=microsoft-azuredevtips-azureappsdev)
# Get to Work
Ensure you have a Cosmos DB database and collection created that you wish to migrate out.
Go to **Keys** (inside your Cosmos DB blade in the portal) to copy the **Primary Connection String**
<img :src="$withBase('/files/migrationcosmos2.png')">
You'll need to append the Database name to the end of the string. For example `Database=cosmosdb-ignite` will be appended to the **Key** copied earlier `AccountEndpoint=https://mbcrump.documents.azure.com:443/;AccountKey=VxDEcJblah==;Database=cosmosdb-ignite`. Save this for later.
Open the **Data Migration Tool** and under **Source Information**, select **DocumentDB** as shown below.
You'll need to add the **ConnectionString** (that we just created) along with the **Collection** and in my case it is `items`. We'll take the defaults on the rest and press **Verify** and if successful, then press **Next**.
<img :src="$withBase('/files/migratingdatafromcosmosdb-1.png')">
In my case, I'll export to a local JSON file and select **Prettify JSON** and press **Next**.
<img :src="$withBase('/files/migratingdatafromcosmosdb-2.png')">
On the next screen, you'll see a **View Command** to see the command that will be used to migrate your data. This is helpful to just learn the syntax.
<img :src="$withBase('/files/migratingdatafromcosmosdb-3.png')">
<img :src="$withBase('/files/migratingdatafromcosmosdb-4.png')">
You'll finally see the Import has completed with over 100K items transferred in a little under 2 minutes.
<img :src="$withBase('/files/migratingdatafromcosmosdb-5.png')">
We now have our local JSON file and can use it however we want! Awesome!

86
blog/blog/tip168.md Normal file
Просмотреть файл

@ -0,0 +1,86 @@
---
type: post
title: "Tip 168 - A quick tour around Azure DevOps Projects using Node.js and AKS - Part 1"
excerpt: "Learn how to use Azure DevOps Projects"
tags: [DevOps, Kubernetes, Languages & Frameworks]
date: 2018-10-28 17:00:00
---
::: tip
:bulb: Learn more : [Azure DevOps Documentation](https://docs.microsoft.com/azure/devops/?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [Quick tour of Azure DevOps projects using Node.js and AKS: Part 1](https://www.youtube.com/watch?v=bwpW44aQ7lU&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=47?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### A quick tour around Azure DevOps Projects using Node.js and AKS - Part 1
In this post, I will walk you through creating a new [Azure Kubernetes Service](https://azure.microsoft.com/services/kubernetes-service?WT.mc_id=azure-azuredevtips-azureappsdev) (AKS) cluster using an [Azure DevOps Projects](https://azure.microsoft.com/features/devops-projects?WT.mc_id=azure-azuredevtips-azureappsdev) and take a look under the hood to help understand how to get started with AKS.
#### Hold up - What is Azure DevOps and AKS (in a nutshell)?
Azure DevOps Services is a cloud service for collaborating on code development such as:
* Source control
* CD/CD
* Agile tooling
* A variety of tools to test your apps, including manual/exploratory testing, load testing, and continuous testing
* Dashboards
* Built-in wiki
Azure Kubernetes Service (AKS) manages your hosted Kubernetes environment, making it quick and easy to deploy and manage containerized applications without container orchestration expertise. It also eliminates the burden of ongoing operations and maintenance by provisioning, upgrading, and scaling resources on demand, without taking your applications offline.
#### Create the DevOps Project
In the Azure portal, search for **DevOps** and choose the **DevOps Project** from the results. Click the **Add** button, select the **Node.js** application, **click** it and then the **Next** button. Select **Express.js** for the application framework and click **Next**. For deploying the application, select **Kubernetes Service** and click Next. Now just give the DevOps project an **Organization name** and **Project name**. Provide a subscription and a **cluster name** and click the **Done** button.
<img :src="$withBase('/files/devops-k8s1.gif')">
A lot of amazing work is happening in the background, so now is the time to drink a cup of coffee or read my [other tips](https://aka.ms/AzureTipsAndTricks) before clicking the **refresh** button on your DevOps Projects list.
<img :src="$withBase('/files/devops-k8s2.png')">
**Click** on your **DevOps project name** in the list to go to the DevOps project dashboard. This has everything you need to access the source code repository and the application build and release pipeline (which automates the steps needed to take your code, build it and deploy to a live Kubernetes environment). There are also links to your live deployed application, the Kubernetes cluster and [Application Insights](https://docs.microsoft.com/azure/application-insights/app-insights-overview?WT.mc_id=docs-azuredevtips-azureappsdev) (telemetry for your live site).
<img :src="$withBase('/files/devops-k8s3.png')">
You should also have received an email letting you know the project is ready. You can also start setting up your team members.
<img :src="$withBase('/files/devops-k8s20.png')">
#### Taking a peek at the code
In the CI/CD pipeline, **click** on the commit to see the code or you can optionally click on **Master** to take you to the full file list.
<img :src="$withBase('/files/devops-k8s4.png')">
This takes you to the commit for the repo we just deployed containing the deployed sample app.
<img :src="$withBase('/files/devops-k8s5.png')">
When you created the DevOps project, it cloned the source from the [devops-project-samples](https://github.com/Microsoft/devops-project-samples?WT.mc_id=github-azuredevtips-azureappsdev) GitHub project and added it your DevOps projects and did a lot of the initial plumbing for you. How cool is that?
#### Taking a look at the Build
Back on the DevOps Project dashboard, click the **Build link** that has the successful build number.
<img :src="$withBase('/files/devops-k8s6.png')">
This takes you to the new Azure DevOps Pipelines build that was created for the project. Now click the **Edit** button at the top.
<img :src="$withBase('/files/devops-k8s7.png')">
You now see the steps created for building a container image and a [Helm package](https://helm.sh/). For those that ned a refresher, **Helm** is used to deploy applications to Kubernetes and it is used by default with the DevOps projects that target Kubernetes.
<img :src="$withBase('/files/devops-k8s8.png')">
Clicking on either of the Docker tasks will show you the new Azure Container Registry (ACR) that the DevOps project created, along with the image name (plus the build number).
<img :src="$withBase('/files/devops-k8s9.png')">
The build creates an ACR, builds and pushes the image using docker, checks to see if Helm is installed begins packaging and deploying the charts/sampleapp directory It also creates the ARM templates and finally publishes the build artifact.
If you switch back over to **Repos**, then **Files** then **Applications** then you can see the `charts/sampleapp` folder.
<img :src="$withBase('/files/devops-k8s10.png')">
I hope that helps and tomorrow we'll take a look at the **Dev** and **Azure Resources** portion.

80
blog/blog/tip169.md Normal file
Просмотреть файл

@ -0,0 +1,80 @@
---
type: post
title: "Tip 169 - A quick tour around Azure DevOps Projects using Node.js and AKS - Part 2"
excerpt: "Learn how to use Azure DevOps Projects"
tags: [DevOps, Kubernetes, Languages & Frameworks]
date: 2018-10-29 17:00:00
---
::: tip
:bulb: Learn more : [Azure DevOps Documentation](https://docs.microsoft.com/azure/devops/?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [Quick tour of Azure DevOps projects using Node.js and AKS: Part 2](https://www.youtube.com/watch?v=QxTqEajKOqo&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=48?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### A quick tour around Azure DevOps Projects using Node.js and AKS - Part 2
#### Intro
We discussed what Azure DevOps Projects are and began creating a project that uses Node.js and Azure Kubernetes Services (AKS). We walked through creating a project from scratch and began looking at the pipeline which included **code** and **build**. Today we'll finish up the pipeline section by looking at **dev**. We'll also review the resources section. In case you haven't read [Part 1](tip168.html) then I'd suggest you do so now.
#### Finish up the pipelines section
Back on the DevOps Project dashboard, click the **Release** link with the number Please make sure it is green - for a successful build)
<img :src="$withBase('/files/devops-k8s11.png')">
Side note: If the build is **red**, then click on it to see what the error is. A common error that I've personally seen is: **The subscription is not registered to use namespace 'Microsoft.ContainerService'** One way to quickly resolve that is by bringing up **Cloud Shell** and selecting **PowerShell** and running the following command:
```powershell
PS Azure:\> Register-AzureRmResourceProvider -ProviderNamespace “Microsoft.ContainerService”
ProviderNamespace : Microsoft.ContainerService
RegistrationState : Registering
ResourceTypes : {containerServices, managedClusters, locations, locations/operationresults...}
Locations : {Japan East, Central US, East US 2, Japan West...}
```
Now you'll see the Azure DevOps Pipeline releases. Click the **Edit release** button at the top.
<img :src="$withBase('/files/devops-k8s12.png')">
Then click the **Edit tasks** link.
<img :src="$withBase('/files/devops-k8s13.png')">
You now see the list of tasks that need to be run such as Creating the AKS cluster, running a PowerShell script and packaging and deploying helm charts.
<img :src="$withBase('/files/devops-k8s14.png')">
That wraps up the Pipelines section and gives you a taste of how powerful Azure DevOps is. In this next section, we'll look at the Azure Resources section.
#### Azure resources in a nutshell
Back in the DevOps Project dashboard, lets look at the Azure resources and [Application Insights](https://azure.microsoft.com/services/application-insights?WT.mc_id=azure-azuredevtips-azureappsdev)?WT.mc_id=azure-azuredevtips-azureappsdev). The resources are the URL to your live site running in Kubernetes and a link to the AKS cluster. The last link will take you to see the live telemetry for your site provided by Application Insights.
<img :src="$withBase('/files/devops-k8s15.png')">
**Click** on the **External Endpoint** link to be taken to the deployed application.
<img :src="$withBase('/files/devops-k8s16.png')">
The second link is to the Azure Kubernetes Service.
<img :src="$withBase('/files/devops-k8s17.png')">
The last link shows the [Application Insights](https://azure.microsoft.com/services/application-insights?WT.mc_id=azure-azuredevtips-azureappsdev)?WT.mc_id=azure-azuredevtips-azureappsdev) created for the service which includes powerful analytics tools to help you diagnose issues and to understand what users actually do with your app. It's designed to help you continuously improve performance and usability and works seamlessly with Azure DevOps
<img :src="$withBase('/files/devops-k8s18.png')">
Now if you go to **Resource Groups** on the Azure portal and search for the name of the DevOps project, youll see that three resource groups were created.
<img :src="$withBase('/files/devops-k8s19.png')">
I hope that was helpful and hope that you continue learning by going through the [step by step tutorial](https://docs.microsoft.com/azure/devops-project/azure-devops-project-aks?toc=%2F%2Fazure%2Fdevops-project%2Ftoc.json&bc=%2F%2Fazure%2Fbread%2Ftoc.json?WT.mc_id=docs-azuredevtips-azureappsdev) or read more about [Azure DevOps](https://azure.microsoft.com/blog/introducing-azure-devops?WT.mc_id=azure-azuredevtips-azureappsdev).

65
blog/blog/tip17.md Normal file
Просмотреть файл

@ -0,0 +1,65 @@
---
type: post
title: "Tip 17 - Use PowerShell with Azure Cloud Shell"
excerpt: "Learn how to take advantage of PowerShell within Azure Cloud Shell"
tags: [Languages & Frameworks]
date: 2017-09-18 17:00:00
---
### Use PowerShell within Azure Cloud Shell
<img :src="$withBase('/files/bashscreenshot.png')">
**PowerShell** is the other command language that the Azure Cloud Shell supports. I've recently [signed up for a preview](https://aka.ms/PSCloudSignup?WT.mc_id=akams-azuredevtips-azureappsdev) and thought I'd share.
You can switch to PowerShell by clicking the dropdown and selecting PowerShell.
<img :src="$withBase('/files/switchtops.png')">
This will prompt you to Restart Cloud Shell with PowerShell which will log you out of your current instance and stop any running processes.
<img :src="$withBase('/files/restartwithps.png')">
On first launch, it'll authenticate with Azure (just like BASH does) and build your Azure drive. You now have access to all your accounts. You can run the `dir` command and see your subscriptions vs with BASH it would list out the physical files on the currently mounted drive.
PS Azure:\> dir
Directory: Azure:
Mode SubscriptionName SubscriptionId TenantId State
---- ---------------- -------------- -------- -----
+ Demo - Azure Monitoring xxx-xxx-xxx-xxx xxx-xxx-xxx-xxx Enabled
Now that I have access to my subscriptions, I can traverse the "Demo - Azure Monitor" subscription by typing `cd '.\Demo - Azure Monitoring\'`.
If I run a `dir` again, then I'd see what is included in that account. Here I see Resource Groups, Storage Accounts, VMs and Web Apps.
Directory: Azure:\Demo - Azure Monitoring
Mode Name
---- ----
+ AllResources
+ ResourceGroups
+ StorageAccounts
+ VirtualMachines
+ WebApps
I can continue traversing this subscription by typing `cd VirtualMachines` then typing `dir`.
Directory: Azure:\Demo - Azure Monitoring\VirtualMachines
Name ResourceGroupName Location VmSize OsType NIC ProvisioningState PowerState
---- ----------------- -------- ------ ------ --- ----------------- ----------
k8s-a APPROVAL eastus Standard_D2_v2 Linux -nic-0 Succeeded running
As you can tell, I can easily discover and navigate Azure resources now by using Powershell.
The other nice thing that I've seen while playing with PowerShell is the ability to run modules such as the ones found in Microsoft.PowerShell. For instance we can use `Get-Date` as described [here](https://docs.microsoft.com/powershell/module/microsoft.powershell.utility/get-date?view=powershell-5.1?WT.mc_id=docs-azuredevtips-azureappsdev) inside of Cloud Shell.
PS Azure:\> Get-Date
Monday, September 18, 2017 11:02:49 PM

57
blog/blog/tip170.md Normal file
Просмотреть файл

@ -0,0 +1,57 @@
---
type: post
title: "Tip 170 - SAP on Azure in Plain English Part 1 of 2"
excerpt: "Learn about SAP hosted on Azure"
tags: [SAP on Azure]
date: 2018-11-04 17:00:00
---
::: tip
:bulb: Learn more : [Use Azure to host and run SAP workload scenarios](https://docs.microsoft.com/azure/virtual-machines/workloads/sap/get-started?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### SAP on Azure in Plain English Part 1 of 2
In this series, I take a look at SAP coming from someone who hasn't used it before.
* [Part 1 - This post](tip170.html)
* [Part 2](tip171.html)
#### SAP - A brief history lesson from 1973 - 2018
SAP, the world's third-largest software company, produces business applications including customer relationship management (CRM) and enterprise resource planning (ERP) solutions. Since its first product was released in 1973, the company has made major changes to its software to accommodate industry trends from mainframes through to cloud computing.
In 2010, with a focus on cloud computing, SAP developed the new SAP HANA platform, which is built on a proprietary database engine and forms the foundation of all its latest offerings. In 2015, SAP launched S/4HANA, which is a Business Suite implementation on this HANA platform. Microsoft and SAP have partnered to provide a range of SAP solutions running in the Azure cloud.
#### Tell me more about SAP on Azure?
Traditionally, SAP systems were designed to be hosted on-premises and required a significant hardware investment. Using Azure we support both the traditional NetWeaver-based and HANA-based solutions.
You can also run your SAP applications in Azure which features the broadest global footprint, the largest compliance portfolio, embedded security, enterprise-grade SLAs, and industry-leading support. Azure supports the largest SAP HANA workloads of any hyperscale cloud provider.
#### What offerings are available?
Azure has a number of preconfigured (and SAP certified) virtual machine images, published by SAP and third-party Linux vendors, so that you can spin up your infrastructure in minutes rather than the weeks it would take on-premises. You can select from a wide range of virtual machine SKUs and even add your own license. Just search for **Marketplace** and click on **Compute** and filter by **SAP**.
<img :src="$withBase('/files/azure-sap-vms.png')">
The **SAP HANA express edition (Server + Applications)** image contains a full development image and will deploy to a D12v2 virtual machine. Storage and networking options are preconfigured to ensure you can quickly create a system that will support SAP HANA.
SAP HANA, express edition includes the in-memory data engine with advanced analytical data processing engines for business, text, spatial and graph data - supporting multiple data models on a single copy of the data The software license allows for both non-production and production use cases, enabling you to quickly prototype, demo and deploy next-generation applications with SAP HANA, express edition.
This image contains only the advanced data processing engines, without XSA stack, web based IDE and administration tools.
Below, I've started creating a VM using this template. If you are doing the same (just for testing) then you'll want to leave the **Size** as-is as dropping down to a **Basic A0** won't work. You want to make sure that the **Image** is **SAP HANA, express edition (Server + Applications)
<img :src="$withBase('/files/azure-sap-create-vm.png')">
It's important to note at this stage that these VMs are not available in all regions. If you select a region where the required machines are not available, the entry next to **Size** will be blank. You can skip straight to **Review + Create** at this point or customize options for disks, networking, etc.
<img :src="$withBase('/files/azure-sap-review.png')">
After you hit **Create** on the review screen, you'll reach a status page showing the deployment in progress and finally when it has completed.
<img :src="$withBase('/files/azure-sap-creation.png')">
Come back tomorrow and we'll take a look at this VM and start setting everything up.

411
blog/blog/tip171.md Normal file
Просмотреть файл

@ -0,0 +1,411 @@
---
type: post
title: "Tip 171 - SAP on Azure in Plain English Part 2 of 2"
excerpt: "Learn about SAP hosted on Azure"
tags: [SAP on Azure]
date: 2018-11-05 17:00:00
---
::: tip
:bulb: Learn more : [Use Azure to host and run SAP workload scenarios](https://docs.microsoft.com/azure/virtual-machines/workloads/sap/get-started?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### SAP on Azure in Plain English Part 2 of 2
In this series, I take a look at SAP coming from someone who hasn't used it before.
* [Part 1](tip170.html)
* [Part 2 - This Post](tip171.html)
In previous poast we took a look at SAP on Azure and learned a little about the history and what offerings were available. We set up a VM using the **SAP HANA express edition (Server + Applications)** image and left off with a deployed SAP VM. Now, we'll take a look at connecting to the instance and configuring it.
#### Connecting to our VM
Once Azure has created the VM and configured the storage and virtual network you can connect to the server via RDP or SSH.
<img :src="$withBase('/files/azure-sap-connect.png')">
In my case, I'm using SSH.
<img :src="$withBase('/files/azure-sap-connect-to-vm.png')">
Below shows how I configured SAP Hana, you can follow along with what I did if you want.
**Pro Tip** The default password is `HXEHana1` if you have any trouble logging in via the `su - hxeadm` login.
SSH into the Linux VM with `sh hxehost@<the ip provided from azure portal>`
```
mbcrump@sapexpressed:~> ssh hxehost@<the ip provided from azure portal>
The authenticity of host 'x (x)' can't be established.
ECDSA key fingerprint is SHA256:XxQpKdpxiwzEo27E+dkFc7HPE4a4iP00sYNqprWzDmA.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'x' (ECDSA) to the list of known hosts.
Password:
SUSE Linux Enterprise Server 12 SP3 x86_64 (64-bit)
As "root" (sudo or sudo -i) use the:
- zypper command for package management
- yast command for configuration management
If you are using extensions consider to enable the auto-update feature
of the extension agent and restarting the service. As root execute:
- sed -i s/AutoUpdate.Enabled=n/AutoUpdate.Enabled=y/ /etc/waagent.conf
- rcwaagent restart
Management and Config: https://www.suse.com/suse-in-the-cloud-basics
Documentation: https://www.suse.com/documentation/sles-12/
Forum: https://forums.suse.com/forumdisplay.php?93-SUSE-Public-Cloud
Have a lot of fun...
```
Now run the `sudo -i` command to gain Sys Admin access with the password you entered earlier.
```
xehost@SAPTestMC:~> sudo -i
We trust you have received the usual lecture from the local System
Administrator. It usually boils down to these three things:
#1) Respect the privacy of others.
#2) Think before you type.
#3) With great power comes great responsibility.
[sudo] password for hxehost:
```
Run `su - hxeadm` to log into the express edition and use `HXEHana1` for the password and configure as shown below.
```
SAPTestMC:~ # su - hxeadm
Password:
############################################################################################################################################################
# Welcome to SAP HANA, express edition 2.0. #
# #
# The system must be configured before use. #
############################################################################################################################################################
Password must be at least 8 characters in length. It must contain at least
1 uppercase letter, 1 lowercase letter, and 1 number. Special characters
are allowed, except \ (backslash), ' (single quote), " (double quotes),
` (backtick), and $ (dollar sign).
New HANA database master password:
Confirm "HANA database master" password:
Do you need to use proxy server to access the internet? (Y/N): n
XSA configuration may take a while. Do you wish to wait for XSA configuration to finish?
If you enter no, XSA will be configured in background after server completes.
Wait for XSA configuration to finish (Y/N) [Y] : y
```
You'll now see the Summary.
```
############################################################################################################################################################
# Summary before execution #
############################################################################################################################################################
HANA, express edition
Host name : SAPTestMC
Domain name : x.dx.internal.cloudapp.net
Master password : ********
Log file : /var/tmp/hdb_init_config_2018-10-19_04.00.52.log
Wait for XSA configuration to finish : Yes
Proxy host : N/A
Proxy port : N/A
Hosts with no proxy : N/A
Proceed with configuration? (Y/N) : y
```
Press `y` to proceed and you'll see a log similar to the following.
```
Please wait while HANA server starts. This may take a while...
tartService
OK
OK
Starting instance using: /usr/sap/HXE/SYS/exe/hdb/sapcontrol -prot NI_HTTP -nr 90 -function StartWait 2700 2
21.10.2018 00:56:08
Start
OK
21.10.2018 00:56:18
StartWait
OK
Change SYSTEM user password on SystemDB database...
Change SYSTEM user password on HXE database...
############################################################################################################################################################
# Security keys change summary #
############################################################################################################################################################
HANA system ID : HXE
HANA instance number : 90
system password : ********
root key backup password : ********
root key backup directory : /usr/sap/HXE/home/root_key.bck
#########################################################
# Changing SSFS Master keys #
#########################################################
Re-encrypt master key of the instance SSFS...
Record Statistics
=============================================
Encrypted and readable : 8
Encrypted and not readable : 0
Plaintext : 7
Removed by compacting : 0
Add new entry to global.ini file...
Re-encrypt the system PKI SSFS with new key...
Record Statistics
=============================================
Encrypted and readable : 3
Encrypted and not readable : 0
Plaintext : 0
Removed by compacting : 0
#################################################################################
# Change root key for SystemDB database #
#################################################################################
Root key backup password set for SYSTEMDB!
Root key generated for data volume of SYSTEMDB!
Root key generated for redo log of SYSTEMDB!
Root key generated for internal application of SYSTEMDB!
Root key for SYSTEMDB is backed up to /usr/sap/HXE/home/root_key.bck/SYSTEMDB.rkb!
Root key activated for data volume of SYSTEMDB!
Root key activated for redo log of SYSTEMDB!
Root key activated for internal application of SYSTEMDB!
#####################################################################################
# Change root key for tenant database HXE #
#####################################################################################
Root key backup password set for HXE!
Root key generated for data volume of HXE!
Root key generated for redo log of HXE!
Root key generated for internal application of HXE!
Root key for HXE is backed up to /usr/sap/HXE/home/root_key.bck/HXE.rkb!
Root key activated for data volume of HXE!
Root key activated for redo log of HXE!
Root key activated for internal application of HXE!
Collecting garbage...
Collect garbage on "hdbnameserver"...
Shrink resource container memory on "hdbnameserver"...
Collect garbage on "hdbindexserver"...
Shrink resource container memory on "hdbindexserver"...
Collect garbage on "hdbcompileserver"...
Shrink resource container memory on "hdbcompileserver"...
Collect garbage on "hdbdiserver"...
Shrink resource container memory on "hdbdiserver"...
Collect garbage on "hdbwebdispatcher"...
Shrink resource container memory on "hdbwebdispatcher"...
Total in use HANA processes heap memory (MB)
============================================
Before collection : 2726
After collection : 2457
Free and used memory in the system
==================================
Before collection
-------------------------------------------------------------------------
total used free shared buffers cached
Mem: 27G 18G 8.7G 68M 331M 7.7G
-/+ buffers/cache: 10G 16G
Swap: 4.0G 0B 4.0G
After collection
-------------------------------------------------------------------------
total used free shared buffers cached
Mem: 27G 16G 11G 68M 332M 7.7G
-/+ buffers/cache: 8.2G 19G
Swap: 4.0G 0B 4.0G
Please wait while XSA starts. This may take a while...OK
Change XSA_ADMIN user password on SystemDB database...
Change XSA_DEV user password on SystemDB database...
Collecting garbage...
Collect garbage on "hdbnameserver"...
Shrink resource container memory on "hdbnameserver"...
Collect garbage on "hdbindexserver"...
Shrink resource container memory on "hdbindexserver"...
Collect garbage on "hdbcompileserver"...
Shrink resource container memory on "hdbcompileserver"...
Collect garbage on "hdbdiserver"...
Shrink resource container memory on "hdbdiserver"...
Collect garbage on "hdbwebdispatcher"...
Shrink resource container memory on "hdbwebdispatcher"...
Total in use HANA processes heap memory (MB)
============================================
Before collection : 2482
After collection : 2494
Free and used memory in the system
==================================
Before collection
-------------------------------------------------------------------------
total used free shared buffers cached
Mem: 27G 16G 11G 68M 334M 7.8G
-/+ buffers/cache: 8.2G 19G
Swap: 4.0G 0B 4.0G
After collection
-------------------------------------------------------------------------
total used free shared buffers cached
Mem: 27G 16G 11G 68M 336M 7.9G
-/+ buffers/cache: 8.1G 19G
Swap: 4.0G 0B 4.0G
Change telemetry technical user (TEL_ADMIN) password on SystemDB database...
===============================================================================
Change telemetry technical user password on "SystemDB" database
===============================================================================
Password must be at least 8 characters in length. It must contain at least
1 uppercase letter, 1 lowercase letter, and 1 number. Special characters
are allowed, except \ (backslash), ' (single quote), " (double quotes),
` (backtick), and $ (dollar sign).
Login to XSA services...
Check/Wait for Cockpit app to start...
Waiting for apps: cockpit-persistence-svc, cockpit-hdb-svc, cockpit-xsa-svc, cockpit-collection-svc, cockpit-telemetry-svc, cockpit-adminui-svc, cockpit-admin-web-app
cockpit-collection-svc: ready
Waiting for apps: cockpit-persistence-svc, cockpit-hdb-svc, cockpit-xsa-svc, cockpit-telemetry-svc, cockpit-adminui-svc, cockpit-admin-web-app.......................
cockpit-xsa-svc: ready
Waiting for apps: cockpit-persistence-svc, cockpit-hdb-svc, cockpit-telemetry-svc, cockpit-adminui-svc, cockpit-admin-web-app.....................
cockpit-adminui-svc: ready
Waiting for apps: cockpit-persistence-svc, cockpit-hdb-svc, cockpit-telemetry-svc, cockpit-admin-web-app.................
cockpit-admin-web-app: ready
OK
Create role collections...
Role collections created.
Get authentication token from UAA...
"HXE" database is registered to Cockpit.
Collecting garbage...
Collect garbage on "hdbnameserver"...
Shrink resource container memory on "hdbnameserver"...
Collect garbage on "hdbindexserver"...
Shrink resource container memory on "hdbindexserver"...
Collect garbage on "hdbcompileserver"...
Shrink resource container memory on "hdbcompileserver"...
Collect garbage on "hdbdiserver"...
Shrink resource container memory on "hdbdiserver"...
Collect garbage on "hdbwebdispatcher"...
Shrink resource container memory on "hdbwebdispatcher"...
Total in use HANA processes heap memory (MB)
============================================
Before collection : 3151
After collection : 2468
Free and used memory in the system
==================================
Before collection
-------------------------------------------------------------------------
total used free shared buffers cached
Mem: 27G 26G 1.1G 68M 507M 13G
-/+ buffers/cache: 12G 14G
Swap: 4.0G 0B 4.0G
After collection
-------------------------------------------------------------------------
total used free shared buffers cached
Mem: 27G 25G 1.6G 68M 507M 13G
-/+ buffers/cache: 12G 15G
Swap: 4.0G 0B 4.0G
*** Congratulations! SAP HANA, express edition 2.0 is configured. ***
See https://www.sap.com/developer/tutorials/hxe-ua-getting-started-vm.html to get started.
hxeadm@saprg:/usr/sap/HXE/HDB90> HDB info
USER PID PPID %CPU VSZ RSS COMMAND
hxeadm 11316 11315 0.0 16616 6300 -bash
hxeadm 25032 11316 0.0 13656 3468 \_ /bin/sh /usr/sap/HXE/HDB90/HDB info
hxeadm 25063 25032 0.0 37296 2992 \_ ps fx -U hxeadm -o user:8,pid:8,ppid:8,pcp
hxeadm 6937 1 0.0 21732 2920 sapstart pf=/usr/sap/HXE/SYS/profile/HXE_HDB90_hxe
hxeadm 6975 6937 0.0 211916 57196 \_ /usr/sap/HXE/HDB90/hxehost/trace/hdb.sapHXE_HD
hxeadm 6994 6975 18.2 3760444 3179184 \_ hdbnameserver
hxeadm 7150 6975 0.9 1385044 363288 \_ hdbcompileserver
hxeadm 7171 6975 6.4 3429060 2352128 \_ hdbindexserver -port 39003
hxeadm 7380 6975 0.6 1356504 333676 \_ hdbdiserver
hxeadm 7382 6975 0.6 1593184 549284 \_ hdbwebdispatcher
hxeadm 7384 6975 4.1 788348 362572 \_ /hana/shared/HXE/xs/bin/../sapjvm_8/bin/ja
hxeadm 9619 7384 0.4 1174860 272520 | \_ /hana/shared/HXE/xs/router/webdispatch
hxeadm 7386 6975 15.7 781300 302356 \_ /hana/shared/HXE/xs/bin/../sapjvm_8/bin/ja
hxeadm 9900 7386 0.6 930264 197944 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 10164 7386 1.5 703292 284292 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 10165 7386 0.5 799672 143036 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 10184 7386 2.0 732348 322612 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 10339 7386 0.0 1299964 53368 | \_ node index.js
hxeadm 10707 7386 0.0 1306104 63532 | \_ node application.js
hxeadm 10978 7386 0.9 1422236 440296 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 11250 7386 0.0 1312192 56980 | \_ node node_modules/@sap/approuter/appro
hxeadm 11407 7386 0.0 1311956 55900 | \_ node node_modules/approuter/approuter.
hxeadm 11756 7386 0.0 13400 3080 | \_ /bin/sh ./startup.sh
hxeadm 12803 11756 0.0 1326508 63864 | | \_ sinopia
hxeadm 12108 7386 0.0 1280508 39376 | \_ node start.js
hxeadm 12221 7386 1.8 1238612 381144 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 12935 7386 0.0 1280532 46124 | \_ node start.js
hxeadm 13102 7386 0.0 1308644 60024 | \_ node node_modules/@sap/approuter/appro
hxeadm 14494 7386 0.1 1334952 72836 | \_ node node_modules/@sap/approuter/appro
hxeadm 14672 7386 1.3 1485488 442512 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 14991 7386 0.1 1325884 71616 | \_ node application.js
hxeadm 15292 7386 0.0 1319708 55052 | \_ node node_modules/@sap/approuter/appro
hxeadm 15815 7386 0.5 787236 127164 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 16079 7386 0.0 1280020 44124 | \_ node server.js
hxeadm 16836 7386 3.9 736540 304172 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 16974 7386 3.0 1742528 637948 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 17607 7386 0.1 1303440 56208 | \_ node application.js
hxeadm 17773 7386 0.1 1303368 55112 | \_ node application.js
hxeadm 17868 7386 0.2 1337404 73636 | \_ node --harmony server.js
hxeadm 18023 7386 0.1 1309320 62820 | \_ node --harmony application.js
hxeadm 18131 7386 0.0 1287276 52956 | \_ node start.js
hxeadm 18353 7386 5.4 1285856 498360 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 18487 7386 1.8 1403840 410568 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 18969 7386 0.1 1318772 53876 | \_ node node_modules/@sap/approuter/appro
hxeadm 19194 7386 2.9 975916 283896 | \_ META-INF/.sap_java_buildpack/sapjvmjdk
hxeadm 22984 7386 6.3 1469408 523404 | \_ META-INF/.sap_java_buildpack/sapjvm/bi
hxeadm 23663 7386 0.9 3849144 119500 | \_ node main.js
hxeadm 7424 6975 3.3 1843108 723612 \_ /hana/shared/HXE/xs/bin/../sapjvm_8/bin/ja
hxeadm 2049 1 0.0 495028 33628 /usr/sap/HXE/HDB90/exe/sapstartsrv pf=/usr/sap/HXE
hxeadm 1840 1 0.0 36760 4568 /usr/lib/systemd/systemd --user
hxeadm 1843 1840 0.0 85944 1552 \_ (sd-pam)
hxeadm@saprg:/usr/sap/HXE/HDB90>
```
Once complete open up your browser and go to the **Public IP Address** with port 8090. You should now see XSEngine is up and running!
<img :src="$withBase('/files/azure-sap-browser.png')">
#### Before I go - A Managed Solution is available
For a completely managed solution, [SAP Cloud Platform (CP) on Azure](https://azure.microsoft.com/blog/agile-sap-development-with-sap-cloud-platform-on-azure?WT.mc_id=azure-azuredevtips-azureappsdev) is a platform-as-a-service implementation that is completely managed by SAP but hosted in the Azure cloud. This arrangement still offers Azure integration so that you can connect with Event Hubs, Azure SQL, Cosmos DB, etc. With this approach, applications are deployed via SAP CP Cockpit, which is a marketplace of apps and components.
#### Wrap-up
Whether you are looking at how to manage a SAP NetWeaver system in Azure, or you want to migrate to a more flexible, cloud-first, SAP HANA system, you'll find a lot more detailed information, along with case studies, on the [SAP for Azure website](https://azure.microsoft.com/solutions/sap?WT.mc_id=azure-azuredevtips-azureappsdev).

84
blog/blog/tip172.md Normal file
Просмотреть файл

@ -0,0 +1,84 @@
---
type: post
title: "Tip 172 - Getting Started with HDInsight"
excerpt: "Learn about Azure HDInsight clusters"
tags: [Analytics]
date: 2018-11-11 17:00:00
---
::: tip
:bulb: Learn more : [Azure HDInsight Documentation](https://docs.microsoft.com/azure/hdinsight/?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Getting Started with HDInsight
#### What is Azure HDInsight?
[Azure HDInsight](https://azure.microsoft.com/services/hdinsight?WT.mc_id=azure-azuredevtips-azureappsdev) is a managed cloud service for analyzing large sets of data. This big data is often collected rapidly and may be relatively unstructured. By itself this data might not be very useful, but when cleaned, analyzed, and presented, it can provide actionable insights. You can use Azure HDInsight to power machine learning, IoT, or data warehousing projects.
The service is available in most Azure regions and has the security and compliance standards you would expect from an Azure managed service. You can use whichever language you prefer to develop with. Python, Java, and C# are just a few examples.
#### The elephant in the room?
When you read about [Azure HDInsight](https://azure.microsoft.com/services/hdinsight?WT.mc_id=azure-azuredevtips-azureappsdev) (or see it in the Azure portal, you'll probably notice a little elephant icon. This is the logo for Apache Hadoop, which is an open-source distributed data analysis solution (which MS contributes to).
Hadoop manages the processing of large datasets across large clusters of computers and it detects and handles failures. There are related projects in the Hadoop stack such as Hive, Spark, and Kafka that HDInsight also contains.
#### Why Hadoop on Azure?
As we already know, Azure provides dynamic machines that are billed only when active. This enables elastic computing, where you can add machines for particular workloads or projects and then remove them when not needed. HDInsight can take advantage of this scalable platform. It can also capitalize on the security and management features of Azure, integration with Azure Active Directory and Log Analytics.
HDInsight can also integrate with familiar business intelligence tools such as Excel, PowerPivot, and SQL Server Analysis Services and Reporting Services. This is facilitated with special ODBC drivers.
#### Hardoop for Devs
Hadoop uses the [MapReduce](https://docs.microsoft.com/azure/hdinsight/hadoop/hdinsight-use-mapreduce?WT.mc_id=docs-azuredevtips-azureappsdev) programming model. This allows you to map (filter and sort) data sources and reduce (summarize, count, etc.) to produce meaningful output from a large unstructured data source. For developing with HDInsight, you can use Visual Studio, Visual Studio Code, Eclipse, and IntelliJ development environments. You can create user-defined functions (UDFs) written in a number of languages for the Pig Latin programming language or use HiveQL, which is an SQL dialect, to treat the data like a relational model. A third tool, [Sqoop](https://docs.microsoft.com/azure/hdinsight/hadoop/hdinsight-use-sqoop?WT.mc_id=docs-azuredevtips-azureappsdev), allows you to export to a conventional relational database.
<img :src="$withBase('/files/hadoop-ecosystem.png')">
#### Into the action - Setting up your cluster
Adding a new HDInsight cluster is a three-step process. First, search the Azure Portal for **HDInsight cluster** and create a new cluster. We'll stay on the **Quick create** for this sample.
You must specify a unique name and login credentials and fill in the rest of the fields as you normally would.
When you get to **cluster type**, select **Hardoop** and leave the rest at their default values.
Note: It also supports the following types:
* Hadoop: Terabyte-scale processing with Hadoop components like Hive (SQL in Hadoop), Pig and Oozie.
* HBase: Fast and scalable NoSQL database. Data Lake Storage connectivity is available in preview for HDI 3.6.
* Storm: Reliably process infinite streams of data in real-time.
* Spark: Fast data analytics and cluster computing using in-memory processing.
* Interactive Query: In-memory analytics using Hive and LLAP.
* R Server: Terabyte-scale, enterprise-grade R analytics with transparent parallelization on top of Spark and Hadoop.
* Kafka: Fast, scalable, durable, and fault-tolerant publish-subscribe messaging system.
<img :src="$withBase('/files/azure-hdinsight-basics.png')">
Next, you either select an existing Storage account or create a new one.
<img :src="$withBase('/files/azure-hdinsight-storage.png')">
Finally, you are presented with a summary of the options you have chosen and an estimated running cost, per hour, for this configuration.
<img :src="$withBase('/files/azure-hdinsight-summary.png')">
Once you click **Create**, the cluster nodes will be configured and you will be billed for the cluster until you remove it.
NOTE: It may take a while to spin this up. The information dialog says up to 20 minutes.
You can now open the cluster in the Azure Portal and do things such as scale up or scale out, manage policies (such as mask a column) or review audit logs.
<img :src="$withBase('/files/azure-hdinsight-summary3.png')">
Click **Cluster dashboards** to open your Apache Ambari dashboard which simplifies Hadoop management with a GUI. You can use Ambari to manage and monitor Hadoop clusters. Developers can integrate these capabilities into their applications by using the Ambari REST APIs.
<img :src="$withBase('/files/azure-hdinsight-summary2.png')">
#### Wrap-up
You can read about the Hadoop project in more detail [here](http://hadoop.apache.org). There is also a range of third-party applications [to explore](https://azure.microsoft.com/services/hdinsight/partner-ecosystem/?WT.mc_id=azure-azuredevtips-azureappsdev).

72
blog/blog/tip173.md Normal file
Просмотреть файл

@ -0,0 +1,72 @@
---
type: post
title: "Tip 173 - Get the most out of Azure Advisor"
excerpt: "Learn how to use Azure Advisor"
tags: [Management and Governance]
date: 2018-11-12 17:00:00
---
::: tip
:bulb: Learn more : [Introduction to Azure Advisor](https://docs.microsoft.com/azure/advisor/advisor-overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Get the most out of Azure Advisor
[Azure Advisor](https://azure.microsoft.com/services/advisor?WT.mc_id=azure-azuredevtips-azureappsdev) is a simple dashboard that helps you implement best practices across your Azure resources. In this blog post, Ill walk you through the types of recommendations it provides and how easy it is to implement them.
#### Recommendation categories
Advisor looks at the Azure resources in your subscriptions and comes up with recommendations that fall into these categories:
* [High Availability](https://docs.microsoft.com/azure/advisor/advisor-high-availability-recommendations?WT.mc_id=docs-azuredevtips-azureappsdev): Suggestions that are important for business-critical and production-worthy applications.
* [Security](https://docs.microsoft.com/azure/advisor/advisor-security-recommendations?WT.mc_id=docs-azuredevtips-azureappsdev): Advice to help you prevent and detect threats or security vulnerabilities.
* [Performance](https://docs.microsoft.com/azure/advisor/advisor-performance-recommendations?WT.mc_id=docs-azuredevtips-azureappsdev): Recommendations that are tailored to the configurations of your resources and that compile together items from [SQL Database Advisor](https://docs.microsoft.com/azure/sql-database/sql-database-advisor?WT.mc_id=docs-azuredevtips-azureappsdev), [Redis Cache Advisor](https://docs.microsoft.com/azure/redis-cache/cache-configure#redis-cache-advisor?WT.mc_id=docs-azuredevtips-azureappsdev), and other best practices.
* [Cost](https://docs.microsoft.com/azure/advisor/advisor-cost-recommendations?WT.mc_id=docs-azuredevtips-azureappsdev): Information on past usage of things like VMs generates cost-saving recommendations as well as sizing and other resource configurations that affect cost.
#### Use Advisor to implement recommendations
Inside the Azure portal, search for **Advisor** and open the **Advisor recommendations** dashboard. At first glance, we can see Advisor has a few recommendations for me, one performance, two for high availability and three for security. At the bottom of the screen, you'll also notice that you can export the recommendations to PDF or CSV files - which is great for mangers to prove you need to work on this task. It is very cool that Azure is watching my back "out of the box"!
<img :src="$withBase('/files/advisor1.png')">
If youre following along in your own Azure subscription, chances are good you have some recommendations available on the **Advisor dashboard**.
Lets take a look at the High Availability recommendations I have.
**Click** on the **High Availability** box in the dashboard.
<img :src="$withBase('/files/advisor2.png')">
You now see more information about the recommendations, like the impact (high, medium, low), description, potential benefits, impacted resources, and the last time the recommendation was updated.
**Click** on an **item** in the recommendation list to learn the complete details.
<img :src="$withBase('/files/advisor3.png')">
The recommendation I selected was the **“Enable Soft Delete to protect blob data.”** In my case, I do think its a good idea to turn that on for one of the two storage account it lists. So, I clicked on the **“Enable Soft Delete to protect blob data”** link to get to where I can turn that feature on. Isnt that cool? Not only did it tell me it was a good idea, but it walks me through what I need to do to follow the recommendation!
<img :src="$withBase('/files/advisor4.png')">
Once I click **Enabled**, I set the Retention policies to **30 days** and click **Save**, thats it! Now I can use the breadcrumbs in the top to go back to the Enable Soft Delete list.
Next, I want to select **Postpone** for the other storage account.
<img :src="$withBase('/files/advisor5.png')">
Since I know I dont want soft delete enabled on this account, I can postpone this recommendation to **Never** and click the **Postpone** button so it doesnt make the recommendation next time I take a look at Advisor.
If you click on the **Security** tab and drill into a recommendation, then you'll see an actual secure score. This allows you to quickly see which security recommendations pose the greatest threat.
<img :src="$withBase('/files/advisor6.png')">
Again, we see that not only did it tell me what posed the greatest threat, but it walks me through what I need to do to follow the recommendation.
#### Summary
To sum it up, the flow for all four categories of recommendations is:
1. Select the recommendation category in the dashboard.
2. Select the individual recommendation.
3. If you want to ignore, you can postpone (or, for security recommendations, use “dismiss”).
4. If you want to implement, select the recommendation item and Advisor will walk you through the steps you need to implement the recommendation.
Implementing best practices is that easy!

226
blog/blog/tip174.md Normal file
Просмотреть файл

@ -0,0 +1,226 @@
---
type: post
title: "Tip 174 - Machine Learning with ML.NET and Azure Functions - Part 1 of 2"
excerpt: "ML.NET is the machine learning framework that Microsoft Research made just for .NET developers so you can do everything inside Visual Studio. And when you are ready to deploy your ML.NET algorithm, you can use serverless architecture through Azure Functions--the “dont worry about it” option when you want to get an app up and running."
tags: [Serverless, AI + Machine Learning]
date: 2018-11-18 17:00:00
---
::: tip
:bulb: Learn more : [ML.NET Overview](https://dotnet.microsoft.com/apps/machinelearning-ai/ml-dotnet?WT.mc_id=docs-azuredevtips-azureappsdev).
:bulb: Checkout [Azure ML for data scientists page](https://azure.microsoft.com/en-us/overview/ai-platform/data-scientist-resources/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Machine Learning with ML.NET and Azure Functions - Part 1 of 2
#### Intro
Machine learning can be tricky. Fortunately, Azure is coming up with ways to make it easier for developers to jump into machine learning. [ML.NET](https://azure.microsoft.com/updates/ml-net?WT.mc_id=azure-azuredevtips-azureappsdev) is the machine learning framework that Microsoft Research made just for .NET developers so you can do everything inside Visual Studio. If you havent already been playing with it, I think youre going to love it. And when you are ready to deploy your ML.NET algorithm, you can use serverless architecture through [Azure Functions](https://azure.microsoft.com/services/functions?WT.mc_id=azure-azuredevtips-azureappsdev)— the “dont worry about it” option when you want to get an app up and running but dont necessarily want to mess around with servers and containers.
#### Serverless Machine Learning
This is [part 1](tip174.html) of a [two part](tip175.html) post on ML.NET inspired by Luis Quintanillas [article](http://luisquintanilla.me/2018/08/21/serverless-machine-learning-mlnet-azure-functions/) about using ML.NET with Azure Functions, where he took these two great ideas and combined them. You will use ML.NET locally to train your machine learning model. Then you will create an Azure environment with a storage account and Azure Function to host your machine learning app. The final step, building an app that uses your model, will be covered in the next post.
#### Create your model
For the ML.NET portion of this quick project, lets build the iris categorization model from the [Getting started in 10 minutes](https://www.microsoft.com/net/learn/machine-learning-and-ai/get-started-with-ml-dotnet-tutorial?WT.mc_id=microsoft-azuredevtips-azureappsdev) ML.NET tutorial. As a prerequisite, youll want to install [Azure CLI 2.0](https://docs.microsoft.com/cli/azure/install-azure-cli?view=azure-cli-latest?WT.mc_id=docs-azuredevtips-azureappsdev), [Azure Functions Core Tools](https://docs.microsoft.com/azure/azure-functions/functions-run-local?WT.mc_id=docs-azuredevtips-azureappsdev) and a recent version of [.NET Core](https://www.microsoft.com/net/download/dotnet-core/2.2?WT.mc_id=microsoft-azuredevtips-azureappsdev).
<img :src="$withBase('/files/iris-machinelearning.png')">
Open a command prompt and create a new folder for your ML.NET project.
```
> mkdir demo
> cd demo
```
Next, create a new solution as well as a new console project and install the ML.NET package.
```
> dotnet new solution
> dotnet new console -o model
> dotnet sln add model/model.csproj
> cd model
> dotnet add package Microsoft.ML --version 0.4.0
> dotnet restore
```
Create a data directory under model.
```
> mkdir data
```
Open the [UCI Machine Learning Repository: Iris Data Set](https://archive.ics.uci.edu/ml/machine-learning-databases/iris/iris.data), copy and paste the data into VS Code, or TextEdit or Notepad, and save it as **iris-data.txt** in the **data** directory.
Now its time to write some code. Open up your project in Visual Studio Code and create a couple of data structure classes: **IrisData.cs** and **IrisPrediction.cs**.
```
using Microsoft.ML.Runtime.Api;
public class IrisData
{
[Column("0")]
public float SepalLength;
[Column("1")]
public float SepalWidth;
[Column("2")]
public float PetalLength;
[Column("3")]
public float PetalWidth;
[Column("4")]
[ColumnName("Label")]
public string Label;
}
public class IrisPrediction
{
[ColumnName("PredictedLabel")]
public string PredictedLabels;
}
```
Add a **model** class to perform the machine learning training.
```
using System.Threading.Tasks;
using Microsoft.ML;
using Microsoft.ML.Data;
using Microsoft.ML.Trainers;
using Microsoft.ML.Transforms;
class Model
{
public static async Task<PredictionModel<IrisData, IrisPrediction>> Train(LearningPipeline pipeline, string dataPath, string modelPath)
{
// Load Data
pipeline.Add(new TextLoader(dataPath).CreateFrom<IrisData>(separator: ','));
// Transform Data
// Assign numeric values to text in the "Label" column, because
// only numbers can be processed during model training
pipeline.Add(new Dictionarizer("Label"));
// Vectorize Features
pipeline.Add(new ColumnConcatenator("Features", "SepalLength", "SepalWidth", "PetalLength", "PetalWidth"));
// Add Learner
pipeline.Add(new StochasticDualCoordinateAscentClassifier());
// Convert Label back to text
pipeline.Add(new PredictedLabelColumnOriginalValueConverter() { PredictedLabelColumn = "PredictedLabel" });
// Train Model
var model = pipeline.Train<IrisData, IrisPrediction>();
// Persist Model
await model.WriteAsync(modelPath);
return model;
}
}
```
Place your logic inside the Program.cs file to run through the process:
```
class Program
{
static void Main(string[] args)
{
string dataPath = "/Users/mbcrump/Documents/demo/model/data/iris-data.txt";
string modelPath = "/Users/mbcrump/Documents/demo/model/model.zip";
var model = Model.Train(new LearningPipeline(), dataPath, modelPath).Result;
// Test data for prediction
var prediction = model.Predict(new IrisData()
{
SepalLength = 3.3f,
SepalWidth = 1.6f,
PetalLength = 0.2f,
PetalWidth = 5.1f
});
Console.WriteLine($"Predicted flower type is: {prediction.PredictedLabels}");
}
}
```
Run the model project to create a new **model.zip** file in your root directory. Below is the results that I got.
```
Michaels-MacBook-Pro:model mbcrump$ dotnet run
Automatically adding a MinMax normalization transform, use 'norm=Warn' or 'norm=No' to turn this behavior off.
Using 4 threads to train.
Automatically choosing a check frequency of 4.
Auto-tuning parameters: maxIterations = 9996.
Auto-tuning parameters: L2 = 2.668802E-05.
Auto-tuning parameters: L1Threshold (L1/L2) = 0.
Using best model from iteration 500.
Not training a calibrator because it is not needed.
Predicted flower type is: Iris-virginica
```
Congratulations! Youve trained a machine learning model with ML.NET that categorizes irises.
#### Set up your Azure environment using Cloud Shell
We'll use Azure Cloud Shell which uses the [Azure CLI](https://docs.microsoft.com/cli/azure/?view=azure-cli-latest?WT.mc_id=docs-azuredevtips-azureappsdev) to set up our Azure environment. The easiest way to do this is to sign in to your Azure portal account and click on the **cloud shell icon** shown below to open a bash shell or go to [shell.azure.com](http://shell.azure.com).
<img :src="$withBase('/files/cloudshell.PNG')">
Once logged in, create a new resource group for this project in the bash shell (and replace “mlnetdemo” as well as the location with one of your own).
`$ az group create --name mlnetdemo --location westus`
Add storage to this resource group.
NOTE: You'll have to change the name below to something unique
`$ az storage account create --name mlnetdemostorage --location westus --resource-group mlnetdemo --sku Standard_LRS`
Create your Azure Function and configure it to use the beta runtime which supports .NET Core.
NOTE: You'll have to change the name below to something unique
`
$ az functionapp create --name mlnetdemoazfunction1 --storage-account mlnetdemostorage1 --consumption-plan-location westus --resource-group mlnetdemo
`
`
$ az functionapp config appsettings set --name mlnetdemoazfunction1 --resource-group mlnetdemo --settings FUNCTIONS_EXTENSION_VERSION=beta
`
#### Deploy your machine learning model
To get your model up to the server, you will need to get the keys to your storage account. Use the following command in the bash window to get it.
`$ az storage account keys list --account-name mlnetdemostorage1 --resource-group mlnetdemo
`
You'll see the following:
```
[
{
"keyName": "key1",
"permissions": "Full",
"value": "YOURKEY"
},
{
"keyName": "key2",
"permissions": "Full",
"value": "NONEYOBUSINESS"
}
]
```
Use the following command to create a new directory called `models` to put your model in using your account key (this can be found in the navigation window under **Settings | Access keys**).
`$ az storage container create --name models --account-key YOURKEY --account-name mlnetdemostorage1`
<img :src="$withBase('/files/blob_model.png')">
Since we are using Cloud Shell, it will be easier to use the Azure Portal for this step. You can also use the Azure CLI if you wish. Browse to your version of the **mlnetdemo** resource group and drill down to your storage resource that you created earlier. Drill into the blobs and you see the new folder `models` subdirectory. Upload the **model.zip** here which can be found on your hard drive.
In part 2 (coming tomorrow), well look at building an app hosted by your Azure Function that will run images against your iris categorizer.

115
blog/blog/tip175.md Normal file
Просмотреть файл

@ -0,0 +1,115 @@
---
type: post
title: "Tip 175 - Machine Learning with ML.NET and Azure Functions - Part 2 of 2"
excerpt: "In part 1 of this post on ML.NET and Azure Functions, you created a machine learning model with ML.NET that categorizes irises. You also set up a serverless architecture environment with Azure Functions and uploaded your model to it. In this post, youre going to finish by building an app that uses your machine learning model."
tags: [Serverless, AI + Machine Learning]
date: 2018-11-19 17:00:00
---
::: tip
:bulb: Learn more : [ML.NET Overview](https://dotnet.microsoft.com/apps/machinelearning-ai/ml-dotnet?WT.mc_id=docs-azuredevtips-azureappsdev).
:bulb: Checkout [Azure ML for data scientists page](https://azure.microsoft.com/en-us/overview/ai-platform/data-scientist-resources/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Machine Learning with ML.NET and Azure Functions - Part 2 of 2
#### Intro
Machine learning can be tricky. Fortunately, Azure is coming up with ways to make it easier for developers to jump into machine learning. In part 1 of this post on ML.NET and Azure Functions, you created a machine learning model with [ML.NET](https://www.microsoft.com/net/apps/machinelearning-ai/ml-dotnet?WT.mc_id=microsoft-azuredevtips-azureappsdev) that categorizes irises. You also set up a serverless architecture environment with Azure Functions and uploaded your model to it. In this post, youre going to finish by building an app that uses your machine learning model.
#### Identify irisis like a Machine
This is [part 2](tip175.html) of a two part post on ML.NET inspired by Luis Quintanillas [article](http://luisquintanilla.me/2018/08/21/serverless-machine-learning-mlnet-azure-functions/) about using ML.NET with Azure Functions, where he took these two great ideas and combined them. Picking up [with Part 1](tip174.html), you are going to create a new Azure Function project using Visual Studio.
Note: Make sure you have the Azure Workload installed to see this template.
<img :src="$withBase('/files/azurefunction.png')">
Open up the **demo** solution from part 1 in Visual Studio and create a new project using the Azure Functions project template called **serverless_ai**.
<img :src="$withBase('/files/httptrigger.png')">
When prompted, select the **Http trigger** option and connect it to your Azure storage account for the project (**mlnetdemostorage1** for this post). Then complete the following steps:
• Use NuGet to add the **Microsoft.ML** package to your project.
• Copy the **IrisData.cs** and **IrisPrediction.cs** files from your model project to the _serverless_ai_ project. Youll need them both again.
Change the name of the Http trigger class _Function1_ to _Predict_ and copy in the following code:
```csharp
using Newtonsoft.Json;
using Microsoft.ML;
namespace serverless_ai
{
public static class Predict
{
[FunctionName("Predict")]
public static IActionResult Run([HttpTrigger(AuthorizationLevel.Function, "get", "post", Route = null)]HttpRequest req,
[Blob("models/model.zip", FileAccess.Read, Connection = "AzureWebJobsStorage")] Stream serializedModel,
TraceWriter log)
{
if (typeof(Microsoft.ML.Runtime.Data.LoadTransform) == null ||
typeof(Microsoft.ML.Runtime.Learners.LinearClassificationTrainer) == null ||
typeof(Microsoft.ML.Runtime.Internal.CpuMath.SseUtils) == null ||
typeof(Microsoft.ML.Runtime.FastTree.FastTree) == null)
{
log.Error("Error loading ML.NET");
return new StatusCodeResult(500);
}
//Read incoming request body
string requestBody = new StreamReader(req.Body).ReadToEnd();
log.Info(requestBody);
//Bind request body to IrisData object
IrisData data = JsonConvert.DeserializeObject<IrisData>(requestBody);
//Load prediction model
var model = PredictionModel.ReadAsync<IrisData, IrisPrediction>(serializedModel).Result;
//Make prediction
IrisPrediction prediction = model.Predict(data);
//Return prediction
return (IActionResult)new OkObjectResult(prediction.PredictedLabels);
}
}
}
```
These lines use your model to evaluate new iris data to make a prediction. Your app is ready for testing.
#### Test locally before deploying
To test the Azure Function app on your local machine, check your **local.settings.json** file to make sure that **AzureWebJobsStorage** has a value associated with it. This is how your local app will find your uploaded model on your Azure storage account. If this has a value (and it should if you bound the project to your account when you created it), you can just _F5_ the _serverless_ai_ project in order to run it.
Now open up [Postman](https://www.getpostman.com/apps) (or a similar REST API tool) and send a POST call to http://localhost:7071/api/Predict with the following body:
```
{
"SepalLength": 3.3,
"SepalWidth": 1.6,
"PetalLength": 0.2,
"PetalWidth": 5.1
}
```
If all is well, the categorizer will return “Iris-verginica”.
#### To deploy Skynet
… or whatever AI you are deploying from Visual Studio, go to your build settings in the toolbar.
<img :src="$withBase('/files/publish.png')">
Select “Publish serverless_ai” to deploy your Azure Function app.
<img :src="$withBase('/files/test_in_portal.png')">
To test the app deployment in the Azure Portal, select you Azure Function app under **mlnetdemo** (or however you named it) and then pick the **Predict** function under that. Use the **Test** panel to the right of the screen to see your deployed app in action.
#### Wrap-up
This will place your iris categorizer out on Azure for other people to try. Congratulations! You are now able to deploy artificial intelligences to the cloud.

48
blog/blog/tip176.md Normal file
Просмотреть файл

@ -0,0 +1,48 @@
---
type: post
title: "Tip 176 - Azure Lab Services Demystified"
excerpt: "Learn how to get started with Azure Lab Services"
tags: [Developer Tools]
date: 2018-11-25 17:00:00
---
::: tip
:bulb: Learn more : [Azure Lab Services and Azure DevTest Labs](https://docs.microsoft.com/azure/lab-services/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Azure Lab Services Demystified
#### I was working in the lab late one night
Historically, if you wanted to set up a lab for educational or research use, you'd have to fill a room with identical PCs and load them all with a custom disk image with the OS and software required so that each user has exactly the same experience. Azure already supports hosting virtual machines with custom images, but this doesn't handle access management, user quotas, etc. With Azure Lab Service, you can build and manage your computer lab in the cloud.
#### Lab equipment
Azure Lab Services does everything you'd expect from a traditional computer lab and then some. Because you're running on virtual machines, there's no painful copy-and-install process. Instead, you can scale your lab up easily from a common template.
Use the Azure portal to create a new Lab Services account. This must have a unique name and can be in a new or existing resource group.
<img :src="$withBase('/files/azure-labs-newaccount.png')">
Unlike most other Azure services, this one has its own portal. You manage the labs through https://labs.azure.com. This is where you set up machines, users, and all other lab settings.
<img :src="$withBase('/files/lab-services-dashboard.png')">
Virtual machine images are created from a range of Windows or Linux OS options, and you can then customize the template image with the require software and settings. You start the template virtual machine, connect remotely, and configure as required. Once you're done, you can click Publish and then you're ready to deploy it to multiple virtual machines for your lab. You need to take care at this step because you can't un-publish a template. You'll have to start over with a new lab if you need to change the template once published.
You can specify how many concurrent users can be active in the lab, from 1 to a maximum of 100. In addition, you can set a schedule for when the machines are available with auto-shutdown of virtual machines at the end of the session. This is important because if you leave machines running, you'll be charged for their usage.
Users register with the lab using a unique link that you can distribute to them however you like. This will allow them to connect to an available virtual machine. The admin or educator can monitor which users have registered and see which virtual machines are active and how long they have been active.
<img :src="$withBase('/files/azure-labs-vm-sizes.png')">
There are three tiers of virtual machine instance you can select from depending on the workload required. The service is billed for the number of minutes each machine is active. You're not charged for virtual machines that are shut down.
#### Securing your lab
Lab Services only allows users with the Lab Creator role to create and edit labs. This role is managed through the Azure portal. This means that regular lab users only have access to a virtual machine and can't make any changes to the lab setup. The virtual machine itself is protected by a default username and password, which the lab creator set on creation, and this has to be shared with valid users in order to access the lab content.
#### Why should I care?
Beyond just creating a virtual machine image, there is a great deal of admin involved in successfully running a lab. Azure Lab Services manages a lot of this work for you. You can use the service for facilitating classroom labs or providing a controlled environment for customers to trial your software. You can even go beyond the features available here by creating a customized environment with Azure DevTest Labs to deploy to another user's Azure subscription, respecting their own organization's restrictions and infrastructure. To deploy your first lab, visit https://labs.azure.com.

49
blog/blog/tip177.md Normal file
Просмотреть файл

@ -0,0 +1,49 @@
---
type: post
title: "Tip 177 - Getting Started with Azure Information Protection"
excerpt: "Learn how to use get started with Azure Information Protection"
tags: [Security]
date: 2018-11-26 17:00:00
---
::: tip
:bulb: Learn more : [Azure Information Protection Documentation](https://docs.microsoft.com/azure/information-protection/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Getting Started with Azure Information Protection
[Azure Information Protection (AIP)](https://azure.microsoft.com/services/information-protection?WT.mc_id=azure-azuredevtips-azureappsdev) is a Microsoft Azure offering that works in conjunction with Microsoft Online services (Exchange Online, SharePoint Online, etc.) and Office 365 to categorize and protect documents and emails using labels and policies defined by an administrator. As a cloud-based solution, AIP is an [evolution](https://docs.microsoft.com/azure/information-protection/aka?WT.mc_id=docs-azuredevtips-azureappsdev) of a variety of technologies all focused on rights management, and it uses [Azure Rights Management](https://docs.microsoft.com/azure/information-protection/what-is-azure-rms?WT.mc_id=docs-azuredevtips-azureappsdev) as its protection technology.
Getting started requires an Azure subscription that includes AIP, and your organization would typically get this via Enterprise Mobility + Security, Microsoft 365 Enterprise, or volume licensing, or perhaps through a [Cloud Solution Provider](https://partner.microsoft.com/en-qa/cloud-solution-provider). To familiarize yourself with capabilities, though, you can sign up for a [free trial of Enterprise Mobility + Security E5](https://portal.office.com/signup/logout?OfferId=87dd2714-d452-48a0-a809-d2f58c4f68b7) and get your own tenant with AIP.
Within the Azure portal, an administrator would define labels (and sublabels) to classify documents as well as policies to govern what can be done with those documents. AIP provides a set of default classification labels, but you can define whatever categories you like.
<img :src="$withBase('/files/aip-labels.png')">
Each label, in turn, has a number of properties that indicate how documents with that label are marked (header, footer, watermark) and whether they should be protected from access by unauthorized users. Labels are included in one or more policies that govern conditions under which those labels are applied. For instance, you might want to automatically classify documents containing a Social Security number as personally identifiable information (PII) and enforce a policy that requires editors to explain any downgrading in the classification of such documents.
<img :src="$withBase('/files/aip-label.png')">
<img :src="$withBase('/files/aip-policy.png')">
To actually label and protect files with AIP, you'll need to install the [Azure Information Protection client](https://www.microsoft.com/download/details.aspx?id=53018?WT.mc_id=microsoft-azuredevtips-azureappsdev). Users of iOS, MacOS, and Android can still access protected documents via the [Azure Information Protection viewer app](https://portal.azurerms.com/#/download).
If you've installed the AIP client, your Office ribbons should include a Protect option, and as in the case of Microsoft Word below, you'll see a menu bar reflecting the classifications of the AIP policy that applies. That policy information is automatically downloaded and updated when you sign into the AIP-enabled tenant within your Office app.
<img :src="$withBase('/files/aip-word-1.png')">
Now, if you were to create and save a new document containing text that resembles a Social Security number, you'd be prompted to adjust the classification of the document accordingly. Had the condition been configured to automatically apply the label, that would have occurred without prompting. With the application of the PII label, the sensitivity level is updated and the (optional) watermark that was configured for that label appears.
<img :src="$withBase('/files/aip-word-2.png')">
Now let's say you want to share this document (or one that was created before you installed the AIP client) with an individual in another organization. Within File Explorer, you'll find a context menu option named Classify and Protect, which allows you to apply marking and protection to files (of supported types). Using custom permissions, you can single out the recipients and then attach the file to a work or school account email.
<img :src="$withBase('/files/aip-explorer.png')">
Authorized recipients will then be able to access the document with the assigned permissions, such as on an Android phone:
<img :src="$withBase('/files/aip-droid.jpg')">
Unauthorized users will be met with a message that they will need to get permission from the document owner to open the document.
Clearly, Azure Information Protection requires an organizational investment not only in licensing costs but also in terms of planning and document governance. A [quick-start tutorial](https://docs.microsoft.com/azure/information-protection/infoprotect-quick-start-tutorial?WT.mc_id=docs-azuredevtips-azureappsdev) goes into more detail in terms of implementation steps covered in this blog post. For those planning a rollout of a rights management solution, the [Azure Information Protection deployment roadmap](https://docs.microsoft.com/azure/information-protection/deployment-roadmap?WT.mc_id=docs-azuredevtips-azureappsdev) is a good guide to the steps for successful implementation.

60
blog/blog/tip178.md Normal file
Просмотреть файл

@ -0,0 +1,60 @@
---
type: post
title: "Tip 178 - A Lap Around Azure Media Player"
excerpt: "Learn how to use azure media player"
tags: [Media Services]
date: 2019-01-13 17:00:00
---
::: tip
:bulb: Learn more : [Azure Media Services Documentation](https://docs.microsoft.com/azure/media-services?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### A Lap Around Azure Media Player
Check out [Part 2 - Using Media Analytics to search for specific terms in a Video](tip179.html)
More and more, video has become an integral part of immersive, modern applications, and with the Azure Media Player your applications can easily surface audio and video content—hosted in Azure Media Services—in the format best for the current viewing device.
A quick way to get started is to take a look at the [Azure Media Player demo](https://ampdemo.azureedge.net/azuremediaplayer.html), which you can use to experiment with tracks in your own Azure Media Services account or just play back one of the two dozen hosted samples.
<img :src="$withBase('/files/amp.png')">
Just by perusing the samples you get an idea of the variety of adaptive streaming formats and DRM technologies supported. If you look under the “Chosen Player Options” in the left sidebar, you can see what format and playback technology has been selected for your current device. Here DASH stands for Dynamic Adaptive Streaming over HTTP, an international standard for adaptive bitrate streaming.
However, when playing on an iPhone using Safari, for example, notice that the Azure Media Player opts for the HTTP Live Streaming (HLS) option using native HTML5. For older platforms, Adobe Flash or Microsoft Silverlight might even be selected. Consult the [compatibility matrix](http://amp.azure.net/libs/amp/latest/docs/index.html#compatibility-matrix) to get a better understanding of how the *default* rendering options differ among various browser and platform combinations.
<img :src="$withBase('/files/amp-ios.jpg')">
Media geeks out there will also be interested in the various diagnostics that can be captured as the video is playing. If you suspect your media will be playing in environments with limited bandwidth, you could simulate a throttled network with Chromes Developer Tools or another utility and visualize the effect on bitrates and buffering.
<img :src="$withBase('/files/amp-diag.png')">
Embedding the player into your own web application is simple and can be done via an IFrame or, optionally, using the HTML5 video tag with some JavaScript to customize behavior. The Code tab of the Azure Media Player Demo app provides both approaches. For the IFrame option, you can paste the tag into the \<body\> of a bare bones HTML document and quickly view the result. The HTML5/JavaScript option provides a bit of HTML markup as well as ancillary script (that you would generally save separately and refer to in a \<script\> tag). Below is the JavaScript snippet provided for our sample video
```javascript
var myOptions = {
nativeControlsForTouch: false,
controls: true,
autoplay: true,
width: "640",
height: "400",
}
myPlayer = amp("azuremediaplayer", myOptions);
myPlayer.src([
{
"src": "https://amssamples.streaming.mediaservices.windows.net/830584f8-f0c8-4e41-968b-6538b9380aa5/TearsOfSteelTeaser.ism/manifest",
"type": "application/vnd.ms-sstr+xml",
"protectionInfo": [
{
"type": "AES",
"authenticationToken": "<redacted token>"
}
]
}
]);
```
Note that the `amp` object has a number of [options](https://amp.azure.net/libs/amp/latest/docs/index.html#options) that can be used to customize the capabilities of the player, including whether controls should be displayed, options for playback speed, [live captioning](https://amp.azure.net/libs/amp/latest/samples/dynamic_webvtt.html), and hot keys for controlling the volume, screen size, and play position. Programmatically, you can tap into [events](https://amp.azure.net/libs/amp/latest/docs/index.html#amp.eventname) just as you would expect for any JavaScript object. There is even a [plug-in model](http://amp.azure.net/libs/amp/latest/docs/PLUGINS.html) through which you and other developers can enhance player functionality.
For more details on specific capabilities, as well as code for a variety of scenarios, be sure to check out the latest [docs](http://amp.azure.net/libs/amp/latest/docs/index.html) and particularly [these samples](http://amp.azure.net/libs/amp/latest/docs/samples.html), which show how to exercise more of the player options themselves like playback speed, localization of captions, and event handling.

71
blog/blog/tip179.md Normal file
Просмотреть файл

@ -0,0 +1,71 @@
---
type: post
title: "Tip 179 - Using Azure Media Analytics to search for specific terms in a Video"
excerpt: "Learn how to use azure media analytics to search for specific terms in a video"
tags: [Media Services]
date: 2019-01-14 17:00:00
---
::: tip
:bulb: Learn more : [Azure Media Services Documentation](https://docs.microsoft.com/azure/media-services?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Using Media Analytics to search for specific terms in a Video
Check out [Part 1 - A Lap Around Azure Media Player](tip178.html) if you would like a quick intro before diving in.
Once you start taking advantage of Azure Media Services to deliver audio and video content to your users, you also gain access to an expanding suite of capabilities: [Azure Media Analytics](https://azure.microsoft.com/services/media-services/media-analytics/?v=18.18?WT.mc_id=azure-azuredevtips-azureappsdev)). Using artificial intelligence and machine learning technologies, Azure Media Analytics enables a variety of insights and capabilities including indexing, facial and emotion detection, optical character recognition, and time lapsing.
The quickest way to use Azure Media Analytics is through a SaaS application called [Video Indexer](https://vi.microsoft.com/). This service is free for 10 hours, after which you can connect it to your Azure Media Services account and assume a [pricing model](https://azure.microsoft.com/pricing/details/cognitive-services/video-indexer?WT.mc_id=azure-azuredevtips-azureappsdev) dependent on storage and processing speed. Video Indexer includes a number of preprocessed videos, or you can upload your own.
In the sample videos, you can search for terms—like “football”—and get a feel for how the indexing could work for your application needs.
<img :src="$withBase('/files/seahawks-1.png')">
If you select [one of the videos](https://www.videoindexer.ai/accounts/00000000-0000-0000-0000-000000000000/videos/4452cf7e59/) from the original sample list, youll see even more insights gleaned from that video, including keywords, sentiment analysis, and facial recognition.
<img :src="$withBase('/files/seahawks-2.png')">
While the output is compelling, you might be wondering how to make use of it in the context of your own applications. First, you can [connect Video Indexer with your own Media Services account in Azure](https://docs.microsoft.com/azure/cognitive-services/video-indexer/connect-to-azure#manual-configuration?WT.mc_id=docs-azuredevtips-azureappsdev). Within the Azure portal, you will then have access to the various assets produced by Video Indexer (as well as be able to access its capability there). Below you can clearly see the various outputs of the process including various caption files (with the .vtt, .smi, and .ttml extensions), an audio index file (.aib), and the XML-formatted keywords file.
<img :src="$withBase('/files/indexer.png')">
For [this particular video from Vimeo](https://vimeo.com/255872218), heres a snippet of the keywords XML showing the extracted keywords, locations in the clip, and confidence rating.
```xml
<rss version="2.0" xmlns:mavis="http://www.microsoft.com/dtds/mavis/')">
<channel>
<mavis:keywords>dude crazy running,specific complex handshaking,parallel universe,girls,birthday,kid,month,julian,familiarity,sense,handshake,grabbing,fun,friends,life,friendship,shot</mavis:keywords>
<items>
<mavis:keyword Content="dude crazy running" Count="1" AvgConfidence="0.90')">
<mavis:keyworddetail Confidence="0.90" Offset="24.86" />
</mavis:keyword>
<mavis:keyword Content="specific complex handshaking" Count="1" AvgConfidence="0.80')">
<mavis:keyworddetail Confidence="0.80" Offset="77.79" />
</mavis:keyword>
<mavis:keyword Content="parallel universe" Count="1" AvgConfidence="1.00')">
<mavis:keyworddetail Confidence="1.00" Offset="111.74" />
</mavis:keyword>
<mavis:keyword Content="girls" Count="1" AvgConfidence="0.85')">
<mavis:keyworddetail Confidence="0.85" Offset="35.46" />
</mavis:keyword>
<mavis:keyword Content="birthday" Count="2" AvgConfidence="0.80')">
<mavis:keyworddetail Confidence="0.61" Offset="36.25" />
<mavis:keyworddetail Confidence="1.00" Offset="39.48" />
</mavis:keyword>
</items>
</channel>
</rss>
```
In the snippet of XML above, youll see that “birthday” appears twice, near the 36 and the 39 second marks. You might use this information to present the user a list of links that take them directly to the portions of the video containing the keyword. If using Video Indexer to play back the video, the `t` query parameter can be used to specify the start time, in seconds:
- `https://www.videoindexer.ai/.../videos/79cf1a52d0/?t=36.25`
- `https://www.videoindexer.ai/.../videos/79cf1a52d0/?t=39.48`
For even more programmatic control, an [underlying API](https://api-portal.videoindexer.ai/) drives the functionality exposed by the Azure portal and the Video Indexer. The key method, [Get Video Index](https://api-portal.videoindexer.ai/docs/services/operations/operations/Get-Video-Index?), returns a JSON document containing all the results of the indexing operations, and other APIs can be used to submit and process videos as well as list and search videos for specific content.
And lastly, from a user experience perspective, the insights widgets and player that you see within the Video Indexer app can be [embedded and customized as part of your own application](https://docs.microsoft.com/azure/cognitive-services/video-indexer/video-indexer-embed-widgets?WT.mc_id=docs-azuredevtips-azureappsdev).
Pretty cool stuff, go check it out!

71
blog/blog/tip18.md Normal file
Просмотреть файл

@ -0,0 +1,71 @@
---
type: post
title: "Tip 18 - Use Tags to quickly organize Azure Resources"
excerpt: "Learn how to take advantage of tags to organize your Azure resources"
tags: [Management and Governance]
date: 2017-09-19 17:00:00
---
::: tip
:bulb: Learn more : [Azure Resource Manager](https://docs.microsoft.com/azure/azure-resource-manager?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use tags to quickly organize Azure Resources](https://www.youtube.com/watch?v=qFLvB5cxREg&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=14?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Use Tags to quickly organize Azure Resources
Head over to the Azure Portal and select service. In my example, I'm going to select a Web App that I want to tag as a production app. Select the **Tags** menu and provide a Name and Value as shown below.
<img :src="$withBase('/files/azuretag1.png')">
**Remember this!** Tags are user-defined key/value pairs which can be placed directly on a resource or a resource group.
I selected "Environment" and gave it the value of "Production". I then clicked "Save". I could also do this for other Production resources and even tag the appropriate ones with "Dev".
I can now take advantage of this by going to **More Services** and typing **Tags** and click on the Environment: Production as shown below.
<img :src="$withBase('/files/azuretag2.png')">
1. Results from searching "Tags"
2. Our Production Environment we just setup
3. List all the Web Apps with the Production Environment Tag
4. Pin the Blade to our Azure Portal Main Page
If you pin the blade (by pressing the pin in step 4.), then you'll see the following on your Azure Portal dashboard
<img :src="$withBase('/files/azuretag3.png')">
You can even interact with **Tags** using Azure CLI 2.0. For example, I can type `az tag list -o json` to list all the tags associated with an account.
``` shell
michael@Azure:~$ az tag list
[
{
"count": {
"type": "Total",
"value": 2
},
"id": "/subscriptions/c0e5fb0f-7461-4b04-9720-63fe407b1bdb/tagNames/Environment",
"tagName": "Environment",
"values": [
{
"count": {
"type": "Total",
"value": 1
},
"id": "/subscriptions/c0e5fb0f-7461-4b04-9720-63fe407b1bdb/tagNames/Environment/tagValues/Dev",
"tagValue": "Dev"
},
{
"count": {
"type": "Total",
"value": 1
},
"id": "/subscriptions/c0e5fb0f-7461-4b04-9720-63fe407b1bdb/tagNames/Environment/tagValues/Production",
"tagValue": "Production"
}
]
}
]
```

53
blog/blog/tip180.md Normal file
Просмотреть файл

@ -0,0 +1,53 @@
---
type: post
title: "Tip 180 - Taking a peek at Azure Key Vault Part 1 of 2"
excerpt: "Learn how to use taking a peek at azure key vault part 1 of 2"
tags: [Security, Identity]
date: 2019-01-27 17:00:00
---
::: tip
:bulb: Learn more : [Key Vault Documentation](https://docs.microsoft.com/azure/key-vault?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Taking a peek at Azure Key Vault Part 1 of 2
[Part 1 - this post](tip180.html)
[Part 2](tip181.html)
One of the more vexing problems for developers is securing access to other services used by their applications. Databases and other restricted resources need authentication, and your apps need to provide that, but how? Passwords within your code? (Un)encrypted configuration files? Certificate stores? Hardware? And who safeguards and manages these resources?
Addressing these concerns is the primary objective of [Azure Key Vault](https://azure.microsoft.com/services/key-vault?WT.mc_id=azure-azuredevtips-azureappsdev), a globally available service to store and manage three types of assets:
- Secrets - sensitive strings like passwords and database connection strings. You might store your application's database password as a secret, for instance.
- Encryption keys - RSA or Elliptic Curve keys that you would use for cryptographic operations such as encrypting application data for transit or storage.
- Certificates - X.509 certificates that you may provision through Azure Key Vault or via other providers like DigiCert.
In this post, you're going to see how to create and manage a secret, but keys work in much the same way. Certificates are a little more complex, and in fact themselves used keys and secrets. Check out [Get started with Key Vault Certificates](https://docs.microsoft.com/azure/key-vault/certificate-scenarios?WT.mc_id=docs-azuredevtips-azureappsdev) for more information specifically on certificates.
#### Creating a Key Vault Account
Let's start by creating a new Key Vault service in the Azure portal. In the Create Key Vault blade (below), provide a unique name for your vault (which, as with most services, becomes an endpoint for invoking the service) and pick (or create) a resource group and a pricing tier. There are currently two tiers, Standard and Premium; the latter supports keys protected by a hardware security module (HSM). Use the standard option for this exercise.
<img :src="$withBase('/files/create-kv.png')">
Access to the Key Vault is managed via policies to which principals (like users and applications) can be assigned. By default, the user creating the vault is granted all permissions to keys, secrets, and certificates, but in practice you will grant more detailed policies to specific principals.
<img :src="$withBase('/files/create-kv-policy.png')">
Indeed, across the three entities (keys, secrets, and certificates), there are 40 permissions that can be individually granted, thus supporting the [principle of least privilege](https://docs.microsoft.com/windows-server/identity/ad-ds/plan/security-best-practices/implementing-least-privilege-administrative-models?WT.mc_id=docs-azuredevtips-azureappsdev). For instance, a web API that is accessing SQL Server might have GET permission on the secrets store, but only members of the security team would have SET permission to modify the database password. That's a simplistic example, so [here's another scenario](https://docs.microsoft.com/azure/key-vault/key-vault-secure-your-key-vault#example?WT.mc_id=docs-azuredevtips-azureappsdev) involving developers, the security team, and even auditors.
#### Adding a Secret
On the left sidebar menu of the Key Vault blade in the Azure portal, you can easily create or import a secret, key, or certificate. Since we're just interested in a secret now, we simply provide a name-value pair and options to manage the window of accessibility to that secret.
<img :src="$withBase('/files/create-secret.png')">
Once the secret is created, you'll notice that there is a bit more depth to this than a simple key-value pair. For each secret, a versioning history is automatically maintained, and through the Secret Identifier URI you can access any version of that secret. This provides a bit of an audit trail and can help implement policies to forbid reuse of actual or derived values of previous secrets.
<img :src="$withBase('/files/kv-history.png')">
Although the Azure portal is a convenient visual approach to interact with Key Vault, for most scenarios you will want to have a repeatable and isolated process for managing Key Vault. Supporting that are the [Azure CLI](https://docs.microsoft.com/azure/key-vault/quick-create-cli?WT.mc_id=docs-azuredevtips-azureappsdev) and [PowerShell cmdlets](https://docs.microsoft.com/azure/key-vault/quick-create-powershell?WT.mc_id=docs-azuredevtips-azureappsdev) as well as [integration with Azure Resource Manager (ARM) templates](https://docs.microsoft.com/azure/azure-resource-manager/resource-manager-keyvault-parameter?WT.mc_id=docs-azuredevtips-azureappsdev).
From the perspective of consuming secrets (as well as keys and certificates) from Key Vault within your applications, SDKs and libraries are available in [.NET](https://docs.microsoft.com/dotnet/api/microsoft.azure.keyvault?view=azure-dotnet?WT.mc_id=docs-azuredevtips-azureappsdev), [Java](https://docs.microsoft.com/java/api/overview/azure/keyvault?view=azure-java-stable?WT.mc_id=docs-azuredevtips-azureappsdev), [Node.js](https://docs.microsoft.com/javascript/api/overview/azure/key-vault?view=azure-node-latest?WT.mc_id=docs-azuredevtips-azureappsdev), and [Python](https://docs.microsoft.com/python/api/overview/azure/key-vault?view=azure-python?WT.mc_id=docs-azuredevtips-azureappsdev), and, of course,
you can use the [REST API](https://docs.microsoft.com/rest/api/keyvault/) from any programming environment that supports HTTP. We'll look at a small sample using the .NET SDK in the [next installment](tip181.html).

112
blog/blog/tip181.md Normal file
Просмотреть файл

@ -0,0 +1,112 @@
---
type: post
title: "Tip 181 - Taking a peek at Azure Key Vault Part 2 of 2"
excerpt: "Learn how to use taking a peek at azure key vault part 2 of 2"
tags: [Security, Identity]
date: 2019-01-28 17:00:00
---
::: tip
:bulb: Learn more : [Key Vault Documentation](https://docs.microsoft.com/azure/key-vault?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Taking a peek at Azure Key Vault Part 2 of 2
In the [previous post](tip180.html), you set up Key Vault and added a secret via the Azure portal. Now you'll see how to securely access that secret programmatically. Let's start by creating a ASP.NET Core API app in Visual Studio (or you can [grab the completed project here](https://github.com/mbcrump/azure-key-vault?WT.mc_id=github-azuredevtips-azureappsdev)):
<img :src="$withBase('/files/new-api-app.png')">
After the project is generated, add a new controller file called **SecretController.cs** with the following code.
```csharp
using Microsoft.AspNetCore.Mvc;
namespace KeyVaultApi.Controllers
{
[Route("secret")]
[ApiController]
public class SecretController : ControllerBase
{
// GET api/values
[HttpGet]
public IActionResult Get()
{
string secret = "TBD";
return Ok(new { Secret = secret });
}
}
}
```
As you might expect, the response from a browser request (or a tool like [Postman](https://www.getpostman.com/)) looks like:
<img :src="$withBase('/files/browser-1.png')">
To retrieve the secret from Key Vault, you'll want to pull in the [**Azure.Security.KeyVault.Secrets**](https://www.nuget.org/packages/Azure.Security.KeyVault.Secrets/) and [**Azure.Identity**](https://www.nuget.org/packages/Azure.Identity/) NuGet packages. The **SecretClient** class is the entry point for interacting with secrets in the Azure Key Vault, and the client code is actually pretty simple. Here the Key Vault service name is *fort-knox* and the secret's name is *deep-thought*; you should provide the appropriate values for your service.
```csharp
public async Task<IActionResult> GetAsync()
{
string secret = "TBD";
var secretClient = new SecretClient(new Uri("https://fort-knox.vault.azure.net"), new DefaultAzureCredential());
secret = (await secretClient.GetSecretAsync("deep-thought")).Value.Value;
return Ok(new { Secret = secret });
}
```
The **DefaultAzureCredential** is appropriate for most scenarios. You can refer to [here](https://github.com/Azure/azure-sdk-for-net/tree/master/sdk/identity/Azure.Identity#defaultazurecredential) for more information.
During development, if you are signed in via the [Azure CLI](https://docs.microsoft.com/cli/azure/?view=azure-cli-latest), the app will authenticate using AzureCliCredential. Of course, that identity needs requisite (Get) access to secrets in Key Vault, but if it's the same user that set up the secret, it already has more permissions than needed.
To set up the application to run within Azure, create a new API app in the Azure portal, and via the **Managed service identity** option under **Settings**, select **On** to register with Azure Active Directory. This creates a service principal that the API app can use to authenticate itself to other Azure services like Key Vault.
<img :src="$withBase('/files/msi.png')">
At this point, your Azure Key Vault does not have an access policy associated with this new identity, so any attempts to run the app in Azure would result in a Forbidden (403) error. To remedy, create a new policy that refers to the API app you just created (or more precisely to the MSI associated with that API app). For purposes of this exercise, the only needed operation is to get secrets from Key Vault.
<img :src="$withBase('/files/access-policy.png')">
Now, you can deploy the application and run it from Azure as well.
<img :src="$withBase('/files/browser-2.png')">
If you are building ASP.NET Core applications, you can add package [**Azure.Extensions.AspNetCore.Configuration.Secrets**](https://www.nuget.org/packages/Azure.Extensions.AspNetCore.Configuration.Secrets) to your project.
In the **Program.cs** of your ASP.NET Core application, add the following call to **ConfigureAppConfiguration** to first get the name of the Key Vault from standard app properties and then use the **AddAzureKeyVault** extension method to gather all the names of the Key Vault secrets into the **IConfiguration** reference. Note, because the extension enumerates the secrets in Key Vault, you will need to grant the MSI List as well as Get permissions on the Key Vault instance.
```csharp
public static IWebHostBuilder CreateWebHostBuilder(string[] args) =>
WebHost.CreateDefaultBuilder(args)
.ConfigureAppConfiguration((context, config) =>
{
var builtConfig = config.Build();
config.AddAzureKeyVault(
new Uri($"https://{builtConfig["Vault"]}.vault.azure.net/"),
new DefaultAzureCredential());
}).UseStartup<Startup>();
```
The complete body of the controller then becomes
```csharp
[Route("secret")]
[ApiController]
public class SecretController : ControllerBase
{
private readonly IConfiguration _config;
public SecretController(IConfiguration config)
{
_config = config;
}
[HttpGet]
public IActionResult Get()
{
return Ok(new { Secret = _config.GetValue<string>("deep-thought") });
}
}
```
For more on this subject, [Use Key Vault from App Service with Managed Service Identity](https://github.com/Azure-Samples/app-service-msi-keyvault-dotnet?WT.mc_id=github-azuredevtips-azureappsdev) on GitHub has a larger sample of using MSI to authenticate to Key Vault. Additionally, you can find [examples](https://docs.microsoft.com/azure/key-vault/key-vault-use-from-web-application?WT.mc_id=docs-azuredevtips-azureappsdev) that cover manual application registration using client secrets and certificates; however, those techniques are no longer recommended whenever MSI can be used.
I hope this helps someone out there!

109
blog/blog/tip182.md Normal file
Просмотреть файл

@ -0,0 +1,109 @@
---
type: post
title: "Tip 182 - Use VNET peering to connect existing VNETs"
excerpt: "Learn how to use vnet peering to connect existing vnets"
tags: [Networking]
date: 2019-02-03 17:00:00
---
::: tip
:bulb: Learn more : [Azure Virtual Network](https://docs.microsoft.com/azure/virtual-network/virtual-networks-overview?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Use VNET peering to connect existing VNETs
Recently a question came up about how to securely connect existing VNETs. This got me thinking about how I have VMs deployed in their own VNETs.
Often when I set up a VM for a demo, Ill use the default settings, which creates a VNET for each VM. In this post, Ill walk you through how to set up a [hub-spoke network topology](https://docs.microsoft.com/azure/architecture/reference-architectures/hybrid-networking/hub-spoke?toc=%2fazure%2fvirtual-network%2ftoc.json?WT.mc_id=docs-azuredevtips-azureappsdev) to connect existing VNETs.
#### My requirements
* Connect three VNETs, one of which has a Point-to-Site gateway configured
* No public IPs attached to VM network interfaces
* Ability to connect to all VMs
#### How I connect three existing VNETs
I am starting with these resources:
Subscription 1:
* hub-vnet – the VNET with the Point-to-Site VPN configured
* vnet-gw – the VNET gateway
* win2016svr-east – VM inside the VNET with no public IP ([Windows Server 2016 DataCenter](https://azuremarketplace.microsoft.com/marketplace/apps/Microsoft.WindowsServer?tab=Overview))
* spoke2-vnet – second VNET with no gateway
o win10vm2-east – VM inside the VNET with no public IP ([Windows 10 image](https://azuremarketplace.microsoft.com/marketplace/apps/microsoftwindowsdesktop.windows-10?tab=Overview))
Subscription 2:
* spoke1-vnet – VNET in another subscription (but same Azure Active Directory)
o win10vm-east – VM inside the VNET with no public IP ([Windows 10 image](https://azuremarketplace.microsoft.com/marketplace/apps/microsoftwindowsdesktop.windows-10?tab=Overview))
<img :src="$withBase('/files/peering1.png')">
**Part 1:**
* In the Azure portal, I go to the **hub-vnet virtual network**, select **Peerings**, and click the **Add** button.
<img :src="$withBase('/files/peering2.png')">
* I give the peering the name **hub-spoke2-peer** and **select the subscription** and **virtual network**. Then I check the • I give the peering the name hub-spoke2-peer and select the subscription and virtual network. Then I check the Allow gateway transit and click the OK button. and click the **OK** button.
<img :src="$withBase('/files/peering3.png')">
Note: **Allow gateway transit** is needed to make the Point-to-Site connection. Otherwise I would need to use a jumpbox.
Once this side of the peering is ready, it will show the status as **Initiated**.
<img :src="$withBase('/files/peering4.png')">
* I repeat the above steps and add the peering for the **hub-spoke1-peer**.
That sets up the hub side of the peering. The next step is the spoke side.
**Part 2:**
* In the Azure portal, I select **Virtual Networks**, select the **spoke2-vnet**, then select **Peerings**.
<img :src="$withBase('/files/peering6.png')">
* Click the **Add** button.
<img :src="$withBase('/files/peering7.png')">
* I give the peering the reverse name **spoke2-hub-peer**, select the **subscription** and **virtual network** of hub-vnet, and check the **Use remote gateway**. Then I click **OK** to create the peering.
<img :src="$withBase('/files/peering8.png')">
Note: **Use remote gateways** is the reverse setting of the one that allows the usage of the gateway in hub-vnet.
* Now I go back to my virtual networks list and repeat for **spoke1-vnet**.
<img :src="$withBase('/files/peering9.png')">
I now have the VNET peerings shown below:
<img :src="$withBase('/files/peering10.png')">
**Part 3:**
To test that I can connect to each of the VMs, I first need to download the VPN client again.
* In the Azure portal, I search for **vnet-gw** and select it from the results, and then select **Point-to-site configuration**.
<img :src="$withBase('/files/peering11.png')">
* Then I click **Download VPN client** and reinstall it once it downloads.
<img :src="$withBase('/files/peering12.png')">
Once I have it reinstalled, I **connect the VPN**.
* I open remote desktop (type **mstsc** at the command prompt or make sure you read [Quickly Connect to Windows VMs with RDP](tip9/)) and, one by one, I verify that I can connect to the private IPs for the VMs:
* Win2016svr-east: 10.0.0.36
* Win10vm-east: 10.1.0.68
* Win10vm2-east: 10.2.0.68
Thats all it takes to connect all three VNETs using VNET peering!
Read the full detail of how to [Implement a hub-spoke network topology in Azure](https://docs.microsoft.com/azure/architecture/reference-architectures/hybrid-networking/hub-spoke?toc=%2fazure%2fvirtual-network%2ftoc.json?WT.mc_id=docs-azuredevtips-azureappsdev) on the Azure Architecture site or watch [Virtual Network (vNet) Peering](https://channel9.msdn.com/Shows/Azure-Friday/Virtual-Network-vNet-Peering?term=vnet%20peering&lang-en=true?WT.mc_id=ch9-azuredevtips-azureappsdev) on Azure Friday.

74
blog/blog/tip183.md Normal file
Просмотреть файл

@ -0,0 +1,74 @@
---
type: post
title: "Tip 183 - Optimize what you spend on the cloud with Cloudyn"
excerpt: "Learn how to use optimize what you spend on the cloud with Cloudyn"
tags: [Management and Governance]
share: true
date: 2019-02-17 17:00:00
---
::: tip
:bulb: Learn more : [Cloudyn service overview](https://docs.microsoft.com/azure/cost-management/overview?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Optimize what you spend on the cloud with Cloudyn
With your personal finances, if youve ever wondered where all your money goes or exactly what you are spending your money on, you may have purchased an application such a Quicken. Quicken connects with your bank and credit card accounts to show you exactly where your money is going with different charts, graphs, and detailed tables of data. Once you know where your money is going, you can start to optimize what you spend your money on.
Cloudyn is like Quicken on steroids for your cloud costs on Azure.
Note Cloudyn is related to, but separate from native [Azure Cost Management](https://azure.microsoft.com/services/cost-management?WT.mc_id=azure-azuredevtips-azureappsdev). Cost Management has no onboarding, ~8-hour data latency, and is [integrated into the Azure portal](https://aka.ms/costmgmt?WT.mc_id=akams-azuredevtips-azureappsdev). Cost Management is recommended for individiuals and organizations with Enterprise Aggreement (EA), pay-as-you-go, dev/test, and free/trial subscriptions. Cloud Solution Provider (CSP) customers should start with Cloudyn.
##### How to set up Cloudyn
To use Cloudyn, first you need to register your subscription to get the billing information shared with the Cloudyn portal.
In the Azure Portal, select **Cost Management + Billing**, select [**Cost Management**](https://aka.ms/costmgmt?WT.mc_id=akams-azuredevtips-azureappsdev) (if available), then **Cloudyn** and finally click the **Go to Cloudyn** button.
<img :src="$withBase('/files/costmanagement1.png')">
You will now be on the Cloudyn portal registration (Cloudyn is a Microsoft subsidiary), fill out the registration steps and then youll need to wait a good 24 hours until the billing data has been pulled into the Cloudyn portal.
##### Cloudyn views that Ive found useful
Once youve waited a day or so for Cloudyn to connect with billing, you can start learning where you are spending your money.
######## Cost Controller
The Cost Controller view is available by selecting it in the tabs across the top of the dashboards. This dashboard shows the following:
* **Cost Over Time** – a line chart and the last days cost.
* **Monthly Cost Trends** – shows a projected amortized cost and your actual spend for the month.
* **12 Month Planner** – shows the projected costs over the next 12 months.
* **Cost By Service** – shows the costs by service for the past 30 days in a pie chart. If you hover over the slices you see the costs.
* **Cost By Account** – shows the costs by account for the past 30 days in a pie chart. Also if you hover over the slices you see the costs.
* **Cost Trend By Day** – a bar chart of cost over the last 30 days
* **Cost Trend By Month** – a bar chart cost over the last 6 months (once you have 6 months of data)
<img :src="$withBase('/files/costmanagement2.png')">
You can also create your own dashboard. Check out the documentation to find out more about other dashboards available.
######## Actual Cost Analysis
The Actual Cost Analysis view can be found by using the Costs menu at the top.
<img :src="$withBase('/files/costmanagement3.png')">
On the Cost Controller dashboard, I could see my Azure Web App cost was more than I thought it should be. With this view, I can drill down into my costs, by right-clicking on the bar chart or adding filters on the left side. If you look at all **Services**, **Resource Types**, and **Sub Types** youll be able to see where your costs are coming from. Now I can clearly see that I must have left a demo web app or two running that I need to remove.
<img :src="$withBase('/files/costmanagement4.png')">
######## Azure Resource Explorer
The Azure Resource Explorer view can be found by using the Resources mentioned at the top.
<img :src="$withBase('/files/costmanagement5.png')">
On the last view, I could clearly see my Web Apps were costing me more than I thought. This view shows all your resources and their costs—just like a credit card bill.
<img :src="$withBase('/files/costmanagement6.png')">
By using these three views, I identified that my app services were costing me more than I thought they should, drilled into the detail to see it was some web apps and then found the exact resource names of those Web Apps so I could remove them. This saved me money in just took a few moments of my time!
Plus, I only scratched the surface of how Cloudyn can help you uncover other hidden costs.
Now you can know where all your cloud cost is coming from!

41
blog/blog/tip184.md Normal file
Просмотреть файл

@ -0,0 +1,41 @@
---
type: post
title: "Tip 184 - Quickly Set Up Azure Active Directory with Azure App Services"
excerpt: "Quickly Set Up Azure Active Directory with Azure App Services"
tags: [Identity, Web]
share: true
date: 2019-03-03 17:00:00
---
::: tip
:bulb: Learn more : [App Service Documentation](https://docs.microsoft.com/azure/app-service?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Quickly Set Up Azure Active Directory with Azure App Services
A while ago, I did a post on [Quick and Dirty User Authentication with Azure Web Apps and MVC5](https://microsoft.github.io/AzureTipsAndTricks/blog/tip112.html), where I created a simple web app that used forms authentication. Since then, Ive been asked if I could address how to use the **Settings -> Authentication / Authorization** feature to turn on AAD for an existing web app. In this post, well take a look at setting up Azure Active Directory with Azure App Services.
#### My Requirements
* Any user on my AAD will be able to log in.
* I wont write or add any code to my web app.
* I want to do this with the FREE Tier of Azure App Service Web Apps.
#### How to Set Up Azure Active Directory with an App Service Web App
Go to the Azure portal and select my web app and click on **Authentication / Authorization** under **Settings** to get started.
<img :src="$withBase('/files/aad1.png')">
Click the **On** button to see the Authentication Provider list and then click **Azure Active Directory** in the list of providers.
<img :src="$withBase('/files/aad2.png')">
Great. Now click on the **Express** management mode button and click **OK**.
<img :src="$withBase('/files/aad3.png')">
Now youll need to do one last thing before saving the Authentication / Authorization settings, which is to set the **Action to take when a request is not authenticated**. Youll want to make sure that it is set to **Log in with Azure Active Directory**. This makes sure anyone visiting your site has been authenticated by AAD first. If you are following along and find that you want to use a different AAD tenant (not the Azure account you usually sign into), you can find those steps here: Manually configure Azure Active Directory with advanced settings.
<img :src="$withBase('/files/aad4.png')">
Now you can click the **Save** button to have AAD added as your Authentication Provider.

59
blog/blog/tip185.md Normal file
Просмотреть файл

@ -0,0 +1,59 @@
---
type: post
title: "Tip 185 - Performance Testing on Cosmos DB"
excerpt: "Learn how to implement performance testing on Cosmos DB"
tags: [Databases]
share: true
date: 2019-03-10 17:00:00
---
::: tip
:bulb: Learn more : [Azure Cosmos DB](https://docs.microsoft.com/azure/cosmos-db/introduction?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Performance Testing on Cosmos DB
Although [Cosmos DB](https://azure.microsoft.com/services/cosmos-db?WT.mc_id=azure-azuredevtips-azureappsdev) comes with global availability and guaranteed performance, it's still incumbent on the developer and architect to understand the implication of application and database design choices on performance. Central to the discussion of performance in Cosmos DB is the concept of a [request unit (RU)](https://docs.microsoft.com/azure/cosmos-db/request-units?WT.mc_id=docs-azuredevtips-azureappsdev), which is canonically defined as the processing capacity (CPU, memory, and IOPS) to perform a GET (retrieve) on a 1-KB document with 10 properties. Requests to delete, insert, or update require more capacity and so result in a higher RU cost. For instance, an insert of that same 1-KB document would incur a cost of 5 RUs.
RUs are also the currency of scale in Cosmos DB and, given that the RU cost of a single operation is deterministic, it is possible to estimate the cost of anticipated operations as well as to monitor the actual cost of completed operations. Armed with this information, you will be able to better assess the performance and scalability of your data architecture from planning to implementation to monitoring the production system.
##### Capacity Planning
In the planning stages, you can make use of the [capacity planner](https://www.documentdb.com/capacityplanner) to provide a rough estimate of required RUs given your sample document profile and the expected number of operations per second.
<img :src="$withBase('/files/cosmos-planner.png')">
Here we can see that a query-heavy app storing 50,000 documents (of which **sample.json** was a representation) and expecting 20 new documents and 4 updates per second needs a database provisioned with just over 1,300 RUs.
##### Development Insight
While developing your data access strategies, take a look at [Performance and scale testing with Azure Cosmos DB](https://docs.microsoft.com/azure/cosmos-db/performance-testing?WT.mc_id=docs-azuredevtips-azureappsdev). It describes an [open-source benchmarking project](https://github.com/Azure/azure-cosmosdb-dotnet/tree/master/samples/documentdb-benchmark?WT.mc_id=github-azuredevtips-azureappsdev) that you can adapt to your own domain to get a more precise accounting of RUs and thus the expected performance of your application. The code uses the .NET SDK and specifically applies to inserts into a document database (versus tables or graphs), but the concepts in the code can be adapted to your specific data model and query profiles. A key part of the processing is accumulating the **RequestCharge** from each operation:
```csharp
ResourceResponse<Document> response = await client.CreateDocumentAsync(
UriFactory.CreateDocumentCollectionUri(DatabaseName, DataCollectionName),
newDictionary, new RequestOptions() { });
requestUnitsConsumed[taskId] += response.RequestCharge;
```
For the execution captured below, the exact cost of inserting 10,000 test documents is slightly less 4,000 RU/s or 4 percent of the provisioned throughput for this collection (as noted in the collection summary line at the top of the console output).
<img :src="$withBase('/files/benchmarkapp.png')">
It's important to note, too, that [partitioning](https://docs.microsoft.com/azure/cosmos-db/partition-data?WT.mc_id=docs-azuredevtips-azureappsdev), [consistency levels](https://docs.microsoft.com/azure/cosmos-db/consistency-levels?WT.mc_id=docs-azuredevtips-azureappsdev), and [indexing](https://docs.microsoft.com/azure/cosmos-db/indexing-policies?WT.mc_id=docs-azuredevtips-azureappsdev) will also have an impact on performance, so you may want to establish a baseline benchmark application and judiciously modify various configuration options and settings to determine their effect on performance.
Also consider using the [Azure Cosmos DB Emulator](https://docs.microsoft.com/azure/cosmos-db/local-emulator?WT.mc_id=docs-azuredevtips-azureappsdev) as the target of the performance testing application. It only supports document style databases and doesn't simulate different consistency levels, but it will provide insight into RU costs without incurring actual charges for running your performance tests against your Azure instance.
##### Production Monitoring
For an operational database, the Azure portal Monitoring -> Metrics blade provides in-depth statistics on throughput, storage, availability, and latency. The Storage tab is of particular interest in that it lends insight into the partitioning of the data. Be sure to drill down into a specific database and collection to see the partition-specific metrics.
In the snapshot captured here, partitions are relatively evenly distributed, which indicates a good choice for a partition key. Had one partition been exceedingly large (or 'hot'), it could well be a performance bottleneck, and the accompanying list of the predominant keys in that partition could provide some suggestions for tweaking the partitioning strategy.
<img :src="$withBase('/files/partitions.png')">
As you use the insight from these metrics to resolve potential bottlenecks, take a look at the performance tips offered by Microsoft in the following links:
- [Performance tips for .NET SDK](https://docs.microsoft.com/azure/cosmos-db/performance-tips?WT.mc_id=docs-azuredevtips-azureappsdev) (or [Java](https://docs.microsoft.com/azure/cosmos-db/performance-tips-async-java?WT.mc_id=docs-azuredevtips-azureappsdev))
- [Cost-effective reads and writes](https://docs.microsoft.com/azure/cosmos-db/key-value-store-cost?WT.mc_id=docs-azuredevtips-azureappsdev)
- [SQL data partitioning](https://docs.microsoft.com/azure/cosmos-db/sql-api-partition-data?WT.mc_id=docs-azuredevtips-azureappsdev)

94
blog/blog/tip186.md Normal file
Просмотреть файл

@ -0,0 +1,94 @@
---
type: post
title: "Tip 186 - Easily add real-time web functionality to applications with Azure SignalR Service"
excerpt: "Normally when we think of the web, we think of a mostly passive experience. Using SignalR, you can have a real-time, two-way conversation with someone over the web. And with Azure SignalR Service, you get a fully managed service that helps you build real-time experiences."
tags: [Web]
share: true
date: 2019-03-11 17:00:00
---
::: tip
:bulb: Learn more : [Azure SignalR Service](https://docs.microsoft.com/azure/azure-signalr/?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Easily add real-time web functionality to applications with Azure SignalR Service
Hi, folks. Today I wanted to chat with you about real-time web functionality. Normally when we think of the web, we think of a mostly passive experience. When you bring up your mail web client and leave it for a while, your mail gets stale. You wont get your recent emails until you refresh the page, or, if youre lucky, your client has a timer that automatically refreshes the page for you. But it doesnt have to be this way. [SignalR](https://docs.microsoft.com/aspnet/core/signalr/introduction?view=aspnetcore-2.1?WT.mc_id=docs-azuredevtips-azureappsdev) is a technology that can push new emails to you as soon as they arrive. Using SignalR, you can even have a real-time, two-way conversation with someone over the web. And with [Azure SignalR Service](https://azure.microsoft.com/services/signalr-service?WT.mc_id=azure-azuredevtips-azureappsdev), you get a fully managed service that helps you build real-time experiences such as [chat](https://github.com/aspnet/SignalR-samples/tree/master/ChatSample?WT.mc_id=github-azuredevtips-azureappsdev), [stock tickers](https://github.com/aspnet/SignalR-samples/tree/master/StockTickR?WT.mc_id=github-azuredevtips-azureappsdev), live [whiteboard](https://github.com/aspnet/SignalR-samples/tree/master/WhiteBoard?WT.mc_id=github-azuredevtips-azureappsdev), and more.
##### Real-time web functionality with SignalR
SignalR is built on ASP.NET Core, and the secret sauce behind the SignalR architecture is something called [Hubs](https://docs.microsoft.com/aspnet/core/signalr/hubs?view=aspnetcore-2.1?WT.mc_id=docs-azuredevtips-azureappsdev). Hubs run on your server and route messages in and out to make sure they get to the intended web recipient in real time. When you develop a Hub in your middleware, there are two pieces of code that tie everything together.
```
services.AddSignalR();
```
The first is the **AddSignalR** method, which you call in your web apps Startup.ConfigureServices method to enable passing SignalR messages to SignalR.
```
app.UseSignalR(route =>
{
route.MapHub<ChatHub>("/chathub");
});
```
The second call is to **UseSignalR**, which is placed in your web apps Startup.Configure method. This makes SignalR aware of your Hub.
Im telling you about this because these two methods become really important later when you move your SignalR app to Azure SignalR Service, which is really a lot more convenient than provisioning the infrastructure yourself.
By the way (in case you were curious), SignalR supports real-time two-way communication between clients by using [WebSockets](https://en.wikipedia.org/wiki/WebSocket) under the hood. But because not all browsers support WebSockets, it can also gracefully degrade to other technologies to support the same behavior. As a last resort, it just uses frequent polling of the server for changes.
##### Azure SignalR Service
Setting up an Azure SignalR Service instance is straightforward.
<img :src="$withBase('/files/create_resource.png')">
In your Azure portal, select **Create a resource** to get started. Search for and select the **“SignalR Service”** template in the Marketplace.
<img :src="$withBase('/files/signalr_template.png')">
Click the **Create** button at the bottom of the template panel and fill in the details of your resource, including the resource name, location, and pricing tier (free is just fine for development). Click the second **Create** button at the bottom of the SignalR panel to allocate the service.
<img :src="$withBase('/files/get_secret_key.png')">
Once your SignalR resource is created, go into the **Keys** setting and copy your secret key.
##### Moving your SignalR app to your SignalR Service
You should now develop your SignalR app in either VS Code or, as I do here, in Visual Studio using the ASP.NET Core Web Application project template.
<img :src="$withBase('/files/create_chat_app.png')">
There are lots of great [tutorials](https://docs.microsoft.com/aspnet/core/tutorials/signalr?view=aspnetcore-2.1&tabs=visual-studio?WT.mc_id=docs-azuredevtips-azureappsdev), and even [completed samples](https://github.com/aspnet/AzureSignalR-samples/tree/master/samples/ChatRoomLocal?WT.mc_id=github-azuredevtips-azureappsdev), that will show you how to do this, so I wont waste your time with it. However, I do want to remind you to do three things before deploying your SignalR app to your SignalR Service.
<img :src="$withBase('/files/manage_secret.png')">
1) The first thing youll want to do is store your secret key using the [Secret Manager](https://docs.microsoft.com/aspnet/core/security/app-secrets?view=aspnetcore-2.1&tabs=windows?WT.mc_id=docs-azuredevtips-azureappsdev). Right- click your project in Solution Explorer and use the Manage User Secrets option.
This will store your secret outside of your actual project for added security during development. When you get ready to move to test or production, you will want to use [Azure Key Vault](https://docs.microsoft.com/aspnet/core/security/key-vault-configuration?view=aspnetcore-2.1?WT.mc_id=docs-azuredevtips-azureappsdev) instead.
2) Next, find your AddSignalR method and append AddAzureSignalR to it like this:
```
services.AddSignalR().AddAzureSignalR();
```
3) Finally, find UseSignalR and replace it with
```
//app.UseSignalR(route =>
//{
// route.MapHub<ChatHub>("/chathub");
//});
app.UseAzureSignalR(route =>
{
route.MapHub<ChatHub>("/chathub");
});
```
These steps will switch your app from a standard SignalR application to one that uses Azure SignalR Services.
<img :src="$withBase('/files/publish_chat.png')">
Now, select Publish from the Build menu and deploy to your service with the click of (a few) buttons. Your infrastructure provisioning and traffic monitoring are all taken care of.
<img :src="$withBase('/files/lets_chat.png')">

66
blog/blog/tip187.md Normal file
Просмотреть файл

@ -0,0 +1,66 @@
---
type: post
title: "Tip 187 - Create a back end for your next native iOS application"
excerpt: "Learn how to create a back end for your next native iOS application"
tags: [Mobile]
share: true
date: 2019-03-17 17:00:00
---
::: tip
:bulb: Learn more : [Create an iOS app](https://docs.microsoft.com/azure/app-service-mobile/app-service-mobile-ios-get-started?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Create a back end for your next native iOS application
Azure defines a number of services for app developers covering data storage, notifications, authentication, etc. Wouldn't it be great if there was a simple platform-as-a-service (PaaS) offering that tied all these together to quickly build a back end for your app? Good news. That offering already exists: the [Mobile Apps](https://azure.microsoft.com/services/app-service/mobile?WT.mc_id=azure-azuredevtips-azureappsdev) feature in Azure App Service.
#### Start my app
From the Azure portal, select **Create a resource**, search for **Mobile**, and you'll find the **Mobile App** item. To create a new app, you'll need a unique name and you can select an app service plan. For development, there is a free tier that allows you to get started.
<img :src="$withBase('/files/azure-mobile-create.png')">
Once created you can view the settings for the app via **All resources** in the Azure portal. The unique URL that you specified on creation will open in a browser and show a placeholder webpage.
#### Off to a quick start
Looking at all the options available in the Azure portal might seem overwhelming, but luckily there is a quick way to get started. Click **Quickstart** to set up a simple database back end and generate a sample project.
<img :src="$withBase('/files/azure-mobile-quickstart1.png')">
For native iOS apps, you can choose either Objective-C or, as in this example, Swift to create a ready-made project configured to talk to your newly created back end.
<img :src="$withBase('/files/azure-mobile-quickstart2.png')">
You'll need to select an Azure SQL database connection or create a new one. Again, there is a free tier to get started with development.
<img :src="$withBase('/files/azure-mobile-quickstart3.png')">
For the second step, the Quickstart will generate a **TodoItem** table for the app. As we'll see later, you can manage this table through the **Easy tables** settings for the Mobile app.
The third step is to decide whether to generate a new generic app or to receive instructions to integrate with an existing app.
<img :src="$withBase('/files/azure-mobile-quickstart4.png')">
The downloaded Xcode project can be deployed and run on a device or simulator. The Quickstart code is for a generic To-do app and the code creates the **QSTodoDataModel** and related UI code.
<img :src="$withBase('/files/azure-mobile-xcode.png')">
In the **ToDoTableViewController** class, you'll see that your application URL is pre-populated in the **viewDidLoad** method. Before you can build and deploy, you'll need to pick a development team in the Code Signing properties for the project.
<img :src="$withBase('/files/azure-mobile-quickstart-ios.png')">
When you deploy and run the app you'll see a simple clear list interface with the ability to add a new item and pull-to-refresh on the list. You can view the contents of the database from the app entry in the Azure portal under **Easy tables**.
<img :src="$withBase('/files/azure-mobile-easytables.png')">
The sample app is very basic, but you can easily see how it works from the generated code. It's very easy to create tables and edit their schema to fit your app requirements.
A limitation of the Quickstart code is that it uses anonymous access. From the **Authentication / Authorization** tab you can turn on Authentication and link to ready-made configuration providers for Azure AD and a number of social and online networks.
<img :src="$withBase('/files/azure-mobile-authentication.png')">
#### Go your own way
For detailed instructions on building your own Azure Mobile App, you can follow this [Quickstart](https://docs.microsoft.com/azure/app-service-mobile/app-service-mobile-ios-get-started?WT.mc_id=docs-azuredevtips-azureappsdev). You can also find more details about [adding authentication and configuring your tables](https://docs.microsoft.com/azure/app-service-mobile/app-service-mobile-ios-get-started-users?WT.mc_id=docs-azuredevtips-azureappsdev). Support for authentication is already in the MicrosoftAzureMobile.framework, so it only requires a few additional lines of code. The Mobile App can easily be extended with push notification support using [Azure Notification Hubs](https://docs.microsoft.com/azure/notification-hubs?WT.mc_id=docs-azuredevtips-azureappsdev), but that warrants a separate blog post.

93
blog/blog/tip188.md Normal file
Просмотреть файл

@ -0,0 +1,93 @@
---
type: post
title: "Tip 188 - Work with Notification Hubs on your next Native iOS application"
excerpt: "Learn how to use Notification Hubs from a Native iOS app"
tags: [Mobile]
share: true
date: 2019-03-18 17:00:00
---
::: tip
:bulb: Learn more : [Create an iOS app](https://docs.microsoft.com/azure/app-service-mobile/app-service-mobile-ios-get-started?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Work with Notification Hubs on your next Native iOS application
[Azure Notification Hubs](https://docs.microsoft.com/azure/notification-hubs?WT.mc_id=docs-azuredevtips-azureappsdev) provides an integrated back end capable of pushing notifications to all the major mobile platforms through their different cloud services. iOS apps use the [Apple Push Notification Service](https://developer.apple.com/documentation/usernotifications) (APNS), and Notification Hubs can push messages through this service either directly or via an [Azure Mobile App](https://azure.microsoft.com/services/app-service/mobile?WT.mc_id=azure-azuredevtips-azureappsdev) back end. Notification Hubs allows you to push messages to millions of devices across platforms with a single API call. These can be to all users, or to particular segments of your customers using tags.
#### Get certified
Before you can start using APNS, you need to set up a certificate with your Apple developer account. Launch Keychain Access from a Mac, select **Certificate Assistant** from the Keychain Access menu and then **"Request a Certificate From a Certificate Authority"**. Fill in your email address and a name, select **Saved to disk**, and store the .CSR file somewhere convenient.
Next, you will need to set up your app ID with the Apple Developer Program. Sign in and then click **Identifiers and App IDs**. Click the **+** button to create a new entry. You'll need to provide a friendly name for the app, a unique Bundle Identifier (normally in reverse DNS notation). In the list of App Services checkboxes, make sure you check **Push Notifications**.
<img :src="$withBase('/files/apple-dev-appids.png')">
Once you've confirmed the registration, you'll see Push Notifications is highlighted in amber because it requires further configuration. From the App ID list, you can edit this new entry, and then you'll see buttons next to Push Notifications with options to create certificates for development and production use. Click **Create Certificate** against Development. Here, you'll be prompted to upload the .CSR file you created earlier. From this, a certificate will be created and you can download it and install it on your development Mac. For Azure Notification Hubs, you need to export this new certificate as a .p12 file.
#### Set up Notification Hubs
When you create a new **Notification Hub** in the **Azure portal**, you'll also create a new Notification Hubs namespace. Multiple pricing tiers are available, but you can start with the free tier and upgrade as you scale up your app.
<img :src="$withBase('/files/azure-notifications-apns.png')">
To send messages, you'll need to configure the hub with one or more services. For iOS, you use the Apple (APNS) service. Here, you select the Certificate authentication mode and upload the certificate .p12 file to provide a link to your App ID. You can toggle the setting between Production and Sandbox, and you should make sure this is set to Sandbox for development. Once that configuration is done, the Notification Hub is ready to use.
#### Hooking up the app
iOS apps can request a unique token for APNS and this must be registered with the Notification Hub. You can download the [WindowsAzureMessaging.framework](http://go.microsoft.com/fwlink/?LinkID=799698?WT.mc_id=go-azuredevtips-azureappsdev) and add it to your app. This has the functionality to call Azure Notification Hubs, you just need to add the endpoint connection string and hub name. The hub name is the unique name you gave the Notification Hub when you created it. The connection string can be found in the **"Access Policies"** section of the Notification Hub settings. By default, there are two levels of access and you need the **DefaultListenSharedAccessSignature** for a client app. The other full-access connection string has other permissions on the Notification Hub.
In order for the app to support alert notifications the Info.plist for the app will need editing to add the remote-notification entry to the **UIBackgroundModes** key. This allows the system to wake up your app in the background when a remote notification is received.
From iOS 10, all user notification is handled through the UserNotification framework. To respond to a request to display a notification while your app is running, you must implement the **UNUserNotificationCenterDelegate** protocol in your AppDelegate. You also need to request permission from the user to present notifications.
```swift
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
// Override point for customization after application launch.
UNUserNotificationCenter.current().delegate = self
UNUserNotificationCenter.current().requestAuthorization(options: UNAuthorizationOptions.alert, completionHandler: { _,_ in })
application.registerForRemoteNotifications()
return true
}
```
The first line requests a token for the app to hook up to the APNS. You receive a callback when a token is assigned, and this must be sent to the Azure Notification Hub.
```swift
func application(_ application: UIApplication, didRegisterForRemoteNotificationsWithDeviceToken deviceToken: Data) {
hub = SBNotificationHub(connectionString: HubInfo.HUBLISTENACCESS, notificationHubPath: HubInfo.HUBNAME)
hub?.registerNative(withDeviceToken: deviceToken, tags: nil, completion: { _ in })
}
```
Every time a remote notification is received, **didReceiveRemoteNotification** is called, so this is overridden to display a notification.
```swift
func application(_ application: UIApplication, didReceiveRemoteNotification userInfo: [AnyHashable : Any], fetchCompletionHandler completionHandler: @escaping (UIBackgroundFetchResult) -> Void) {
completionHandler(UIBackgroundFetchResult.newData)
}
```
If a UserNotification is requested while the app is running, it won't by default show any alert. You can either request a default alert or implement your own UI to match your app style. For simplicity, this code requests a default alert here.
```swift
func userNotificationCenter(_ center: UNUserNotificationCenter, willPresent notification: UNNotification, withCompletionHandler completionHandler: @escaping (UNNotificationPresentationOptions) -> Void) {
completionHandler(UNNotificationPresentationOptions.alert)
}
```
##### Pushing back
Once configured, you can test that it's working from the Azure portal. The **"Test Send"** item is near the bottom of the service page. Select the Apple platform. Notice that all the services are listed here, even if you've not yet configured them. Attempting to send to a service that is not configured will fail with an error. You can directly edit the default payload to replace with whatever you require. If you want your message to wake your app in the background, you have to include the property **"content-available:1"**.
<img :src="$withBase('/files/azure-notifications-testsend.png')">
On success, you'll see a list of device registrations and status of the sent message. At this point, you can verify on your development device that the message is displayed as expected.
<img :src="$withBase('/files/azure-notifications-device-toast.png')">
#### Pushing ahead
You've seen that it's very easy to create and use Notification Hubs as long as you work through each of the configuration steps so Notification Hubs has the right certificate and your app is configured correctly. Once that is done, you have a permanent communication channel between your app and your back-end service to keep your app alive. To explore further, you can [view documentation](https://docs.microsoft.com/azure/notification-hubs/notification-hubs-ios-apple-push-notification-apns-get-started?WT.mc_id=docs-azuredevtips-azureappsdev) that will give you more details on targeting individuals and groups of users for a more personal experience.

88
blog/blog/tip189.md Normal file
Просмотреть файл

@ -0,0 +1,88 @@
---
type: post
title: "Tip 189 - Guided tour of Azure Machine Learning Studio"
excerpt: "If it makes the tool more approachable, you could think of Azure ML Studio as the low bar for machine learning that makes it easy for everyone to get into AI. I like to think of it more simply as a playroom where I do experiments with machine learning that no else needs to see."
tags: [AI + Machine Learning]
share: true
date: 2019-03-24 17:00:00
---
::: tip
:bulb: Learn more : [Azure Machine Learning Overview](https://docs.microsoft.com/azure/machine-learning/service/overview-what-is-azure-ml?WT.mc_id=docs-azuredevtips-azureappsdev).
:bulb: Checkout [Azure ML for data scientists page](https://azure.microsoft.com/en-us/overview/ai-platform/data-scientist-resources/?WT.mc_id=docs-azuredevtips-azureappsdev).
:::
### Guided tour of Azure Machine Learning Studio
In a world where there are WYSIWIG editors for practically everything, have you ever wondered why there isnt a drag-and-drop web app for machine learning? Well, actually there is: [Azure Machine Learning Studio](https://studio.azureml.net). Today, I want to give you a personal tour of ML Studio and give you an idea of just how much you can do without writing a lick of code.
##### No programming required
If it makes the tool more approachable, you could think of Azure ML Studio as the low bar for machine learning that makes it easy for everyone to get into AI. I like to think of it more simply as a playroom where I do experiments with machine learning that no else needs to see.
ML Studio has a completely [free tier](https://azure.microsoft.com/pricing/details/machine-learning-studio?WT.mc_id=azure-azuredevtips-azureappsdev) that gives you two hours of compute a month so you arent racking up a bill while you are trying things out. Youll want to take advantage of that.
<img :src="$withBase('/files/mlstudio_dash.png')">
The ML Studio home screen is called your workspace. This is where youll collect goodies like datasets, predictive models, experiments, and notebooks, which you can organize under different projects.
* **Projects** are collections of experiments, datasets, notebooks, and other resources.
* **Experiments** are what you create with the drag-and-drop tool.
* **Web services** are deployed from your experiments.
* **Notebooks** are [Jupyter](https://jupyter.org) notebooks where you collect code snippets, equations, links, and figures. Its awesome.
* **Datasets** are really important in machine learning, since your predictions are only as good as the data you work with. Fortunately, ML Studio gives you access to lots of interesting datasets.
* **Trained models** are your machine learning output. You plug them into your apps.
* **Settings** hold your account configuration.
There are lots of great things in the ML Studio home screen, but I want to call out two in particular.
<img :src="$withBase('/files/experiment_dragdrop.gif')">
First, the **Experiments** tab is where you put together a machine learning project visually and with no actual code. This is one of the best ways to learn machine learning by playing around and seeing how things connect together. The tray to the left of your work area gives you access to algorithms, data, and workflows you can pull into your experiment. Lets take a closer look at some of the functionality packaged there for you.
<img :src="$withBase('/files/saved_datasets.png')">
To work with machine learning, your first step is always going to be to find training data to work with. The easiest thing to do is to expand the Saved Datasets node in the navigation window and drag one of the Sample datasets, like “Movie Ratings”, into your experiment.
<img :src="$withBase('/files/select_columns.png')">
The Data Transformation node in the navigation window gives you a lot of choices in how to select and filter your datasets. You can drag “Select Columns in Dataset” into your experiment and draw an arrow to it from Movie Ratings to add it into your workflow and pick some data columns to use.
<img :src="$withBase('/files/run_experiment.png')">
To complete a rudimentary experiment, add **Machine Learning | Train Model** as the next step in your workflow and select **Machine Learning | Initialize Model | Classification | Two-Class Neural Network** as the kind of learning algorithm you want to create. Finally score and evaluate your model by adding those nodes and **Run** your experiment.
<img :src="$withBase('/files/visualize_evaluation.png')">
To see how good your model is, left click on the **Evaluate Model** module and select **Visualize**.
<img :src="$withBase('/files/eval_results.png')">
This will now show you how good your machine learning algorithm is (hint, not very good in this case). There are several things you can do next to improve your model like splitting off some of your data for training and some of it for scoring, but Ill leave that to you to experiment with.
##### Write this down
The other thing I especially want to highlight is the **Notebooks** tab. It took me a long time to understand how cool Jupyter Notebook is, and I dont want you to miss out like I did.
<img :src="$withBase('/files/jupyter.png')">
As you can see above, it looks like a scientists book of secret formulas. What makes it extra neat is that its also an application that can run live code, create visualizations, clean data, and even make predictive models for you. Jupyter Notebook is a broadly used tool in the world of data science. Whether you plan to use ML Studio or not, you should still make a point to become familiar with it.
##### The gallery is your friend
Still intimidated? Thats okay. The [Azure AI Gallery](https://gallery.azure.ai')"> has many prebuilt experiments written by other people that you can simply load into your workspace and modify to learn.
<img :src="$withBase('/files/gallery_ui.png')">
When you browse the AI Gallery and find an experiment you like, just click on the **Open in Studio** button to conveniently copy the whole thing directly into ML Studio. Give it a shot. Until we finally get to the point where AI is training our models for us, using drag-and-drop in Azure ML Studio may be the best thing going.

93
blog/blog/tip19.md Normal file
Просмотреть файл

@ -0,0 +1,93 @@
---
type: post
title: "Tip 19 - Deploy an Azure Web App using only the CLI"
excerpt: "Learn how to deploy an Azure Web App using only the CLI tools from scratch"
tags: [Web, Management and Governance]
date: 2017-09-20 17:00:00
---
::: tip
:bulb: Learn more : [Azure Command-Line Interface (CLI)](https://docs.microsoft.com/cli/azure?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to deploy an Azure Web App using only the CLI tool](https://www.youtube.com/watch?v=lO5Dvde07Tg&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=15?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Deploy an Azure Web App using only the CLI
While I love working with the Azure Portal or even Visual Studio, it is sometimes nice to do everything from the command line. While I'm a power Windows user, in this tutorial I'll be using a Linux VM and BASH to do everything.
Step 1) Ensure you have the following stack installed.
*This will give us a full web development stack we can work with in the future. *
``` shell
mbcrump@crumplinux:~$ git --version
git version 2.7.4
mbcrump@crumplinux:~$ nodejs --version
v4.2.6
mbcrump@crumplinux:~$ npm --version
3.5.2
mbcrump@crumplinux:~$ gulp --version
[20:05:28] CLI version 1.4.0
[20:05:28] Local version 3.9.1
mbcrump@crumplinux:~$ mongod --version
db version v2.6.10
2017-09-20T20:11:43.087+0000 git version: nogitversion
2017-09-20T20:11:43.095+0000 OpenSSL version: OpenSSL 1.0.2g 1 Mar 2016
```
I'm particularly interested in the [MEAN.JS](https://github.com/meanjs/mean?WT.mc_id=github-azuredevtips-azureappsdev) stack.
**What is MEAN.JS?** MEAN.JS is a Full-Stack JavaScript Using MongoDB, Express, AngularJS, and Node.js -
Step 2) Create a folder such as `webapp` and then `cd webapp`.
Step 3) Run the following command `git clone https://github.com/crpietschmann/jsQuizEngine.git`. This is a JavaScript based quiz engine by [Chris Pietschmann](https://github.com/crpietschmann?WT.mc_id=github-azuredevtips-azureappsdev).
Step 4) Change your working directory to `jsQuizEngine/src` and now we'll need to create a deployment user that can deploy web app through Git.
az webapp deployment user set --user-name mbcrump --password AREALLYLONGPW
```shell
Name PublishingUserName
------ --------------------
web mbcrump
```
Step 5) We'll need a resource group. I'm going to put mine in the West US.
az group create --name StaticResourceGroup --location "West US"
```shell
Location Name
---------- -------------------
westus StaticResourceGroup
```
Step 6) We'll also need an Azure App Service Plan. I'll use the free one for this example.
az appservice plan create --name StaticAppServicePlan --resource-group StaticResourceGroup --sku FREE
```shell
AppServicePlanName GeoRegion Kind Location MaximumNumberOfWorkers Name ProvisioningState ResourceGroup Status Subscription
-------------------- ----------- ------ ---------- ------------------------ -------------------- ------------------- ------------------- -------- ------------------------------------
StaticAppServicePlan West US app West US 1 StaticAppServicePlan Succeeded StaticResourceGroup Ready d1ecc7ac-c1d8-40dc-97d6-2507597e7404
```
Step 7) We'll create an Azure Web App and deploy it using local Git.
az webapp create --name MyQuizApplication --resource-group StaticResourceGroup --plan StaticAppServicePlan --deployment-local-git
You should see in the output `Local git is configured with url of 'https://mbcrump@myquizapplication.scm.azurewebsites.net/MyQuizApplication.git'` Copy and paste this to your editor of choice.
Step 8) We'll need to add azure to our local Git repo.
git remote add azure https://mbcrump@myquizapplication.scm.azurewebsites.net/MyQuizApplication.git
Step 9) Push the changes.
git push azure master
Step 10) Nice! We can now browse to our [new site](http://myquizapplication.azurewebsites.net/#).
<img :src="$withBase('/files/azureappservicequiz.png')">

55
blog/blog/tip190.md Normal file
Просмотреть файл

@ -0,0 +1,55 @@
---
type: post
title: "Tip 190 - Multi-Factor Authentication on Azure in a Nutshell"
excerpt: "Multi-Factor Authentication on Azure in a Nutshell"
tags: [Identity]
share: true
date: 2019-03-25 17:00:00
---
::: tip
:bulb: Learn more : [Azure Multi-Factor Authentication](https://docs.microsoft.com/azure/active-directory/authentication/concept-mfa-howitworks/?WT.mc_id=azure-azuredevtips-azureappsdev).
:::
### Multi-Factor Authentication on Azure in a Nutshell
In another Tips and Tricks post, we added Azure Active Directory authentication to an existing App Service Web App. Today, well make sure Multi-Factor Authentication (MFA) is on for that user. There are various services in Azure when it comes to [Multi-Factor Authentication](https://azure.microsoft.com/services/multi-factor-authentication?WT.mc_id=azure-azuredevtips-azureappsdev), so lets first see whats available. Keep in mind, I want it to be FREE.
If you take a look at the documentation on how it works, the following MFA offerings are listed:
* **Azure Active Directory Premium** – Licenses for full-featured, on-premises, or cloud-hosted MFA services.
* **Multi-Factor Authentication for Office 365** – MFA features included with an Office 365 subscription.
* **Azure Active Directory Global Administrators** – MFA capabilities made available for free by Microsoft for protecting global administrator accounts.
Note I am using a Microsoft account that is a global administrator on my
pay-as-you-go Azure account.
#### There are several MFA offerings, but I didnt use them
So why didnt I use Azure Active Directory Premium, MFA for Office 365, or MFA for Azure Active Directory Global Administrators?
First, I didnt want to pay for Azure Active Directory Premium. Also, I didnt use MFA for Office 365 because it is for accounts connected to an Office 365 account, which I didnt have. Finally, Azure Active Directory Global Administrators MFA is a [two-step verification for Azure Active Directory users](https://docs.microsoft.com/azure/active-directory/authentication/howto-mfa-userstates?WT.mc_id=docs-azuredevtips-azureappsdev) and not a Microsoft account. There are ways to turn [two-step verification on for Microsoft accounts](https://support.microsoft.com/help/12408/microsoft-account-about-two-step-verification?WT.mc_id=support-azuredevtips-azureappsdev) that are done outside of Azure, which I didnt want to do.
While researching why I couldnt enable MFA for my Microsoft account user, I found a newer feature that also provides MFA called Baseline Protection. The nice thing about using [Baseline Protection](https://docs.microsoft.com/azure/active-directory/conditional-access/baseline-protection?WT.mc_id=docs-azuredevtips-azureappsdev) is it works well for Microsoft accounts and Azure Active Directory accounts.
#### How I turned on Multi-Factor Authentication using Baseline Policy
Go to the Azure portal and navigate to **Azure Active Directory**, and then click **Conditional access** under **Security**. Since Im using my own pay-as-you-go subscription, this is the default directory.
<img :src="$withBase('/files/mfa1.png')">
Click on **Baseline policy: Require MFA for admins (Preview)** in the list of policies.
<img :src="$withBase('/files/mfa2.png')">
Select **Use policy immediately** and click the **Save** button.
<img :src="$withBase('/files/mfa3.png')">
Once you have saved, youll now see a checkmark in the **Enabled** column of the policy listing.
<img :src="$withBase('/files/mfa4.png')">
Excellent! Now all global administrators of my Azure account will have Multi-Factor Authentication turned on.
<img :src="$withBase('/files/mfa5-small.gif')">
<img :src="$withBase('/files/mfa6-small.gif')">

88
blog/blog/tip191.md Normal file
Просмотреть файл

@ -0,0 +1,88 @@
---
type: post
title: "Tip 191 - Serial console access with Azure VMs - Troubleshooting and diagnosing"
excerpt: "Learn how to use the Azure Virtual Machine Serial Console to troubleshoot your VM regardless of the state of your VM OS"
tags: [Virtual Machines]
share: true
date: 2019-03-31 17:00:00
---
::: tip
:bulb: Learn more: [Azure Virtual Machines documentation](https://docs.microsoft.com/azure/virtual-machines/?WT.mc_id=docs-azuredevtips-azureappsdev)
:tv: Watch the video : [How to use the Azure Virtual Machines Serial Console](https://www.youtube.com/watch?v=pQ9dQ13B2vM&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=45?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Serial console with Azure VMs - Troubleshooting and diagnosing
Do you have Virtual Machines in Azure? If so, you probably need to resolve a problem with them from time to time. This is sometimes difficult with VMs running in Azure, as some things, like the boot menu, aren't visible to you. Luckily, there is a very handy tool that you can use to troubleshoot your VM in Azure. It's called the [Virtual Machine Serial Console](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-windows?WT.mc_id=docs-azuredevtips-azureappsdev).
Serial console lets you use a command line to operate your VM from the Azure Portal. The beauty of it is that it works independent from the state of the VM. So it works when you're booting up, shutting down and when you are running. And it even works if the VM doesn't have an internet connection as the serial console connects to the COM1 serial port. Let's explore how we can use the serial console with Azure VMs for troubleshooting and diagnostics.
##### Making it work
The serial console feature is available for [Windows](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-windows?WT.mc_id=docs-azuredevtips-azureappsdev) and [Linux](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-linux?WT.mc_id=docs-azuredevtips-azureappsdev) VM images. For most Windows images that are created in February 2018 or after that, it will work out-of-the-box. In case that it doesn't work, you need to make sure that you have all of the [prerequisites](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-windows#prerequisites?WT.mc_id=docs-azuredevtips-azureappsdev) in place and you can follow [these steps](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-windows#enable-serial-console-in-custom-or-older-images?WT.mc_id=docs-azuredevtips-azureappsdev).
<img :src="$withBase('/files/Serial_Console_in_the_VM_blade_in_the_Azure_portal.png')">
(Serial console in the Azure portal)
##### Working with serial console
In my case, the serial console works out-of-the-box. This is because I run a VM with an image of Windows 10 and Visual Studios latest preview that was created after February 2018.
When you open the serial console, it shows you the SAC command prompt. From there, you can perform very helpfull operations on your VM, like restarting it and kill processes.
<img :src="$withBase('/files/Serial_console_SAC_commands.png')">
(Serial console SAC commands)
Serial console works with so-called 'channels'. From the SAC command prompt, you can start a channel for cmd.exe by typing **CMD \<enter>**. This start the CMD channel. And by typing **ch -? \<enter>**, you'll see all of the commands that you can use to manage the channel.
<img :src="$withBase('/files/Serial_console_creating_a_CDM_channel.png')">
(Serial console channel management commands)
Now, you can use the shortcut **\<esc>\<tab>** to switch to the CMD channel and back. Once you do, you switch to the CMD channel, which allows you to use the regular windows command prompt to do everything that you need to. You will need to log in with your VM administrator account to access the command prompt. Because of this, one of the prerequisites is that you have an administrative account with a password for the VM.
Once logged in, you can, for instance check the properties of the network card or manipulate the file system.
<img :src="$withBase('/files/Show_file_structure_in_CMD_channel.png')">
(Manipulate the file system in the CMD channel)
If you prefer to use PowerShell, you can even start a PowerShell prompt from the CMD channel. Just type **powershell \<enter>**. Now, you can use PowerShell to do all sorts of thing. For instance, to check if RDP is enabled in your VM.
<img :src="$withBase('/files/Check_if_RDP_is_enabled_using_PowerShell.png')">
(Check if RDP is enabled using PowerShell)
##### Using serial console to get into the VM boot menu
One thing that I've always wanted to do with VMs in Azure that run some of my old images, is to access the boot menu. Sometimes, I just need the option to, for instance, start Windows in safe mode. Now, with serial console, I can. And it's pretty simple to set up.
1. Switch to a cmd channel in the serial console. You can do that by using **\<esc>\<tab>**. If you don't have a cmd channel yet, type **CMD \<enter>** from the SAC command prompt
2. Enter the following commands:
```
bcdedit /set {bootmgr} displaybootmenu yes \<enter>
bcdedit /set {bootmgr} timeout 20 \<enter>
bcdedit /set {bootmgr} bootems yes \<enter>
```
The timeout in the commands determines how long you can see the boot manager menu. That's it! Now, when you reboot the VM, you can access the boot menu from the serial console. You can test that by going to the SAC command prompt and typing in **restart \<enter>**. This reboots the VM and shows you the standard windows boot menu. When you now press **F8**, the advanced boot options menu appears.
<img :src="$withBase('/files/Windows_boot_menu_through_serial_console.png')">
(Windows Advanced Boot Options menu in the serial console)
##### Conclusion
Serial console is a hidden gem that can help you a lot when you need to troubleshoot your Azure VM. Go and check it out.
* [Azure Virtual Machines documentation](https://docs.microsoft.com/azure/virtual-machines/?WT.mc_id=docs-azuredevtips-azureappsdev)
* [Virtual Machine Serial Console for Windows](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-windows?WT.mc_id=docs-azuredevtips-azureappsdev)
* [Virtual Machine Serial Console for Linux](https://docs.microsoft.com/azure/virtual-machines/troubleshooting/serial-console-linux?WT.mc_id=docs-azuredevtips-azureappsdev)

103
blog/blog/tip192.md Normal file
Просмотреть файл

@ -0,0 +1,103 @@
---
type: post
title: "Tip 192 - Getting Started with Azure Front Door"
excerpt: "Learn what Azure Front Door is and how to use it"
tags: [Networking]
share: true
date: 2019-04-07 17:00:00
---
::: tip
:bulb: Learn more: [Azure Front Door Service](https://azure.microsoft.com/services/frontdoor?WT.mc_id=docs-azuredevtips-azureappsdev)
:tv: Watch the video : [How to get started with Azure Front Door](https://www.youtube.com/watch?v=YV2nYfWfgAk&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=46?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Getting Started with Azure Front Door
In this post we'll get started with the [Azure Front Door Service](https://azure.microsoft.com/services/frontdoor?WT.mc_id=docs-azuredevtips-azureappsdev). This is a new networking service that acts as a load balancer and an application firewall.
So what can you do with Azure Front Door? Amongst other things, you can use it to:
* Route users to the most performant application
* Ensure users get routed to a working application (so fail over to a working endpoint when one fails)
* Protect your application against DDoS attacks
* Route users based on URL (https://contoso.com/product goes to one website and https://contoso.com/jobs goes to another)
* Filter traffic to your application based on country
* Rewrite URLs
My first thought when Azure Front Door was released was "*How is this different from [Azure Traffic Manager](https://docs.microsoft.com/azure/traffic-manager/traffic-manager-overview?WT.mc_id=docs-azuredevtips-azureappsdev)?*". As you might know, Azure Traffic Manager is also a kind of load balancer service that provides similar capabilities. Azure Front Door is different from Azure Traffic Manager through. Here are the main reasons why:
1. Azure Front Door provides **TLS protocol termination (SSL offload)**, and Azure Traffic Manager does not
2. Azure Front Door provides **application layer processing**, and Azure Traffic Manager does not. This means that Azure Front Door can do things like URL rewriting and that it provides a Web Application Firewall (WAF) that protects you against common web attacks
#### Improving performance and availability with Azure Front Door
Let's get started. I'll show you a simple example of how to create a new Azure Front Door instance and how to use that, to route to the most performant application and increase the availability of the solution.
#### 1. Create Web Apps in different regions
First, I've created two Azure App Service Web Apps and deployed a simple website to them. The Web Apps are in different geographical regions. One in West Europe and one in West US. The website in the Web Apps shows a message that says "This app is in West US" in the US Web App and "This app is in West Europe" for the Web App in Europe.
<img :src="$withBase('/files/WebAppsInDifferentRegions.png')">
(Two Web Apps in the Azure portal)
#### 2. Create a new Azure Front Door instance
Next, I've created a new Azure Front Door instance in the Azure portal. This is pretty straightforward. Just like with any service that you create, you click the **"Create a resource" button** (the green plus sign on the left of the portal) and search for Azure Front Door and click **create**.
Now, you go through a designer experience that asks you to put in the frontend URL, that users will use.
The designer also asks you to create a **backend pool**. This represents the resources that handle the requests. in this case, these are the two Web Apps. But you have the choice to add any resource that can handle HTTP traffic, inside and outside of Azure.
<img :src="$withBase('/files/CreateAzureFrontDoorBackend.png')">
(Different options for adding resources to the backend pool)
The backend pool contains both Web Apps and looks like the image below:
<img :src="$withBase('/files/BackendPool.png')">
Finally, you can create routing rules that determine how traffic is routed to the backend pool. you could for instance, have multiple backend pools and route traffic that has a URL that contains **\mobile** to a backend pool that contains an app specifically for mobile platforms. Once you've created one or more rules, you are done and the Azure Front Door can be deployed.
<img :src="$withBase('/files/CreateAzureFrontDoor.png')">
(Azure Front Door designer)
#### 3. Testing the route for performance
By default, Azure Front Door routes users to the most performant node in the backend. So when I navigate to the frontend URL, Azure Front Door routes me to the Web App in West US.
<img :src="$withBase('/files/WebAppWestUS.png')">
(The frontend URL routes to the West US Web App)
This makes sense, because I'm located in West US and the Web App in West US is the most performant to me, because it is the closest to me and has the lowest latency.
#### 4. Testing improved availability
Another cool thing that Azure front Door can do is improve the availability of your application. Let's test that out.
By default, when I go to the Azure Front Door frontend URL (https://azuretipsandtricks.azurefd.net), I get routed to the West US Web App, because I am in the US.
Let's stop the West US Web App to see what happens. You can do that very easily from the Azure portal. In the dashboard, you can right-click the context menu of the Web App and perform all sorts of actions, including stopping it.
<img :src="$withBase('/files/StopWebApp.png')">
(Stop the West US Web App from the Azure portal dashboard)
Now, the backend pool only has one working node in it; the Web App in West Europe. So, when I navigate to https://azuretipsandtricks.azurefd.net/, I now get routed to the West Europe Web App.
<img :src="$withBase('/files/WebAppWestEurope.png')">
(The frontend routes to the West Europe Web App)
This didn't work instantly though. The first few times that I refreshed the browser after stopping the Web App, I was still routed to the West US Web App and got an error that said that it was stopped. I was only routed to the West Europe Web App after a while. This is because Azure Front Door only pings the endpoints in the backend pool periodically to see if they are working. You can speed this process up by adjusting the health probes for the backend pool.
#### Conclusion
As you can see, Azure Front Door is a very powerful service. You can easily configure it using the designer in the Azure portal and also by using PowerShell. And it can do a lot more on top of increasing performance and availability. Go and try it out!
* [Azure Front Door overview](https://docs.microsoft.com/azure/frontdoor/front-door-overview?WT.mc_id=docs-azuredevtips-azureappsdev)
* [Tutorial: Set up a geo-filtering policy fr your Azure front Door](https://docs.microsoft.com/azure/frontdoor/front-door-tutorial-geo-filtering?WT.mc_id=docs-azuredevtips-azureappsdev)
* [URL rewriting with Azure Front Door](https://docs.microsoft.com/azure/frontdoor/front-door-url-rewrite?WT.mc_id=docs-azuredevtips-azureappsdev)

223
blog/blog/tip193.md Normal file
Просмотреть файл

@ -0,0 +1,223 @@
---
type: post
title: "Tip 193 - Build and deploy your first app with the Azure SDK for Go on Azure"
excerpt: "Learn how to build and deploy your first app with the Azure SDK for Go"
tags: [Languages & Frameworks]
share: true
date: 2019-04-14 17:00:00
---
::: tip
:bulb: Learn more: [Azure resources for Go developers](https://docs.microsoft.com/go/azure?WT.mc_id=docs-azuredevtips-azureappsdev)
:::
### Build your first app with the Azure SDK for Go on Azure
[Go](https://golang.org) is a programming language that is created by Google. It is sometimes referred to as 'Golang' and it's completely open source. It is statically typed and compiled and in that sense, it is kind of like C# and C. Go is very popular and it is used in big implementations, like in Docker and in parts of Netflix.
And now, just like with almost any programming language, you can use Go in Azure! You can do that with the Azure SDKs for Go. There are several SDKs:
* The [core Azure SDK](http://github.com/Azure/azure-sdk-for-go) for Go
* The [Blob Storage SDK](http://github.com/Azure/azure-storage-blob-go) for Go
* The [File Storage SDK](http://github.com/Azure/azure-storage-file-go) for Go
* The [Storage Queue SDK](http://github.com/Azure/azure-storage-queue-go) for Go
* The [Event Hub SDK](http://github.com/Azure/azure-event-hubs-go) for Go
* The [Service Bus SDK](http://github.com/Azure/azure-service-bus-go) for Go
* The [Application Insights SDK](http://github.com/Microsoft/ApplicationInsights-go) for Go
##### Manage Azure Blobs with Go
Let's build our first app in Go and use the Azure Blob Storage SDK. We'll build something simple that can interact with Azure Blob Storage.
To get started, we'll first do some initial setup of things.
##### Initial setup
First things first:
1. If you haven't already, [install Go 1.8 or later](https://golang.org/dl)
2. Create an Azure Storage account. We'll use this in the application. From the Azure portal, you can create a general Storage Account by leaving things to their default settings as they are in the image below:
<img :src="$withBase('/files/CreateStorageAccount.png')">
(Creating a general purpose Azure Storage Account in the Azure portal)
1. Go to the Azure Storage Account and click on the **Access keys** menu. Now copy the **Storage account name** and the **Key** (either Key1 or Key2)
<img :src="$withBase('/files/AzureStorageKey.png')">
(Azure Storage Access keys in the Azure portal)
4. In the Go application, we'll use the Storage account name and the Access key from environment variables. In Windows, you can set these variables from the command window, like in the image below. If you are running Linux, you can see the command [here](https://docs.microsoft.com/azure/storage/blobs/storage-quickstart-blobs-go?toc=%2Fgo%2Fazure%2Ftoc.json&tabs=Linux#configure-your-storage-connection-string?WT.mc_id=docs-azuredevtips-azureappsdev).
<img :src="$withBase('/files/SetEnvironmentVariables.png')">
(Setting Azure Storage credentials as environment variables)
##### Creating the Go application
Now that we have everything setup, we can create our Go application. If you want, you can use Go with VSCode. Just install [this VSCode extension](https://marketplace.visualstudio.com/items?itemName=ms-vscode.Go) to make it work and get features like IntelliSense and debugging.
1. Let's start at the beginning. First, we create a Go file that will contain our code. Just create a file that has the .go extension, like **AzureBlobTest.go**. VSCode even let's you save files as Go files.
2. Now, in the **AzureBlobTest.go** file, write:
```
package main
import (
"bufio"
"bytes"
"context"
"fmt"
"io/ioutil"
"log"
"math/rand"
"net/url"
"os"
"strconv"
"time"
"github.com/Azure/azure-storage-blob-go/azblob"
)
```
If you are using VSCode, you can don't have to type all of the things in the import statement, except for the Azure Blob Storage SDK one. VSCode will automatically detect which packages you need and put them in the import statement.
The **github.com/Azure/azure-storage-blob-go/azblob** is the Azure Storage Blobs SDK.
3. To make sure that the Azure Blob Storage SDK can be found and that the application always imports it, I will also create a **go.mod** file. In here, I write:
```
module main
require github.com/Azure/azure-storage-blob-go v0.0.0-20181022225951-5152f14ace1c
```
This makes sure that the application actually downloads the SDK onto the machine. There are other methods to do this as well, but this one works for me.
4. Next, we create a new function, like this:
```
func main(){
}
```
5. And in the main function, we'll start creating the code that talks to blob storage. I'll start with getting the Azure Storage **account** and **Key** from the environment variables
```
// From the Azure portal, get your storage account name and key and set environment variables.
accountName, accountKey := os.Getenv("AZURE_STORAGE_ACCOUNT"), os.Getenv("AZURE_STORAGE_ACCESS_KEY")
```
6. Next, we use the credentials to create an Azure Storage pipeline and create a new Azure Blob Container. For the container name, we use a random string so that we minimize the chance that the container already exists. I don't show the code for getting the random string here. Don't worry, that is included in the finalized code sample. You can find a link to that in the conclusion of this post
```
// Create a default request pipeline using your storage account name and account key.
credential, err := azblob.NewSharedKeyCredential(accountName, accountKey)
if err != nil {
log.Fatal("Invalid credentials with error: " + err.Error())
}
pipeline := azblob.NewPipeline(credential, azblob.PipelineOptions{})
// Create a random string for the quick start container
containerName := fmt.Sprintf("quickstart-%s", randomString())
// From the Azure portal, get your storage account blob service URL endpoint.
URL, _ := url.Parse(
fmt.Sprintf("https://%s.blob.core.windows.net/%s", accountName, containerName))
// Create a ContainerURL object that wraps the container URL and a request
// pipeline to make requests.
containerURL := azblob.NewContainerURL(*URL, pipeline)
// Create the container
fmt.Printf("Creating a container named %s\n", containerName)
ctx := context.Background() // This example uses a never-expiring context
_, err = containerURL.Create(ctx, azblob.Metadata{}, azblob.PublicAccessNone)
handleErrors(err)
```
7. Now that we have a container, we can put blobs in it. The code below creates a file and uploads it as a Blob to the container.
```
// Create a file to test the upload and download.
fmt.Printf("Creating a dummy file to test the upload and download\n")
data := []byte("hello world this is a blob\n")
fileName := randomString()
err = ioutil.WriteFile(fileName, data, 0700)
handleErrors(err)
// Here's how to upload a blob.
blobURL := containerURL.NewBlockBlobURL(fileName)
file, err := os.Open(fileName)
handleErrors(err)
// The high-level API UploadFileToBlockBlob function uploads blocks in parallel for optimal performance, and can handle large files as well.
// This function calls PutBlock/PutBlockList for files larger 256 MBs, and calls PutBlob for any file smaller
fmt.Printf("Uploading the file with blob name: %s\n", fileName)
_, err = azblob.UploadFileToBlockBlob(ctx, file, blobURL, azblob.UploadToBlockBlobOptions{
BlockSize: 4 * 1024 * 1024,
Parallelism: 16})
handleErrors(err)
```
8. And next, we loop through all of the files in the container and print them out on the screen.
```
// List the container that we have created above
fmt.Println("Listing the blobs in the container:")
for marker := (azblob.Marker{}); marker.NotDone(); {
// Get a result segment starting with the blob indicated by the current Marker.
listBlob, err := containerURL.ListBlobsFlatSegment(ctx, marker, azblob.ListBlobsSegmentOptions{})
handleErrors(err)
// ListBlobs returns the start of the next segment; you MUST use this to get
// the next segment (after processing the current result segment).
marker = listBlob.NextMarker
// Process the blobs returned in this result segment (if the segment is empty, the loop body won't execute)
for _, blobInfo := range listBlob.Segment.BlobItems {
fmt.Print(" Blob name: " + blobInfo.Name + "\n")
}
}
```
9. And the final piece of code waits for the user to press ENTER and cleans everything up
```
// Cleaning up the quick start by deleting the container and the file created locally
fmt.Printf("Press enter key to delete the sample files, example container, and exit the application.\n")
bufio.NewReader(os.Stdin).ReadBytes('\n')
fmt.Printf("Cleaning up.\n")
containerURL.Delete(ctx, azblob.ContainerAccessConditions{})
file.Close()
os.Remove(fileName)
```
That's it. Now we can test it out. You can run the code from VSCode, but also from the command line. To run it, you use:
```
go run AzureBlobTest.go
```
And the output looks like this:
<img :src="$withBase('/files/RunningGoResult.png')">
(Running the Go application)
If you don't press ENTER, you can now see the results of the application in Azure.
1. Go to the Azure portal and navigate to the Azure Storage Account
2. Click on **Blobs**
3. Now you'll see the container. Click on it and click on the blob within it. When you now click **Edit Blob**, you can see the content of the Blob.
<img :src="$withBase('/files/BlobInAzurePortal.png')">
(Azure Storage Blob in the Azure portal)
4. You can now go back to the command prompt and press **ENTER**. This wil delete the Azure Storage container and the blob in it.
##### Conclusion and where to find the source code
That's it! As you can see, it is relatively easy to use Azure with Go. You can really use almost any language to work with Azure and to create apps for Azure, now including Go.
Besides working with Azure services, you can also create Go applications that run in Azure. You can for instance compile a Go application by executing the **Go build filename.go** command to get an executable file that contains everything it needs to run. And you can deploy that executable to run in a container, in App Service, in Azure Service Fabric or wherever you like.
And you can find the complete source code of the application that we've built, in [this GitHub repository](https://github.com/Azure-Samples/storage-blobs-go-quickstart).

180
blog/blog/tip194.md Normal file
Просмотреть файл

@ -0,0 +1,180 @@
---
type: post
title: "Tip 194 - Azure Automation with Windows Machine with PowerShell"
excerpt: "Learn how to easily automate things in Azure using Azure Automation"
tags: [Management and Governance, Languages & Frameworks]
share: true
date: 2019-04-21 02:00:00
---
::: tip
:bulb: Learn more : [An introduction to Azure Automation](https://docs.microsoft.com/azure/automation/automation-intro?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use Azure Automation with PowerShell](https://www.youtube.com/watch?v=pQ9dQ13B2vM&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=50?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Azure Automation with Windows Machine with PowerShell
As a developer, when I do something more than once, I want to automate it. [Azure Automation](https://docs.microsoft.com/azure/automation/automation-intro?WT.mc_id=docs-azuredevtips-azureappsdev) makes this very easy for most common IT tasks, like scaling Azure SQL DBs up and down and starting and stopping VMs on a schedule.
Azure Automation is a very mature service in Azure that let's you do anything that you can think of in Azure and also on-premises in hybrid scenarios.
Basically, Azure Automation is serverless. You don't have to worry about how it runs and works, you just tell it what to do and use it. And you only [pay for it](https://azure.microsoft.com/pricing/details/automation?WT.mc_id=azure-azuredevtips-azureappsdev) when it does something.
Azure Automation provides lots of features out-of-the-box, which range from tracking changes in the configuration of VMs, to managing VM Operating System updates to starting and stopping VMs on a schedule.
<img :src="$withBase('/files/AutomationFeatures.png')">
(Azure Automation features in the Azure portal menu)
You can also customize the job that you want Azure Automation to do for you. To do that, you can create a **Runbook**. You can create one from scratch or pick one from the [Runbook gallery](https://gallery.technet.microsoft.com/scriptcenter/site/search?f[0].Type=RootCategory&f[0].Value=WindowsAzure&f[1].Type=SubCategory&f[1].Value=WindowsAzure_automation&f[1].Text=Automation). Runbooks can contain PowerShell **Modules** that make up the functionality in the runbook. As with runbooks, you can create modules yourself or select them from the [PowerShell gallery](https://www.powershellgallery.com')">. Runbooks can also contain Python code and Python packages instead of PowerShell modules.
All of this runs in an Azure Automation account. So the hierarchy of components looks like this:
* Azure Automation account
* Runbook
* Module 1
* Module 2
* Module 3
##### Scaling Azure VMs using a custom Runbook
Let's create a custom runbook that allows us to scale the VMs in our Azure subscription.
We'll start by making sure that we have an Azure Automation Account.
1. In the Azure portal, find **Azure Automation**. You can do this by searching for it in the searchbar at the top of the portal
2. If you don't have an Azure Automation Account yet, you need to create one by clicking on the button that's in the image below:
<img :src="$withBase('/files/CreateAutomationAccount.png')">
(Create an Azure Automation Account in the Azure portal)
3. Now, fill out the creation wizard and create the account.
4. In the menu of the Azure Automation account, we are going to make sure that we are running the latest version of the PowerShell commandlets. Click on the **Modules** menu item. And now, click on **Update Azure Modules**. Updating the modules can take a while, but this ensures that we are running the latest versions of the PowerShell commands
<img :src="$withBase('/files/UpdatePowerShellModules.png')">
(Update PowerShell modules in the Azure portal)
1. Now to create the runbook that will contain PowerShell. Click on the **Runbooks** menu item and click on **Add a runbook**. This opens up the runbook creation blade
2. Pick a **Name** for the runbook and select **PowerShell** as the runbook type and click **Create**
<img :src="$withBase('/files/CreateARunbook.png')">
(Create a runbook in the Azure portal)
7. We will now create the PowerShell that does the actual work. We'll start by declaring some variables
```powershell
param(
[parameter(Mandatory=$false)]
[bool]$scaleUp = $false,
[parameter(Mandatory=$false)]
[string]$ScaleUpSize = "Standard_D2_V2",
[parameter(Mandatory=$false)]
[string]$ScaleDownSize = "Standard_B1ms"
)
```
The first one is going to determine if we scale up or down.
The second and third variables indicate to which tier we should scale up or down. For instance to "Standard_D2_V2".
8. Next, we add PowerShell script to log into Azure. We do that using the AzureRunAsConnection that Azure Automation has created for us automatically
```powershell
try
{
"Logging in to Azure..."
$runAsConnectionProfile = Get-AutomationConnection `
-Name "AzureRunAsConnection"
Add-AzureRmAccount -ServicePrincipal `
-TenantId $runAsConnectionProfile.TenantId `
-ApplicationId $runAsConnectionProfile.ApplicationId `
-CertificateThumbprint ` $runAsConnectionProfile.CertificateThumbprint | Out-Null
Write-Output "Authenticated with Automation Run As Account."
}
```
9. Once we are successfully logged in, we can set variables that determine the tier that we scale to. When **\$ScaleUp** is true, we use the **\$ScaleUpSize** parameter. And when it is false, we use the **\$ScaleDownSize** parameter
```powershell
if($scaleUp){
$ScaleSize= $ScaleUpSize
}
else{
$ScaleSize= $ScaleDownSize
}
```
10. Now, for the meat of the script. In the script below, the following happens
a. First, we get all of the resource groups and loop through them
b. In each resource group, we find all of the VMs and loop through them
c. Next, we get the current size of the VM (so the pricing tier) and compare that against the size that we want to scale to. If the VM is already the same size, we don't scale it
d. Next, we check if the VM is running and stop it if it is running
e. Finally, we update the VM with the new size
```powershell
Function Start-VMAutoScaling{
# a. First, we get all of the resource groups and loop through them
$RGs = Get-AzureRMResourceGroup
foreach($RG in $RGs){
$RGN = $RG.ResourceGroupName
$VMs = Get-AzureRmVM -ResourceGroupName $RGN
foreach ($VM in $VMs){
# b. In each resource group, we find all of the VMs and loop through them
$VMName = $VM.Name
$VMDetail = Get-AzureRmVM -ResourceGroupName $RGN -Name $VMName
$VMSize = $VMDetail.HardwareProfile.VmSize
if(($VMSize -ne $ScaleSize) -and ($ScaleSize)){
# c. Next, we get the current size of the VM (so the pricing tier) and compare that against the size that we want to scale to.
Write-Output "Resource Group: $RGN", ("VM Name: " + $VMName), "Current VM Size: $VMSize", "$scaleTagSwitch : $ScaleSize"
$VMStatus = Get-AzureRmVM -ResourceGroupName $RGN -Name $VMName -Status
# d. Next, we check if the VM is running and stop it if it is running
if($VMStatus.Statuses[1].DisplayStatus -eq "VM running"){
Write-Output "Stopping VM '$VMName'"
Stop-AzureRmVM -ResourceGroupName $RGN -Name $VMName -Force | Out-Null
}
# e. Finally, we update the VM with the new size
$VM.HardwareProfile.VmSize = $ScaleSize
Update-AzureRmVM -VM $VM -ResourceGroupName $RGN
Write-Output "Resized VM '$VMName'" `n
}
else{
Write-Output "VM '$VMName' is exempted from scaling (Currrent VM size matches scaling size)"
}
}
}
}
```
11. Finally, we call the **Start-VMAutoscaling** function
```powershell
############################ Start autoscaling function ####################
Start-VMAutoScaling
Write-Output "VM Scaling Completed"
```
12. The script is now ready. Click **Save** and click on **Test pane** to test it.
13. In the test pane, leave all the parameters as they are and click **Start**. This will look for VMs in your subscription and scale them down as that is the default setting. This will only work if you have VMs in your subscription. After a while, it will succeed and show that in the output of the Test Pane
<img :src="$withBase('/files/TestRunBook.png')">
(Test completed in the Azure portal)
14. Go back to the **runbook edit blade** (you can use the breadcrumb menu in the top to navigate back to it) and click **Publish**. This pushes the runbook to 'production', which means that you can now use it and schedule it.
15. You should be in the blade of the runbook. If not, navigate to it. Here, you can click **Schedule** to set up a schedule to run this workbook on. You can, for instance, schedule the runbook to scale up every morning and scale down every evening. This will save you costs if you don't need to run powerful machines during the night
<img :src="$withBase('/files/CreateSchedule.png')">
(Create a schedule for the runbook in the Azure portal)
##### Conclusion
That's it! Now, we have automated an important task using an Azure Automation PowerShell Runbook that runs on a schedule. Sure, this script should be more sophisticated as it should maybe have more checks and balances and it should be configurable to only scale certain VMs in your subscription. But that's okay as I just wanted to show you how you can go about creating functionality like this. And remember, there are already many pre-made Azure Automation solutions available in the [Runbook gallery](https://gallery.technet.microsoft.com/scriptcenter/site/search?f[0].Type=RootCategory&f[0].Value=WindowsAzure&f[1].Type=SubCategory&f[1].Value=WindowsAzure_automation&f[1].Text=Automation).

112
blog/blog/tip195.md Normal file
Просмотреть файл

@ -0,0 +1,112 @@
---
type: post
title: "Tip 195 - Use Azure Monitor to track custom events"
excerpt: "Learn how to track custom events with Azure Monitor"
tags: [Management and Governance]
share: true
date: 2019-04-28 02:00:00
---
::: tip
:bulb: Learn more : [Azure Monitor Capabilities](https://docs.microsoft.com/azure/azure-monitor/?WT.mc_id=docs-azuredevtips-azureappsdev).
:tv: Watch the video : [How to use Azure Monitor Insights to record custom events](https://www.youtube.com/watch?v=iTRILNstmFI&list=PLLasX02E8BPCNCK8Thcxu-Y-XcBUbhFWC&index=50&t=1s?WT.mc_id=youtube-azuredevtips-azureappsdev).
:::
### Use Azure Monitor to track custom events
When you enable [Insights](https://docs.microsoft.com/azure/application-insights/app-insights-overview?WT.mc_id=docs-azuredevtips-azureappsdev) in your application, you automatically start tracking application usage, performance, failures and more. This is great information that helps you to keep your application up and running and preforming well.
But sometimes you need more information. For instance, when you've just released a new feature, you want to know if users are using it and if that works for them.
Insight scan help you to track this information as well. You do that by talking to the Insights API that you can access through the SDK.
> NOTE: The following examples demonstrate adding the Insights SDK for .NET, but versions are also available for Node.js and Java
If you don't have the Insights SDK in your application yet, you can add it easily from within Visual Studio. Just **right-click you project > Add > Insights Telemetry** and go through the wizard.
<img :src="$withBase('/files/AddAppInsightsSDK.png')">
(Add the Insights SDK)
If you do have the SDK already, make sure that you've updated it to the latest version. You can do this in the NuGet Package Manager.
<img :src="$withBase('/files/UpdateAppInsightsSDKNuget.png')">
(Update the Insights SDK)
Let's take a look at how we can track some custom events with Application Insights.
##### Log custom server-side events using the Insights C# SDK
Logging custom events is very simple when you use the SDK.
The first thing that you need to do is get a reference to the telemetry client, like this:
```csharp
private TelemetryClient telemetry = new TelemetryClient();
```
Now use the client to log events. There are several methods that you can use to log custom data. For things that happened, the **TrackEvent** call works best. [Other methods](https://docs.microsoft.com/azure/application-insights/app-insights-api-custom-events-metrics?WT.mc_id=docs-azuredevtips-azureappsdev) that you can use are **TrackException**, **TrackRequest** and **TrackDependency**, amongst others.
In my case, I want to track the search terms that people use in my application. I can do that like this:
```csharp
var dictionary = new Dictionary<string, string>();
dictionary.Add("term", searchterm);
telemetry.TrackEvent("Searched", dictionary);
```
You provide the TrackEvent method with a dictionary. Because of this, you can provide it with a whole list of key/value pairs.
##### Log custom client-side events using the Insights JavaScript SDK
I also want to track events that happen on the client. In JavaScript, I execute a method when a user clicks on a certain div element. I want to know which div this is.
To track events on the client, you use the Insights JavaScript SDK. This is automatically injected for you on each page when you enable Application Insights.
In JavaScript, I just call **trackEvent** on the **appInsights** object to track the event.
```javascript
var dictionary = { "service": serviceName };
window.appInsights.trackEvent("ServiceClick", dictionary);
```
I can do this, without first getting a reference to the Insights client, because the client is already injected for me.
And also here, I pass the **trackEvent** method a dictionary with values.
##### Analyzing custom events
Now that we are logging custom events, we can also see them in Insights in the Azure portal.
In the portal, in your Insights instance, you can see the events in the **Events** menu.
<img :src="$withBase('/files/CustomEventsInPortal.png')">
(Insights Events in the Azure portal)
In the events page, you can see an aggregate table of which custom events happened and how many times they happened.
From here, you can drill down into the events to take a closer look at their data. You can see each individual event and see which data is associated with the event.
<img :src="$withBase('/files/DataFromCustomEvent.png')">
(Custom event data in the Azure portal)
If you want to dive in deeper, you can query all of your data, including the custom events, using the Insights Analytics portal.
<img :src="$withBase('/files/AppInsightsAnalyticsButton.png')">
(The Analytics button in Application Insights)
In here, you can query the data. This is very powerful and allows you to display the results as data or even render it in a chart. You can query the data using a SQL-like language, which is pretty simple to use and there's good documentation on the query language. The portal provides you with all of the fields that you can query on and you can use common constructs like "where" clauses in your queries.
<img :src="$withBase('/files/AppInsightsAnalytics.png')">
(Query custom events in Insights Analytics)
##### Conclusion
Logging custom data with Insights is very easy. You can track things that happen and you can also track metrics, like the http queue length performance metric. Insights in the Azure portal allows you to easily analyze this custom data and even render charts from it.
As all of this is relatively easy, you owe it to yourself to track what your users are doing. That's the only way to know if your application is actually being used in the way you thought it would. Go and check it out!

Некоторые файлы не были показаны из-за слишком большого количества измененных файлов Показать больше