Adding overview diagram and minor edits

This commit is contained in:
Xu Rui 2018-08-06 16:48:31 -07:00 коммит произвёл GitHub
Родитель 1ffbed500c
Коммит ecf5d05342
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
1 изменённых файлов: 11 добавлений и 7 удалений

Просмотреть файл

@ -1,18 +1,21 @@
# An overview, how Event Hubs Capture integrates with Event Grid
# How Event Hubs Capture integrates with Event Grid and Azure Function
One of the key scenarios for modern cloud scale apps is seamless integration and notification amongst other apps and services. In this blog post, we introduced [Azure EventGrid](https://azure.microsoft.com/blog/introducing-azure-event-grid-an-event-service-for-modern-applications/) (in public preview), a service designed just for that!
Today, we will go over a realistic scenario of capturing Azure EventHub data into a SQL Database Warehouse and demonstrate the power and simplicity of using [Azure EventGrid](https://docs.microsoft.com/azure/event-grid/overview) to achieve this.
So, fasten your seat belts…
### What is covered in this tutorial:
![Visual Studio](./media/EventGridIntegrationOverview.PNG)
* First, we turn on Capture on Azure Event Hub with Azure blob storage as destination. This generates Azure storage blobs containing EventHub data in Avro format.
* Next, we create an Azure EventGrid subscription with source as the Azure EventHub namespace and the destination as an Azure WebJob Function.
* Whenever a new Avro blob file is generated by Azure EventHub Capture, Azure EventGrid notifies or shoulder-taps the Azure WebJobs Function with info about the blob (the blob path etc.). The Function then does the required processing to dump the data to a SQL Database data warehouse.
That is it! There are no worker services involved in polling for these Avro blobs. This means no management overhead and significantly lower COGS, especially in a cloud-scale production environment!
The sample code for this scenario is here. It consists of a solution with projects that do the following
a. WindTurbineDataGenerator – a simple publisher for wind turbine data, that sends the interested data to your EventHub which has Capture enabled on it
b. FunctionDWDumper – This Azure Functions project receives the EventGrid notification about each Capture Avro blob created. This gets the blobs Uri path, reads its contents and pushes this data to a SQL Database data warehouse.
The sample code for this scenario consists of a solution with projects that do the following:
1. WindTurbineDataGenerator – a simple publisher for wind turbine data, that sends the interested data to your EventHub which has Capture enabled on it
1. FunctionDWDumper – This Azure Functions project receives the EventGrid notification about each Capture Avro blob created. This gets the blobs Uri path, reads its contents and pushes this data to a SQL Database data warehouse.
# Prerequisites
* [Visual studio 2017 Version 15.3.2 or greater](https://www.visualstudio.com/vs/)
@ -21,7 +24,7 @@ b. FunctionDWDumper – This Azure Functions project receives the EventGrid noti
![Visual Studio](./media/EventCaptureGridDemo1.png)
# Detailed steps
**Overview:**
### Overview:
1. Deploy the infrastructure for this solution
2. Create a table in SQL Data Warehouse
3. Publish code to the Functions App
@ -126,7 +129,8 @@ You have now set up your Event Hub, SQL data warehouse, Azure Function App, and
6. Build the solution, then run the WindTurbineGenerator.exe application.
## 6. Observe the Captured data that has been migrated to your SQL Data Warehouse table by the Azure Function
After a couple of minutes, query the table in your data warehouse. Data generated by the WindTurbineDataGenerator is now streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.
After a couple of minutes, query the table in your data warehouse. You will observe that data generated by the WindTurbineDataGenerator is now streamed to your Event Hub, Captured into an Azure Storage container, and then migrated into the SQL data table by Azure Function.
## Next steps
You can use powerful data visualization tools with your data warehouse to achieve your Actionable insights.