Страница:
Cloud Deployment On Linux
Страницы
Arm Parameters
Azure deployment
Cloud Deployment On Linux
Cloud Simulator
Cloud deployment
Configuring the Arm template
Create new metric
Creating your first pipeline in 5 minutes!
Data Accelerator with Databricks
Data Accelerator
Data Accumulator
Diagnose issues using Telemetry
FAQ
Find Applications For Your Environment
Home
Inviting others and RBAC
Live query
Local Cloud Debugging
Local Tutorial Add an Alert
Local Tutorial Adding SQL to your flow and outputs to Metrics dashboard
Local Tutorial Advanced Aggregate alerts
Local Tutorial Creating your first Flow in local mode
Local Tutorial Custom schema
Local Tutorial Debugging using Spark logs
Local Tutorial Extending with UDF UDAF custom code
Local Tutorial Outputs to disk
Local Tutorial Reference data
Local Tutorial Scaling the docker host
Local Tutorial Tag Aggregate to metrics
Local Tutorial Tag Rules output to local file
Local create metric
Local mode with Docker
Local running sample
Output data to Azure SQL Database
Run Data Accelerator Flows on Databricks
Scale
Schedule batch job
Set up aggregate alert
Set up new outputs
Set up simple alert
Spark logs
Tagged data flowing to CosmosDB
Tagging aggregate rules
Tagging simple rules
Tutorials
Upgrade Existing Data Accelerator Environment to v1.1
Upgrade Existing Data Accelerator Environment to v1.2
Use Input in different tenant
Windowing functions
functions
readme
reference data
sql query
7
Cloud Deployment On Linux
Kwangje Cho редактировал(а) эту страницу 2019-04-19 02:11:11 -07:00
The ARM deployment for DataX currently support only Windows. We have a plan to support multiplatforms (Linux and Mac) in the next release.
Prerequisite
-
Install Powershell Core https://docs.microsoft.com/en-us/powershell/scripting/install/installing-powershell?view=powershell-6
-
Install Azure CLI https://docs.microsoft.com/en-us/cli/azure/install-azure-cli?view=azure-cli-latest
Deployment Steps
- Open common.parameters.txt under DeploymentCloud/Deployment.DataX, provide TenantId and SubscriptionId.
- Open a shell / command prompt
- Run Powershell Core
- Follow the steps below to set up a DataX environment
Resource Deployment
- Run deployResources.ps1
- After the script is finished, the following steps should be done
- Setup secrets in KVs: Automated
- Run script actions for HDInsights
- Upload all files in DeploymentCloud/Deployment.Common/scripts to a storage account
- Add Script Actions Azure Portal > Resource Group > Select DataX resource group > Select HDInsight cluster > Script Actions > Submit New > Custom > Bash Script URL > Provide the URI of the script you uploaded in step 1
- Setup CosmosDB
- Create Production database and add the following set of collections
- commons
- configgenConfigs
- flows
- sparkClusters
- sparkJobs
- Upload the documents from DeploymentCloud/Deployment.Common/CosmosDB to each collection
- Create Production database and add the following set of collections
- Setup KVAccess
- To each KV, Make sure the following principals are added to the access policies
- The Webapp app
- The service AAD app
- SparkManagedIdentity
- Virtual Machine Scale Set
- To each KV, Make sure the following principals are added to the access policies
- Setup ServiceFabric
- Install SSL certificate on cluster nodes. Please refer to https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-tutorial-dotnet-app-enable-https-endpoint#install-certificate-on-cluster-nodes
- Open port 443 in the Azure load balancer. Please refer to https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-tutorial-dotnet-app-enable-https-endpoint#open-port-443-in-the-azure-load-balancer
App Deployment
- Make sure nuget and mono have been installed locally. Please refer to https://docs.microsoft.com/en-us/nuget/tools/nuget-exe-cli-reference
- Run deployApps.ps1
Sample Deployment
- Make sure nuget and mono have been installed locally. Please refer to https://docs.microsoft.com/en-us/nuget/tools/nuget-exe-cli-reference
- Run deploySample.ps1