Azure-Sentinel/DataConnectors
v-admahe cbbac2aaad
ESET Enterprise Inspector update package url (#1479)
updating package url (WEBSITE_RUN_FROM_PACKAGE) ,pointing to master
2020-12-16 18:32:36 -08:00
..
Agari PFI25 - Agari data connector fixes (#1315) 2020-11-23 10:39:25 -08:00
AkamaiSecurityEvents Akamai Security Events Data Connector (#1375) 2020-12-04 11:18:30 -08:00
ApacheHTTPServer Apache HTTP Server Data Connector (#1373) 2020-12-08 17:32:56 -08:00
Aruba ClearPass Update Connector_Syslog_ArubaClearPass.json (#1444) 2020-12-11 13:12:40 -08:00
AzureStorage Moved location 2020-04-17 09:57:46 +01:00
CEF fix spacing 2020-12-16 13:43:39 +02:00
CEF-VMSS Update VMSS recommended Size 2020-07-21 08:37:43 -07:00
CIsco Meraki update parser link (#1462) 2020-12-14 10:27:58 -08:00
Cisco UCS ACN_CD_CiscoUCS_Connector01 (#1281) 2020-11-13 15:03:22 -08:00
CiscoFirepower Cisco firepower e streamer cef (#1239) 2020-11-24 17:45:50 -08:00
CiscoISE Cisco ISE Data Connector (#1374) 2020-12-04 11:14:52 -08:00
CiscoUmbrella cisco umbrella data conn - updated links (#1338) 2020-11-19 09:03:02 -08:00
CyclancePROTECT Update Connector_Syslog_CylancePROTECT.json (#1439) 2020-12-10 16:11:59 -08:00
ESET Enterprise Inspector ESET Enterprise Inspector update package url (#1479) 2020-12-16 18:32:36 -08:00
Fluentd-VMSS Update VMSS recommended Size 2020-07-21 08:37:43 -07:00
GitHub Update azuredeploy.json GitHubVulnAlerts App 2020-10-08 19:39:03 -04:00
GithubFunction Updated Default timer schedule to 10 min 2020-12-15 16:23:08 -08:00
GoogleWorkspaceReports Update azuredeploy_Connector_GWorkspaceReportsAPI_AzureFunction.json (#1480) 2020-12-16 18:18:15 -08:00
Images image updated 2020-06-08 12:06:13 -07:00
Imperva WAF Gateway Removing spaces from id 2020-12-10 21:06:09 -08:00
Infoblox NIOS Update Connector_Syslog_Infoblox.json 2020-06-19 14:19:37 -07:00
JSON-Import/dotnet_loganalytics_json_import Added an MIT header 2020-04-02 13:11:54 +01:00
Juniper SRX ACN_CD_JuniperSRX_DataConnector01 (#1324) 2020-11-20 16:40:07 -08:00
Logstash-VMSS Fix type in Logstash VMSS cloud init files (#1116) 2020-10-06 16:02:18 -07:00
Netskope ACN_CD_Netskope_Connector04 (#1379) 2020-12-01 14:34:00 -08:00
O365 Data Zoom Queries 2020-04-24 15:21:32 -07:00
O365 DataCSharp Removed Comments from JSON File. 2020-04-23 10:04:59 +05:30
Okta Single Sign-On removed unnecessary changes 2020-11-12 12:35:48 -08:00
OneLogin Update azuredeploy.json (#1475) 2020-12-16 18:25:09 -08:00
Palo Alto Networks Palo Alto data connector fixes (#1201) 2020-11-16 13:21:17 -08:00
Proofpoint TAP Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
ProofpointPOD Update azuredeploy_Connector_Proofpoint_AzureFunction.json (#1476) 2020-12-16 18:19:15 -08:00
Pulse Secure Connect Add missing title property for 2nd instruction step (#814) 2020-07-01 13:15:26 -07:00
Qualys KB Qualys KB : Updated the WEBSITE_RUN_FROM_PACKAGE url to point to zip from Master branch (#1483) 2020-12-16 18:15:16 -08:00
Qualys VM support for special char in username/password (#1411) 2020-12-04 16:35:47 -08:00
S3-Lambda Updates per customer feedback 2020-09-10 16:21:31 -07:00
SalesforceServiceCloud SalesforceServiceCloud : Updated the WEBSITE_RUN_FROM_PACKAGE url to point to zip from Master branch (#1481) 2020-12-16 18:17:15 -08:00
SonicWall Adding SonicWall CEF connector for Azure Sentinel (#1397) 2020-12-03 15:02:45 -08:00
Sophos Cloud Optix Sophos cloud optix (#1391) 2020-12-08 08:27:37 -08:00
Sophos XG Firewall fix sophos parser link (#860) 2020-07-15 09:30:51 -07:00
Squid Proxy Squid Proxy Connector (#1231) 2020-11-10 17:08:31 -08:00
Symantec DLP ACN_CD_SymantecDLP_Connector01 (#1362) 2020-12-04 16:30:41 -08:00
Symantec Endpoint Protection ACN_CD_SymantecEndpointProtection_Connector01 (#1387) 2020-12-07 21:37:09 -08:00
Symantec VIP remove locale param from symantec VIP connector link (#811) 2020-06-30 13:50:34 -07:00
SymantecProxySG Update Connector_Syslog_SymantecProxySG.json 2020-06-19 14:16:57 -07:00
Templates Unique Function Name using ResourceGroup id (#1366) 2020-11-25 09:35:59 -08:00
Trend Micro Trend Micro : Updated the WEBSITE_RUN_FROM_PACKAGE url to point to zip from Master branch (#1482) 2020-12-16 18:16:22 -08:00
VMware Carbon Black Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
VMware ESXi ACN_CD_VMwareESXi_DataConnector01 (#1309) 2020-11-17 15:22:51 -08:00
Zoom Zoom update package url (#1477) 2020-12-16 18:24:15 -08:00
microsoft-logstash-output-azure-loganalytics Merge branch 'master' into update-plugin-version-v2 2020-08-25 18:45:05 +03:00
AIVectraDetect.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
AlsidForAD.json Removed parser in samples queries 2020-12-07 15:33:19 +01:00
BETTERMTD.json Expanded connectivityCriterias for data connector to include all data types 2020-11-04 13:23:41 +03:00
Beyond Security beSECURE.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
Citrix_WAF.json Update Citrix_WAF.json 2020-08-21 06:58:29 -07:00
Connector_syslog_WatchGuardFirebox.json WatchGuardFirebox Connector update (#1365) 2020-12-04 11:35:27 -08:00
CyberArk Data Connector.json Palo Alto data connector fixes (#1201) 2020-11-16 13:21:17 -08:00
Darktrace.json Darktrace connector (#1359) 2020-12-11 17:56:52 -08:00
FORCEPOINT_NGFW.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
Forcepoint CASB.json Palo Alto data connector fixes (#1201) 2020-11-16 13:21:17 -08:00
Forcepoint DLP.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
ForgeRock_CEF.json Update ForgeRock_CEF.json (#1059) 2020-10-06 16:57:39 -07:00
NXLogDnsLogs.json NXLog LinuxAudit data connector: Initial Commit (#1280) 2020-12-01 23:44:20 -08:00
NXLogLinuxAudit.json NXLog LinuxAudit data connector: Initial Commit (#1280) 2020-12-01 23:44:20 -08:00
OnapsisPlatform.json Adding Onapsis Connector and Workbook (#1303) 2020-11-19 20:24:52 -08:00
OrcaSecurityAlerts.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
Perimeter81ActivityLogs.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
ReadMe.md Update ReadMe.md 2020-12-11 14:02:08 -08:00
SquadraTechnologies.SecRMM.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
SquadraTechnologiesLogo.svg add logo file 2020-02-12 17:27:06 -08:00
ThycoticSecretServer_CEF.json Thycotic (#1144) 2020-11-13 15:15:49 -08:00
TrendMicroDeepSecurity.json Trend Micro Deep Security (#1127) 2020-10-30 16:40:46 -07:00
TrendMicroTippingPoint.json Update TrendMicroTippingPoint.json 2020-10-30 17:20:44 -05:00
WireXsystemsNFP(1b).json Update CEF agent installation command 2020-10-29 11:07:32 -07:00
Zimperium MTD Alerts.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
alcide_kaudit.json Add keys permissions for connectors (#1007) 2020-09-10 12:14:29 -07:00
esetSmc.json Change third link to point to 7.2 2020-09-30 11:15:40 +02:00
illusive Attack Management System.json Palo Alto data connector fixes (#1201) 2020-11-16 13:21:17 -08:00
template_BarracudaCloudFirewall.JSON Barracuda CloudGen Firewall data connector updates 2020-06-05 17:30:54 -07:00

ReadMe.md

Guide to Building Azure Sentinel Data Experiences

This guide provides an overview of the different data connectivity options providers can enable in Azure Sentinel for customers with specific focus on build, validation steps and publishing process. Furthermore, the document covers technical details on opportunities to enable richer Azure Sentinel experiences.

Bring your data in Azure Sentinel

Bring your data to Azure Sentinel

  1. Choose the data connector type – Provider decides on the connector type planned to be built, depending on the planned customer experience and the existing ingestion mechanism(s) supported by the provider data source
  2. Send data to Azure Sentinel – Provider follows the specific steps for the applicable data connector to establish the pipeline setup as POC, validate and see the data flow in Azure Sentinel
  3. Build the connector – Provider builds the connector using templates and guidance, validates and submits the data connector with query samples and documentation
  4. Validate and sign off in production – Microsoft will deploy the connector after which provider validates the connector from an E2E customer standpoint in production under limited view before customers can discover the connector
  5. Connector ships in public preview – After provider signs off, Microsoft switches the connector to public preview - customers can discover the data connector in the gallery and try it

Evolve your data experience

Provider can build workbooks, analytic data templates, hunting queries, investigation graph queries, logic app connectors, playbooks and more for an enhanced customer experience while using provider's data connector in Azure Sentinel as illustrated below. Refer to details in evolve the data experience.

Evolve the data experience

Bring your data in Azure Sentinel

Choose the data connector type

There are three types of data connectors providers can build to stream their data into Azure Sentinel.

The following table lists these and provides a high-level overview to help providers decide.

Sentinel Log Ingestion Format Customer Experience Why choose?
CEF (Preferred)
  • Log information is automatically ingested into standard CEF schema.
  • KQL Queries use strongly typed and well-known CEF schema.
  • Little or no additional parsing required by your customers
  • Your data will be meaningful to many queries.
  • Multi-step configuration - Customer needs to set up a machine / Azure VM to run an agent to push logs into Sentinel
CEF results in the best query and analytics experience in Sentinel as it will appear as the well know CEF (CommonSecurityLog) schema as columns in the Sentinel Log tables.
REST API
  • Log information is automatically ingested into custom tables with your schema.
  • Custom queries required to use your data.
  • Customer must learn your schema.
  • Simple configuration - Customer does not need to set up a machine / Azure VM to run the agent
When you have data that does not conform to CEF or RAW Syslog formats you can create custom tables.

You want strict control over schema mapping and column names in Sentinel tables on how you present your data.

Syslog (Least preferred)
  • RAW Syslog information is automatically ingested into simple log schema with a simple string.
  • Queries are more complex as customers will need to parse the syslog messages using KQL Functions.
  • Multi-step configuration - Customer needs to set up a machine / Azure VM to run an agent to push logs into Sentinel
You only can emit RAW Syslog at this point.

Send Data to Azure Sentinel

Once you have decided on the type of data connector you plan to support, set the pipeline to send this data to Azure Sentinel as a POC before building the connector. The process is described for each data connector type. Once you have a POC, send an email to AzureSentinelPartner@microsoft.com for the POC demo.

REST API Connectors

  1. Use the Azure Monitor Data Collector API to send data to Azure Log Analytics. This blog covers step by step instructions with screenshots to do so. If on prem, open port 443 (HTTPS/TLS) on your environment to talk to Azure Sentinel.
  2. Ensure the schema used for structuring the data in Log Analytics is locked. Any changes to the schema after the data connector is published will have a compatibility impact, hence need to have a new name for the connector data type.
  3. Design a configuration mechanism in your product experience via product settings or via your product website, where your customers can go and enter the following information to send their logs into Log Analytics for Azure Sentinel.
    1. [Required] Azure Sentinel workspace ID
    2. [Required] Azure Sentinel primary key
    3. [Optional] Custom log name
    4. Any other specific dependency that may be needed to successfully establish connectivity
  4. These logs will appear in a Custom Log Analytics table CustomLogs -> <log name> where the log name is what the customer provides in the above-mentioned step. Identify a default log name to handle the scenario where customer does not enter the custom log name.
  5. Design and validate a few key queries that lands the value of the data stream using Kusto Query Language. Share these as sample queries in the data connector.

Example connectors to refer to : Symantec, Barracuda WAF

Connector Validation Steps

  1. Test the actual customer experience and validate if data flows as expected and appears in the expected Azure Sentinel Log Analytics custom table provided.
  2. If on prem, open port 443 (HTTPS/TLS) on your environment to talk to Azure Sentinel. Ensure this is documented in connector documentation (steps in following section) for your customers.
  3. From a data quality perspective,
    1. Ensure the data you send is complete and contains the same fields available in your product.
    2. Ensure the data is valid and easy to query using Log Analytics.

CEF Connector

To enable the CEF connector deploy a dedicated proxy Linux machine (VM or on premises) to support the communication between your security solution (the product that sends the CEF messages) and Azure Sentinel.

Enable the CEF connector as follows:

  1. Go to Azure Sentinel
  2. Open the Data Connectors page and choose the relevant connector and click Open connector page
  3. Follow the CEF instructions below (also in the CEF connector documentation).

1. Install and configure Linux Syslog agent

Install and configure the Linux agent to collect your Common Event Format (CEF) Syslog messages and forward them to Azure Sentinel.

1.1 Select a Linux machine

Select or create a Linux machine that Azure Sentinel will use as the proxy between your security solution and Azure Sentinel this machine can be on your on-prem environment, Azure or other clouds.

1.2 Install the CEF collector on the Linux machine

Install the Microsoft Monitoring Agent on your Linux machine and configure the machine to listen on the necessary port and forward messages to your Azure Sentinel workspace.

Note:

  1. Make sure that you have Python on your machine using the following command:

     python –version
    
  2. You must have elevated permissions (sudo) on your machine

    Run the following command to install and apply the CEF collector:

     sudo wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/CEF/cef\_installer.py&&sudo python cef\_installer.py [WorkspaceID]_ [Workspace Primary Key]
    

2. Forward Common Event Format (CEF) logs to Syslog agent

2.1 Set your security solution to send Syslog messages in CEF format to the proxy machine. This varies from product to product and follow the process for your product. There are couple of ways to choose from pushing your logs

  1. The agent can collect logs from multiple sources but must be installed on dedicated machine per the following diagram collect logs
  2. Alternatively, you can deploy the agent manually on an existing Azure VM, on a VM in another cloud, or on an on-premises machine as shown in the diagram below deploy agent 2.2 Make sure to send the logs to port 514 TCP on the machine's IP address.

2.3 Outline specific steps custom for sending your product logs along with link to your (partner) product documentation on how customers should configure their agent to send CEF logs from the respective product into Azure Sentinel.

Example connectors to refer to : ZScaler

Connector Validation Steps

Follow the instructions to validate your connectivity:

  1. Open Log Analytics to check if the logs are received using the CommonSecurityLog schema. Note: It may take about 20 minutes until the connection streams data to your workspace.

  2. If the logs are not received, run the following connectivity validation script:

    1. Note:
      1. Make sure that you have Python on your machine using the following command:

        python –version

      2. You must have elevated permissions (sudo) on your machine
    2. sudo wget https://raw.githubusercontent.com/Azure/Azure-Sentinel/master/DataConnectors/CEF/cef_troubleshoot.py&&sudo python cef_troubleshoot.py [WorkspaceID]
  3. From a data quality perspective,

    1. Ensure the data you send is complete and contains the same fields available in your product.
    2. Ensure the data is valid and easy to query using Log Analytics.
  4. Design and validate a few key queries that lands the value of the data stream using Kusto Query Language. Share these as sample queries in the data connector.

To use TLS communication between the security solution and the Syslog machine, you will need to configure the Syslog daemon (rsyslog or syslog-ng) to communicate in TLS: Encrypting Syslog Traffic with TLS - rsyslog, Encrypting log messages with TLS – syslog-ng.

Syslog Connector

Note: If your product supports CEF, the connection is more complete and you should choose CEF and follow the instructions in Connecting data from CEF and data connector building steps detailed in the CEF connector section.

  1. Follow the steps outlined in the Connecting data from Syslog to use the Azure Sentinel syslog connector to connect your product.
  2. Set your security solution to send Syslog messages to the proxy machine. This varies from product to product and follow the process for your product.
  3. Outline specific steps custom for sending your product logs along with link to your (partner) product documentation on how customers should configure their agent to send Syslog logs from the respective product into Azure Sentinel.
  4. Design and validate a few key queries that lands the value of the data stream using Kusto Query Language. Share these as sample queries in the data connector.
  5. Build a parser based on Kusto function to ensure the query building experience is easy for customers working with the data connector.

Example connectors to refer to : Barracuda CWF

Connector Validation Steps

Follow the instructions to validate your connectivity:

  1. Open Log Analytics to check if the logs are received using the Syslog schema. Note: It may take about 20 minutes until the connection streams data to your workspace.
  2. From a data quality perspective,
    1. Ensure the data you send is complete and contains the same fields available in your product.
    2. Ensure the data is valid and easily to query using Log Analytics.

Build the connector

Once you have a working POC, you are ready to build, validate the data connector user experience and submit your connector and relevant documentation.

  1. Review the data connector template guidance - This is to help get familiarized with the nomenclature used in the templates and to enable filling out the json template easily.
  2. Use the template - Download the right template for your data connector type from the following, rename the json file to ProviderNameApplianceName.json (no spaces in name) and fill out the template per the guidance mentioned above.
  3. Validate the Connector UX – Follow these steps to render and validate the connector UX you just built
    1. The test utility can be accessed by this URL - https://portal.azure.com/?feature.BringYourOwnConnector=true
    2. Go to Azure Sentinel -> Data Connectors
    3. Click the “import” button and select the json file you created as follows. Import button
    4. The json file you just created is loaded (example as follows) - Validate connector UX by ensuring all links resolve appropriately with no errors (including query links) in both the main and next steps page, check for content accuracy, grammar, typos and formatting. Update the json as needed and reload to revalidate. Validate connector

Note: This json is loaded only in your session and not shared out. The logo wont show up since its not part of the json. Connector logo will be included when Microsoft builds and deploys the data connector.

  1. Prepare sample data for validation and submission – Plan to submit some real-world, sanitized sample data for your connectors that covers all types of logs, events, alerts, etc. depending on the data type. This is the test validation set that can be used to build other contribution types on top of this data connector. The format for this file can be json / csv (json preferred) file with the column names / property names adhering to the data type property names. The data file name needs to be the same name as the data type name. Submit the sample data file via a GitHub PR to the 'Sample data' folder in the right subfolder - CEF / Syslog / Custom depending on the type of data connector.

  2. Submit your data connector - Follow the general contribution guidelines for Azure Sentinel to open a Pull Request (PR) to submit the data connector:

    1. The json file in the 'Connectors' folder
    2. The sample data file in the right subfolder of 'Sample data' folder
    3. The company logo adhering to the following requirements in the 'Logo' folder
      1. Logo needs to be in SVG format and under 5 Kb
      2. Ensure raw file of logo does not have any of the following:
        • cls and style formats
        • embedded png formats
        • xmlns:xlink
        • data-name
      3. Do not use xlink:href - use inline instead
      4. Do not use title tag
      5. If some properties in the logo have IDs (for e.g. <g id="layer0"...), then set these IDs as GUIDs so that these are uniquely identifiable across all Azure logo assets
      6. Logo scales well to fit in a 75 px square
      7. SVG code file is formatted well for readability
    4. For Syslog data connector, the Kusto function parser is in the right subfolder (PROVIDERNAME) of 'Parsers' folder
    5. If you are bringing in detections or hunting queries, requiredDataConnectors section of the YAML template must be populated. Details of what to reference in the YAML template from the connector JSON are in the Query Style Guide under requiredDataConnectors
  3. Prepare and submit your data connector documentation – Besides Azure Sentinel gallery discoverability, the connectors can also be discovered out of product in documentation.

    1. Download one of the following templates depending on the type of data connector and PROVIDER NAME APPLIANCE NAME.md and fill out the template per the guidance mentioned in the template. Replace the guidance in the template with relevant steps.
    2. Validate the md file for formatting and ensure all links resolve appropriately. You can use VS Code or any other editor that supports md file editing.
    1. Once validated, email the md file to AzureSentinelPartner@microsoft.com

Validate and sign off in production

Once the connector is deployed in production, we will share a link for you to privately access your data connector. Validate your connector:

  1. Ensure data flows as expected and the data appears in the expected format in the right Log Analytics table
  2. Ensure sample queries shared with the connector execute as expected and all the other queries that appear in the json file like the graphQueries, dataTypes etc.
  3. Validate connector UX by ensuring all links resolve appropriately with no errors (including query links) in both the main and next steps page, check for content accuracy, grammar, typos, formatting and logo rendering aspects.
  4. If you have Kusto functions included / your sample queries and workbooks take a dependency on certain Kusto function, ensure those work as expected and that dependency is called out in the connector UX (in the Configuration at the beginning and in the next steps section of the connector as a banner)

    Once everything looks as expected, send an email to AzureSentinelPartner@microsoft.com of your sign off to get your connector shipped in public preview.

Connector ships in public preview

Promote your connector to get installs and get customer feedback. Support connector issues reported by the customer. These can be in generic data flow aspects which you can handle on provider side. There may be connector UX issues or queries etc. issues that you can update by doing a PR on the respective file and inform AzureSentinelPartner@microsoft.com for deployment.

Exit criteria for connector GA

Once the data connector is in public preview for at least a month, send an email with the following info to AzureSentinelPartner@microsoft.com to get the connector to GA.

  • The data connector has at least sample queries and workbooks to visualize and use the data effectively in Azure Sentinel.
  • The data connector has at least 10 unique customers
  • No major unresolved customer reported incidents with the data connector in a month after release

Evolve the data experience

Workbooks

Follow the steps to build your workbook and submit your workbook json file, two screenshots of the workbook view one each in white and black background theme settings, logo and entry in the workbooksMetadata.json file by a PR as mentioned in the instructions.

Analytic Rule Templates

Follow the steps to build and submit your analytic rule template or detection pertaining to this data connector. Ensure to fill in the requiredDataConnectors parameter with the right data connector ID(s) to establish relation of this analytic rule template with the data connector.

Logic Apps Connectors

Build logic apps connectors to enable automation capabilities for customers in the following areas:

  1. Incident management – for e.g. assign a ticket to an analyst, keep ticket status in sync, …
  2. Enrichment and Investigation – for e.g. geo lookup for an IP, sending investigation emails, …
  3. Remediation – for e.g. block an IP address, block user access, isolate machine, …
  4. Any other automation capabilities unique to your appliance.

Follow the steps in the Azure Logic Apps building custom connectors documentation to create, certify and ship an Azure Logic App connector. This not only discoverable for Azure Sentinel customers, but also visible in the Azure Logic Apps gallery for Azure Logic Apps and Microsoft Flow customers too. Inform AzureSentinelPartner@microsoft.com if you are thinking of building a custom connector for your security appliance.

Other data experience options

Check out the Azure Sentinel GitHub repo for more information on these.