Merge pull request #950 from Azure/dicolanl-32

AWS Lambda Function
This commit is contained in:
dicolanl 2020-08-07 18:50:22 -07:00 коммит произвёл GitHub
Родитель be4a31ec12 ab4656fed4
Коммит 8f9d46c6b1
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
11 изменённых файлов: 306 добавлений и 0 удалений

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture1.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 78 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture2.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 50 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture3.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 78 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture4.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 57 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture5.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 73 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture6.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 109 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture7.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 148 KiB

Двоичные данные
DataConnectors/S3-Lambda/Graphics/Picture8.png Normal file

Двоичный файл не отображается.

После

Ширина:  |  Высота:  |  Размер: 116 KiB

Просмотреть файл

@ -0,0 +1,55 @@
# AWS Lambda Function to import S3 Files to Azure Sentinel
This Lambda function is desgined to read S3 Files on upload and send them to Azure Sentinel using the Log Analytics API.
## Deployment
### Machine Setup
To deploy this, you will need a machine prepared with the following:
- PowerShell Core – I recommend PowerShell 7 found here
- .Net Core 3.1 SDK found here
- AWSLambdaPSCore module – You can install this either from the PowerShell Gallery, or you can install it by using the following PowerShell Core shell command:
```powershell
Install-Module AWSLambdaPSCore -Scope CurrentUser
```
See the documentation here https://docs.aws.amazon.com/lambda/latest/dg/powershell-devenv.html
I recommend you review https://docs.aws.amazon.com/lambda/latest/dg/powershell-package.html to review the cmdlets that are part of AWSLambdaPSCore.
### Create AWS Role
The Lambda function will need an execution role defined that grants access to the S3 bucket and CloudWatch logs. To create an execution role:
1. Open the [roles](https://console.aws.amazon.com/iam/home#/roles) page in the IAM console.
2. Choose Create role.
3. Create a role with the following properties.
- Trusted entity – AWS Lambda.
- Permissions – AWSLambdaExecute.
- Role name – lambda-s3-role.
The AWSLambdaExecute policy has the permissions that the function needs to manage objects in Amazon S3 and write logs to CloudWatch Logs. Copy the arn of the role created as you will need it for the next step.
### Create the Lambda Function
To deploy the PowerShell script, you can create a Package (zip file) to upload to the AWS console or you can use the Publish-AWSPowerShell cmdlet.
```powershell
Publish-AWSPowerShellLambda -Name YourLambdaNameHere -ScriptPath <path>/S3Event.ps1 -Region <region> -IAMRoleArn <arn of role created earlier> -ProfileName <profile>
```
You might need –ProfileName if your configuration of .aws/credentials file doesn't contain a default. See this document for information on setting up your AWS credentials.
### Edit the Lambda Trigger
1. Once created, login to the AWS console. In Find services, search for Lambda. Click on Lambda.
![Picture1](./Graphics/Picture1.png)
2. Click on the lambda function name you used with the cmdlet.Click Add Trigger
![Picture2](./Graphics/Picture2.png)
3. Select S3. Select the bucket. Acknowledge the warning at the bottom. Click Add.
![Picture3](./Graphics/Picture3.png)
4. Your lambda function is ready to send data to Log Analytics.
### Test the function
1. To test your function, upload a support file to the S3 bucket defined in the trigger.
![Picture4](./Graphics/Picture4.png)
2. To see the logs, go the Lambda function. Click Monitoring tab. Click view logs in CloudWatch.
![Pciture5](./Graphics/Picture5.png)
3. In CloudWatch, you will see each log stream from the runs. Select the latest.
![Picture6](./Graphics/Picture6.png)
4. Here you can see anything from the script from the Write-Host cmdlet.
![Picture7](./Graphics/Picture7.png)
5. Go to portal.azure.com and verify your data is in the custom log.
![Picture8](./Graphics/Picture8.png)

Просмотреть файл

@ -0,0 +1,251 @@
# PowerShell script file to be executed as a AWS Lambda function.
#
# When executing in Lambda the following variables will be predefined.
# $LambdaInput - A PSObject that contains the Lambda function input data.
# $LambdaContext - An Amazon.Lambda.Core.ILambdaContext object that contains information about the currently running Lambda environment.
#
# The last item in the PowerShell pipeline will be returned as the result of the Lambda function.
#
# To include PowerShell modules with your Lambda function, like the AWS.Tools.S3 module, add a "#Requires" statement
# indicating the module and version. If using an AWS.Tools.* module the AWS.Tools.Common module is also required.
#
# The following link contains documentation describing the structure of the S3 event object.
# https://docs.aws.amazon.com/AmazonS3/latest/dev/notification-content-structure.html
#Requires -Modules @{ModuleName='AWS.Tools.Common';ModuleVersion='4.0.6.0'}
#Requires -Modules @{ModuleName='AWS.Tools.S3';ModuleVersion='4.0.6.0'}
# Uncomment to send the input event to CloudWatch Logs
# Write-Host (ConvertTo-Json -InputObject $LambdaInput -Compress -Depth 5)
foreach ($record in $LambdaInput.Records) {
#Credit to Travis Roberts for the function from https://github.com/tsrob50/LogAnalyticsAPIFunction/blob/master/Write-OMSLogfile.ps1
function Write-OMSLogfile {
<#
.SYNOPSIS
Inputs a hashtable, date and workspace type and writes it to a Log Analytics Workspace.
.DESCRIPTION
Given a value pair hash table, this function will write the data to an OMS Log Analytics workspace.
Certain variables, such as Customer ID and Shared Key are specific to the OMS workspace data is being written to.
This function will not write to multiple OMS workspaces. Build-signature and post-analytics function from Microsoft documentation
at https://docs.microsoft.com/azure/log-analytics/log-analytics-data-collector-api
.PARAMETER DateTime
date and time for the log. DateTime value
.PARAMETER Type
Name of the logfile or Log Analytics "Type". Log Analytics will append _CL at the end of custom logs String Value
.PARAMETER LogData
A series of key, value pairs that will be written to the log. Log file are unstructured but the key should be consistent
withing each source.
.INPUTS
The parameters of data and time, type and logdata. Logdata is converted to JSON to submit to Log Analytics.
.OUTPUTS
The Function will return the HTTP status code from the Post method. Status code 200 indicates the request was received.
.NOTES
Version: 2.0
Author: Travis Roberts
Creation Date: 7/9/2018
Purpose/Change: Crating a stand alone function.
.EXAMPLE
This Example will log data to the "LoggingTest" Log Analytics table
$type = 'LoggingTest'
$dateTime = Get-Date
$data = @{
ErrorText = 'This is a test message'
ErrorNumber = 1985
}
$returnCode = Write-OMSLogfile $dateTime $type $data -Verbose
write-output $returnCode
#>
[cmdletbinding()]
Param(
[Parameter(Mandatory = $true, Position = 0)]
[datetime]$dateTime,
[parameter(Mandatory = $true, Position = 1)]
[string]$type,
[Parameter(Mandatory = $true, Position = 2)]
[psobject]$logdata,
[Parameter(Mandatory = $true, Position = 3)]
[string]$CustomerID,
[Parameter(Mandatory = $true, Position = 4)]
[string]$SharedKey
)
Write-Verbose -Message "DateTime: $dateTime"
Write-Verbose -Message ('DateTimeKind:' + $dateTime.kind)
Write-Verbose -Message "Type: $type"
write-Verbose -Message "LogData: $logdata"
#region Workspace ID and Key
# Workspace ID for the workspace
#$CustomerID = 'ENTER WORKSPACE ID HERE'
# Shared key needs to be set for environment
# Below uses an encrypted variable from Azure Automation
# Uncomment the next two lines if using Azure Automation Variable and comment the last
# $automationVarName = 'Enter Variable Name Here'
# $sharedKey = Get-AutomationVariable -name $automationVarName
# Key Vault is another secure option for storing the value
# Less secure option is to put the key in the code
#$SharedKey = 'ENTER WORKSPACE KEY HERE'
#endregion
# Supporting Functions
# Function to create the auth signature
function Build-signature ($CustomerID, $SharedKey, $Date, $ContentLength, $method, $ContentType, $resource) {
$xheaders = 'x-ms-date:' + $Date
$stringToHash = $method + "`n" + $contentLength + "`n" + $contentType + "`n" + $xHeaders + "`n" + $resource
$bytesToHash = [text.Encoding]::UTF8.GetBytes($stringToHash)
$keyBytes = [Convert]::FromBase64String($SharedKey)
$sha256 = New-Object System.Security.Cryptography.HMACSHA256
$sha256.key = $keyBytes
$calculateHash = $sha256.ComputeHash($bytesToHash)
$encodeHash = [convert]::ToBase64String($calculateHash)
$authorization = 'SharedKey {0}:{1}' -f $CustomerID, $encodeHash
return $authorization
}
# Function to create and post the request
Function Post-LogAnalyticsData ($CustomerID, $SharedKey, $Body, $Type) {
$method = "POST"
$ContentType = 'application/json'
$resource = '/api/logs'
$rfc1123date = ($dateTime).ToString('r')
$ContentLength = $Body.Length
$signature = Build-signature `
-customerId $CustomerID `
-sharedKey $SharedKey `
-date $rfc1123date `
-contentLength $ContentLength `
-method $method `
-contentType $ContentType `
-resource $resource
$uri = "https://" + $customerId + ".ods.opinsights.azure.com" + $resource + "?api-version=2016-04-01"
$headers = @{
"Authorization" = $signature;
"Log-Type" = $type;
"x-ms-date" = $rfc1123date
"time-generated-field" = $dateTime
}
$response = Invoke-WebRequest -Uri $uri -Method $method -ContentType $ContentType -Headers $headers -Body $body -UseBasicParsing
Write-Verbose -message ('Post Function Return Code ' + $response.statuscode)
return $response.statuscode
}
# Check if time is UTC, Convert to UTC if not.
# $dateTime = (Get-Date)
if ($dateTime.kind.tostring() -ne 'Utc') {
$dateTime = $dateTime.ToUniversalTime()
Write-Verbose -Message $dateTime
}
# Add DateTime to hashtable
#$logdata.add("DateTime", $dateTime)
$logdata | Add-Member -MemberType NoteProperty -Name "DateTime" -Value $dateTime
#Build the JSON file
$logMessage = ConvertTo-Json $logdata -Depth 20
Write-Verbose -Message $logMessage
#Submit the data
$returnCode = Post-LogAnalyticsData -CustomerID $CustomerID -SharedKey $SharedKey -Body ([System.Text.Encoding]::UTF8.GetBytes($logMessage)) -Type $type
Write-Verbose -Message "Post Statement Return Code $returnCode"
return $returnCode
}
$bucket = $record.s3.bucket.name
$key = $record.s3.object.key
$workspaceId = "workspaceId"
$workspaceKey = "workspaceKey"
$CustomLogName = "CustomLog"
Write-Host "Processing event for: bucket = $bucket, key = $key"
# TODO: Add logic to handle S3 event record, for example
$obj = Get-S3Object -Bucket $bucket -Key $key
Write-Host "Object $key is $($obj.Size) bytes"
#Download the file
Write-Host "Downloading $key to /tmp/$key"
Read-S3Object -BucketName $bucket -Key $key -File "/tmp/$key"
Write-Host "Downloaded $key to /tmp/$key"
#Determine if CSV or JSON or whatever
$FileName = "/tmp/$key"
if ($fileName -like "*.csv") {
Write-Host "Handling CSV File"
$data = import-csv $filename
}
elseif ($filename -like "*.json") {
Write-Host "Handling JSON File"
$Data = Get-Content $filename | ConvertFrom-Json
}
elseif ($filename -like "*.log") {
Write-Host "Handling Log File"
#Assuming CEF formatted logs
$cefdata = Get-Content $filename
$data = @()
$cefmsg = @{}
foreach ($line in $cefdata) {
if ($line -like "*CEF:*") {
#Write-Host "Handling CEF Data"
$CEFtimegenerated = ($line -split '(?<time>(?:\w+ +){2,3}(?:\d+:){2}\d+|\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.[\w\-\:\+]{3,12})')[1]
#$CEFHost = (($line -split '(?<time>(?:\w+ +){2,3}(?:\d+:){2}\d+|\d{4}-\d{2}-\d{2}T\d{2}:\d{2}:\d{2}.[\w\-\:\+]{3,12})')[2] -split "CEF:")[0]
#$CEFVersion = $line.Split("CEF: ").Split("|")[1]
$CEFDeviceVendor = $line.split("|")[1]
$CEFDeviceProduct = $line.split("|")[2]
$CEFDeviceVersion = $line.split("|")[3]
$CEFDeviceEventClassId = $line.split("|")[4]
$CEFName = $line.split("|")[5]
$CEFSeverity = $line.split("|")[6]
$CEFExtension = $line.split("|")[7] -split '([^=\s]+=(?:[\\]=|[^=])+)(?:\s|$)'
foreach ($extenstion in $CEFExtension) {
if ($extenstion -like "*=*") { $cefmsg += @{$extenstion.Split("=")[0] = $extenstion.Split("=")[1] } }
}
$CEFmsg += @{TimeGenerated = $CEFtimegenerated }
$CEFmsg += @{DeviceVendor = $CEFDeviceVendor }
$CEFmsg += @{DeviceProduct = $CEFDeviceProduct }
$CEFmsg += @{DeviceVersion = $CEFDeviceVersion }
$CEFmsg += @{DeviceEventClassID = $CEFDeviceEventClassId }
$CEFmsg += @{Activity = $CEFName }
$CEFmsg += @{LogSeverity = $CEFSeverity }
$data += $CEFmsg
$cefmsg = @{}
}
}
Write-Host "Finished Handling Log file"
}
else { Write-Host "$filename is not supported yet" }
Write-Host "Number of data records: " ($Data.Count)
#Parsing Function to drop or add or edit fields
#FUTURE
#Test Size; Log A limit is 30MB
$tempdata = @()
$tempDataSize = 0
Write-Host "Checking if upload is over 25MB"
if ((($Data | Convertto-json -depth 20).Length) -gt 25MB) {
Write-Host "Upload needs to be split"
foreach ($record in $data) {
$tempdata += $record
$tempDataSize += ($record | ConvertTo-Json -depth 20).Length
if ($tempDataSize -gt 25MB) {
Write-OMSLogfile -dateTime (Get-Date) -type $CustomLogName -logdata $tempdata -CustomerID $workspaceId -SharedKey $workspaceKey
write-Host "Sending data = $TempDataSize"
$tempdata = $null
$tempdata = @()
$tempDataSize = 0
}
}
Write-Host "Sending left over data = $Tempdatasize"
Write-OMSLogfile -dateTime (Get-Date) -type $CustomLogName -logdata $tempdata -CustomerID $workspaceId -SharedKey $workspaceKey
}
Else {
#Send to Log A as is
Write-Host "Upload does not need to be split, sending to Log A"
Write-OMSLogfile -dateTime (Get-Date) -type $CustomLogName -logdata $Data -CustomerID $workspaceId -SharedKey $workspaceKey
}
Remove-Item $FileName -Force
}

Двоичные данные
DataConnectors/S3-Lambda/S3toSentinel.zip Normal file

Двоичный файл не отображается.