зеркало из
1
0
Форкнуть 0
Logstash output for Kusto
Перейти к файлу
vladikbr ce9a65b51c
Update LICENSE
2019-01-22 13:59:12 +02:00
.vscode Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00
ci Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00
lib Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00
spec Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00
.gitignore publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
.rubocop.yml publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
.travis.yml Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00
CHANGELOG.md Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00
CONTRIBUTORS publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
Gemfile publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
LICENSE Update LICENSE 2019-01-22 13:59:12 +02:00
README.md Update README.md (#8) 2019-01-16 14:53:53 +02:00
Rakefile publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
logstash-output-kusto.gemspec Version 0.2.0 (#7) 2019-01-16 14:41:36 +02:00

README.md

Logstash Output Plugin for Azure Data Explorer (Kusto)

master: Build Status dev: Build Status

This is a plugin for Logstash.

It is fully free and open source. The license is Apache 2.0.

This Azure Data Explorer (ADX) Logstash plugin enables you to process events from Logstash into an Azure Data Explorer database for later analysis.

Requirements

Installation

To make the Azure Data Explorer plugin available in your Logstash environment, run the following command:

bin/logstash-plugin install logstash-output-kusto

Configuration

Perform configuration before sending events from Logstash to Azure Data Explorer. The following example shows the minimum you need to provide. It should be enough for most use-cases:

output {
	kusto {
            path => "/tmp/kusto/%{+YYYY-MM-dd-HH-mm}.txt"
            ingest_url => "https://ingest-<cluster-name>.kusto.windows.net/"
            app_id => "<application id>"
            app_key => "<application key/secret>"
            app_tenant => "<tenant id>"
            database => "<database name>"
            table => "<target table>"
            mapping => "<mapping name>"
	}
}

More information about configuring Logstash can be found in the logstash configuration guide

Available Configuration Keys

Parameter Name Description Notes
path The plugin writes events to temporary files before sending them to ADX. This parameter includes a path where files should be written and a time expression for file rotation to trigger an upload to the ADX service. The example above shows how to rotate the files every minute and check the Logstash docs for more information on time expressions. Required
ingest_url The Kusto endpoint for ingestion-related communication. See it on the Azure Portal. Required
app_id, app_key, app_tenant Credentials required to connect to the ADX service. Be sure to use an application with 'ingest' priviledges. Required
database Database name to place events Required
table Target table name to place events Required
mapping Mapping is used to map an incoming event json string into the correct row format (which property goes into which column) Required
recovery If set to true (default), plugin will attempt to resend pre-existing temp files found in the path upon startup
delete_temp_files Determines if temp files will be deleted after a successful upload (true is default; set false for debug purposes only)
flush_interval The time (in seconds) for flushing writes to temporary files. Default is 2 seconds, 0 will flush on every event. Increase this value to reduce IO calls but keep in mind that events in the buffer will be lost in case of abrupt failure.

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, and complaints. Programming is not a required skill. It is more important to the community that you are able to contribute. For more information about contributing, see the CONTRIBUTING file.