зеркало из
1
0
Форкнуть 0
Logstash output for Kusto
Перейти к файлу
AsafMah 9f680fbbcc
Merge branch 'update-java-version' into jsonMapping
2020-12-14 10:53:04 +02:00
.vscode Version 0.4.0 (#26) 2019-05-26 16:20:46 +03:00
ci Update travis to properly build 2020-12-10 08:56:40 +02:00
lib Merge branch 'update-java-version' into jsonMapping 2020-12-14 10:53:04 +02:00
spec -Fixed grammer 2020-12-10 08:51:38 +02:00
.gitignore publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
.rubocop.yml publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
.travis.yml Pr fixes - rename java object, updated logstash version 2020-12-14 08:45:15 +02:00
CHANGELOG.md bring back mapping (in deprecated form) 2020-12-14 08:40:15 +02:00
CONTRIBUTORS publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
Gemfile publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
Jars.lock Update java sdk version 2020-12-07 11:44:09 +02:00
LICENSE Update LICENSE 2019-01-22 13:59:12 +02:00
README.md -Fixed grammer 2020-12-10 08:51:38 +02:00
Rakefile publish version 0.1.6 (#1) 2018-09-20 16:09:28 +03:00
logstash-output-kusto.gemspec Update java sdk version 2020-12-07 11:44:09 +02:00

README.md

Logstash Output Plugin for Azure Data Explorer (Kusto)

Build Status Build Status Gem Gem

This is a plugin for Logstash.

It is fully free and open source. The license is Apache 2.0.

This Azure Data Explorer (ADX) Logstash plugin enables you to process events from Logstash into an Azure Data Explorer database for later analysis.

Requirements

Installation

To make the Azure Data Explorer plugin available in your Logstash environment, run the following command:

bin/logstash-plugin install logstash-output-kusto

Configuration

Perform configuration before sending events from Logstash to Azure Data Explorer. The following example shows the minimum you need to provide. It should be enough for most use-cases:

output {
	kusto {
            path => "/tmp/kusto/%{+YYYY-MM-dd-HH-mm}.txt"
            ingest_url => "https://ingest-<cluster-name>.kusto.windows.net/"
            app_id => "<application id>"
            app_key => "<application key/secret>"
            app_tenant => "<tenant id>"
            database => "<database name>"
            table => "<target table>"
            json_mapping => "<mapping name>"
	}
}

More information about configuring Logstash can be found in the logstash configuration guide

Available Configuration Keys

Parameter Name Description Notes
path The plugin writes events to temporary files before sending them to ADX. This parameter includes a path where files should be written and a time expression for file rotation to trigger an upload to the ADX service. The example above shows how to rotate the files every minute and check the Logstash docs for more information on time expressions. Required
ingest_url The Kusto endpoint for ingestion-related communication. See it on the Azure Portal. Required
app_id, app_key, app_tenant Credentials required to connect to the ADX service. Be sure to use an application with 'ingest' privileges. Required
database Database name to place events Required
table Target table name to place events Required
json_mapping Maps each attribute from incoming event JSON strings to the appropriate column in the table. Note that this must be in JSON format, as this is the interface between Logstash and Kusto Required
recovery If set to true (default), plugin will attempt to resend pre-existing temp files found in the path upon startup
delete_temp_files Determines if temp files will be deleted after a successful upload (true is default; set false for debug purposes only)
flush_interval The time (in seconds) for flushing writes to temporary files. Default is 2 seconds, 0 will flush on every event. Increase this value to reduce IO calls but keep in mind that events in the buffer will be lost in case of abrupt failure.

Development Requirements

To fully build the gem, run:

bundle install
lock_jars
gem build

Contributing

All contributions are welcome: ideas, patches, documentation, bug reports, and complaints. Programming is not a required skill. It is more important to the community that you are able to contribute. For more information about contributing, see the CONTRIBUTING file.