Enabling Continuous Data Processing with Apache Spark and Azure Event Hubs
Перейти к файлу
SJ bda705a13f
Couple of fixes related to configuration and improve logging. (#493)
* several minor fixes
* add a check for operation timeout
2020-04-08 15:13:56 -07:00
.github
core Couple of fixes related to configuration and improve logging. (#493) 2020-04-08 15:13:56 -07:00
docs Update docs and add a Q/A in FAQ (#481) 2020-03-31 15:03:11 -07:00
project
.gitignore
.scalafmt.conf
.travis.yml
FAQ.md Update docs and add a Q/A in FAQ (#481) 2020-03-31 15:03:11 -07:00
LICENSE
README.md
event-hubs_spark.png
pom.xml
run_tests.sh

README.md

Azure Event Hubs + Apache Spark Connector

Azure Event Hubs Connector for Apache Spark

chat on gitter build status star our repo

This is the source code of the Azure Event Hubs Connector for Apache Spark.

Azure Event Hubs is a highly scalable publish-subscribe service that can ingest millions of events per second and stream them into multiple applications. Spark Streaming and Structured Streaming are scalable and fault-tolerant stream processing engines that allow users to process huge amounts of data using complex algorithms expressed with high-level functions like map, reduce, join, and window. This data can then be pushed to filesystems, databases, or even back to Event Hubs.

By making Event Hubs and Spark easier to use together, we hope this connector makes building scalable, fault-tolerant applications easier for our users.

Latest Releases

Spark

Spark Version Package Name Package Version
Spark 2.4 azure-eventhubs-spark_2.11 Maven Central
Spark 2.3 azure-eventhubs-spark_2.11 Maven Central
Spark 2.2 azure-eventhubs-spark_2.11 Maven Central
Spark 2.1 azure-eventhubs-spark_2.11 Maven Central

Databricks

Databricks Runtime Version Artifact Id Package Version
Databricks Runtime 5.X azure-eventhubs-spark_2.11 Maven Central
Databricks Runtime 4.X azure-eventhubs-spark_2.11 Maven Central
Databricks Runtime 3.5 azure-eventhubs-spark_2.11 Maven Central

Roadmap

There is an open issue for each planned feature/enhancement.

Usage

Linking

For Scala/Java applications using SBT/Maven project definitions, link your application with the artifact below. Note: See Latest Releases to find the correct artifiact for your version of Apache Spark (or Databricks)!

groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.11
version = 2.3.13

Documentation

Documentation for our connector can be found here. The integration guides there contain all the information you need to use this library.

If you're new to Apache Spark and/or Event Hubs, then we highly recommend reading their documentation first. You can read Event Hubs documentation here, documentation for Spark Streaming here, and, the last but not least, Structured Streaming here.

FAQ

We maintain an FAQ - reach out to us via gitter if you think anything needs to be added or clarified!

Further Assistance

If you need additional assistance, please don't hesitate to ask! General questions and discussion should happen on our gitter chat. Please open an issue for bug reports and feature requests! Feedback, feature requests, bug reports, etc are all welcomed!

Contributing

If you'd like to help contribute (we'd love to have your help!), then go to our Contributor's Guide for more information.

Build Prerequisites

In order to use the connector, you need to have:

More details on building from source and running tests can be found in our Contributor's Guide.

Build Command

// Builds jar and runs all tests
mvn clean package

// Builds jar, runs all tests, and installs jar to your local maven repository
mvn clean install