Prepare for 2.3.6 & 2.2.6 release (#407)

This commit is contained in:
SJ 2018-11-08 09:06:05 -08:00 коммит произвёл GitHub
Родитель e0faa35fbc
Коммит 1c198ba39a
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
8 изменённых файлов: 18 добавлений и 18 удалений

2
.github/CONTRIBUTING.md поставляемый
Просмотреть файл

@ -19,7 +19,7 @@ run all unit/integration tests and build a JAR.
### SBT Dependency
// https://mvnrepository.com/artifact/com.microsoft.azure/azure-eventhubs-spark_2.11
libraryDependencies += "com.microsoft.azure" %% "azure-eventhubs-spark" %% "2.3.5"
libraryDependencies += "com.microsoft.azure" %% "azure-eventhubs-spark" %% "2.3.6"
## Filing Issues

2
FAQ.md
Просмотреть файл

@ -23,7 +23,7 @@ If that's the case, simply send fresh events to the Event Hubs and continue test
**Why am I getting a `ReceiverDisconnectedException`?**
In version 2.3.4 and above, the connector uses epoch receivers from the Event Hubs Java client.
In version 2.3.2 and above, the connector uses epoch receivers from the Event Hubs Java client.
This only allows one receiver to be open per consumer group-partition combo. To be crystal clear,
let's say we have `receiverA` with an epoch of `0` which is open within consumer group `foo` on partition `0`.
Now, if we open a new receiver, `receiverB`, for the same consumer group and partition with an epoch of

Просмотреть файл

@ -30,15 +30,15 @@ By making Event Hubs and Spark easier to use together, we hope this connector ma
#### Spark
|Spark Version|Package Name|Package Version|
|-------------|------------|----------------|
|Spark 2.3|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.3.5-brightgreen.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.3.5%7Cjar)|
|Spark 2.2|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.2.5-blue.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.2.5%7Cjar)|
|Spark 2.1|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.2.5-blue.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.2.5%7Cjar)|
|Spark 2.3|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.3.6-brightgreen.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.3.6%7Cjar)|
|Spark 2.2|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.2.6-blue.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.2.6%7Cjar)|
|Spark 2.1|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.2.6-blue.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.2.6%7Cjar)|
#### Databricks
|Databricks Runtime Version|Artifact Id|Package Version|
|-------------|------------|----------------|
|Databricks Runtime 4.X|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.3.5-brightgreen.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.3.5%7Cjar)|
|Databricks Runtime 3.5|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.3.5-brightgreen.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.3.5%7Cjar)|
|Databricks Runtime 4.X|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.3.6-brightgreen.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.3.6%7Cjar)|
|Databricks Runtime 3.5|azure-eventhubs-spark_2.11|[![Maven Central](https://img.shields.io/badge/maven%20central-2.3.6-brightgreen.svg)](https://search.maven.org/#artifactdetails%7Ccom.microsoft.azure%7Cazure-eventhubs-spark_2.11%7C2.3.6%7Cjar)|
#### Roadmap
@ -53,7 +53,7 @@ For Scala/Java applications using SBT/Maven project definitions, link your appli
groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.11
version = 2.3.5
version = 2.3.6
### Documentation

Просмотреть файл

@ -23,7 +23,7 @@
<parent>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-eventhubs-spark-parent_2.11</artifactId>
<version>2.3.5</version>
<version>2.3.6</version>
<relativePath>../pom.xml</relativePath>
</parent>
<artifactId>azure-eventhubs-spark_2.11</artifactId>

Просмотреть файл

@ -23,7 +23,7 @@ Structured streaming integration for Azure Event Hubs is ultimately run on the J
```
groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.11
version = 2.3.5
version = 2.3.6
```
For Python applications, you need to add this above library and its dependencies when deploying your application.
@ -387,11 +387,11 @@ AMQP types need to be handled explicitly by the connector. Below we list the AMQ
As with any Spark applications, `spark-submit` is used to launch your application. `azure-eventhubs-spark_2.11`
and its dependencies can be directly added to `spark-submit` using `--packages`, such as,
./bin/spark-submit --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.5 ...
./bin/spark-submit --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.6 ...
For experimenting on `spark-shell`, you can also use `--packages` to add `azure-eventhubs-spark_2.11` and its dependencies directly,
./bin/spark-shell --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.5 ...
./bin/spark-shell --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.6 ...
See [Application Submission Guide](https://spark.apache.org/docs/latest/submitting-applications.html) for more details about submitting
applications with external dependencies.

Просмотреть файл

@ -23,7 +23,7 @@ For Scala/Java applications using SBT/Maven project defnitions, link your applic
```
groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.11
version = 2.3.5
version = 2.3.6
```
For Python applications, you need to add this above library and its dependencies when deploying your application.

Просмотреть файл

@ -23,7 +23,7 @@ For Scala/Java applications using SBT/Maven project defnitions, link your applic
```
groupId = com.microsoft.azure
artifactId = azure-eventhubs-spark_2.11
version = 2.3.5
version = 2.3.6
```
For Python applications, you need to add this above library and its dependencies when deploying your application.
@ -389,11 +389,11 @@ AMQP types need to be handled explicitly by the connector. Below we list the AMQ
As with any Spark applications, `spark-submit` is used to launch your application. `azure-eventhubs-spark_2.11`
and its dependencies can be directly added to `spark-submit` using `--packages`, such as,
./bin/spark-submit --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.5 ...
./bin/spark-submit --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.6 ...
For experimenting on `spark-shell`, you can also use `--packages` to add `azure-eventhubs-spark_2.11` and its dependencies directly,
./bin/spark-shell --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.5 ...
./bin/spark-shell --packages com.microsoft.azure:azure-eventhubs-spark_2.11:2.3.6 ...
See [Application Submission Guide](https://spark.apache.org/docs/latest/submitting-applications.html) for more details about submitting
applications with external dependencies.

Просмотреть файл

@ -25,7 +25,7 @@
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-eventhubs-spark-parent_2.11</artifactId>
<version>2.3.5</version>
<version>2.3.6</version>
<packaging>pom</packaging>
<name>EventHubs+Spark Parent POM</name>
@ -133,7 +133,7 @@
<dependency>
<groupId>com.microsoft.azure</groupId>
<artifactId>azure-eventhubs</artifactId>
<version>1.2.0</version>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>