This commit is contained in:
Terry Kim 2019-06-03 11:54:29 -07:00 коммит произвёл GitHub
Родитель aa3eb1381d
Коммит eb26baa462
Не найден ключ, соответствующий данной подписи
Идентификатор ключа GPG: 4AEE18F83AFDEB23
6 изменённых файлов: 47 добавлений и 6 удалений

Просмотреть файл

@ -12,6 +12,7 @@
## Table of Contents
- [Supported Apache Spark](#supported-apache-spark)
- [Get Started](#get-started)
- [Build Status](#build-status)
- [Building from Source](#building-from-source)
@ -23,6 +24,37 @@
- [Code of Conduct](#code-of-conduct)
- [License](#license)
## Supported Apache Spark
<table>
<thead>
<tr>
<th>Apache Spark</th>
<th>.NET for Apache Spark</th>
</tr>
</thead>
<tbody align="center">
<tr>
<td >2.3.*</td>
<td rowspan=3><a href="https://github.com/dotnet/spark/releases/tag/v0.2.0">v0.2.0</a></td>
</tr>
<tr>
<td>2.4.0</td>
</tr>
<tr>
<td>2.4.1</td>
</tr>
<tr>
<td>2.4.2</td>
<td><a href="https://github.com/dotnet/spark/issues/60">Not supported</a></td>
</tr>
<tr>
<td>2.4.3</td>
<td>master branch</td>
</tr>
</tbody>
</table>
## Get Started
These instructions will show you how to run a .NET for Apache Spark app using .NET Core.
- [Windows Instructions](docs/getting-started/windows-instructions.md)

Просмотреть файл

@ -123,6 +123,17 @@ jobs:
HADOOP_HOME: $(Build.BinariesDirectory)\hadoop
DOTNET_WORKER_DIR: $(Build.ArtifactStagingDirectory)\Microsoft.Spark.Worker\netcoreapp2.1\win-x64
- task: DotNetCoreCLI@2
displayName: 'E2E tests for Spark 2.4.3'
inputs:
command: test
projects: '**/Microsoft.Spark.E2ETest/*.csproj'
arguments: '--configuration $(buildConfiguration) /p:CollectCoverage=true /p:CoverletOutputFormat=cobertura'
env:
SPARK_HOME: $(Build.BinariesDirectory)\spark-2.4.3-bin-hadoop2.7
HADOOP_HOME: $(Build.BinariesDirectory)\hadoop
DOTNET_WORKER_DIR: $(Build.ArtifactStagingDirectory)\Microsoft.Spark.Worker\netcoreapp2.1\win-x64
- ${{ if and(ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}:
- task: CopyFiles@2
displayName: Stage .NET artifacts

Просмотреть файл

@ -38,10 +38,7 @@ The following table outlines the supported Spark versions along with the microso
</tr>
<tr>
<td>2.4.0</td>
<td rowspan=4>microsoft-spark-2.4.x-0.2.0.jar</td>
</tr>
<tr>
<td>2.4.0</td>
<td rowspan=2>microsoft-spark-2.4.x-0.2.0.jar</td>
</tr>
<tr>
<td>2.4.1</td>

Просмотреть файл

@ -19,5 +19,6 @@ curl -k -L -o spark-2.3.2.tgz https://archive.apache.org/dist/spark/spark-2.3.2/
curl -k -L -o spark-2.3.3.tgz https://archive.apache.org/dist/spark/spark-2.3.3/spark-2.3.3-bin-hadoop2.7.tgz && tar xzvf spark-2.3.3.tgz
curl -k -L -o spark-2.4.0.tgz https://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz && tar xzvf spark-2.4.0.tgz
curl -k -L -o spark-2.4.1.tgz https://archive.apache.org/dist/spark/spark-2.4.1/spark-2.4.1-bin-hadoop2.7.tgz && tar xzvf spark-2.4.1.tgz
curl -k -L -o spark-2.4.3.tgz https://archive.apache.org/dist/spark/spark-2.4.3/spark-2.4.3-bin-hadoop2.7.tgz && tar xzvf spark-2.4.3.tgz
endlocal

Просмотреть файл

@ -12,7 +12,7 @@
<encoding>UTF-8</encoding>
<scala.version>2.11.8</scala.version>
<scala.binary.version>2.11</scala.binary.version>
<spark.version>2.4.1</spark.version>
<spark.version>2.4.3</spark.version>
</properties>
<dependencies>

Просмотреть файл

@ -33,7 +33,7 @@ import scala.util.Try
*/
object DotnetRunner extends Logging {
private val DEBUG_PORT = 5567
private val supportedSparkVersions = Set[String]("2.4.0", "2.4.1")
private val supportedSparkVersions = Set[String]("2.4.0", "2.4.1", "2.4.3")
val SPARK_VERSION = DotnetUtils.normalizeSparkVersion(spark.SPARK_VERSION)