C# and F# language binding and extensions to Apache Spark
Перейти к файлу
skaarthik 1e29aa9528 added 'API Usage' section 2016-01-12 13:22:04 -08:00
build add packaging signing and deployment to maven central 2016-01-11 22:02:08 -08:00
csharp Merge pull request #236 from danielli90/sonatype 2016-01-11 23:45:03 -08:00
docs adding streaming section to docs 2016-01-05 11:18:40 -08:00
examples adding JDBC example to DataFrame API 2016-01-10 21:15:17 -08:00
logo Rename clear to white and white to clear for 100px logos 2016-01-11 17:38:33 -08:00
notes fixing codecov issue and moving linux-patch to notes directory 2016-01-09 22:53:46 -08:00
scala add packaging signing and deployment to maven central 2016-01-11 22:02:08 -08:00
scripts add packaging signing and deployment to maven central 2016-01-11 22:02:08 -08:00
.gitattributes Add standard .gitattributes file, to avoid line ending problems. 2015-11-05 11:06:08 -08:00
.gitignore moving streaming methods from top level proxy to streaming context proxy and other minor updates 2016-01-04 22:11:15 -08:00
.travis.yml fixing issue on relative path 2016-01-09 18:04:59 -08:00
LICENSE initial commit 2015-10-29 15:27:15 -07:00
README.md added 'API Usage' section 2016-01-12 13:22:04 -08:00
appveyor.yml add packaging signing and deployment to maven central 2016-01-11 22:02:08 -08:00

README.md

SparkCLR logo

SparkCLR (pronounced Sparkler) adds C# language binding to Apache Spark, enabling the implementation of Spark driver code and data processing operations in C#.

For example, the word count sample in Apache Spark can be implemented in C# as follows :

var lines = sparkContext.TextFile(@"hdfs://path/to/input.txt");  
var words = lines.FlatMap(s => s.Split(' '));
var wordCounts = words.Map(w => new KeyValuePair<string, int>(w.Trim(), 1))  
                      .ReduceByKey((x, y) => x + y);  
var wordCountCollection = wordCounts.Collect();  
wordCounts.SaveAsTextFile(@"hdfs://path/to/wordcount.txt");  

A simple DataFrame application using TempTable may look like the following:

var reqDataFrame = sqlContext.TextFile(@"hdfs://path/to/requests.csv");
var metricDataFrame = sqlContext.TextFile(@"hdfs://path/to/metrics.csv");
reqDataFrame.RegisterTempTable("requests");
metricDataFrame.RegisterTempTable("metrics");
// C0 - guid in requests DataFrame, C3 - guid in metrics DataFrame  
var joinDataFrame = GetSqlContext().Sql(  
    "SELECT joinedtable.datacenter" +
         ", MAX(joinedtable.latency) maxlatency" +
         ", AVG(joinedtable.latency) avglatency " + 
    "FROM (" +
       "SELECT a.C1 as datacenter, b.C6 as latency " +  
       "FROM requests a JOIN metrics b ON a.C0  = b.C3) joinedtable " +   
    "GROUP BY datacenter");
joinDataFrame.ShowSchema();
joinDataFrame.Show();

A simple DataFrame application using DataFrame DSL may look like the following:

// C0 - guid, C1 - datacenter
var reqDataFrame = sqlContext.TextFile(@"hdfs://path/to/requests.csv")  
                             .Select("C0", "C1");    
// C3 - guid, C6 - latency   
var metricDataFrame = sqlContext.TextFile(@"hdfs://path/to/metrics.csv", ",", false, true)
                                .Select("C3", "C6"); //override delimiter, hasHeader & inferSchema
var joinDataFrame = reqDataFrame.Join(metricDataFrame, reqDataFrame["C0"] == metricDataFrame["C3"])
                                .GroupBy("C1");
var maxLatencyByDcDataFrame = joinDataFrame.Agg(new Dictionary<string, string> { { "C6", "max" } });
maxLatencyByDcDataFrame.ShowSchema();
maxLatencyByDcDataFrame.Show();

Refer to SparkCLR\csharp\Samples directory and sample usage for complete samples.

API Documentation

Refer to SparkCLR C# API documentation for the list of Spark's data processing operations supported in SparkCLR.

API Usage

SparkCLR API usage samples are available at:

  • Samples project which uses a comprehensive set of SparkCLR APIs to implement samples that are also used for functional validation of APIs

  • Examples folder which contains standalone SparkCLR projects that can be used as templates to start developing SparkCLR applications

Documents

Refer to the docs folder for design overview and other info on SparkCLR

Build Status

Ubuntu 14.04.3 LTS Windows
Build status Build status

Building, Running and Debugging SparkCLR

(Note: Tested with Spark 1.5.2)

License

License

SparkCLR is licensed under the MIT license. See LICENSE file for full license information.

Contribution

Issue Stats Issue Stats

We welcome contributions. To contribute, follow the instructions in CONTRIBUTING.md.