C# and F# language binding and extensions to Apache Spark
Перейти к файлу
dependabot[bot] 45008324e6
Bump log4net in /csharp/Adapter/Microsoft.Spark.CSharp
Bumps [log4net](https://github.com/apache/logging-log4net) from 2.0.5 to 2.0.10.
- [Release notes](https://github.com/apache/logging-log4net/releases)
- [Commits](https://github.com/apache/logging-log4net/commits/rel/2.0.10)

---
updated-dependencies:
- dependency-name: log4net
  dependency-type: direct:production
...

Signed-off-by: dependabot[bot] <support@github.com>
2024-10-07 11:22:16 +00:00
build Update default spark version in Windows build to 2.0.2 2017-02-09 14:51:25 -05:00
cpp [Mobius-102] Add missing license text in batch and powershell scripts (#557) 2016-08-31 11:33:41 -07:00
csharp Bump log4net in /csharp/Adapter/Microsoft.Spark.CSharp 2024-10-07 11:22:16 +00:00
dev/scripts updating examples to use latest preview release 2016-10-11 16:20:41 -07:00
docs doc updates and using Mobius name in scripts 2016-04-25 13:06:18 -07:00
examples updated examples to use latest version of Mobius 2017-01-28 21:01:17 -08:00
logo replacing old SparkCLR logo with new Mobius logo 2016-04-19 11:27:33 -07:00
notes Update linux-instructions.md 2017-01-05 21:23:40 +01:00
python/perf Add Python version performance benchmark test; Add usage + example and RIOSocket option support for C# version benchmark; Update Scala version benchmark; Update csv package version. (#565) 2016-09-20 13:41:29 +08:00
scala upgrading to Spark 2.0.2 2017-01-28 01:48:49 -08:00
scripts updating incorrect versions for Mobius jar 2017-01-28 21:33:21 -08:00
.gitattributes Add standard .gitattributes file, to avoid line ending problems. 2015-11-05 11:06:08 -08:00
.gitignore Add Python version performance benchmark test; Add usage + example and RIOSocket option support for C# version benchmark; Update Scala version benchmark; Update csv package version. (#565) 2016-09-20 13:41:29 +08:00
.travis.yml fixing Travis build issue 2016-12-03 22:28:59 -08:00
LICENSE initial commit 2015-10-29 15:27:15 -07:00
README.md Update documatation for tuples change 2016-12-13 16:25:23 -05:00
appveyor.yml updating build version 2017-01-28 02:10:24 -08:00

README.md

Mobius logo # Mobius: C# API for Spark

Mobius provides C# language binding to Apache Spark enabling the implementation of Spark driver program and data processing operations in the languages supported in the .NET framework like C# or F#.

For example, the word count sample in Apache Spark can be implemented in C# as follows :

var lines = sparkContext.TextFile(@"hdfs://path/to/input.txt");  
var words = lines.FlatMap(s => s.Split(' '));
var wordCounts = words.Map(w => new Tuple<string, int>(w.Trim(), 1))  
                      .ReduceByKey((x, y) => x + y);  
var wordCountCollection = wordCounts.Collect();  
wordCounts.SaveAsTextFile(@"hdfs://path/to/wordcount.txt");  

A simple DataFrame application using TempTable may look like the following:

var reqDataFrame = sqlContext.TextFile(@"hdfs://path/to/requests.csv");
var metricDataFrame = sqlContext.TextFile(@"hdfs://path/to/metrics.csv");
reqDataFrame.RegisterTempTable("requests");
metricDataFrame.RegisterTempTable("metrics");
// C0 - guid in requests DataFrame, C3 - guid in metrics DataFrame  
var joinDataFrame = GetSqlContext().Sql(  
    "SELECT joinedtable.datacenter" +
         ", MAX(joinedtable.latency) maxlatency" +
         ", AVG(joinedtable.latency) avglatency " +
    "FROM (" +
       "SELECT a.C1 as datacenter, b.C6 as latency " +  
       "FROM requests a JOIN metrics b ON a.C0  = b.C3) joinedtable " +   
    "GROUP BY datacenter");
joinDataFrame.ShowSchema();
joinDataFrame.Show();

A simple DataFrame application using DataFrame DSL may look like the following:

// C0 - guid, C1 - datacenter
var reqDataFrame = sqlContext.TextFile(@"hdfs://path/to/requests.csv")  
                             .Select("C0", "C1");    
// C3 - guid, C6 - latency   
var metricDataFrame = sqlContext.TextFile(@"hdfs://path/to/metrics.csv", ",", false, true)
                                .Select("C3", "C6"); //override delimiter, hasHeader & inferSchema
var joinDataFrame = reqDataFrame.Join(metricDataFrame, reqDataFrame["C0"] == metricDataFrame["C3"])
                                .GroupBy("C1");
var maxLatencyByDcDataFrame = joinDataFrame.Agg(new Dictionary<string, string> { { "C6", "max" } });
maxLatencyByDcDataFrame.ShowSchema();
maxLatencyByDcDataFrame.Show();

A simple Spark Streaming application that processes messages from Kafka using C# may be implemented using the following code:

StreamingContext sparkStreamingContext = StreamingContext.GetOrCreate(checkpointPath, () =>
    {
      var ssc = new StreamingContext(sparkContext, slideDurationInMillis);
      ssc.Checkpoint(checkpointPath);
      var stream = KafkaUtils.CreateDirectStream(ssc, topicList, kafkaParams, perTopicPartitionKafkaOffsets);
      //message format: [timestamp],[loglevel],[logmessage]
      var countByLogLevelAndTime = stream
                                    .Map(kvp => Encoding.UTF8.GetString(kvp.Value))
                                    .Filter(line => line.Contains(","))
                                    .Map(line => line.Split(','))
                                    .Map(columns => new Tuple<string, int>(
                                                          string.Format("{0},{1}", columns[0], columns[1]), 1))
                                    .ReduceByKeyAndWindow((x, y) => x + y, (x, y) => x - y,
                                                          windowDurationInSecs, slideDurationInSecs, 3)
                                    .Map(logLevelCountPair => string.Format("{0},{1}",
                                                          logLevelCountPair.Key, logLevelCountPair.Value));
      countByLogLevelAndTime.ForeachRDD(countByLogLevel =>
      {
          foreach (var logCount in countByLogLevel.Collect())
              Console.WriteLine(logCount);
      });
      return ssc;
    });
sparkStreamingContext.Start();
sparkStreamingContext.AwaitTermination();

For more code samples, refer to Mobius\examples directory or Mobius\csharp\Samples directory.

API Documentation

Refer to Mobius C# API documentation for the list of Spark's data processing operations supported in Mobius.

API Usage

Mobius API usage samples are available at:

  • Examples folder which contains standalone C# and F# projects that can be used as templates to start developing Mobius applications

  • Samples project which uses a comprehensive set of Mobius APIs to implement samples that are also used for functional validation of APIs

  • Mobius performance test scenarios implemented in C# and Scala for side by side comparison of Spark driver code

Documents

Refer to the docs folder for design overview and other info on Mobius

Build Status

Ubuntu 14.04.3 LTS Windows Unit test coverage
Build status Build status codecov.io

Getting Started

Windows Linux
Build & run unit tests Build in Windows Build in Linux
Run samples (functional tests) in local mode Samples in Windows Samples in Linux
Run examples in local mode Examples in Windows Examples in Linux
Run Mobius app
Run Mobius Shell Not supported yet

Supported Spark Versions

Mobius is built and tested with Apache Spark 1.4.1, 1.5.2, 1.6.* and 2.0.

Releases

Mobius releases are available at https://github.com/Microsoft/Mobius/releases. References needed to build C# Spark driver applicaiton using Mobius are also available in NuGet

NuGet Badge

Refer to mobius-release-info.md for the details on versioning policy and the contents of the release.

License

License

Mobius is licensed under the MIT license. See LICENSE file for full license information.

Community

Issue Stats Issue Stats Join the chat at https://gitter.im/Microsoft/Mobius [Twitter](https://twitter.com/intent/tweet?text=@MobiusForSpark [your tweet] via @GitHub)

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.