Mirror of Apache Spark
Перейти к файлу
Matei Zaharia 7d0eae17e3 Merge branch 'dev'
Conflicts:
	src/scala/spark/HdfsFile.scala
	src/scala/spark/NexusScheduler.scala
	src/test/spark/repl/ReplSuite.scala
2010-06-27 15:21:54 -07:00
src Merge branch 'dev' 2010-06-27 15:21:54 -07:00
third_party New nexus.jar 2010-06-10 22:41:23 -07:00
.gitignore
Makefile Added back REPL tests 2010-06-11 10:03:01 -07:00
README Fixed README 2010-06-11 14:55:23 -07:00
alltests
lr_data.txt
run Merge branch 'dev' 2010-06-27 15:21:54 -07:00
spark-executor
spark-shell

README

Spark requires Scala 2.8. This version has been tested with 2.8.0RC3.

To build and run Spark, you will need to have Scala's bin in your $PATH,
or you will need to set the SCALA_HOME environment variable to point
to where you've installed Scala. Scala must be accessible through one
of these methods on Nexus slave nodes as well as on the master.

To build Spark and the example programs, run make.

To run one of the examples, use ./run <class> <params>. For example,
./run SparkLR will run the Logistic Regression example. Each of the
example programs prints usage help if no params are given.

Tip: If you are building Spark and examples repeatedly, export USE_FSC=1
to have the Makefile use the fsc compiler daemon instead of scalac.