Reversed the order of tests to find a scala executable (in the case when SPARK_LAUNCH_WITH_SCALA is defined): instead of checking in the PATH first, and only then (if not found) for SCALA_HOME, now we check for SCALA_HOME first, and only then (if not defined) do we look in the PATH. The advantage is that now if the user has a more recent (non-compatible) version of scala in her PATH, she can use SCALA_HOME to point to the older (compatible) version for use with spark.

Suggested by Josh Rosen in this thread:

  https://groups.google.com/forum/?fromgroups=#!topic/spark-users/NC9JKvP8808
This commit is contained in:
Mike 2013-04-11 20:52:06 -07:00
Родитель c91ff8d493
Коммит 6f68860891
1 изменённых файлов: 6 добавлений и 5 удалений

11
run
Просмотреть файл

@ -47,14 +47,15 @@ case "$1" in
esac
if [ "$SPARK_LAUNCH_WITH_SCALA" == "1" ]; then
if [ `command -v scala` ]; then
RUNNER="scala"
if [ "$SCALA_HOME" ]; then
RUNNER="${SCALA_HOME}/bin/scala"
else
if [ -z "$SCALA_HOME" ]; then
echo "SCALA_HOME is not set" >&2
if [ `command -v scala` ]; then
RUNNER="scala"
else
echo "SCALA_HOME is not set and scala is not in PATH" >&2
exit 1
fi
RUNNER="${SCALA_HOME}/bin/scala"
fi
else
if [ `command -v java` ]; then