Clarify that PySpark is not supported on Windows

This commit is contained in:
root 2013-07-01 06:25:17 +00:00
Родитель ec31e68d5d
Коммит 7cd490ef5b
1 изменённых файлов: 2 добавлений и 3 удалений

Просмотреть файл

@ -17,10 +17,9 @@ There are a few key differences between the Python and Scala APIs:
* Python is dynamically typed, so RDDs can hold objects of different types.
* PySpark does not currently support the following Spark features:
- Special functions on RDDs of doubles, such as `mean` and `stdev`
- `lookup`
- `lookup`, `sample` and `sort`
- `persist` at storage levels other than `MEMORY_ONLY`
- `sample`
- `sort`
- Execution on Windows -- this is slated for a future release
In PySpark, RDDs support the same methods as their Scala counterparts but take Python functions and return Python collection types.
Short functions can be passed to RDD methods using Python's [`lambda`](http://www.diveintopython.net/power_of_introspection/lambda_functions.html) syntax: