- Jan 02, 2014
-
-
Prashant Sharma authored
-
- Dec 24, 2013
-
-
Andrew Ash authored
-
- Sep 24, 2013
-
-
Patrick Wendell authored
-
- Sep 07, 2013
-
-
Aaron Davidson authored
-
Aaron Davidson authored
The sc.StorageLevel -> StorageLevel pathway is a bit janky, but otherwise the shell would have to call a private method of SparkContext. Having StorageLevel available in sc also doesn't seem like the end of the world. There may be a better solution, though. As for creating the StorageLevel object itself, this seems to be the best way in Python 2 for creating singleton, enum-like objects: http://stackoverflow.com/questions/36932/how-can-i-represent-an-enum-in-python
-
- Sep 06, 2013
-
-
Aaron Davidson authored
It uses reflection... I am not proud of that fact, but it at least ensures compatibility (sans refactoring of the StorageLevel stuff).
-
- Sep 01, 2013
-
-
Matei Zaharia authored
-
- Aug 12, 2013
-
-
Andre Schumacher authored
Now ADD_FILES uses a comma as file name separator.
-
- Jul 16, 2013
-
-
Matei Zaharia authored
-
- Jan 30, 2013
-
-
Patrick Wendell authored
Also, adds a line in doc explaining how to use.
-
- Jan 20, 2013
-
-
Matei Zaharia authored
-
- Jan 01, 2013
-
-
Josh Rosen authored
Expand the PySpark programming guide.
-
Josh Rosen authored
-
- Dec 28, 2012
-
-
Josh Rosen authored
- Bundle Py4J binaries, since it's hard to install - Uses Spark's `run` script to launch the Py4J gateway, inheriting the settings in spark-env.sh With these changes, (hopefully) nothing more than running `sbt/sbt package` will be necessary to run PySpark.
-
- Dec 27, 2012
-
-
Josh Rosen authored
Suggested by / based on code from @MLnick
-
- Oct 19, 2012
-
-
Josh Rosen authored
-