- Jan 08, 2014
-
-
liguoqiang authored
-
- Jan 06, 2014
-
-
Holden Karau authored
-
Holden Karau authored
-
- Jan 03, 2014
-
-
Patrick Wendell authored
Closes #316
-
- Jan 02, 2014
-
-
Prashant Sharma authored
-
Prashant Sharma authored
-
Prashant Sharma authored
-
- Oct 01, 2013
-
-
Du Li authored
-
- Sep 22, 2013
-
-
shane-huang authored
Signed-off-by:
shane-huang <shengsheng.huang@intel.com>
-
- Sep 09, 2013
-
- Aug 29, 2013
-
-
Matei Zaharia authored
-
Matei Zaharia authored
-
Matei Zaharia authored
This commit makes Spark invocation saner by using an assembly JAR to find all of Spark's dependencies instead of adding all the JARs in lib_managed. It also packages the examples into an assembly and uses that as SPARK_EXAMPLES_JAR. Finally, it replaces the old "run" script with two better-named scripts: "run-examples" for examples, and "spark-class" for Spark internal classes (e.g. REPL, master, etc). This is also designed to minimize the confusion people have in trying to use "run" to run their own classes; it's not meant to do that, but now at least if they look at it, they can modify run-examples to do a decent job for them. As part of this, Bagel's examples are also now properly moved to the examples package instead of bagel.
-
- Aug 21, 2013
-
-
Jey Kottalam authored
This reverts commit 66e7a38a.
-
Jey Kottalam authored
-
Matei Zaharia authored
-
- Aug 16, 2013
-
-
Jey Kottalam authored
-
Jey Kottalam authored
-
- Jul 30, 2013
-
-
Benjamin Hindman authored
requiring Spark to be installed. Using 'make_distribution.sh' a user can put a Spark distribution at a URI supported by Mesos (e.g., 'hdfs://...') and then set that when launching their job. Also added SPARK_EXECUTOR_URI for the REPL.
-
- Jul 16, 2013
-
-
Matei Zaharia authored
-
- Jul 15, 2013
-
-
seanm authored
-
- Jun 25, 2013
-
-
Evan Chan authored
-
- Jun 24, 2013
-
-
Evan Chan authored
-