Skip to content
Snippets Groups Projects
Commit 0e375a3c authored by Patrick Wendell's avatar Patrick Wendell
Browse files

Add assmebly plug in links

parent 6371febe
No related branches found
No related tags found
No related merge requests found
...@@ -294,12 +294,15 @@ There are a few additional considerations when running jobs on a ...@@ -294,12 +294,15 @@ There are a few additional considerations when running jobs on a
### Including Your Dependencies ### Including Your Dependencies
If your code depends on other projects, you will need to ensure they are also If your code depends on other projects, you will need to ensure they are also
present on the slave nodes. The most common way to do this is to create an present on the slave nodes. A popular approach is to create an
assembly jar (or "uber" jar) containing your code and its dependencies. You assembly jar (or "uber" jar) containing your code and its dependencies. Both
may then submit the assembly jar when creating a SparkContext object. If you [sbt](https://github.com/sbt/sbt-assembly) and
do this, you should make Spark itself a `provided` dependency, since it will [Maven](http://maven.apache.org/plugins/maven-assembly-plugin/)
already be present on the slave nodes. It is also possible to submit your have assembly plugins. When creating assembly jars, list Spark
dependent jars one-by-one when creating a SparkContext. itself as a `provided` dependency; it need not be bundled since it is
already present on the slaves. Once you have an assembled jar,
add it to the SparkContext as shown here. It is also possible to submit
your dependent jars one-by-one when creating a SparkContext.
### Setting Configuration Options ### Setting Configuration Options
Spark includes several configuration options which influence the behavior Spark includes several configuration options which influence the behavior
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment