Skip to content
Snippets Groups Projects
Commit 153cad12 authored by Patrick Wendell's avatar Patrick Wendell
Browse files

README incorrectly suggests build sources spark-env.sh

This is misleading because the build doesn't source that file. IMO
it's better to force people to specify build environment variables
on the command line always, like we do in every example.
parent 5b74609d
No related branches found
No related tags found
No related merge requests found
...@@ -69,9 +69,6 @@ When building for Hadoop 2.2.X and newer, you'll need to include the additional ...@@ -69,9 +69,6 @@ When building for Hadoop 2.2.X and newer, you'll need to include the additional
# Apache Hadoop 2.2.X and newer # Apache Hadoop 2.2.X and newer
$ mvn -Dyarn.version=2.2.0 -Dhadoop.version=2.2.0 -Pnew-yarn $ mvn -Dyarn.version=2.2.0 -Dhadoop.version=2.2.0 -Pnew-yarn
For convenience, these variables may also be set through the `conf/spark-env.sh` file
described below.
When developing a Spark application, specify the Hadoop version by adding the When developing a Spark application, specify the Hadoop version by adding the
"hadoop-client" artifact to your project's dependencies. For example, if you're "hadoop-client" artifact to your project's dependencies. For example, if you're
using Hadoop 1.2.1 and build your application using SBT, add this entry to using Hadoop 1.2.1 and build your application using SBT, add this entry to
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment