Skip to content
Snippets Groups Projects
Commit f9cc1fbf authored by Jey Kottalam's avatar Jey Kottalam
Browse files

Remove references to unsupported Hadoop versions

parent 4d737b6d
No related branches found
No related tags found
No related merge requests found
......@@ -48,7 +48,7 @@ Hadoop, you must build Spark against the same version that your cluster runs.
You can change the version by setting the `SPARK_HADOOP_VERSION` environment
when building Spark.
For Apache Hadoop versions 1.x, 0.20.x, Cloudera CDH MRv1, and other Hadoop
For Apache Hadoop versions 1.x, Cloudera CDH MRv1, and other Hadoop
versions without YARN, use:
# Apache Hadoop 1.2.1
......
......@@ -12,7 +12,7 @@ Building Spark using Maven Requires Maven 3 (the build process is tested with Ma
To enable support for HDFS and other Hadoop-supported storage systems, specify the exact Hadoop version by setting the "hadoop.version" property. If unset, Spark will build against Hadoop 1.0.4 by default.
For Apache Hadoop versions 1.x, 0.20.x, Cloudera CDH MRv1, and other Hadoop versions without YARN, use:
For Apache Hadoop versions 1.x, Cloudera CDH MRv1, and other Hadoop versions without YARN, use:
# Apache Hadoop 1.2.1
$ mvn -Dhadoop.version=1.2.1 clean install
......
......@@ -24,10 +24,9 @@ import AssemblyKeys._
//import com.jsuereth.pgp.sbtplugin.PgpKeys._
object SparkBuild extends Build {
// Hadoop version to build against. For example, "0.20.2", "0.20.205.0", or
// "1.0.4" for Apache releases, or "0.20.2-cdh3u5" for Cloudera Hadoop.
// Note that these variables can be set through the environment variables
// SPARK_HADOOP_VERSION and SPARK_WITH_YARN.
// Hadoop version to build against. For example, "1.0.4" for Apache releases, or
// "2.0.0-mr1-cdh4.2.0" for Cloudera Hadoop. Note that these variables can be set
// through the environment variables SPARK_HADOOP_VERSION and SPARK_WITH_YARN.
val DEFAULT_HADOOP_VERSION = "1.0.4"
val DEFAULT_WITH_YARN = false
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment