From 669e3f05895d9dfa37abf60f60aecebb03988e50 Mon Sep 17 00:00:00 2001
From: CrazyJvm <crazyjvm@gmail.com>
Date: Wed, 30 Jul 2014 23:37:25 -0700
Subject: [PATCH] automatically set master according to `spark.master` in
 `spark-defaults....

automatically set master according to `spark.master` in `spark-defaults.conf`

Author: CrazyJvm <crazyjvm@gmail.com>

Closes #1644 from CrazyJvm/standalone-guide and squashes the following commits:

bb12b95 [CrazyJvm] automatically set master according to `spark.master` in `spark-defaults.conf`
---
 docs/spark-standalone.md | 3 ---
 1 file changed, 3 deletions(-)

diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index ad8b6c0e51..2fb30765f3 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -242,9 +242,6 @@ To run an interactive Spark shell against the cluster, run the following command
 
     ./bin/spark-shell --master spark://IP:PORT
 
-Note that if you are running spark-shell from one of the spark cluster machines, the `bin/spark-shell` script will
-automatically set MASTER from the `SPARK_MASTER_IP` and `SPARK_MASTER_PORT` variables in `conf/spark-env.sh`.
-
 You can also pass an option `--cores <numCores>` to control the number of cores that spark-shell uses on the cluster.
 
 # Launching Compiled Spark Applications
-- 
GitLab