diff --git a/docs/spark-standalone.md b/docs/spark-standalone.md
index ad8b6c0e51a78f53a34af9a3ca288cc54ff2d079..2fb30765f35e8cafaeedd227ff67747fc82076e7 100644
--- a/docs/spark-standalone.md
+++ b/docs/spark-standalone.md
@@ -242,9 +242,6 @@ To run an interactive Spark shell against the cluster, run the following command
 
     ./bin/spark-shell --master spark://IP:PORT
 
-Note that if you are running spark-shell from one of the spark cluster machines, the `bin/spark-shell` script will
-automatically set MASTER from the `SPARK_MASTER_IP` and `SPARK_MASTER_PORT` variables in `conf/spark-env.sh`.
-
 You can also pass an option `--cores <numCores>` to control the number of cores that spark-shell uses on the cluster.
 
 # Launching Compiled Spark Applications