From 9a11396113d4bb0e76e0520df4fc58e7a8ec9f69 Mon Sep 17 00:00:00 2001
From: Cheolsoo Park <cheolsoop@netflix.com>
Date: Fri, 24 Jul 2015 11:56:55 -0700
Subject: [PATCH] [SPARK-9270] [PYSPARK] allow --name option in pyspark

This is continuation of #7512 which added `--name` option to spark-shell. This PR adds the same option to pyspark.

Note that `--conf spark.app.name` in command-line has no effect in spark-shell and pyspark. Instead, `--name` must be used. This is in fact inconsistency with spark-sql which doesn't accept `--name` option while it accepts `--conf spark.app.name`. I am not fixing this inconsistency in this PR. IMO, one of `--name` and `--conf spark.app.name` is needed not both. But since I cannot decide which to choose, I am not making any change here.

Author: Cheolsoo Park <cheolsoop@netflix.com>

Closes #7610 from piaozhexiu/SPARK-9270 and squashes the following commits:

763e86d [Cheolsoo Park] Update windows script
400b7f9 [Cheolsoo Park] Allow --name option to pyspark
---
 bin/pyspark             | 2 +-
 bin/pyspark2.cmd        | 2 +-
 python/pyspark/shell.py | 2 +-
 3 files changed, 3 insertions(+), 3 deletions(-)

diff --git a/bin/pyspark b/bin/pyspark
index f9dbddfa53..8f2a3b5a77 100755
--- a/bin/pyspark
+++ b/bin/pyspark
@@ -82,4 +82,4 @@ fi
 
 export PYSPARK_DRIVER_PYTHON
 export PYSPARK_DRIVER_PYTHON_OPTS
-exec "$SPARK_HOME"/bin/spark-submit pyspark-shell-main "$@"
+exec "$SPARK_HOME"/bin/spark-submit pyspark-shell-main --name "PySparkShell" "$@"
diff --git a/bin/pyspark2.cmd b/bin/pyspark2.cmd
index 45e9e3def5..3c6169983e 100644
--- a/bin/pyspark2.cmd
+++ b/bin/pyspark2.cmd
@@ -35,4 +35,4 @@ set PYTHONPATH=%SPARK_HOME%\python\lib\py4j-0.8.2.1-src.zip;%PYTHONPATH%
 set OLD_PYTHONSTARTUP=%PYTHONSTARTUP%
 set PYTHONSTARTUP=%SPARK_HOME%\python\pyspark\shell.py
 
-call %SPARK_HOME%\bin\spark-submit2.cmd pyspark-shell-main %*
+call %SPARK_HOME%\bin\spark-submit2.cmd pyspark-shell-main --name "PySparkShell" %*
diff --git a/python/pyspark/shell.py b/python/pyspark/shell.py
index 144cdf0b0c..99331297c1 100644
--- a/python/pyspark/shell.py
+++ b/python/pyspark/shell.py
@@ -40,7 +40,7 @@ if os.environ.get("ADD_FILES") is not None:
 if os.environ.get("SPARK_EXECUTOR_URI"):
     SparkContext.setSystemProperty("spark.executor.uri", os.environ["SPARK_EXECUTOR_URI"])
 
-sc = SparkContext(appName="PySparkShell", pyFiles=add_files)
+sc = SparkContext(pyFiles=add_files)
 atexit.register(lambda: sc.stop())
 
 try:
-- 
GitLab