-
- Downloads
[SPARK-15116] In REPL we should create SparkSession first and get SparkContext from it
## What changes were proposed in this pull request? see https://github.com/apache/spark/pull/12873#discussion_r61993910. The problem is, if we create `SparkContext` first and then call `SparkSession.builder.enableHiveSupport().getOrCreate()`, we will reuse the existing `SparkContext` and the hive flag won't be set. ## How was this patch tested? verified it locally. Author: Wenchen Fan <wenchen@databricks.com> Closes #12890 from cloud-fan/repl.
Showing
- repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoop.scala 8 additions, 12 deletions....10/src/main/scala/org/apache/spark/repl/SparkILoop.scala
- repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoopInit.scala 3 additions, 8 deletions...src/main/scala/org/apache/spark/repl/SparkILoopInit.scala
- repl/scala-2.11/src/main/scala/org/apache/spark/repl/Main.scala 12 additions, 15 deletions...cala-2.11/src/main/scala/org/apache/spark/repl/Main.scala
- repl/scala-2.11/src/main/scala/org/apache/spark/repl/SparkILoop.scala 3 additions, 8 deletions....11/src/main/scala/org/apache/spark/repl/SparkILoop.scala
Loading
Please register or sign in to comment