-
- Downloads
Merge pull request #124 from tgravescs/sparkHadoopUtilFix
Pull SparkHadoopUtil out of SparkEnv (jira SPARK-886) Having the logic to initialize the correct SparkHadoopUtil in SparkEnv prevents it from being used until after the SparkContext is initialized. This causes issues like https://spark-project.atlassian.net/browse/SPARK-886. It also makes it hard to use in singleton objects. For instance I want to use it in the security code.
No related branches found
No related tags found
Showing
- core/src/main/scala/org/apache/spark/SparkContext.scala 5 additions, 6 deletionscore/src/main/scala/org/apache/spark/SparkContext.scala
- core/src/main/scala/org/apache/spark/SparkEnv.scala 4 additions, 13 deletionscore/src/main/scala/org/apache/spark/SparkEnv.scala
- core/src/main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala 20 additions, 6 deletions.../main/scala/org/apache/spark/deploy/SparkHadoopUtil.scala
- core/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala 4 additions, 3 deletionscore/src/main/scala/org/apache/spark/rdd/CheckpointRDD.scala
- core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala 4 additions, 3 deletionscore/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala
- core/src/main/scala/org/apache/spark/scheduler/InputFormatInfo.scala 3 additions, 4 deletions...in/scala/org/apache/spark/scheduler/InputFormatInfo.scala
- core/src/main/scala/org/apache/spark/util/Utils.scala 1 addition, 2 deletionscore/src/main/scala/org/apache/spark/util/Utils.scala
- examples/src/main/scala/org/apache/spark/examples/SparkHdfsLR.scala 2 additions, 1 deletion...rc/main/scala/org/apache/spark/examples/SparkHdfsLR.scala
Loading
Please register or sign in to comment