Skip to content
Snippets Groups Projects
Commit 59f475c7 authored by Patrick Wendell's avatar Patrick Wendell
Browse files

Merge pull request #442 from pwendell/standalone

Workers should use working directory as spark home if it's not specified

If users don't set SPARK_HOME in their environment file when launching an application, the standalone cluster should default to the spark home of the worker.
parents 2a05403a 00a3f7ee
No related branches found
No related tags found
No related merge requests found
......@@ -209,8 +209,11 @@ private[spark] class Worker(
logWarning("Invalid Master (" + masterUrl + ") attempted to launch executor.")
} else {
logInfo("Asked to launch executor %s/%d for %s".format(appId, execId, appDesc.name))
// TODO (pwendell): We shuld make sparkHome an Option[String] in
// ApplicationDescription to be more explicit about this.
val effectiveSparkHome = Option(execSparkHome_).getOrElse(sparkHome.getAbsolutePath)
val manager = new ExecutorRunner(appId, execId, appDesc, cores_, memory_,
self, workerId, host, new File(execSparkHome_), workDir, akkaUrl, ExecutorState.RUNNING)
self, workerId, host, new File(effectiveSparkHome), workDir, akkaUrl, ExecutorState.RUNNING)
executors(appId + "/" + execId) = manager
manager.start()
coresUsed += cores_
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment