Skip to content
Snippets Groups Projects
Commit bb98ecaf authored by Aaron Davidson's avatar Aaron Davidson Committed by Patrick Wendell
Browse files

SPARK-1860: Do not cleanup application work/ directories by default

This causes an unrecoverable error for applications that are running for longer
than 7 days that have jars added to the SparkContext, as the jars are cleaned up
even though the application is still running.

Author: Aaron Davidson <aaron@databricks.com>

Closes #800 from aarondav/shitty-defaults and squashes the following commits:

a573fbb [Aaron Davidson] SPARK-1860: Do not cleanup application work/ directories by default
parent 94c51396
No related branches found
No related tags found
No related merge requests found
......@@ -65,7 +65,7 @@ private[spark] class Worker(
val REGISTRATION_TIMEOUT = 20.seconds
val REGISTRATION_RETRIES = 3
val CLEANUP_ENABLED = conf.getBoolean("spark.worker.cleanup.enabled", true)
val CLEANUP_ENABLED = conf.getBoolean("spark.worker.cleanup.enabled", false)
// How often worker will clean up old app folders
val CLEANUP_INTERVAL_MILLIS = conf.getLong("spark.worker.cleanup.interval", 60 * 30) * 1000
// TTL for app folders/data; after TTL expires it will be cleaned up
......
......@@ -390,10 +390,11 @@ Apart from these, the following properties are also available, and may be useful
</tr>
<tr>
<td>spark.worker.cleanup.enabled</td>
<td>true</td>
<td>false</td>
<td>
Enable periodic cleanup of worker / application directories. Note that this only affects standalone
mode, as YARN works differently.
mode, as YARN works differently. Applications directories are cleaned up regardless of whether
the application is still running.
</td>
</tr>
<tr>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment