Skip to content
Snippets Groups Projects
Commit cdb003e3 authored by Patrick Wendell's avatar Patrick Wendell
Browse files

Removing docs on akka options

parent 792d9084
No related branches found
No related tags found
No related merge requests found
......@@ -192,7 +192,15 @@ class SparkConf(loadDefaults: Boolean) extends Cloneable with Logging {
}
/** Get all akka conf variables set on this SparkConf */
def getAkkaConf: Seq[(String, String)] = getAll.filter {case (k, v) => k.startsWith("akka.")}
def getAkkaConf: Seq[(String, String)] =
/* This is currently undocumented. If we want to make this public we should consider
* nesting options under the spark namespace to avoid conflicts with user akka options.
* Otherwise users configuring their own akka code via system properties could mess up
* spark's akka options.
*
* E.g. spark.akka.option.x.y.x = "value"
*/
getAll.filter {case (k, v) => k.startsWith("akka.")}
/** Does the configuration contain a given parameter? */
def contains(key: String): Boolean = settings.contains(key)
......
......@@ -379,13 +379,6 @@ Apart from these, the following properties are also available, and may be useful
Too large a value decreases parallelism during broadcast (makes it slower); however, if it is too small, <code>BlockManager</code> might take a performance hit.
</td>
</tr>
<tr>
<td>akka.x.y....</td>
<td>value</td>
<td>
An arbitrary akka configuration can be set directly on spark conf and it is applied for all the ActorSystems created spark wide for that SparkContext and its assigned executors as well.
</td>
</tr>
<tr>
<td>spark.shuffle.consolidateFiles</td>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment