Skip to content
Snippets Groups Projects
Commit 069bb942 authored by Andrew Ash's avatar Andrew Ash
Browse files

Clarify spark.default.parallelism

It's the task count across the cluster, not per worker, per machine, per core, or anything else.
parent f8544981
No related branches found
No related tags found
No related merge requests found
......@@ -98,7 +98,7 @@ Apart from these, the following properties are also available, and may be useful
<td>spark.default.parallelism</td>
<td>8</td>
<td>
Default number of tasks to use for distributed shuffle operations (<code>groupByKey</code>,
Default number of tasks to use across the cluster for distributed shuffle operations (<code>groupByKey</code>,
<code>reduceByKey</code>, etc) when not set by user.
</td>
</tr>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment