Skip to content
Snippets Groups Projects
Commit cfb25b27 authored by jinxing's avatar jinxing Committed by Wenchen Fan
Browse files

[SPARK-21530] Update description of spark.shuffle.maxChunksBeingTransferred.

## What changes were proposed in this pull request?

Update the description of `spark.shuffle.maxChunksBeingTransferred` to include that the new coming connections will be closed when the max is hit and client should have retry mechanism.

Author: jinxing <jinxing6042@126.com>

Closes #18735 from jinxing64/SPARK-21530.
parent 60472dbf
No related branches found
No related tags found
No related merge requests found
...@@ -258,7 +258,11 @@ public class TransportConf { ...@@ -258,7 +258,11 @@ public class TransportConf {
} }
/** /**
* The max number of chunks allowed to being transferred at the same time on shuffle service. * The max number of chunks allowed to be transferred at the same time on shuffle service.
* Note that new incoming connections will be closed when the max number is hit. The client will
* retry according to the shuffle retry configs (see `spark.shuffle.io.maxRetries` and
* `spark.shuffle.io.retryWait`), if those limits are reached the task will fail with fetch
* failure.
*/ */
public long maxChunksBeingTransferred() { public long maxChunksBeingTransferred() {
return conf.getLong("spark.shuffle.maxChunksBeingTransferred", Long.MAX_VALUE); return conf.getLong("spark.shuffle.maxChunksBeingTransferred", Long.MAX_VALUE);
......
...@@ -635,7 +635,11 @@ Apart from these, the following properties are also available, and may be useful ...@@ -635,7 +635,11 @@ Apart from these, the following properties are also available, and may be useful
<td><code>spark.shuffle.maxChunksBeingTransferred</code></td> <td><code>spark.shuffle.maxChunksBeingTransferred</code></td>
<td>Long.MAX_VALUE</td> <td>Long.MAX_VALUE</td>
<td> <td>
The max number of chunks allowed to being transferred at the same time on shuffle service. The max number of chunks allowed to be transferred at the same time on shuffle service.
Note that new incoming connections will be closed when the max number is hit. The client will
retry according to the shuffle retry configs (see <code>spark.shuffle.io.maxRetries</code> and
<code>spark.shuffle.io.retryWait</code>), if those limits are reached the task will fail with
fetch failure.
</td> </td>
</tr> </tr>
<tr> <tr>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment