-
- Downloads
[SPARK-16420] Ensure compression streams are closed.
## What changes were proposed in this pull request? This uses the try/finally pattern to ensure streams are closed after use. `UnsafeShuffleWriter` wasn't closing compression streams, causing them to leak resources until garbage collected. This was causing a problem with codecs that use off-heap memory. ## How was this patch tested? Current tests are sufficient. This should not change behavior. Author: Ryan Blue <blue@apache.org> Closes #14093 from rdblue/SPARK-16420-unsafe-shuffle-writer-leak.
Showing
- common/network-common/src/main/java/org/apache/spark/network/util/LimitedInputStream.java 23 additions, 0 deletions...ava/org/apache/spark/network/util/LimitedInputStream.java
- core/src/main/java/org/apache/spark/shuffle/sort/UnsafeShuffleWriter.java 12 additions, 5 deletions...va/org/apache/spark/shuffle/sort/UnsafeShuffleWriter.java
- core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala 10 additions, 3 deletions...n/scala/org/apache/spark/broadcast/TorrentBroadcast.scala
- core/src/main/scala/org/apache/spark/serializer/GenericAvroSerializer.scala 12 additions, 3 deletions...a/org/apache/spark/serializer/GenericAvroSerializer.scala
Loading
Please register or sign in to comment