Skip to content
Snippets Groups Projects
Commit ce204780 authored by Greg Owen's avatar Greg Owen Committed by gatorsmile
Browse files

[SPARK-22120][SQL] TestHiveSparkSession.reset() should clean out Hive warehouse directory

## What changes were proposed in this pull request?
During TestHiveSparkSession.reset(), which is called after each TestHiveSingleton suite, we now delete and recreate the Hive warehouse directory.

## How was this patch tested?
Ran full suite of tests locally, verified that they pass.

Author: Greg Owen <greg@databricks.com>

Closes #19341 from GregOwen/SPARK-22120.
parent 038b1857
No related branches found
No related tags found
No related merge requests found
......@@ -18,6 +18,7 @@
package org.apache.spark.sql.hive.test
import java.io.File
import java.net.URI
import java.util.{Set => JavaSet}
import scala.collection.JavaConverters._
......@@ -498,6 +499,11 @@ private[hive] class TestHiveSparkSession(
}
}
// Clean out the Hive warehouse between each suite
val warehouseDir = new File(new URI(sparkContext.conf.get("spark.sql.warehouse.dir")).getPath)
Utils.deleteRecursively(warehouseDir)
warehouseDir.mkdir()
sharedState.cacheManager.clearCache()
loadedTables.clear()
sessionState.catalog.reset()
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment