Skip to content
Snippets Groups Projects
Commit 2ae7b88a authored by Yin Huai's avatar Yin Huai Committed by Reynold Xin
Browse files

[SPARK-15705][SQL] Change the default value of spark.sql.hive.convertMetastoreOrc to false.

## What changes were proposed in this pull request?
In 2.0, we add a new logic to convert HiveTableScan on ORC tables to Spark's native code path. However, during this conversion, we drop the original metastore schema (https://issues.apache.org/jira/browse/SPARK-15705). Because of this regression, I am changing the default value of `spark.sql.hive.convertMetastoreOrc` to false.

Author: Yin Huai <yhuai@databricks.com>

Closes #14267 from yhuai/SPARK-15705-changeDefaultValue.
parent 162d04a3
No related branches found
No related tags found
No related merge requests found
......@@ -97,10 +97,11 @@ private[spark] object HiveUtils extends Logging {
.createWithDefault(false)
val CONVERT_METASTORE_ORC = SQLConfigBuilder("spark.sql.hive.convertMetastoreOrc")
.internal()
.doc("When set to false, Spark SQL will use the Hive SerDe for ORC tables instead of " +
"the built in support.")
.booleanConf
.createWithDefault(true)
.createWithDefault(false)
val HIVE_METASTORE_SHARED_PREFIXES = SQLConfigBuilder("spark.sql.hive.metastore.sharedPrefixes")
.doc("A comma separated list of class prefixes that should be loaded using the classloader " +
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment