Skip to content
Snippets Groups Projects
Commit f2f26c2a authored by Patrick Wendell's avatar Patrick Wendell
Browse files

SPARK-3092 [SQL]: Always include the thriftserver when -Phive is enabled.

Currently we have a separate profile called hive-thriftserver. I originally suggested this in case users did not want to bundle the thriftserver, but it's ultimately lead to a lot of confusion. Since the thriftserver is only a few classes, I don't see a really good reason to isolate it from the rest of Hive. So let's go ahead and just include it in the same profile to simplify things.

This has been suggested in the past by liancheng.

Author: Patrick Wendell <pwendell@gmail.com>

Closes #2006 from pwendell/hiveserver and squashes the following commits:

742ea40 [Patrick Wendell] Merge remote-tracking branch 'apache/master' into hiveserver
034ad47 [Patrick Wendell] SPARK-3092: Always include the thriftserver when -Phive is enabled.
parent 8c5a2226
No related branches found
No related tags found
No related merge requests found
...@@ -118,11 +118,7 @@ If your project is built with Maven, add this to your POM file's `<dependencies> ...@@ -118,11 +118,7 @@ If your project is built with Maven, add this to your POM file's `<dependencies>
## A Note About Thrift JDBC server and CLI for Spark SQL ## A Note About Thrift JDBC server and CLI for Spark SQL
Spark SQL supports Thrift JDBC server and CLI. Spark SQL supports Thrift JDBC server and CLI.
See sql-programming-guide.md for more information about those features. See sql-programming-guide.md for more information about using the JDBC server.
You can use those features by setting `-Phive-thriftserver` when building Spark as follows.
$ sbt/sbt -Phive-thriftserver assembly
## Configuration ## Configuration
......
...@@ -163,11 +163,6 @@ ...@@ -163,11 +163,6 @@
<artifactId>spark-hive_${scala.binary.version}</artifactId> <artifactId>spark-hive_${scala.binary.version}</artifactId>
<version>${project.version}</version> <version>${project.version}</version>
</dependency> </dependency>
</dependencies>
</profile>
<profile>
<id>hive-thriftserver</id>
<dependencies>
<dependency> <dependency>
<groupId>org.apache.spark</groupId> <groupId>org.apache.spark</groupId>
<artifactId>spark-hive-thriftserver_${scala.binary.version}</artifactId> <artifactId>spark-hive-thriftserver_${scala.binary.version}</artifactId>
......
...@@ -60,14 +60,14 @@ if [[ ! "$@" =~ --package-only ]]; then ...@@ -60,14 +60,14 @@ if [[ ! "$@" =~ --package-only ]]; then
-Dmaven.javadoc.skip=true \ -Dmaven.javadoc.skip=true \
-Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 \ -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 \
-Dtag=$GIT_TAG -DautoVersionSubmodules=true \ -Dtag=$GIT_TAG -DautoVersionSubmodules=true \
-Pyarn -Phive -Phive-thriftserver -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \ -Pyarn -Phive -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \
--batch-mode release:prepare --batch-mode release:prepare
mvn -DskipTests \ mvn -DskipTests \
-Darguments="-DskipTests=true -Dmaven.javadoc.skip=true -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -Dgpg.passphrase=${GPG_PASSPHRASE}" \ -Darguments="-DskipTests=true -Dmaven.javadoc.skip=true -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 -Dgpg.passphrase=${GPG_PASSPHRASE}" \
-Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 \ -Dhadoop.version=2.2.0 -Dyarn.version=2.2.0 \
-Dmaven.javadoc.skip=true \ -Dmaven.javadoc.skip=true \
-Pyarn -Phive -Phive-thriftserver -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \ -Pyarn -Phive -Phadoop-2.2 -Pspark-ganglia-lgpl -Pkinesis-asl \
release:perform release:perform
cd .. cd ..
...@@ -117,10 +117,10 @@ make_binary_release() { ...@@ -117,10 +117,10 @@ make_binary_release() {
spark-$RELEASE_VERSION-bin-$NAME.tgz.sha spark-$RELEASE_VERSION-bin-$NAME.tgz.sha
} }
make_binary_release "hadoop1" "-Phive -Phive-thriftserver -Dhadoop.version=1.0.4" & make_binary_release "hadoop1" "-Phive -Dhadoop.version=1.0.4" &
make_binary_release "cdh4" "-Phive -Phive-thriftserver -Dhadoop.version=2.0.0-mr1-cdh4.2.0" & make_binary_release "cdh4" "-Phive -Dhadoop.version=2.0.0-mr1-cdh4.2.0" &
make_binary_release "hadoop2" \ make_binary_release "hadoop2" \
"-Phive -Phive-thriftserver -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -Pyarn.version=2.2.0" & "-Phive -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -Pyarn.version=2.2.0" &
make_binary_release "hadoop2-without-hive" \ make_binary_release "hadoop2-without-hive" \
"-Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -Pyarn.version=2.2.0" & "-Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -Pyarn.version=2.2.0" &
wait wait
......
...@@ -99,7 +99,7 @@ echo -e "q\n" | sbt/sbt $BUILD_MVN_PROFILE_ARGS clean package assembly/assembly ...@@ -99,7 +99,7 @@ echo -e "q\n" | sbt/sbt $BUILD_MVN_PROFILE_ARGS clean package assembly/assembly
# If the Spark SQL tests are enabled, run the tests with the Hive profiles enabled: # If the Spark SQL tests are enabled, run the tests with the Hive profiles enabled:
if [ -n "$_RUN_SQL_TESTS" ]; then if [ -n "$_RUN_SQL_TESTS" ]; then
SBT_MAVEN_PROFILES_ARGS="$SBT_MAVEN_PROFILES_ARGS -Phive -Phive-thriftserver" SBT_MAVEN_PROFILES_ARGS="$SBT_MAVEN_PROFILES_ARGS -Phive"
fi fi
# echo "q" is needed because sbt on encountering a build file with failure # echo "q" is needed because sbt on encountering a build file with failure
# (either resolution or compilation) prompts the user for input either q, r, # (either resolution or compilation) prompts the user for input either q, r,
......
...@@ -17,7 +17,7 @@ ...@@ -17,7 +17,7 @@
# limitations under the License. # limitations under the License.
# #
echo -e "q\n" | sbt/sbt -Phive -Phive-thriftserver scalastyle > scalastyle.txt echo -e "q\n" | sbt/sbt -Phive scalastyle > scalastyle.txt
# Check style with YARN alpha built too # Check style with YARN alpha built too
echo -e "q\n" | sbt/sbt -Pyarn -Phadoop-0.23 -Dhadoop.version=0.23.9 yarn-alpha/scalastyle \ echo -e "q\n" | sbt/sbt -Pyarn -Phadoop-0.23 -Dhadoop.version=0.23.9 yarn-alpha/scalastyle \
>> scalastyle.txt >> scalastyle.txt
......
...@@ -98,12 +98,8 @@ mvn -Pyarn-alpha -Phadoop-2.3 -Dhadoop.version=2.3.0 -Dyarn.version=0.23.7 -Dski ...@@ -98,12 +98,8 @@ mvn -Pyarn-alpha -Phadoop-2.3 -Dhadoop.version=2.3.0 -Dyarn.version=0.23.7 -Dski
# Building Thrift JDBC server and CLI for Spark SQL # Building Thrift JDBC server and CLI for Spark SQL
Spark SQL supports Thrift JDBC server and CLI. Spark SQL supports Thrift JDBC server and CLI. See sql-programming-guide.md for
See sql-programming-guide.md for more information about those features. more information about the JDBC server.
You can use those features by setting `-Phive-thriftserver` when building Spark as follows.
{% highlight bash %}
mvn -Phive-thriftserver assembly
{% endhighlight %}
# Spark Tests in Maven # Spark Tests in Maven
......
...@@ -578,9 +578,7 @@ evaluated by the SQL execution engine. A full list of the functions supported c ...@@ -578,9 +578,7 @@ evaluated by the SQL execution engine. A full list of the functions supported c
The Thrift JDBC server implemented here corresponds to the [`HiveServer2`] The Thrift JDBC server implemented here corresponds to the [`HiveServer2`]
(https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2) in Hive 0.12. You can test (https://cwiki.apache.org/confluence/display/Hive/Setting+Up+HiveServer2) in Hive 0.12. You can test
the JDBC server with the beeline script comes with either Spark or Hive 0.12. In order to use Hive the JDBC server with the beeline script comes with either Spark or Hive 0.12.
you must first run '`sbt/sbt -Phive-thriftserver assembly/assembly`' (or use `-Phive-thriftserver`
for maven).
To start the JDBC server, run the following in the Spark directory: To start the JDBC server, run the following in the Spark directory:
......
...@@ -1179,7 +1179,7 @@ ...@@ -1179,7 +1179,7 @@
</profile> </profile>
<profile> <profile>
<id>hive-thriftserver</id> <id>hive</id>
<activation> <activation>
<activeByDefault>false</activeByDefault> <activeByDefault>false</activeByDefault>
</activation> </activation>
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment