-
- Downloads
You need to sign in or sign up before continuing.
[SPARK-16579][SPARKR] add install.spark function
## What changes were proposed in this pull request? Add an install_spark function to the SparkR package. User can run `install_spark()` to install Spark to a local directory within R. Updates: Several changes have been made: - `install.spark()` - check existence of tar file in the cache folder, and download only if not found - trial priority of mirror_url look-up: user-provided -> preferred mirror site from apache website -> hardcoded backup option - use 2.0.0 - `sparkR.session()` - can install spark when not found in `SPARK_HOME` ## How was this patch tested? Manual tests, running the check-cran.sh script added in #14173. Author: Junyang Qian <junyangq@databricks.com> Closes #14258 from junyangq/SPARK-16579.
Showing
- R/check-cran.sh 1 addition, 1 deletionR/check-cran.sh
- R/pkg/DESCRIPTION 2 additions, 1 deletionR/pkg/DESCRIPTION
- R/pkg/NAMESPACE 2 additions, 0 deletionsR/pkg/NAMESPACE
- R/pkg/R/install.R 235 additions, 0 deletionsR/pkg/R/install.R
- R/pkg/R/sparkR.R 17 additions, 0 deletionsR/pkg/R/sparkR.R
- R/pkg/R/utils.R 8 additions, 0 deletionsR/pkg/R/utils.R
- R/pkg/inst/tests/testthat/test_sparkSQL.R 2 additions, 2 deletionsR/pkg/inst/tests/testthat/test_sparkSQL.R
Loading
Please register or sign in to comment