Skip to content
Snippets Groups Projects
Commit 2082a495 authored by Dongjoon Hyun's avatar Dongjoon Hyun Committed by Sean Owen
Browse files

[MINOR][DOCS] Use `spark-submit` instead of `sparkR` to submit R script.

## What changes were proposed in this pull request?

Since `sparkR` is not used for submitting R Scripts from Spark 2.0, a user faces the following error message if he follows the instruction on `R/README.md`. This PR updates `R/README.md`.
```bash
$ ./bin/sparkR examples/src/main/r/dataframe.R
Running R applications through 'sparkR' is not supported as of Spark 2.0.
Use ./bin/spark-submit <R file>
```

## How was this patch tested?

Manual.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11842 from dongjoon-hyun/update_r_readme.
parent 1970d911
No related branches found
No related tags found
No related merge requests found
......@@ -40,7 +40,7 @@ To set other options like driver memory, executor memory etc. you can pass in th
If you wish to use SparkR from RStudio or other R frontends you will need to set some environment variables which point SparkR to your Spark installation. For example
```
# Set this to where Spark is installed
Sys.setenv(SPARK_HOME="/Users/shivaram/spark")
Sys.setenv(SPARK_HOME="/Users/username/spark")
# This line loads SparkR from the installed directory
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library(SparkR)
......@@ -51,7 +51,7 @@ sc <- sparkR.init(master="local")
The [instructions](https://cwiki.apache.org/confluence/display/SPARK/Contributing+to+Spark) for making contributions to Spark also apply to SparkR.
If you only make R file changes (i.e. no Scala changes) then you can just re-install the R package using `R/install-dev.sh` and test your changes.
Once you have made your changes, please include unit tests for them and run existing unit tests using the `run-tests.sh` script as described below.
Once you have made your changes, please include unit tests for them and run existing unit tests using the `R/run-tests.sh` script as described below.
#### Generating documentation
......@@ -60,9 +60,9 @@ The SparkR documentation (Rd files and HTML files) are not a part of the source
### Examples, Unit tests
SparkR comes with several sample programs in the `examples/src/main/r` directory.
To run one of them, use `./bin/sparkR <filename> <args>`. For example:
To run one of them, use `./bin/spark-submit <filename> <args>`. For example:
./bin/sparkR examples/src/main/r/dataframe.R
./bin/spark-submit examples/src/main/r/dataframe.R
You can also run the unit-tests for SparkR by running (you need to install the [testthat](http://cran.r-project.org/web/packages/testthat/index.html) package first):
......@@ -70,7 +70,7 @@ You can also run the unit-tests for SparkR by running (you need to install the [
./R/run-tests.sh
### Running on YARN
The `./bin/spark-submit` and `./bin/sparkR` can also be used to submit jobs to YARN clusters. You will need to set YARN conf dir before doing so. For example on CDH you can run
The `./bin/spark-submit` can also be used to submit jobs to YARN clusters. You will need to set YARN conf dir before doing so. For example on CDH you can run
```
export YARN_CONF_DIR=/etc/hadoop/conf
./bin/spark-submit --master yarn examples/src/main/r/dataframe.R
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment