Skip to content
Snippets Groups Projects
Commit 2a39d2ca authored by Reynold Xin's avatar Reynold Xin
Browse files

Merge pull request #810 from pwendell/dead_doc_code

Remove now dead code inside of docs
parents e5b9ed28 92445241
No related branches found
No related tags found
No related merge requests found
---
layout: global
title: Tutorial - Running a Simple Spark Application
---
1. Create directory for spark demo:
~$ mkdir SparkTest
2. Copy the sbt files in ~/spark/sbt directory:
~/SparkTest$ cp -r ../spark/sbt .
3. Edit the ~/SparkTest/sbt/sbt file to look like this:
#!/usr/bin/env bash
java -Xmx800M -XX:MaxPermSize=150m -jar $(dirname $0)/sbt-launch-*.jar "$@"
4. To build a Spark application, you need Spark and its dependencies in a single Java archive (JAR) file. Create this JAR in Spark's main directory with sbt as:
~/spark$ sbt/sbt assembly
5. create a source file in ~/SparkTest/src/main/scala directory:
~/SparkTest/src/main/scala$ vi Test1.scala
6. Make the contain of the Test1.scala file like this:
import spark.SparkContext
import spark.SparkContext._
object Test1 {
def main(args: Array[String]) {
val sc = new SparkContext("local", "SparkTest")
println(sc.parallelize(1 to 10).reduce(_ + _))
System.exit(0)
}
}
7. Run the Test1.scala file:
~/SparkTest$ sbt/sbt run
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment