From 92445241463abf2cc764aab2f79d6f0b54b8b42e Mon Sep 17 00:00:00 2001
From: Patrick Wendell <pwendell@gmail.com>
Date: Sun, 11 Aug 2013 20:33:58 -0700
Subject: [PATCH] Removing dead docs

---
 docs/spark-simple-tutorial.md | 41 -----------------------------------
 1 file changed, 41 deletions(-)
 delete mode 100644 docs/spark-simple-tutorial.md

diff --git a/docs/spark-simple-tutorial.md b/docs/spark-simple-tutorial.md
deleted file mode 100644
index fbdbc7d19d..0000000000
--- a/docs/spark-simple-tutorial.md
+++ /dev/null
@@ -1,41 +0,0 @@
----
-layout: global
-title: Tutorial - Running a Simple Spark Application
----
-
-1. Create directory for spark demo:
-
-        ~$ mkdir SparkTest
-
-2. Copy the sbt files in ~/spark/sbt directory:
-
-        ~/SparkTest$ cp -r ../spark/sbt .
-
-3. Edit the ~/SparkTest/sbt/sbt file to look like this:
-
-        #!/usr/bin/env bash
-        java -Xmx800M -XX:MaxPermSize=150m -jar $(dirname $0)/sbt-launch-*.jar "$@"
-
-4. To build a Spark application, you need Spark and its dependencies in a single Java archive (JAR) file. Create this JAR in Spark's main directory with sbt as:
-
-        ~/spark$ sbt/sbt assembly
-
-5. create a source file in ~/SparkTest/src/main/scala directory:
-
-        ~/SparkTest/src/main/scala$ vi Test1.scala
-
-6. Make the contain of the Test1.scala file like this:
-
-        import spark.SparkContext
-        import spark.SparkContext._
-        object Test1 {
-          def main(args: Array[String]) {
-            val sc = new SparkContext("local", "SparkTest")
-            println(sc.parallelize(1 to 10).reduce(_ + _))
-            System.exit(0)
-          }
-        }
-
-7. Run the Test1.scala file:
-
-        ~/SparkTest$ sbt/sbt run
-- 
GitLab