diff --git a/docs/java-programming-guide.md b/docs/java-programming-guide.md
index 37a906ea1c780687474849473db056ad9ff71ef8..ae8257b53938e0672efc1cb35d4f77893a7a1ec2 100644
--- a/docs/java-programming-guide.md
+++ b/docs/java-programming-guide.md
@@ -189,7 +189,7 @@ We hope to generate documentation with Java-style syntax in the future.
 # Where to Go from Here
 
 Spark includes several sample programs using the Java API in
-`examples/src/main/java`.  You can run them by passing the class name to the
+[`examples/src/main/java`](https://github.com/mesos/spark/tree/master/examples/src/main/java/spark/examples).  You can run them by passing the class name to the
 `run` script included in Spark -- for example, `./run
 spark.examples.JavaWordCount`.  Each example program prints usage help when run
 without any arguments.
diff --git a/docs/python-programming-guide.md b/docs/python-programming-guide.md
index 2012241a6a77bb1a0114b72242801c00a44aea14..3a7a8db4a6ee43fdfa7af612c39a09b953b6560a 100644
--- a/docs/python-programming-guide.md
+++ b/docs/python-programming-guide.md
@@ -109,9 +109,9 @@ Code dependencies can be added to an existing SparkContext using its `addPyFile(
 
 # Where to Go from Here
 
-PySpark includes several sample programs using the Python API in `python/examples`.
+PySpark includes several sample programs in the [`python/examples` folder](https://github.com/mesos/spark/tree/master/python/examples).
 You can run them by passing the files to the `pyspark` script -- for example `./pyspark python/examples/wordcount.py`.
-Each example program prints usage help when run without any arguments.
+Each program prints usage help when run without arguments.
 
 We currently provide [API documentation](api/pyspark/index.html) for the Python API as Epydoc.
 Many of the RDD method descriptions contain [doctests](http://docs.python.org/2/library/doctest.html) that provide additional usage examples.