- Jan 23, 2014
-
-
Josh Rosen authored
-
Josh Rosen authored
-
Josh Rosen authored
-
- Jan 18, 2014
-
-
Patrick Wendell authored
Remove Typesafe Config usage and conf files to fix nested property names With Typesafe Config we had the subtle problem of no longer allowing nested property names, which are used for a few of our properties: http://apache-spark-developers-list.1001551.n3.nabble.com/Config-properties-broken-in-master-td208.html This PR is for branch 0.9 but should be added into master too. (cherry picked from commit 34e911ce) Signed-off-by:
Patrick Wendell <pwendell@gmail.com>
-
- Jan 14, 2014
-
-
Matei Zaharia authored
-
Matei Zaharia authored
-
- Jan 13, 2014
-
-
Matei Zaharia authored
-
- Jan 12, 2014
-
-
Matei Zaharia authored
This helps in case the exception happened while serializing a record to be sent to Java, leaving the stream to Java in an inconsistent state where PythonRDD won't be able to read the error.
-
Matei Zaharia authored
We've used camel case in other Spark methods so it felt reasonable to keep using it here and make the code match Scala/Java as much as possible. Note that parameter names matter in Python because it allows passing optional parameters by name.
-
Matei Zaharia authored
- Added a Python wrapper for Naive Bayes - Updated the Scala Naive Bayes to match the style of our other algorithms better and in particular make it easier to call from Java (added builder pattern, removed default value in train method) - Updated Python MLlib functions to not require a SparkContext; we can get that from the RDD the user gives - Added a toString method in LabeledPoint - Made the Python MLlib tests run as part of run-tests as well (before they could only be run individually through each file)
-
- Jan 06, 2014
-
-
Hossein Falaki authored
-
Hossein Falaki authored
-
- Jan 04, 2014
-
-
Hossein Falaki authored
-
- Jan 03, 2014
-
-
Patrick Wendell authored
Closes #316
-
Prashant Sharma authored
-
Prashant Sharma authored
-
- Jan 02, 2014
-
-
Prashant Sharma authored
-
- Jan 01, 2014
-
-
Matei Zaharia authored
-
Matei Zaharia authored
Also replaced SparkConf.getOrElse with just a "get" that takes a default value, and added getInt, getLong, etc to make code that uses this simpler later on.
-
- Dec 30, 2013
-
-
Matei Zaharia authored
-
- Dec 29, 2013
-
-
Matei Zaharia authored
-
Matei Zaharia authored
tests so we don't get the test spark.conf on the classpath.
-
Matei Zaharia authored
-
Matei Zaharia authored
The test in context.py created two different instances of the SparkContext class by copying "globals", so that some tests can have a global "sc" object and others can try initializing their own contexts. This led to two JVM gateways being created since SparkConf also looked at pyspark.context.SparkContext to get the JVM.
-
Matei Zaharia authored
-
- Dec 28, 2013
-
-
Matei Zaharia authored
-
Tor Myklebust authored
-
- Dec 25, 2013
-
-
Tor Myklebust authored
-
Tor Myklebust authored
-
- Dec 24, 2013
-
-
Tor Myklebust authored
-
Tor Myklebust authored
-
Andrew Ash authored
-
Tathagata Das authored
-
Tor Myklebust authored
-
Tor Myklebust authored
-
- Dec 22, 2013
-
-
Tor Myklebust authored
-
- Dec 21, 2013
-
-
Tor Myklebust authored
-
- Dec 20, 2013
-
-
Tor Myklebust authored
-
Tor Myklebust authored
-
Tor Myklebust authored
-