- Mar 02, 2014
-
-
Patrick Wendell authored
This lets us explicitly include Avro based on a profile for 0.23.X builds. It makes me sad how convoluted it is to express this logic in Maven. @tgraves and @sryza curious if this works for you. I'm also considering just reverting to how it was before. The only real problem was that Spark advertised a dependency on Avro even though it only really depends transitively on Avro through other deps. Author: Patrick Wendell <pwendell@gmail.com> Closes #49 from pwendell/avro-build-fix and squashes the following commits: 8d6ee92 [Patrick Wendell] SPARK-1121: Add avro to yarn-alpha profile
-
Patrick Wendell authored
This removes some loose ends not caught by the other (incubating -> tlp) patches. @markhamstra this updates the version as you mentioned earlier. Author: Patrick Wendell <pwendell@gmail.com> Closes #51 from pwendell/tlp and squashes the following commits: d553b1b [Patrick Wendell] Remove remaining references to incubation
-
- Feb 27, 2014
-
-
Sean Owen authored
(Ported from https://github.com/apache/incubator-spark/pull/637 ) Author: Sean Owen <sowen@cloudera.com> Closes #31 from srowen/SPARK-1084.1 and squashes the following commits: 6c4a32c [Sean Owen] Suppress warnings about legitimate unchecked array creations, or change code to avoid it f35b833 [Sean Owen] Fix two misc javadoc problems 254e8ef [Sean Owen] Fix one new style error introduced in scaladoc warning commit 5b2fce2 [Sean Owen] Fix scaladoc invocation warning, and enable javac warnings properly, with plugin config updates 007762b [Sean Owen] Remove dead scaladoc links b8ff8cb [Sean Owen] Replace deprecated Ant <tasks> with <target>
-
- Feb 23, 2014
-
-
Sean Owen authored
Prompted by a recent thread on the mailing list, I tried and failed to see if Spark can be made independent of log4j. There are a few cases where control of the underlying logging is pretty useful, and to do that, you have to bind to a specific logger. Instead I propose some tidying that leaves Spark's use of log4j, but gets rid of warnings and should still enable downstream users to switch. The idea is to pipe everything (except log4j) through SLF4J, and have Spark use SLF4J directly when logging, and where Spark needs to output info (REPL and tests), bind from SLF4J to log4j. This leaves the same behavior in Spark. It means that downstream users who want to use something except log4j should: - Exclude dependencies on log4j, slf4j-log4j12 from Spark - Include dependency on log4j-over-slf4j - Include dependency on another logger X, and another slf4j-X - Recreate any log config that Spark does, that is needed, in the other logger's config That sounds about right. Here are the key changes: - Include the jcl-over-slf4j shim everywhere by depending on it in core. - Exclude dependencies on commons-logging from third-party libraries. - Include the jul-to-slf4j shim everywhere by depending on it in core. - Exclude slf4j-* dependencies from third-party libraries to prevent collision or warnings - Added missing slf4j-log4j12 binding to GraphX, Bagel module tests And minor/incidental changes: - Update to SLF4J 1.7.5, which happily matches Hadoop 2’s version and is a recommended update over 1.7.2 - (Remove a duplicate HBase dependency declaration in SparkBuild.scala) - (Remove a duplicate mockito dependency declaration that was causing warnings and bugging me) Author: Sean Owen <sowen@cloudera.com> Closes #570 from srowen/SPARK-1071 and squashes the following commits: 52eac9f [Sean Owen] Add slf4j-over-log4j12 dependency to core (non-test) and remove it from things that depend on core. 77a7fa9 [Sean Owen] SPARK-1071: Tidy logging strategy and use of log4j
-
- Feb 08, 2014
-
-
Mark Hamstra authored
Version number to 1.0.0-SNAPSHOT Since 0.9.0-incubating is done and out the door, we shouldn't be building 0.9.0-incubating-SNAPSHOT anymore. @pwendell Author: Mark Hamstra <markhamstra@gmail.com> == Merge branch commits == commit 1b00a8a7c1a7f251b4bb3774b84b9e64758eaa71 Author: Mark Hamstra <markhamstra@gmail.com> Date: Wed Feb 5 09:30:32 2014 -0800 Version number to 1.0.0-SNAPSHOT
-
- Dec 15, 2013
-
-
Mark Hamstra authored
-
- Dec 10, 2013
-
-
Prashant Sharma authored
-
- Dec 07, 2013
-
-
Prashant Sharma authored
Incorporated Patrick's feedback comment on #211 and made maven build/dep-resolution atleast a bit faster.
-
- Oct 12, 2013
-
-
Andrew xia authored
-
- Oct 06, 2013
-
-
Patrick Wendell authored
-
- Sep 26, 2013
-
-
Patrick Wendell authored
-
Prashant Sharma authored
-
- Sep 06, 2013
-
-
Jey Kottalam authored
-
- Sep 01, 2013
-
-
Matei Zaharia authored
-
Matei Zaharia authored
-
- Aug 21, 2013
-
-
Mark Hamstra authored
-
- Aug 18, 2013
-
-
Jey Kottalam authored
-
- Aug 16, 2013
-
-
Jey Kottalam authored
-
Jey Kottalam authored
-
Jey Kottalam authored
-
Jey Kottalam authored
-
- Aug 13, 2013
-
-
Shivaram Venkataraman authored
-
- Jul 16, 2013
-
-
Matei Zaharia authored
-
- Jun 14, 2013
-
-
Prashant Sharma authored
-
- May 01, 2013
-
-
Jey Kottalam authored
-
- Apr 07, 2013
-
-
Mridul Muralidharan authored
-
- Mar 28, 2013
-
-
Jey Kottalam authored
-
- Mar 15, 2013
-
-
Mikhail Bautin authored
Also rename parent project to spark-parent (otherwise it shows up as "parent" in IntelliJ, which is very confusing).
-
- Mar 01, 2013
-
-
Mark Hamstra authored
-
- Jan 31, 2013
-
-
Mikhail Bautin authored
See the discussion at https://github.com/mesos/spark/pull/355 for why default profile activation is a problem.
-
- Jan 22, 2013
-
-
Mikhail Bautin authored
-
- Jan 20, 2013
-
-
Matei Zaharia authored
As part of this, changed our Scala 2.9.2 Kafka library to be available as a local Maven repository, following the example in (http://blog.dub.podval.org/2010/01/maven-in-project-repository.html)
-
- Jan 12, 2013
-
-
Shivaram Venkataraman authored
-
- Jan 11, 2013
-
-
Shivaram Venkataraman authored
-
- Jan 08, 2013
-
-
Shivaram Venkataraman authored
by using -Dhadoop -Phadoop2.
-
- Jan 07, 2013
-
-
Shivaram Venkataraman authored
-
- Dec 18, 2012
-
-
Thomas Dudziak authored
Fixed repl maven build to produce artifacts with the appropriate hadoop classifier and extracted repl fat-jar and debian packaging into a separate project to make Maven happy
-
- Dec 10, 2012
-
-
Thomas Dudziak authored
-
Matei Zaharia authored
This will make it easier to make the "run" script work with a Maven build
-
Thomas Dudziak authored
-