-
Notifications
You must be signed in to change notification settings - Fork 310
Conversation
Current coverage is 90.62% (diff: 100%)@@ master #206 diff @@
==========================================
Files 5 5
Lines 322 320 -2
Methods 270 271 +1
Messages 0 0
Branches 52 49 -3
==========================================
- Hits 294 290 -4
- Misses 28 30 +2
Partials 0 0
|
In order for this to actually test the right thing you're going to need to also update the Travis build matrix at https://github.com/databricks/spark-avro/blob/branch-3.1/.travis.yml#L13 |
I'd very much appreciate seeing this uplift merged when acceptable, it's a prerequisite to forming a pull request for Avro encoders (see #169) |
Any idea on when this is going to get merged/a new release will be made? |
+1 |
1 similar comment
+1 |
👍 Any ETA on when this will get in? |
+1 |
I'll merge this today. It's going to be a little while before I'll be able to push a release to Maven Central because this process is currently blocked by an infrastructure issue on our end. |
+1, Waiting for this to get in. |
It looks like it's possible to produce a single |
Actually, I'll merge this now and layer the compatibility piece in a separate patch. |
This patch builds on #206 in order to restore support for Spark 2.0.x. This means that a single binary artifact can be used with both Spark 2.0.x and 2.1.x, simplifying the builds of downstream projects which are compatible with both Spark versions. Author: Josh Rosen <joshrosen@databricks.com> Closes #212 from JoshRosen/add-spark-2.1.
Closes #205
I am not entirely sure why i had to make changes in getDefaultWorkFile. Without these changes one extra (and unnecessary) layer of subdirectories was created upon writing, which meant no avro files were found upon reading.