Skip to content
This repository has been archived by the owner on Dec 20, 2018. It is now read-only.

Upgrade to spark 2.1.0 #206

Closed

Conversation

koertkuipers
Copy link
Contributor

Closes #205

I am not entirely sure why i had to make changes in getDefaultWorkFile. Without these changes one extra (and unnecessary) layer of subdirectories was created upon writing, which meant no avro files were found upon reading.

@codecov-io
Copy link

codecov-io commented Dec 31, 2016

Current coverage is 90.62% (diff: 100%)

Merging #206 into master will decrease coverage by 0.67%

@@             master       #206   diff @@
==========================================
  Files             5          5          
  Lines           322        320     -2   
  Methods         270        271     +1   
  Messages          0          0          
  Branches         52         49     -3   
==========================================
- Hits            294        290     -4   
- Misses           28         30     +2   
  Partials          0          0          

Powered by Codecov. Last update b01a034...b51adf2

@JoshRosen
Copy link
Contributor

In order for this to actually test the right thing you're going to need to also update the Travis build matrix at https://github.com/databricks/spark-avro/blob/branch-3.1/.travis.yml#L13

@bdrillard
Copy link

bdrillard commented Jan 10, 2017

I'd very much appreciate seeing this uplift merged when acceptable, it's a prerequisite to forming a pull request for Avro encoders (see #169)

@andrioni
Copy link

Any idea on when this is going to get merged/a new release will be made?

@elyast
Copy link

elyast commented Jan 20, 2017

+1

1 similar comment
@carlos-eduardo-gb
Copy link

+1

@schwartzmx
Copy link

schwartzmx commented Jan 23, 2017

👍

Any ETA on when this will get in?

@alexander-branevskiy
Copy link

+1

@JoshRosen
Copy link
Contributor

I'll merge this today. It's going to be a little while before I'll be able to push a release to Maven Central because this process is currently blocked by an infrastructure issue on our end.

@hardkap
Copy link

hardkap commented Feb 7, 2017

+1, Waiting for this to get in.

@JoshRosen
Copy link
Contributor

It looks like it's possible to produce a single spark-avro binary artifact which is compatible with both Spark 2.0.x and 2.1.x. Thus I'm going to close this PR in favor of #212, which adds one additional commit to restore Spark 2.0.x compatibility. If my PR passes tests then I'll merge it and release version 3.2.0 of this library.

@JoshRosen JoshRosen closed this Feb 7, 2017
@JoshRosen JoshRosen reopened this Feb 7, 2017
@JoshRosen
Copy link
Contributor

Actually, I'll merge this now and layer the compatibility piece in a separate patch.

@JoshRosen JoshRosen closed this in b988184 Feb 7, 2017
JoshRosen added a commit that referenced this pull request Feb 7, 2017
This patch builds on #206 in order to restore support for Spark 2.0.x. This means that a single binary artifact can be used with both Spark 2.0.x and 2.1.x, simplifying the builds of downstream projects which are compatible with both Spark versions.

Author: Josh Rosen <joshrosen@databricks.com>

Closes #212 from JoshRosen/add-spark-2.1.
@JoshRosen JoshRosen added this to the 3.2.0 milestone Feb 8, 2017
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

10 participants