-
Notifications
You must be signed in to change notification settings - Fork 237
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Databricks 3.0.1 with new build profiles #3356
Conversation
Signed-off-by: Thomas Graves <tgraves@nvidia.com>
…rapids into buildsparkdb301
Signed-off-by: Thomas Graves <tgraves@nvidia.com>
build |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
For some reason the use of S{spark.version}
for all the non-Spark jars struck me this time, surprised I didn't notice it before. This is done in a lot of places, so I didn't put a comment for each one. Not sure it absolutely has to have a comment, but it sure did look weird and wrong when I noticed it.
Signed-off-by: Thomas Graves <tgraves@nvidia.com>
…nto buildsparkdb301
build |
relates to #3224
Add support for databricks 3.0.1 with new build profiles. In this case so prevent having to copy all the dependencies for data bricks 3.1.1 into a profile for 3.0.1 I made another profiles dbdeps that is triggered by property -Ddatabricks.
If we want to remove this we can and we can just copy them.
I also ran into a bug with the scala-maven-plugin where I had to go back to an older version.
Also clean up some build warnings.
build command is like:
mvn -Ddatabricks -Dbuildver=301db package