-
Notifications
You must be signed in to change notification settings - Fork 655
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Spark extension #2162
Add Spark extension #2162
Conversation
Codecov ReportBase: 72.08% // Head: 71.48% // Decreases project coverage by
Additional details and impacted files@@ Coverage Diff @@
## master #2162 +/- ##
============================================
- Coverage 72.08% 71.48% -0.61%
- Complexity 5126 6345 +1219
============================================
Files 473 629 +156
Lines 21970 28044 +6074
Branches 2351 3019 +668
============================================
+ Hits 15838 20046 +4208
- Misses 4925 6522 +1597
- Partials 1207 1476 +269
Help us with your feedback. Take ten seconds to tell us how you rate us. Have a feature suggestion? Share it here. ☔ View full report at Codecov. |
import java.io.Serializable | ||
|
||
/** | ||
* `SparkModel` is the implementation of [[ai.djl.Model]] for Spark support. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
does scala doc support markdown?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes. This is the scala format of scaladoc
extensions/spark/build.sbt
Outdated
@@ -0,0 +1,16 @@ | |||
name := "spark" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any reason we add build.sbt?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Removed.
import java.util | ||
|
||
/** A [[ai.djl.translate.Translator]] for Spark Image Classification tasks. */ | ||
@SerialVersionUID(123456789L) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@SerialVersionUID(123456789L) | |
@SerialVersionUID(1L) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Please consolidate versioning in a doc. By the way, could you also create a simple Python wrapper to demo the capability in PySpark?
extensions/spark/build.sbt
Outdated
|
||
resolvers += Resolver.jcenterRepo | ||
|
||
libraryDependencies += "org.apache.spark" %% "spark-core" % "3.0.1" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
any reason we are using 3.0.1? Do we also want to put this into gradle.properties?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
OK, put into gradle.properties.
4b63c09
to
4ddf8f9
Compare
extensions/spark/README.md
Outdated
<dependency> | ||
<groupId>ai.djl.spark</groupId> | ||
<artifactId>spark</artifactId> | ||
<version>0.19.0</version> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
0.20.0
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Changed
Description
Brief description of what this PR is about
Add Spark extension