Ktor server for Tensorflow serving using local inference (Local model with Tensorflow Java Api) and GCP inference (Model uploaded to ML:Engine)
-
Updated
Sep 3, 2018 - Kotlin
Ktor server for Tensorflow serving using local inference (Local model with Tensorflow Java Api) and GCP inference (Model uploaded to ML:Engine)
Add a description, image, and links to the tensorflow-java topic page so that developers can more easily learn about it.
To associate your repository with the tensorflow-java topic, visit your repo's landing page and select "manage topics."