Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Facing issues when Tensorflow model involves a String as input/output to serving function #2053

Closed
chaitanya-basava opened this issue Sep 30, 2022 · 3 comments · Fixed by #2056
Labels
enhancement New feature or request

Comments

@chaitanya-basava
Copy link

Hi,
I am testing out a tensorflow model which takes a string as input and returns the spell corrected version of this string and it's confidence score. The signature_def retuned for the model using saved_model_cli is as following

signature_def['serving_default']:
  The given SavedModel SignatureDef contains the following input(s):
    inputs['query'] tensor_info:
        dtype: DT_STRING
        shape: (-1)
        name: serving_default_query:0
  The given SavedModel SignatureDef contains the following output(s):
    outputs['corrected_query'] tensor_info:
        dtype: DT_STRING
        shape: ()
        name: StatefulPartitionedCall_5:0
    outputs['score'] tensor_info:
        dtype: DT_DOUBLE
        shape: unknown_rank
        name: StatefulPartitionedCall_5:1
  Method name is: tensorflow/serving/predict

But when loading the model on DJL, the loaded model is only identifying score as model output. corrected_query is not getting returned when checking with model.describeOutput().keys() and only score is getting returned upon model prediction call.

Any support or document around solving this issue would be really helpful.

Thanks

@frankfliu
Copy link
Contributor

@chaitanya-basava

This is current limitation in DJL. We filtered out String tensor in the output (https://github.com/deepjavalibrary/djl/blob/master/engines/tensorflow/tensorflow-engine/src/main/java/ai/djl/tensorflow/engine/TfSymbolBlock.java#L120). Since we added String Tensor support, should be able to add String tensor support. Can you share your model or point to a similar model, so we can add a test for this type of model.

@frankfliu frankfliu added the enhancement New feature or request label Sep 30, 2022
@chaitanya-basava
Copy link
Author

chaitanya-basava commented Oct 1, 2022

@frankfliu thanks for the response. I won't be able to share the exact model file, so have created this dummy model which has similar input and output signatures, except it would be returning the query string as corrected_query and score=0.0

model.zip

Hope this will be helpful.

Wanted to clarify one more thing would there be a similar behaviour (of string output's getting filtered out) when using other model types like onnx and pytorch as well?

@frankfliu
Copy link
Contributor

@chaitanya-basava

You can try nightly snapshot build once this PR is merged.
The ONNX and PyTorch engine should not have this issue. This tensorflow behavior is more like a bug (we forget to remove this limit when we added String tensor support).

frankfliu added a commit that referenced this issue Oct 2, 2022
* [tensorflow] Remove String tensor limitation for model output

Fixes #2053

* Add initializer to targetOpHandles
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants