Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error with tag-sets when serving model using tensorflow_model_server tool #3530

Closed
alfred11235 opened this issue Mar 6, 2018 · 9 comments
Closed

Comments

@alfred11235
Copy link

  • What is the top-level directory of the model you are using:
  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow): no
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Linux Ubuntu 16.04
  • TensorFlow installed from (source or binary): yes
  • TensorFlow version (use command below): 1.5
  • Bazel version (if compiling from source): 0.11
  • CUDA/cuDNN version:
  • GPU model and memory:
  • Exact command to reproduce:

I followed the instructions (https://github.com/tensorflow/models/tree/master/research/slim) step by step to download the flowers database, fine-tuning the inception-v3 model from the existing checkpoint and freezing the exported graph, and everything works fine. I even used the eval_image_classifier.py scripts and works perfect. The problem is that when I try to use the tool tensorflow_model_server with this line for serving the model:

tensorflow_model_server --port=9000 --model_name=saved_model --model_base_path=/home/alfred/Testing/OLD/Ejemplo/frozen/

I get an error. The structure of the frozen folder is the following:

/home/alfred/Testing/OLD/Ejemplo/frozen/
└── 1
├── saved_model.pb //the freezed graph
└── variables
├── saved_model.data-00000-of-00001
└── saved_model.index

The error:

2018-03-06 15:17:58.758865: I tensorflow_serving/model_servers/main.cc:149] Building single TensorFlow model file config: model_name: saved_model model_base_path: /home/alfred/Testing/OLD/Ejemplo/frozen/
2018-03-06 15:17:58.759179: I tensorflow_serving/model_servers/server_core.cc:439] Adding/updating models.
2018-03-06 15:17:58.759209: I tensorflow_serving/model_servers/server_core.cc:490] (Re-)adding model: saved_model
2018-03-06 15:17:58.860154: I tensorflow_serving/core/basic_manager.cc:705] Successfully reserved resources to load servable {name: saved_model version: 1}
2018-03-06 15:17:58.860240: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: saved_model version: 1}
2018-03-06 15:17:58.860281: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: saved_model version: 1}
2018-03-06 15:17:58.860346: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /home/alfred/Testing/OLD/Ejemplo/frozen/1
2018-03-06 15:17:58.860396: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:236] Loading SavedModel from: /home/alfred/Testing/OLD/Ejemplo/frozen/1
2018-03-06 15:17:58.992090: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:284] Loading SavedModel: fail. Took 131673 microseconds.
2018-03-06 15:17:58.992178: E tensorflow_serving/util/retrier.cc:38] Loading servable: {name: saved_model version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI: saved_model_cli

I used the saved_model_cli to inspect the available tag-sets of the freezed model and I get this:

The given SavedModel contains the following tag-sets:

There are not tag-sets defined ? If that's the case, it's possible to fix this or there is another way of using the generated model for serving ?

@DavidGOrtega
Copy link

This is happening to me also with the stylize_quantized.pb also. On android it works, however I can load the model in python.

@MithunArunan
Copy link

MithunArunan commented Jun 9, 2018

Hi @ant2n,

Encountered the same error, while trying tf-serving
You have to freeze the model again with tag set serve

builder = tf.saved_model.builder.SavedModelBuilder(output_dir)
builder.add_meta_graph_and_variables(sess, [tf.saved_model.tag_constants.SERVING],)
builder.save()

Now, you can check tag sets in your model.

saved_model_cli  show --dir output_dir

For more information
SavedModel - Tags
TF - SavedModel Reference

@francotheengineer
Copy link

francotheengineer commented Jun 13, 2018

@MithunMJ Thanks for this. Works for the retraining of Inception V3 example.

For future searches:
It is necessary to use the --saved_model_dir flag.
For example:
python retrain.py --image_dir ./images --saved_model_dir=/tmp/saved_models/$(date +%s)/
Then run:
tensorflow_model_server --port=9000 --model_name=my_image_classifier --model_base_path=/tmp/saved_models/

@wt-huang
Copy link

wt-huang commented Nov 1, 2018

Closing as this is resolved

@chikubee
Copy link

chikubee commented Sep 5, 2019

@wt-huang I'm new to this and I can't understand whats wrong here?
Screenshot 2019-09-05 at 4 23 12 PM
Screenshot 2019-09-05 at 4 20 52 PM
Screenshot 2019-09-05 at 4 24 30 PM
How am i supposed to add the signature definition here?
Any help would be appreciated.

Thanks in advance

@francotheengineer
Copy link

@chikubee Can you provide a minimal reproducible example? In the meantime, try to remove the training tag and just leave the serving one.
Thanks :)

@Suro-One
Copy link

Suro-One commented Sep 5, 2019

This might be unrelated, but I get the error when I try to deploy a model on the Google Cloud Platform.

Create Version failed. Model validation failed: SavedModel must contain exactly one metagraph with tag: serve For more information on how to export Tensorflow SavedModel, see https://www.tensorflow.org/api_docs/python/tf/saved_model.

Do you have any advice?

@chikubee
Copy link

chikubee commented Sep 5, 2019

@chikubee Can you provide a minimal reproducible example? In the meantime, try to remove the training tag and just leave the serving one.
Thanks :)

I tried removing train from the tag set as well, didn't work.
I have used https://github.com/google-research/bert to fine-tune bert-base-uncased model for sentiment analysis task.
I've followed their script completely to get the finetuned model (ckpt files).

Whatever further steps I had taken to save and serve the model are shown in the snippets.

@p4rk3r
Copy link

p4rk3r commented Dec 6, 2019

I followed the tutorial described here -> https://medium.com/@yuu.ishikawa/serving-pre-modeled-and-custom-tensorflow-estimator-with-tensorflow-serving-12833b4be421

  • Created the Dockerfile and run it... Got the same error as described in this thread...
  • Changed .deb file in Dockerfile from 1.12 to 1.15 and it works now

RUN TEMP_DEB="$(mktemp)" \ && wget -O "$TEMP_DEB" 'http://storage.googleapis.com/tensorflow-serving-apt/pool/tensorflow-model-server-1.15.0/t/tensorflow-model-server/tensorflow-model-server_1.15.0_all.deb' \ && dpkg -i "$TEMP_DEB" \ && rm -f "$TEMP_DEB"

ericnjogu added a commit to ericnjogu/video-object-detection that referenced this issue Jan 20, 2020
Example invocation:
python utils/add_model_tags.py ~/downloaded-tensorflow-mod
els/ssd_mobilenet_v1_coco_2017_11_17/frozen_inference_graph.pb

credits:
- 
tensorflow/models#3530 (comment)
- 
https://github.com/tensorflow/tensorflow/tree/master/tensorflow/python/saved_model#tags
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants