-
Notifications
You must be signed in to change notification settings - Fork 45.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Error with tag-sets when serving model using tensorflow_model_server tool #3530
Comments
This is happening to me also with the stylize_quantized.pb also. On android it works, however I can load the model in python. |
Hi @ant2n, Encountered the same error, while trying tf-serving builder = tf.saved_model.builder.SavedModelBuilder(output_dir)
builder.add_meta_graph_and_variables(sess, [tf.saved_model.tag_constants.SERVING],)
builder.save() Now, you can check tag sets in your model. saved_model_cli show --dir output_dir For more information |
@MithunMJ Thanks for this. Works for the retraining of Inception V3 example. For future searches: |
Closing as this is resolved |
@wt-huang I'm new to this and I can't understand whats wrong here? Thanks in advance |
@chikubee Can you provide a minimal reproducible example? In the meantime, try to remove the training tag and just leave the serving one. |
This might be unrelated, but I get the error when I try to deploy a model on the Google Cloud Platform.
Do you have any advice? |
I tried removing train from the tag set as well, didn't work. Whatever further steps I had taken to save and serve the model are shown in the snippets. |
I followed the tutorial described here -> https://medium.com/@yuu.ishikawa/serving-pre-modeled-and-custom-tensorflow-estimator-with-tensorflow-serving-12833b4be421
|
Example invocation: python utils/add_model_tags.py ~/downloaded-tensorflow-mod els/ssd_mobilenet_v1_coco_2017_11_17/frozen_inference_graph.pb credits: - tensorflow/models#3530 (comment) - https://github.com/tensorflow/tensorflow/tree/master/tensorflow/python/saved_model#tags
I followed the instructions (https://github.com/tensorflow/models/tree/master/research/slim) step by step to download the flowers database, fine-tuning the inception-v3 model from the existing checkpoint and freezing the exported graph, and everything works fine. I even used the eval_image_classifier.py scripts and works perfect. The problem is that when I try to use the tool tensorflow_model_server with this line for serving the model:
tensorflow_model_server --port=9000 --model_name=saved_model --model_base_path=/home/alfred/Testing/OLD/Ejemplo/frozen/
I get an error. The structure of the frozen folder is the following:
/home/alfred/Testing/OLD/Ejemplo/frozen/
└── 1
├── saved_model.pb //the freezed graph
└── variables
├── saved_model.data-00000-of-00001
└── saved_model.index
The error:
2018-03-06 15:17:58.758865: I tensorflow_serving/model_servers/main.cc:149] Building single TensorFlow model file config: model_name: saved_model model_base_path: /home/alfred/Testing/OLD/Ejemplo/frozen/
2018-03-06 15:17:58.759179: I tensorflow_serving/model_servers/server_core.cc:439] Adding/updating models.
2018-03-06 15:17:58.759209: I tensorflow_serving/model_servers/server_core.cc:490] (Re-)adding model: saved_model
2018-03-06 15:17:58.860154: I tensorflow_serving/core/basic_manager.cc:705] Successfully reserved resources to load servable {name: saved_model version: 1}
2018-03-06 15:17:58.860240: I tensorflow_serving/core/loader_harness.cc:66] Approving load for servable version {name: saved_model version: 1}
2018-03-06 15:17:58.860281: I tensorflow_serving/core/loader_harness.cc:74] Loading servable version {name: saved_model version: 1}
2018-03-06 15:17:58.860346: I external/org_tensorflow/tensorflow/contrib/session_bundle/bundle_shim.cc:360] Attempting to load native SavedModelBundle in bundle-shim from: /home/alfred/Testing/OLD/Ejemplo/frozen/1
2018-03-06 15:17:58.860396: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:236] Loading SavedModel from: /home/alfred/Testing/OLD/Ejemplo/frozen/1
2018-03-06 15:17:58.992090: I external/org_tensorflow/tensorflow/cc/saved_model/loader.cc:284] Loading SavedModel: fail. Took 131673 microseconds.
2018-03-06 15:17:58.992178: E tensorflow_serving/util/retrier.cc:38] Loading servable: {name: saved_model version: 1} failed: Not found: Could not find meta graph def matching supplied tags: { serve }. To inspect available tag-sets in the SavedModel, please use the SavedModel CLI:
saved_model_cli
I used the saved_model_cli to inspect the available tag-sets of the freezed model and I get this:
The given SavedModel contains the following tag-sets:
There are not tag-sets defined ? If that's the case, it's possible to fix this or there is another way of using the generated model for serving ?
The text was updated successfully, but these errors were encountered: