Skip to content
This repository has been archived by the owner on Jan 24, 2024. It is now read-only.

Model Format Conversion (WIP)

kavyasrinet edited this page Dec 2, 2017 · 1 revision

To use an already trained model with TensorFlow Lite, we need to convert the model to .lite format. In order to use the model generated by a framework, the user should have a .pb or .pbtext GraphDef file. The graph needs to be frozen before we convert the model (this is referred to as freezing the graph).

The following concepts are good to know in the light of using TensorFlow Lite and model conversion:

  • GraphDef - This represents the TensorFlow training and computation graph. It is in ProtoBuf format and contains the different operators, tensors, and variables. The file format is .pb.

  • CheckPoint - This contains the serialized variables from a TensorFlow graph, and can be used to restore the values of all the variables in the graph of the model. The file format is .ckpt. This only contains the variables and no information about the structure of the graph.

  • FrozenGraphDef - This is a subclass of the GraphDef class (specific type of GraphDef). We can convert a GraphDef to a frozen GraphDef by: take a saved checkpoint file and a GraphDef and convert every variable present in the graph to the corresponding value in the checkpoint file.

  • SavedModel - A combination of saved Checkpoint and the GraphDef as well as a signature that labels input and output for the model. We can extract the GraphDef and Checkpoint from a saved model.

  • TensorFlow lite model - This model contains the TensorFlow lite operators and Tensors for the interpreter. This uses FlatBuffers instead of ProtoBuf. This is very similar to frozen GraphDefs in TensorFlow as described above. The file format is .tflite.

TensorFlow Lite hosts few supported models here : https://github.com/tensorflow/tensorflow/blob/master/tensorflow/contrib/lite/g3doc/models.md

Clone this wiki locally