Directly insert sub-graphs to tflite model file

389 views
Skip to first unread message

Simon King

unread,
Apr 12, 2021, 7:05:03 PM4/12/21
to TensorFlow Lite
Hi tflite team and friends,

I got a task which requires inserting a sub-graph to a converted tflite model file.

According to tflite schema.fbs (https://github.com/FrozenGene/tflite/blob/master/schema.fbs) file, the model file contains a list of sub-graphs in which the first sub-graph is typically the main model. So I'm thinking about how to directly insert sub-graphs into the tflite model without using Tensorflow model and tflite converter.

To be more detailed, my client would like to have the following steps:

1. Initially, I have a converted tflite model file and I know it's input, output details and some hidden layers.
2. I build a graph by tf.function which takes one hidden layer output tensor as input and gives output tensor to the next layer of the hidden layer. 
3. I insert the graph to the inception_v3.tflite and repackage them together as a new tflite file.

How to insert a sub-graph into the converted tflite file?

I know it's doable if I do sub-graph insertion with keras/tensorflow models. For example, it's easy to insert a keras/tensorflow graph/sub-graph into the keras/tensorflow model and then convert the combined model to tflite file. But my client just provides a tflite model file. But they don't want to provide their keras/tensorflow model files to me. So I have to try to insert a sub-graph into the tflite file. By the way, I can get the architecture and input/outputs by Netron (https://github.com/lutzroeder/netron).

Thanks for the reply,
Simon

Jaesung Chung

unread,
Apr 12, 2021, 7:15:50 PM4/12/21
to Simon King, TensorFlow Lite
Hi Simon King,

In short, sorry, I don't have any convenient tools to merge the existing TFLite models into one.

In my opinion, wiring the input/outputs with the existing TFLite models is more doable than combining, for example, executing the A tflite model and then, executing the B tflite model with the outputs of the A model and so on.

And to generate a single TFLite model file, I would suggest doing at the TF level merging as you said. If not, the low-level flatbuffer manipulation should be done and it needs to understand the underhood concepts of the TFLite schema thoroughly including tensor and operator declarations, tensor mapping with operators, and so on, which seems not an easy job and should be done manually.

Best regards,
Jaesung

--
You received this message because you are subscribed to the Google Groups "TensorFlow Lite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tflite+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tflite/7865fa96-eaa5-41e6-a94d-7fc9f18fb1b9n%40tensorflow.org.

Simon King

unread,
Apr 12, 2021, 7:56:57 PM4/12/21
to TensorFlow Lite, Jaesung Chung, TensorFlow Lite, Simon King
Hi Jaesung,

Thank you so much for your super quick reply!

From my understanding, if there is no any convenient tools as you said, there could be two possible solutions:

1. Convert tflite model back to keras/tensorflow model, then insert a graph to it and finally convert the inserted model back to tflite model.
2. Understand TFLite schema thoroughly, and then leverage the APIs generated by flatbuffer compilers (https://google.github.io/flatbuffers/flatbuffers_guide_tutorial.html) to insert new bytes into the tflite model file.

Currently, it seems that there are no convenient tools for none of them. Which one do you think is more easier?

Best,
Simon

Jaesung Chung

unread,
Apr 12, 2021, 8:15:32 PM4/12/21
to Simon King, TensorFlow Lite
Comments in line

On Tue, Apr 13, 2021 at 8:56 AM Simon King <xin.j...@gmail.com> wrote:
Hi Jaesung,

Thank you so much for your super quick reply!

From my understanding, if there is no any convenient tools as you said, there could be two possible solutions:

1. Convert tflite model back to keras/tensorflow model, then insert a graph to it and finally convert the inserted model back to tflite model.

There is no available tool for re-converting back to the tensorflow model.
 
2. Understand TFLite schema thoroughly, and then leverage the APIs generated by flatbuffer compilers (https://google.github.io/flatbuffers/flatbuffers_guide_tutorial.html) to insert new bytes into the tflite model file.

This is not a recommended path. However, if you really want to take a look at this path, the 'tensorflow.lite.python.schema_py_generated'  python target might gain your interest.

Simon King

unread,
Apr 13, 2021, 9:59:28 PM4/13/21
to TensorFlow Lite, Jaesung Chung, TensorFlow Lite, Simon King
Thanks for your information!
Reply all
Reply to author
Forward
0 new messages