Link flex delegates to TFLM

971 views
Skip to first unread message

Arnaud Galles

unread,
Apr 10, 2022, 3:55:16 AM4/10/22
to SIG Micro
Good afternoon,

I am trying to implement a transformer network onto a DE10-nano board (2xCortex-A9, armv7-a), using tensorflow lite for microcontrollers (TFLM).

I trained the network using python and converted it to .tflite format. When doing so, I get a warning :

"TFLite interpreter needs to link Flex delegate in order to run the model since it contains the following Select TFop(s): Flex ops: FlexEinsum"

And when I deploy the model on the board using an AllOpsResolver I get the error:

Failed to get registration from op code CUSTOM

When I inspect the operations that my network uses, flexEinsum is indeed part of the list:

=== TFLite ModelAnalyzer === 
Subgraph#0 main(T#0, T#1) -> [T#79] 
 Op#0 CAST(T#1) -> [T#21] 
 Op#1 GATHER(T#9, T#21) -> [T#22] 
 Op#2 MUL(T#22, T#18) -> [T#23] 
 Op#3 FlexEinsum(T#23, T#5) -> [T#24] 
 Op#4 ADD(T#24, T#3) -> [T#25] 
 Op#5 FlexEinsum(T#23, T#4) -> [T#26] 
 Op#6 ADD(T#26, T#3) -> [T#27]
...

From my understanding, some operations are not yet supported by TFLM and I would need to directly use the einsum implemented in TF. My question is: how do I do that and is it even possible in TFLM? If not, is there a work around?  From the error sent by tensorflow when converting the model, I would need to 'link the flex delegate' but I don't understand what this means...

To give more context, I am using the Altera baremetal GCC toolchain on DS-5 to compile and deploy on the board. To include TFLM in my project, I generated the 'hello world' project and then used the generated 'tensorflow' and 'third_party' folders as a library in my project This works very well until flex ops show up... All of my versions are up to date (TF 2.8 and last version of TFLM repo)

Does anybody have solutions or ideas about this problem?

Have a great day!

PS: this is an example with Einsum but 2 other operations are in the same case, FlexMatrixBandPart and FlexStridedSlice

PPS: here is my conversion code. If I remove the SELECT_TF_OPS, I get a conversion error
converter = tf.lite.TFLiteConverter.from_saved_model(MODEL_DIR) 
converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, 
   tf.lite.OpsSet.SELECT_TF_OPS
]
tflite_model = converter.convert()

Deqiang Chen

unread,
Apr 11, 2022, 12:56:55 PM4/11/22
to Arnaud Galles, SIG Micro
Hello, Arnaud,

Did you try forcing the conversion to use BUILTIN only. Aka

converter.target_spec.supported_ops = [
  tf.lite.OpsSet.TFLITE_BUILTINS, 
   tf.lite.OpsSet.SELECT_TF_OPS
]

I don't think you can use flex op in TFLite Micro.  

Best regards!
Deqiang


--
You received this message because you are subscribed to the Google Groups "SIG Micro" group.
To unsubscribe from this group and stop receiving emails from it, send an email to micro+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/micro/019cad18-3532-44f4-9c36-c2f60277e619n%40tensorflow.org.

Arnaud Galles

unread,
Apr 12, 2022, 3:16:27 AM4/12/22
to SIG Micro, deqi...@google.com, SIG Micro, Arnaud Galles
Hi, thanks for your answer!

I tried what you propose before, but it then gives me a conversion error instead of a warning (I suppose the operations cannot be decomposed into tflite builtins...)
I guess that my best chance is thus to try avoiding to use the problematic operations ! 

Have a good day,

Arnaud

Reply all
Reply to author
Forward
0 new messages