I am trying to implement a transformer network onto a DE10-nano board (2xCortex-A9, armv7-a), using tensorflow lite for microcontrollers (TFLM).
I trained the network using python and converted it to .tflite format. When doing so, I get a warning :"TFLite interpreter needs to link Flex delegate in order to run the model since it contains the following Select TFop(s): Flex ops: FlexEinsum"
And when I deploy the model on the board using an AllOpsResolver I get the error:Failed to get registration from op code CUSTOM
When I inspect the operations that my network uses, flexEinsum is indeed part of the list:=== TFLite ModelAnalyzer ===
From my understanding, some operations are not yet supported by TFLM and I would need to directly use the einsum implemented in TF. My question is: how do I do that and is it even possible in TFLM? If not, is there a work around? From the error sent by tensorflow when converting the model, I would need to 'link the flex delegate' but I don't understand what this means...
To give more context, I am using the Altera baremetal GCC toolchain on DS-5 to compile and deploy on the board. To include TFLM in my project, I generated the 'hello world' project and then used the generated 'tensorflow' and 'third_party' folders as a library in my project This works very well until flex ops show up... All of my versions are up to date (TF 2.8 and last version of TFLM repo)
Does anybody have solutions or ideas about this problem?
Have a great day!
PS: this is an example with Einsum but 2 other operations are in the same case, FlexMatrixBandPart and FlexStridedSlice
You received this message because you are subscribed to the Google Groups "SIG Micro" group.
To unsubscribe from this group and stop receiving emails from it, send an email to micro+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/micro/019cad18-3532-44f4-9c36-c2f60277e619n%40tensorflow.org.