Is it possible to run multiple TF Lite Micro models on a MCU?

21 views
Skip to first unread message

Visal Rajapakse

unread,
Aug 5, 2021, 3:03:10 AMAug 5
to TensorFlow Lite
Hey everyone, I'm really new to TFLM. I have a use case that ideally would have to execute 2 models (swap from Model A to B given a flag). Is it possible to achieve the required if the necessary resources are available? If so, how can I switch the executing model (Ex. using a flag with the condition, Will I have to initialise multiple MicroInterpreters, etc.)? Thank you in advance

Pete Warden

unread,
Aug 5, 2021, 5:59:39 PMAug 5
to Visal Rajapakse, TensorFlow Lite
Hi Visal,
              the recommended approach is to use multiple MicroInterpreter objects, one for each model. If you are having trouble fitting everything into RAM, you can also look at sharing the temporary parts of the arena. This isn't documented, but you can look at this test to get an idea of how it works:

Does this help?

Pete

On Thu, Aug 5, 2021 at 12:03 AM Visal Rajapakse <visalra...@gmail.com> wrote:
Hey everyone, I'm really new to TFLM. I have a use case that ideally would have to execute 2 models (swap from Model A to B given a flag). Is it possible to achieve the required if the necessary resources are available? If so, how can I switch the executing model (Ex. using a flag with the condition, Will I have to initialise multiple MicroInterpreters, etc.)? Thank you in advance

--
You received this message because you are subscribed to the Google Groups "TensorFlow Lite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tflite+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tflite/14f9d2a0-9497-4d43-bcad-f0615781ead9n%40tensorflow.org.
Reply all
Reply to author
Forward
0 new messages