Various Issues Running TFLite In Visual Studio

Skip to first unread message

Jonathan Torkelson

Feb 27, 2021, 10:14:17 AM2/27/21
to SIG Micro
I'm currently trying to run TFLite in Visual Studio on Windows, running in a virtual microprocessor using Virtuoso ( I am having several issues and was hoping I could get some help.

Background: Some time ago, I pulled in TFLite by following the Make guidelines provided in Pete Warden's book TinyML. This compiled and ran in Visual Studio, using TF version 2.0.0.

I then attempted to perform inference on a model that uses the flowing keras layers:

Conv2D, MaxPooling2D, BatchNormalization, Flatten, Dense (all of the prior time distributed), Bidirectional GRU, Reshape, Dense, Dropout.

My first issue may be due to TFLite incompatibility with one of the above layers, I'm not sure. I am also performing quantization. When I run my code, I am able to build my model and interpreter and verify the correct input and output shapes. However when I get my pointers to my input and output buffers, they are zero. So I am unable to perform inference. They do not return zero for a simple dense model, they return valid pointers.

I am doing training and converting the model with TF 2.4.1 from a separate python PIP install, so I then began to work on getting TFLite 2.4.1 pulled in to my Visual C virtual microcontroller project. There are build issues which I resolved where MSVC compiler was not happy with things that GNU is OK with, I hope to do pull requests to create a build that is compatible with MSVC once I get everything working. However I went back and tried to do a build of the TFLite for Microcontrollers example with Make to get all dependencies and show something building with 2.4.1 in Windows. It is failing due to a bad checksum when getting flatbuffers.

I'm not sure if the easiest path is to try to work through the Make build issues in Windows, but trying to cobble together a successful build of 2.4.1 has proven quite difficult.

If I could get help working through these issues I would greatly appreciate it, as I believe the ability to run Tensorflow Lite in an virtual microcontroller low-code environment with signals coming in and out like a physical embedded system, but without the headaches of hardware setup, would be AWESOME.

Any suggestions? Thank you!

Reply all
Reply to author
0 new messages