Hi all,I am just trying to run a different model instead of the hello world
project. So, I just changed model.cc file accordingly. However, my new
model doesn't run correctly with this change. It shows aborted on the
stdout and terminates. I figured out the error arises during the
TfLiteStatus allocate_status = interpreter->AllocateTensors(); lineHere is my model architecture and model.cc file. Any help would be
appreciated.Thanks
Here is the tflite modelhttps://drive.google.com/file/d/1nq6PXrq4B2AjZaekfoQHRknzDkqFYFwP/view?usp=sharing
Hi there. I took a quick look at the model and we should support all the layers in TFLM. It's likely you have not increased the arena sufficiently to run the model, since your model is quite a bit larger than the hello world example.If you want to dig deeper, please build with BUILD_TYPE=debug and run gdb <binary> The binary should be printed out during the last stages of the build.Thanks,Nat
It looks like you're running into https://github.com/tensorflow/tflite-micro/issues/216 since you're trying to use uint8 quantization.Please convert your model to use signed int8 quantization to avoid this issue. We switched over since TFLite and the TFLite converter both now default to int8 quantization.Thanks,Nat