Hi,
Thanks for reaching out.
Without seeing the concrete code with reproducible steps (including the TFLite version etc), it would be hard to identify what's going wrong.
Would you be able to provide more information?
In theory, unless the model is stateful or non-deterministic (I think BERT isn't), it should give you the same output with the same input.
Thanks,
YC
On Monday, February 14, 2022 at 10:53:08 AM UTC-8
li...@tamu.edu wrote:
Hi,
My name is Liuyi Jin from Texas A&M University majoring Computer Science. I am wondering if it's correct that I got different inference results each time I run Bert on my mobile phone. I used the tflite model maker provided by tensorflow official website to convert tf2 model to tflite.
If this is abnormal, what parameters or configurations should I pay attention to when running tflite models on the android phones? Thanks
Best,
Liuyi