Different inference results on the phone

Skip to first unread message

Liuyi Jin

Feb 14, 2022, 1:53:08 PMFeb 14
to TensorFlow Lite

My name is Liuyi Jin from Texas A&M University majoring Computer Science. I am wondering if it's correct that I got different inference results each time I run Bert on my mobile phone. I used the tflite model maker provided by tensorflow official website to convert tf2 model to tflite. 

If this is abnormal, what parameters or configurations should I pay attention to when running tflite models on the android phones? Thanks


Yu-Cheng Ling

Feb 27, 2022, 1:54:16 PMFeb 27
to TensorFlow Lite, li...@tamu.edu

Thanks for reaching out. 

Without seeing the concrete code with reproducible steps (including the TFLite version etc), it would be hard to identify what's going wrong. 
Would you be able to provide more information? 

In theory, unless the model is stateful or non-deterministic (I think BERT isn't), it should give you the same output with the same input. 


Lu Wang

Feb 28, 2022, 12:23:48 PMFeb 28
to Yu-Cheng Ling, Yuqi Li, TensorFlow Lite, li...@tamu.edu
+Yuqi Li 

Hi Liuyi,

You can try if your model works well with this desktop demo tool for BertNLClassifier. If the result is good, we can help to further debug your inference code. Otherwise, there may be some issues on the Model Maker side.


You received this message because you are subscribed to the Google Groups "TensorFlow Lite" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tflite+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tflite/2d8c6e1e-a3a5-4196-99f7-50c802bf1474n%40tensorflow.org.
Reply all
Reply to author
0 new messages