Conversion of mobilenet protobuf to .tflite with multiple outputs

257 views
Skip to first unread message

s.mehltrett...@gmail.com

unread,
Jan 23, 2019, 6:44:16 AM1/23/19
to Discuss
Hello together,
I would like to use mobilenet (ssd_mobilenet_v1_android_export.pb) in an Android App to classify objects/persons and to draw a bounding box around them. I have already found a Tensorflow Mobile example app ( https://github.com/tensorflow/tensorflow/tree/master/tensorflow/examples/android ) which fulfills my requirements. But the detection and bounding box calculation takes 200 ms. With 200 ms a refer to the runtime of the native method run in the TensorFlowInferenceInterface.class.
Apart from that I found a tensorflow lite example app (https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/java/demo) which can perform an image classification (no bounding box) with the model mobilenet_v1_1.0_224.tflite within 60ms. The 60ms refer to the runtime of the native method run() in the Interpreter.class that is called from the runInference() methode of the ImageClassifierFloatMobilenet.java.

After reading the tensorflow lite docu about model conversion (https://www.tensorflow.org/lite/convert/cmdline_examples) I had the idea if the reason for that huge performance issue was a model that is not converted correctly. So I tried to convert ssd_mobilenet_v1_android_export.pb to a .tflite file with multiple outputs. My questions are:
1) if I need a classification and a bounding box from mobilenet with which output_arrays parameters do I have to convert the model?
2) Are there any other reasons why tf-mobile is so much slower than tflite? Are the protobuf files the reason?

Here is the example code from the tflite docu about model conversion: I don't know how to convert mobilenet for bounding box detection and classification.

curl https://storage.googleapis.com/download.tensorflow.org/models/inception_v1_2016_08_28_frozen.pb.tar.gz \
 
| tar xzv -C /tmp
tflite_convert
\
 
--graph_def_file=/tmp/inception_v1_2016_08_28_frozen.pb \
 
--output_file=/tmp/foo.tflite \
 
--input_arrays=input \
 
--output_arrays=InceptionV1/InceptionV1/Mixed_3b/Branch_1/Conv2d_0a_1x1/Relu,InceptionV1/InceptionV1/Mixed_3b/Branch_2/Conv2d_0a_1x1/Relu


The protobuf file can be found under https://www.dropbox.com/home/Modelfile?preview=ssd_mobilenet_v1_android_export.pb .

It would be great if somebody could help me! I have been trying to fix that performance for a week.

Thanks in advance!

mobilenet_v1_1.0_224.tflite

bluegreen

unread,
Jan 23, 2019, 7:11:48 AM1/23/19
to Discuss
I would like to use runForMultipleOutputs as described here  (https://github.com/tensorflow/tensorflow/issues/15633) by KaviSanth on 20th Sep. 18. But I don't know the correct params to convert my model.
Reply all
Reply to author
Forward
0 new messages