TensorFlow Lite enabled the new converter by default in nightly builds

318 views
Skip to first unread message

Lawrence Chan

unread,
Feb 26, 2020, 7:58:05 PM2/26/20
to tfl...@tensorflow.org
If you don’t use the TensorFlow Lite Converter, you can stop reading now.

Hi everyone,

What is happening?

The TensorFlow Lite converter is switching to use a new version by default -- This was announced earlier last quarter.

Why is this happening?

  • Enables conversion of new classes of models, including Mask R-CNN, Mobile BERT, and many more

  • Adds support for functional control flow (enabled by default in TensorFlow 2.x)

  • Tracks original TensorFlow node name and Python code, and exposes them during conversion if errors occur

  • Leverages MLIR, Google's cutting edge compiler technology for ML, which makes it easier to extend to accommodate feature requests

  • Adds basic support for models with input tensors containing unknown dimensions

  • Supports all existing converter functionality


The switch to use the new converter by default was submitted at git commit 06db91.

My use case started failing—what to do?

We’ve extensively tested correctness and runtime performance against a variety of models generated by the new converter. However, if there is something that we missed and you observed an unexpected failure or regression:

  • Please create a GitHub issue with the component label “TFLiteConverter.” Please include:

    • Command used to run the converter or code if you’re using the Python API

    • The output from the converter invocation

    • The input model to the converter

    • If the conversion is successful, but the generated model is wrong, state what is wrong:

      • Producing wrong results and / or decrease in accuracy

      • Producing correct results, but the model is slower than expected (model generated from old converter)

  • If you are using the allow_custom_ops feature, please read the Python API and Command Line Tool documentation

  • Switch to the old converter by setting --experimental_new_converter=false (from the tflite_convert command line tool) or converter.experimental_new_converter=False (from Python API)


Thanks,


Lawrence Chan on behalf of TFLite and MLIR teams
TPM: TensorFlow Lite, Micro, MOT
PM: Internal ML Insights, External TensorFlow Metrics

Cezary Zaboklicki

unread,
Feb 28, 2020, 5:41:29 AM2/28/20
to TensorFlow Lite
Hi that looks amazing - worked great for my LSTM/GRU models, 

I was wondering if there is anything for a student to contribute to TFLite in the form of a GSoC project?

Thanks for any help!
Reply all
Reply to author
Forward
Message has been deleted
0 new messages