TensorFlow Lite signature support enhancements

Skip to first unread message

Jaesung Chung

Aug 20, 2021, 1:47:18 AM8/20/21
to TensorFlow Lite

Hi everyone,

We want to share the recent enhancements of signature support and gather your early feedback. You can try out the feature in TensorFlow nightly build, or use the TensorFlow master branch code.

The input/output specifications are called "signatures". Signatures specify inputs and outputs of the converted TensorFlow Lite model by respecting the TensorFlow model's signatures, and also allow a single TensorFlow Lite model to support multiple entry points

Refer to our newly released Signatures in TensorFlow Lite Colab that illustrates how to convert TensorFlow models with signatures to a TensorFlow Lite model and run inference using these signatures.

Converter API enhancements

TensorFlow Lite converter APIs will bring the above signature information into the converted TensorFlow Lite model.

This conversion functionality is available on all the converter APIs starting from TensorFlow version 2.7.0.

Inference API enhancements

TensorFlow inference APIs support the signature-based executions since TensorFlow 2.5. From TensorFlow 2.7.0 version, C++ language binding supports the signature runner based execution. C++, Java and Python language bindings support the multiple entry points.

Known issues/limitations

  • As TFLite interpreter does not guarantee thread safety, the signature runners from the same interpreter won't be executed concurrently.
  • Support for C/iOS/Swift is not available yet.


For issues, please create a GitHub issue with the component label "comp:lite".

For feedback, please reply to all.


The TensorFlow Lite team

Reply all
Reply to author
0 new messages