Hi, everyone,
I'm sorry I missed the meeting the other day. I was trying to finish the
JavaCPP Presets for TensorFlow Lite so we could talk about this maybe
and ended up sleeping through the meeting instead.
Anyway, it's nothing super important, but the TF Lite team offers Python
binaries for Linux, Mac, and Windows:
https://google-coral.github.io/py-repo/tflite-runtime/
However, they do not offer any such builds for the Java API, which is
Android-only:
https://repo1.maven.org/maven2/org/tensorflow/tensorflow-lite/2.5.0/
The official Java API is limited and inefficient anyway, they admit as
much in the documentation:
https://www.tensorflow.org/lite/guide/inference#android_platform
So I've created builds that expose the C++ API with JavaCPP and that do
run on Linux, Mac, and Windows:
https://github.com/bytedeco/javacpp-presets/tree/master/tensorflow-lite
Given that most people use TF on Java only for inference, and that we
can convert most TF models to TF Lite, I think it makes sense to offer a
solution that is over an order of magnitude smaller (currently about 2
megs for TF Lite vs 50 megs for TF Core, per platform on average).
Latency is also potentially lower, which would be a bonus.
Now, I'm perfectly happy to leave those builds there in the Bytedeco
organization, but if there are SIG-JVM users who would prefer to have
those at
https://github.com/tensorflow/java and develop a high-level
idiomatic API on top, possibly using the ndarray package, please let me
know!
Samuel