Support for TensorFlow Lite

121 views
Skip to first unread message

Samuel Audet

unread,
May 31, 2021, 6:01:25 AM5/31/21
to SIG JVM
Hi, everyone,

I'm sorry I missed the meeting the other day. I was trying to finish the
JavaCPP Presets for TensorFlow Lite so we could talk about this maybe
and ended up sleeping through the meeting instead.

Anyway, it's nothing super important, but the TF Lite team offers Python
binaries for Linux, Mac, and Windows:
https://google-coral.github.io/py-repo/tflite-runtime/

However, they do not offer any such builds for the Java API, which is
Android-only:
https://repo1.maven.org/maven2/org/tensorflow/tensorflow-lite/2.5.0/

The official Java API is limited and inefficient anyway, they admit as
much in the documentation:
https://www.tensorflow.org/lite/guide/inference#android_platform

So I've created builds that expose the C++ API with JavaCPP and that do
run on Linux, Mac, and Windows:
https://github.com/bytedeco/javacpp-presets/tree/master/tensorflow-lite

Given that most people use TF on Java only for inference, and that we
can convert most TF models to TF Lite, I think it makes sense to offer a
solution that is over an order of magnitude smaller (currently about 2
megs for TF Lite vs 50 megs for TF Core, per platform on average).
Latency is also potentially lower, which would be a bonus.

Now, I'm perfectly happy to leave those builds there in the Bytedeco
organization, but if there are SIG-JVM users who would prefer to have
those at https://github.com/tensorflow/java and develop a high-level
idiomatic API on top, possibly using the ndarray package, please let me
know!

Samuel

Adam Pocock

unread,
Jun 1, 2021, 8:55:54 AM6/1/21
to SIG JVM, samuel...@gmail.com
I think we should talk to the TF Lite team before adding a separate Java API for their runtime. It's going to be quite confusing from a product placement standpoint, and Google are more heavily invested in TF on Android than they are in TF Java. They might not take kindly to us creating confusion by having a completely separate API which doesn't interoperate with theirs. It would need careful documentation to show when to use which version, and then we'd also need to explain which of TF Lite and TF Java is appropriate for our users.

Also we've already got far too much work to do supporting TF Java, I'm not sure creating another completely separate Java API is something we have the bandwidth for.

Adam

Karl Lessard

unread,
Jun 1, 2021, 12:11:56 PM6/1/21
to Adam Pocock, SIG JVM, samuel...@gmail.com
I think it could be interesting to support TF Lite serving on TF Java, as many models available in the hub has been trained on it and I know some users would like to serve them as a service rather than on a device. I have no idea at this point how much work it could be to support TF Lite inference only.

As for training a new model on the lite stack, I agree with Adam’s points.

- Karl

On Jun 1, 2021, at 08:55, Adam Pocock <crai...@gmail.com> wrote:

I think we should talk to the TF Lite team before adding a separate Java API for their runtime. It's going to be quite confusing from a product placement standpoint, and Google are more heavily invested in TF on Android than they are in TF Java. They might not take kindly to us creating confusion by having a completely separate API which doesn't interoperate with theirs. It would need careful documentation to show when to use which version, and then we'd also need to explain which of TF Lite and TF Java is appropriate for our users.
--
You received this message because you are subscribed to the Google Groups "SIG JVM" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jvm+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/jvm/1654c526-fcfd-47e6-ab77-a5f19f3e72c9n%40tensorflow.org.

Samuel Audet

unread,
Jun 4, 2021, 7:05:48 PM6/4/21
to Karl Lessard, Adam Pocock, SIG JVM
I'm afraid supporting Java on anything else than Android isn't on their roadmap:

I agree SIG-JVM already has too much work, but someone's going to need to invest in AI for Java at some point.
If not Google or Oracle, maybe Amazon or even Microsoft with Pytorch? Who knows:

Anyway, I'm just throwing ideas at the wall here. Please don't shoot the messenger.

Samuel

Chris Nuernberger

unread,
Jun 4, 2021, 7:12:39 PM6/4/21
to Samuel Audet, Karl Lessard, Adam Pocock, SIG JVM
As far as deep learning on java in the general case, https://github.com/deepjavalibrary/djl appears to be furthest ahead and supports many deep learning frameworks including tflite.  

It is my understanding that Amazon is funding it's development at least partially.

--
You received this message because you are subscribed to the Google Groups "SIG JVM" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jvm+uns...@tensorflow.org.

Samuel Audet

unread,
Jun 4, 2021, 7:25:50 PM6/4/21
to Chris Nuernberger, Karl Lessard, Adam Pocock, SIG JVM
Yes, it's interesting to see some investment coming online, but being Amazon
and all the politics there, they cannot prioritize anything else than MXNet, unfortunately.

For example, their support for TF Lite is based on the current limited and inefficient Java API,
instead of the C++ API, so it's probably not useful for low-latency inference use cases.

Adam Gibson

unread,
Jun 4, 2021, 7:30:28 PM6/4/21
to Chris Nuernberger, Samuel Audet, Karl Lessard, Adam Pocock, SIG JVM
Hi Chris:

We're supporting a similar approach in the eclipse deeplearning4j project as well. Sam recently added tf lite support:
https://github.com/eclipse/deeplearning4j/pull/9335

We also support running TVM models. If anyone is interested, we are under a vendor neutral open source foundation
and welcome contributions here as well.

Since javacpp is at the core of all of this, it actually plays nicely with all of the other frameworks supported by it as a numpy like common data structure
that can use any backend while not compromising on performance.

We're more focused on the interop and packaging aspect. More options in the ecosystem certainly don't hurt.

The broader dl4j project has been around for along time now, but the usage over the years has shift more towards the nd4j library, keras/tensorflow import.


Best,


Adam

Reply all
Reply to author
Forward
0 new messages