New TF2 Keras layer for TPU based embedding lookup

104 views
Skip to first unread message

Bruce Fontaine

unread,
Nov 2, 2020, 1:12:23 PM11/2/20
to SIG Addons, Tomer Kaftan
Hello SIG Addons,

I work on Tensorflow, specifically around TPUs. Recently we add the following API to Tensorflow: https://www.tensorflow.org/api_docs/python/tf/tpu/experimental/embedding/TPUEmbedding?version=nightly

This is a special embedding acceleration API that allows Cloud TPU users to use embedding tables that exceed the memory capacity of a single TPU core. As well, it allows embedding lookups of Sparse and Ragged tensors (which are normally not TPU compatible due to the non-static shape of the component tensors). 

We've developed a Keras layer built on top of this API that can be used both on TPU and the CPU/GPU (in single host mode). We would like to share this layer publicly but it currently does not fit anywhere in the main tensorflow package: The API is a significant departure from the standard tf.keras.layers.Embedding (due to certain hardware restrictions), so tf.keras.layers is not an appropriate place to put it. But as it uses Keras symbols, that is the only place under the tf namespace where it could reasonably go.

I would like to submit this layer to TF Addons. What are your thoughts on including more device specific/performance oriented layers in TF Addons?

Bruce

Sean Morgan

unread,
Nov 3, 2020, 12:37:37 AM11/3/20
to Bruce Fontaine, SIG Addons, Tomer Kaftan
Hi Bruce,

So my first thoughts are that the layer itself and device specific layers are welcome in Addons. However,  it would require a maintenance commitment, and a quick glance at the layer shows a lot of internal API usage. Addons requires that the layer use only public TF APIs for compatibility and maintenance purposes. 

If you're willing to maintain and refactor the layer then feel free to submit a PR. 

Best,
Sean


--
You received this message because you are subscribed to the Google Groups "SIG Addons" group.
To unsubscribe from this group and stop receiving emails from it, send an email to addons+un...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/addons/9766d011-1efb-437b-b68d-6d951f790dfen%40tensorflow.org.

Bruce Fontaine

unread,
Nov 3, 2020, 12:36:10 PM11/3/20
to SIG Addons, Sean Morgan, SIG Addons, Tomer Kaftan, Bruce Fontaine
Hi Sean, 

Thanks for taking a look at this!

I agree 100% on only using a public interface. The tf.tpu.experimental.embedding.* API linked in previous mail is actually the public API that the layer is built on top of, sorry for any confusion! That public API will be maintained by us and I can commit to maintaining the layer itself so it doesn't break across version releases etc.

Bruce



Bruce Fontaine

unread,
Jan 12, 2022, 12:35:48 PM1/12/22
to SIG Addons, Ziyin Huang, Sean Morgan, Tomer Kaftan
It's been a little while since I sent this mail, but we are just about ready to push this code into TF addons. Since it has been a while since we discussed this, I want to check again if there are any issues.

The layer itself now (our internal version) only depends on public symbols in TF 2.8. As well, once this layer is in TF addons, we will add unit tests on the CloudTPU side to ensure that there is no breakage. We also guarantee continued support for this layer going forward.

I've tagged Ziyin on this chain, who will be sending the pull request at some point in the next few weeks.

Thanks,

Bruce
Reply all
Reply to author
Forward
0 new messages