Attach arbritrary Java Object to an Operand/Op object?

22 views
Skip to first unread message

Jim Clarke

unread,
Mar 25, 2021, 1:41:00 PM3/25/21
to SIG JVM
Is there a way to attach an arbitrary Java Object to an Operand or Op object?

That way, the arbitrary Java Object could be queried later on in processing through the Operand or Op itself.

In looking at the Keras Model architecture, this appears to be needed. An object is attached to the Operand when the Model is created, and then later
queried during training, evaluating and prediction.

jim

Ryan Nett

unread,
Mar 25, 2021, 2:59:06 PM3/25/21
to SIG JVM, jimcla...@gmail.com
It should be possible, but it's a bit fraught since `asOutput()` or `z` (or w/e) would give you a different Operand object.

Can you share a link to the Keras code?  I'm wondering if we can work around it.

Jim Clarke

unread,
Mar 25, 2021, 7:02:56 PM3/25/21
to Ryan Nett, SIG JVM
Specifically, 

/Users/jbclarke/Code/tensorflow/tensorflow/python/keras/engine/node.py  class KerasHistory
which is stored in a “KerasTensor"
/Users/jbclarke/Code/tensorflow/tensorflow/python/keras/engine/keras_tensor.py class class KerasTensor

I am not sure we will copy this pattern, but for now I am just investigating.
In keras.Model, they store the model Tensors for input and output and use the “KerasHistory” class 
to get to the originating Layer from the “KerasTensor” when needed. The other approach I am thinking
about is just mapping the Layers to the Operands produced by the input and output layers.

SequentialModel is a subclass of Model, and I haven’t gotten that far.

Here is what sample TF Python code looks like for using a Model directly.
In this example, “inputs", “x", and “outputs" are tensors.

inputs = tf.keras.Input(shape=(3,))
x = tf.keras.layers.Dense(4, activation=tf.nn.relu)(inputs)
outputs = tf.keras.layers.Dense(5, activation=tf.nn.softmax)(x)
model = tf.keras.Model(inputs=inputs, outputs=outputs)


jim

Karl Lessard

unread,
Mar 27, 2021, 10:32:24 AM3/27/21
to Jim Clarke, Ryan Nett, SIG JVM
But how will that work when you are loading your model from disk instead of building it from scratch? You’ll need to find a way to store the information of your object in the graph itself so you can recreate it and reattach it after loading, in which case your model will end up being very Java specific. Is the Python code also doing something like that?

On Mar 25, 2021, at 19:02, Jim Clarke <jimcla...@gmail.com> wrote:

Specifically, 
--
You received this message because you are subscribed to the Google Groups "SIG JVM" group.
To unsubscribe from this group and stop receiving emails from it, send an email to jvm+uns...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/jvm/3C175DCE-35BE-4F13-9E58-08FA065AF0B5%40gmail.com.

Jim Clarke

unread,
Mar 27, 2021, 6:07:07 PM3/27/21
to Karl Lessard, Ryan Nett, SIG JVM
Good point.
I am not sure how TF Python is handling it. They currently use an object, “KerasTensor”,  that holds the tensor and the “KerasHistory” object which holds the layer info.
Maybe all that gets serialized when the model is saved, I will need to investigate more.

jim

Adam Pocock

unread,
Mar 27, 2021, 9:27:39 PM3/27/21
to SIG JVM, jimcla...@gmail.com, jne...@gmail.com, SIG JVM, karl.l...@gmail.com
Graph serialization is causing me some difficulty in Tribuo, as when a graph is deserialized the graph's initialisers aren't populated so you have to create the init op before serialisation. In general the Graph that comes out from the protobuf is missing things compared to the richer Java object that we write out. Introducing more things which increase this divergence is only going to cause more trouble.

Adam

Karl Lessard

unread,
Mar 27, 2021, 9:49:48 PM3/27/21
to Adam Pocock, SIG JVM, jimcla...@gmail.com, jne...@gmail.com
This proto seems an interesting starting point to look at: tensorflow-core/tensorflow-core-api/src/gen/java/org/tensorflow/proto/framework/SavedObjectOrBuilder.java

(warning: might be completely wrong too, I’ve just searched very quickly...)

On Mar 27, 2021, at 21:27, Adam Pocock <crai...@gmail.com> wrote:

Graph serialization is causing me some difficulty in Tribuo, as when a graph is deserialized the graph's initialisers aren't populated so you have to create the init op before serialisation. In general the Graph that comes out from the protobuf is missing things compared to the richer Java object that we write out. Introducing more things which increase this divergence is only going to cause more trouble.

Ryan Nett

unread,
Mar 28, 2021, 2:54:13 AM3/28/21
to SIG JVM, karl.l...@gmail.com, SIG JVM, jimcla...@gmail.com, Ryan Nett, crai...@gmail.com
I don't think SavedModel actually saves the Python object.  From their example notebook, all the loaded models are tensorflow.python.saved_model.load.Loader._recreate_base_user_object.<locals>._UserObject.  Though the tf.Module SavedModel I created isn't a valid SavedModel proto, so I couldn't inspect it manually.

I know for the h5 format the model objects are JSON serialized into the h5 metadata.

I'm hoping to do something about initialization saving/loading when I do init scopes, although I need to look at what Python does when saving.  

Reply all
Reply to author
Forward
0 new messages