Custom objects and Lambda layers problems in Keras Model saving and loading

35 views
Skip to first unread message

r poon

unread,
Jun 21, 2020, 10:17:36 AM6/21/20
to Machine Intelligence and Data Science Group

Python Lambda function  is flexible and powerful in providing short and quick "nameless" function definition, and generally it is to be used as

function argument to another higher level function.

 

Examples:   double = lambda x: x * 2

 

 

Lambda Layer

 

 

Keras also provide a Lambda layer, which basically take in a function as

parameter to the layer, and that layer will apply that function to

its input.

 

Lambda class

 

tf.keras.layers.Lambda(

    function, output_shape=None, mask=None, arguments=None, **kwargs)



 

Example:

 

x = Lambda(lambda x: x[:, t, :])(XIn)

 

What this layer does:

 

i) the lambda function inside (small l):  slice a 3D array according to the value of t (in the second dimension),

 

where ":" denotes All elements of that dimension. So if t = 1, only [:, 1, :] will be returned as the result

 


ii) the Lambda layer (Capital L): use this anonymous lambda function as its argument to transform the input XIn.

 

So the XIn, wiwth 3D shape, will be sliced according to the slicing  lambda function and the value of t

 

According to Keras.io <https://keras.io/api/layers/core_layers/lambda/>:

 

The Lambda layer exists so that arbitrary TensorFlow functions can be used when constructing Sequential and Functional API models. Lambda layers are best suited for simple operations or quick experimentation.


Indeed the function used for Lambda layers could be either tf.keras or other custom functions.

Howver such flexibility gives challenges in model development and saving.



Model Saving Problem for lambda function:

can't pickle _thread.lock objects


The following lambda function (inside the Lambda layer) causes problem in model saving (model.save):

x = Lambda(lambda x: x[:, t, :])(XIn)


Solution: Add a closure layer to wrap the lambda function:


Def  l_slice (Vec):

        V = Vec[:, t, :]

         return V


x = Lambda (l_slice)(XIn)


The cause is a deep-copy process fails with the lambda function when trying to serialize it.



Model Loading Problem for lambda Layer / Custom Objects

Errors: NameError: name 'XYZ' is not defined or Str is not callable

       


Model loading is a deserialize process from tensorflow graph/weights into Python. Due to the nesting nature of the function and without advanced declaration, the reverse process simply loss tracking of the objects


load_model function

tf.keras.models.load_model(filepath, custom_objects=None, compile=True)


Solution: provide a dictionary for the special function: {"xyz string": abc_function/class}


In our example the dictionary to pass is like: custom_objects={"l_slice": Lambda}, 

which declares l_slice belongs to the class Lambda


Similar problem might exists for tf.keras function inside the Lambda layer for similar name recognition reason.


Extra reference would help to establish the proper reference in the reverse deserialization process:


Def XYZ:

    from keras import backend as K

    ......K.xyzfunction....


Summary


Lambda layer and custom functions provide extra traps in name space recognition issue. Using the "cutom_object" reference in model loading, or extra reference inside the special function will help to rebuild the reference.


Bear in mind the underlying  Tf/Keras layer is different from the Python layer, and that the (de)serialization

present name space and time-line challenge when you fold / unfold from your working memory data structure into 

disk structure.








 









Reply all
Reply to author
Forward
0 new messages