Concatenate probabilistic layers

45 views
Skip to first unread message

negar erfanian

unread,
May 25, 2022, 7:02:41 PM5/25/22
to TensorFlow Probability
Hi everyone,

I am in need of immediate help on one layer of my model using probabilistic layers.
The story is, I want to build a stack of probabilistic layers but all with the same structure. The reason behind this is that I am using zero-padded sequences and inputting these sequences to the input layer of this probabilistic layer, but I needed to add tfd.Masked as well in the probabilistic layers that masks out the batches that are zero-padded. Since masks for individual sequences are different I want to have a stack of same but differently masked probabilistic layers. At first I tried to use list of layers (list.append as shown bellow) but the problem is all listed layers will now act according to the mask of the very final layer in the list. I don't know how to deal with this issue ....

Here is the code for this probabilistic layer that I will use as a layer in my actual model:

def problayer_dec_loc(event_shape, input_shape, mask_list):  
    layers = []

    for i in range(len(mask_list)):
        layers.append(Sequential([
        Dense(units=tfpl.IndependentNormal.params_size(event_shape), input_shape=input_shape),
        tfpl.DistributionLambda( lambda t: tfd.Independent(tfd.Masked(tfd.Normal(loc = t[...,:event_shape], scale =tf.math.exp(t[...,event_shape:])), mask_list[i]),
                                                           reinterpreted_batch_ndims=1),
                                convert_to_tensor_fn=lambda s: s.sample(100)),
        ]))

    return layers, mask_list


Any help or insight is very much appreciated.

Brian Patton 🚀

unread,
May 27, 2022, 8:46:56 AM5/27/22
to negar erfanian, TensorFlow Probability
I think this is an issue the i being captured by reference in all the lambdas. Try hoisting the lambda out into a make_dist_callable function which takes arg i and returns the lambda.

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/f3cf0089-9090-4296-9e10-a06cead2b072n%40tensorflow.org.
Reply all
Reply to author
Forward
0 new messages