Saving TensorFlow Probability Models

1,673 views
Skip to first unread message

Omar

unread,
Sep 6, 2019, 10:34:47 AM9/6/19
to TensorFlow Probability
Hi all,

I am training a tfp model that takes quite a long time to learn. After the training I would like to be able to save it and reload when needed. My model includes some variational layers whose main properties are the posteriors of the weights.   

I am relatively new to tensorflow world so I looked on the internet but did not find too much. 

Do you know how can I save tf models that include tfp variational layers? Even some documentation or explanation is ok.

Thanks!

Brian Patton 🚀

unread,
Sep 6, 2019, 10:42:01 AM9/6/19
to Omar, TensorFlow Probability
If you're using a relatively recent build of TFP, then distributions and bijectors all extend from tf.Module, which makes export/import easy via https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/train/Checkpoint?hl=en

Brian Patton | Software Engineer | b...@google.com



--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/a0e2029a-0234-4c82-9291-3f699ee35276%40tensorflow.org.

Omar

unread,
Sep 6, 2019, 12:03:30 PM9/6/19
to TensorFlow Probability, om...@cancergenomics.co.uk
thanks for your quick response Brian! Will try it and let you know


On Friday, September 6, 2019 at 3:42:01 PM UTC+1, Brian Patton wrote:
If you're using a relatively recent build of TFP, then distributions and bijectors all extend from tf.Module, which makes export/import easy via https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/train/Checkpoint?hl=en

Brian Patton | Software Engineer | b...@google.com



On Fri, Sep 6, 2019 at 10:34 AM Omar <om...@cancergenomics.co.uk> wrote:
Hi all,

I am training a tfp model that takes quite a long time to learn. After the training I would like to be able to save it and reload when needed. My model includes some variational layers whose main properties are the posteriors of the weights.   

I am relatively new to tensorflow world so I looked on the internet but did not find too much. 

Do you know how can I save tf models that include tfp variational layers? Even some documentation or explanation is ok.

Thanks!

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprob...@tensorflow.org.

Omar

unread,
Sep 6, 2019, 12:24:50 PM9/6/19
to TensorFlow Probability
It seems to save! Although the accuracy decreases... probably I am implementing it wrong. Will check again!

Brian Patton 🚀

unread,
Sep 6, 2019, 4:15:01 PM9/6/19
to Omar, TensorFlow Probability
It would probably make sense to audit the set of variables you see in the checkpoint to ensure it's everything you expect.

While we have migrated most distributions & bijectors to hold on to variable references, there are a couple tf.convert_to_tensor calls still left in __init__ methods, which would cause variables to not get exported.

Brian Patton | Software Engineer | b...@google.com


--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/20f8c94b-fadb-426b-b70a-f76d29a27e75%40tensorflow.org.

Josh Chang

unread,
Jan 30, 2020, 2:16:30 PM1/30/20
to TensorFlow Probability
Is there a canonical way to save checkpoints for tfp models defined using an unormalized log probability and fit using fit_surrogate_posterior? I tried defining a checkpoint based on the optmizer object

checkpoint = tf.train.Checkpoint(optimizer=opt)
 manager
= tf.train.CheckpointManager(
 checkpoint
, './.tf_ckpts',
 checkpoint_name
=checkpoint_name, max_to_keep=3)

But on inspecting saved checkpoints I get

print_tensors_in_checkpoint_file(latest_ckp, all_tensors=True, tensor_name='')
None

I see in the build_trainable_location_scale_distribution that there are calls to convert_to_tensor... is that why I'm not able to save checkpoints of the variational approximation or am I doing something else wrong? Thanks



On Friday, September 6, 2019 at 4:15:01 PM UTC-4, Brian Patton wrote:
It would probably make sense to audit the set of variables you see in the checkpoint to ensure it's everything you expect.

While we have migrated most distributions & bijectors to hold on to variable references, there are a couple tf.convert_to_tensor calls still left in __init__ methods, which would cause variables to not get exported.

Brian Patton | Software Engineer | b...@google.com



On Fri, Sep 6, 2019 at 12:24 PM Omar <om...@cancergenomics.co.uk> wrote:
It seems to save! Although the accuracy decreases... probably I am implementing it wrong. Will check again!

On Friday, September 6, 2019 at 3:34:47 PM UTC+1, Omar wrote:
Hi all,

I am training a tfp model that takes quite a long time to learn. After the training I would like to be able to save it and reload when needed. My model includes some variational layers whose main properties are the posteriors of the weights.   

I am relatively new to tensorflow world so I looked on the internet but did not find too much. 

Do you know how can I save tf models that include tfp variational layers? Even some documentation or explanation is ok.

Thanks!

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprob...@tensorflow.org.

Josh Chang

unread,
Jan 30, 2020, 7:04:02 PM1/30/20
to TensorFlow Probability
Nevermind, I figured out that I have to pass the relevant tf.Variable objects in as kwargs.

Krzysztof Rusek

unread,
Dec 15, 2020, 10:22:29 AM12/15/20
to TensorFlow Probability
Hello,

I as Briant suggested I have tried to save JDN using tf.train.Checpoint and got an exception during sawing:

Unable to save the object {-1: ListWrapper([<tfp.distributions.TransformedDistribution 'build_factored_surrogate_posterior_level_scale_posterior' batch_shape=[] event_shape=[] dtype=float32>, <tfp.distributions.TransformedDistribution 'build_factored_surrogate_posterior_slope_scale_posterior' batch_shape=[] event_shape=[] dtype=float32>])} (a dictionary wrapper constructed automatically on attribute assignment). The wrapped dictionary contains a non-string key which maps to a trackable object or mutable data structure. If you don't need this dictionary checkpointed, wrap it in a non-trackable object; it will be subsequently ignored.

The workaround is to create a dummy module and store the variable in its property.
However, I am not sure if the order of variables in the module s guaranteed to be the same between executions.
Here is a collab showing my approach and the exception:
Reply all
Reply to author
Forward
0 new messages