Dave Moore
unread,Nov 7, 2021, 5:52:44 PM11/7/21Sign in to reply to author
Sign in to forward
You do not have permission to delete messages in this group
Sign in to report message
Either email addresses are anonymous for this group or you need the view member email addresses permission to view the original message
to TensorFlow Probability, Dave Moore, Christopher Suter, TensorFlow Probability, tanmo...@gmail.com
You might also want to look at DeferredModule (
https://www.tensorflow.org/probability/api_docs/python/tfp/experimental/util/DeferredModule), which lets you defer complex/multipart transformations by writing the function that determines all parameters a distribution or bijector given one or more input variables. For example, if for some reason you wanted to parameterize the location and scale of a Normal distribution by splitting an underlying variable, you could write
```
x = tf.Variable(np.random.randn(2), name='x')
def loc_scale_fn(x):
loc, unconstrained_scale = tf.split(x, 2, axis=-1)
return {'loc': loc, 'scale': tf.nn.softplus(unconstrained_scale)}
dist = tfp.experimental.util.DeferredModule(tfd.Normal, loc_scale_fn, x)
```
Here the resulting `dist` object behaves like a tfd.Normal instance that re-initializes itself using the `loc_scale_fn` on every method call, so there's always a gradient path back to the underlying variable. Note that there's no need for `loc_scale_fn` to be a bijection, since you're explicitly providing the underlying variable `x`.
Dave