How to retrieve latent state over time for DynamicLinearRegression

52 views
Skip to first unread message

Sigrid Keydana

unread,
Jun 13, 2019, 11:05:57 AM6/13/19
to TensorFlow Probability
Hi,

I'm preparing a post on DLMs with TFP and would like to showcase an example of dynamic linear regression. However, I don't see how to access the state over time - I only seem to be able to draw samples for drift_scale independent of time?

Can you please let me know what I'm missing?

Thanks!

Cheers,
Sigrid

Dave Moore

unread,
Jun 13, 2019, 11:31:26 AM6/13/19
to Sigrid Keydana, TensorFlow Probability
Hi Sigrid,

You can get out the means and covariances over the model's latent state using the `posterior_marginals` method of a LinearGaussianStateSpaceModel instance:

```
ssm = model.make_state_space_model(num_timesteps, param_vals=q_samples_)
latent_means, latent_covs = ssm.posterior_marginals(observed_time_series)
```

where 
`model = tfp.sts.Sum([..., tfp.sts.DynamicLinearRegression(...), ...], ...)` 
is an STS model instance, and
`q_samples_ = {k: q.sample(50) for k, q in variational_distributions.items()}`
are posterior model params sampled from a fitted variational distribution (as in the example notebook

If your model is a `tfp.sts.Sum` of multiple components, then `np.cumsum([c.latent_size for c in model.components])` will give you the start and end indices of all the model components in the latent state, which might be useful if you want to slice out just the dynamic regression part.

Dave

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
Visit this group at https://groups.google.com/a/tensorflow.org/group/tfprobability/.

Sigrid Keydana

unread,
Jun 13, 2019, 3:09:45 PM6/13/19
to Dave Moore, TensorFlow Probability
Thanks Dave, awesome! This was exactly the missing link I needed ... :-)

Sigrid Keydana

unread,
Jun 14, 2019, 3:50:05 PM6/14/19
to Dave Moore, TensorFlow Probability
Hi Dave,

thanks again, this looks promising! One terminological question, the way this is implemented, is it more appropriate to call these filtered or smoothed estimates, in normal DLM lingo?

Thanks!

Dave Moore

unread,
Jun 14, 2019, 5:32:10 PM6/14/19
to Sigrid Keydana, TensorFlow Probability
You're welcome! The distributions from `ssm.posterior_marginals(observed_time_series)` are smoothed, i.e., they're the marginals of the posterior conditioned on the entire time series (from the forward-backward algorithm). If you just need the forward pass, `ssm.forward_filter(observed_time_series)` will give you the filtered marginals.

Dave

Sigrid Keydana

unread,
Jun 15, 2019, 1:41:33 AM6/15/19
to Dave Moore, TensorFlow Probability
Awesome, thanks! (Actually I see now it's all there in linear_gaussian_ssm.py, could just have looked there ... :-))
And forward_filter returns the intermediate calculations too, that's nice!
Reply all
Reply to author
Forward
0 new messages