Mixing tfp layer and keras layers in the same model

50 views
Skip to first unread message

Krzysztof Rusek

unread,
Jul 16, 2019, 12:49:33 PM7/16/19
to TensorFlow Probability
Hello all,  

I have a question regarding the interpretation of a neural network constructed using tfp layers and standard Keras layers.
This is quite a common case e.g when doing transfer learning from "frequentists" neural network or when constructing a model using layers not available in tffp (RNN).
When we train the whole network using elbo as loos what we get?

My interpretation is that the Keras layers are Bayesian layers with deterministic surrogate posterior and normal or Laplace prior according regularization (L2 or L1).
Is this correct?

Regards
--
Krzysztof Rusek

Dan Chang

unread,
Jun 19, 2021, 4:10:02 PM6/19/21
to TensorFlow Probability, kru...@gmail.com
I have the same question, and more: e.g., what types of distributions (beyond normal) can be used as prior and surrogate posterior, respectively? It will be nice if there is a tutorial on tfp.layers providing a technical (not package) overview and discussing constraints.
Reply all
Reply to author
Forward
0 new messages