Hi,
Is it possible to pass the output of a neural network built with Keras into a joint distribution built with Tensorflow Probability?
More specifically, suppose the last layer of a Keras model is a "tfp.layers.DistributionLambda", so the model returns a "tfp.distributions.Distribution" instance. How can this be passed into a "JointDistributionCoroutineAutoBatched", so that we can "yield" it as a parameter? The idea being that we aim to subsequently give this a surrogate distribution for variational inference to jointly estimate the neural network and probabilist model (in which the neural network is embedded).
This should be possible, considering that the TFP homepage states that "TensorFlow Probability ... makes it easy to combine probabilistic models and deep learning".
Appreciate any help on this!
Best,
Sascha