TensorFlow Probability Future

199 views
Skip to first unread message

Elmar Janahmadov

unread,
Jan 12, 2020, 5:17:13 AM1/12/20
to TensorFlow Probability
I am wondering what is the future of TensorFlow Probability, is it being designed as a little brother to TensorFlow Machine Learning and Image Processing or is it both complimentary to TensorFlow ML and an independent framework to resolve problems of Bayesian Inference, Probability and Statistics? TFP PyMC4 is intuitive and it seems like a way forward, but how many people actually see it that way?

rif

unread,
Jan 12, 2020, 12:13:31 PM1/12/20
to Elmar Janahmadov, TensorFlow Probability
Hi Elmar. I'm not totally sure I understand the question, but I'll try to answer.

We view probabilistic machine learning as one of the most important parts of ML. It is becoming increasingly important, and we believe this trend will continue. We develop TFP to be a set of useful, easy-to-use, composable building blocks to enable and support probabilistic machine learning. We view probabilistic machine learning widely, including many data science tasks, hierarchical models a la PyMC4, Bayesian Neural Nets and much more. One important measure of success for us is that our tools are useful and widely used. (We also use our tools to do our own research, and another measure of success is the impact and value of that research.)

Does this help? 

Best,

rif


On Sun, Jan 12, 2020 at 2:17 AM Elmar Janahmadov <ejla...@gmail.com> wrote:
I am wondering what is the future of TensorFlow Probability, is it being designed as a little brother to TensorFlow Machine Learning and Image Processing or is it both complimentary to TensorFlow ML and an independent framework to resolve problems of Bayesian Inference, Probability and Statistics? TFP PyMC4 is intuitive and it seems like a way forward, but how many people actually see it that way?

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/91bd460a-cb55-42fd-ab9f-6c445c1cd0e1%40tensorflow.org.

Elmar Janahmadov

unread,
Jan 12, 2020, 4:07:44 PM1/12/20
to rif, TensorFlow Probability
Actually, I think guys are doing incredible job. TF and TFP is an open source gold mine, it has immense potential and fruitful future.

What I am trying aiming to understand, is it wise to work only in one direction (probability machine learning).

When we work on probability problems, we start testing our theory on a univariate model and move to a more complex. The descriptive model eventually becomes continues. At each step, the analyst uses so called the Iterative Model Building Process.

so what I am saying, it may be beneficial to develop TFP in both directions instead of one. Make it intuitive for someone who is trying to implement just a simple Bayesian models (PyMC-like), and at the same time resourceful for the person who works on ML side. 

It is true TF and TFP are useful and widely used across industry (I believe that trend will grow exponentially), it is also need to be easy learn and continuous in terms moving from a simple model to a complex.

rif

unread,
Jan 12, 2020, 4:40:28 PM1/12/20
to Elmar Janahmadov, TensorFlow Probability
Thanks Elmar. We certainly want to make it easy for folks to implement simple Bayesian models on top of TensorFlow. 

Krzysztof Rusek

unread,
Jan 13, 2020, 7:07:52 AM1/13/20
to TensorFlow Probability
BTW I would like to ask about the future plans for JAX in TFP?



--
Krzysztof Rusek

Brian Patton 🚀

unread,
Jan 13, 2020, 10:34:23 AM1/13/20
to Krzysztof Rusek, TensorFlow Probability
Have you played around with `tfp = tfp.experimental.substrates.jax` yet?
We have been putting a fair amount of focus on getting this up and running in the last couple months.

Most bijectors are passing tests under jax (~11 to go).
A little under half of distributions pass all tests, ~30/80; several more are passing most tests & we have some dtype mismatch etc to work through; some special ops are missing from jax/xla (e.g. lgamma, ibeta, betainc, digamma). Implicit reparameterization, e.g. for gamma, vonmises, and mixtures, also relies on some special Eigen kernels that have yet to be moved into XLA.

All the GP kernels and distributions are passing tests.
MCMC will be ready soon.

We figure bijectors/distributions are the most fundamental/crucial, so have mostly focused there thus far.
Any special requests?

Brian Patton | Software Engineer | b...@google.com



Krzysztof Rusek

unread,
Jan 13, 2020, 11:43:37 AM1/13/20
to TensorFlow Probability
I haven't played with JAX backend yet but I noticed a lot of commits related to it so that's why I asked.
Good to hear it is so advanced.   
I would love to know more about the motivation for this backend e.g if there are some use cases where JAX is faster or if FunMCMC is designed as a part of JAX port.



--
Krzysztof Rusek

Brian Patton 🚀

unread,
Jan 13, 2020, 11:48:34 AM1/13/20
to federico vaggi, TensorFlow Probability, Dave Moore

On Mon, Jan 13, 2020 at 11:19 AM federico vaggi <vaggi.f...@gmail.com> wrote:
I think you guys have done some awesome work - but I'd love to know what is the vision in terms of high level APIs.  Do you expect people to use PyMC4?  Jax-Pyro?  Edward2?

My point of view is this:

If what you want is a high level API for 'canned' inference, I'd suggest trying out pymc4. AIUI they are not quite to parity with pymc3 in terms of MCMC adaptation, but I expect them to close that gap soon. 
There is also a nascent stan2tfp project on github (uses Stan's new stanc3 compiler), if that's your modeling language of choice, but I don't expect the Stan language to always express the same vectorized math you can with TF/TFP (and thus pymc4)

TFP's JointDistributionCoroutine (with some auto-batching augmentations Dave will be committing after vacation) has us pretty close to feature parity with the pymc4 model building approach (which was inspired by JDCoroutine, then improved). But pymc4 will likely still have a more data-scientist-oriented orientation in terms of whole-package canned inference, MCMC adaptation strategies, and model criticism.

Fundamentally there's a difference in focus where TFP is more oriented around a toolkit for research and experimentation (on top of which folks can build 'canned' approaches; we're willing to assemble pieces from scratch for Google-internal applications) and pymc4 is more data science practitioner oriented. I'd expect us to be upstreaming useful bits from pm4 into TFP insofar as they help with maintainability or complement TFP-proper. We've certainly discussed doing more data science-oriented work, as opposed to MCMC/PPL/GP/uncertainty research, so maybe you'll see this orientation shift a bit in the future.

I've been more engaged in pymc4 and stan2tfp so I have more visibility into their objectives / capabilities, as opposed to numpyro or edward2.
I expect numpyro has some pretty interesting VI tricks you can do with composable effect handlers, based on what I've seen in the past about pyro.
edward2 has been evolving independently of TFP recently, here: https://github.com/google/edward2
 

Brian Patton 🚀

unread,
Jan 13, 2020, 12:03:04 PM1/13/20
to Krzysztof Rusek, TensorFlow Probability
Re: motivation: I'd doubt JAX is faster in most cases, since TF can use XLA just as well as JAX.
For some research cases folks find JAX quicker to iterate with,
or they prefer the numpy API,
or they like the pure functional API, the idea of composable function transformations.
API-wise, it's compatible enough that we were able to just build a tf2jax internal interface to use ~the same TFP implementations for both TF and JAX implementations, so we figured 'why not?'.

I have seen one interesting thing you can do with JAX's parallel RNG that you can't do with TF [at least, not as parallel]: the Brownian bridge from https://arxiv.org/pdf/2001.01328.pdf. But do note that JAX is very researcher oriented, to the point where productionization/mobile/etc are not really considerations, whereas TF is much more of an ecosystem (TF.js, TFlite, TF federated, TF GAN, TF agents, T2T).

We have an interesting PPL-related experiment using JAX's function transformations which would have been hard to do in TF, but is enabled by getting into the lower levels of JAX's tracers.


Brian Patton | Software Engineer | b...@google.com


Paige Bailey

unread,
Jan 13, 2020, 1:46:07 PM1/13/20
to Brian Patton 🚀, jax-core, Krzysztof Rusek, TensorFlow Probability
Adding +jax-core, RE: JAX / TFP discussions.



--

Paige Bailey   

Product Manager (TensorFlow)

@DynamicWebPaige

webp...@google.com


 

Reply all
Reply to author
Forward
0 new messages