Is TFP in active development?

37 views
Skip to first unread message

John Stewart

unread,
Mar 11, 2022, 12:20:21 AMMar 11
to TensorFlow Probability
We are about to make a tough decision about frameworks and tools.  TFP is attractive for various reasons, but development seems to have slowed since 2019/20, at least from what one can glean from github.

Any clues about possible futures of TFP are appreciated. 

Thanks.

rif

unread,
Mar 11, 2022, 1:43:24 AMMar 11
to John Stewart, TensorFlow Probability
TFP is actively maintained and used, but it's definitely true that development has slowed --- we consider TFP fairly mature, and are more focussed on using TFP than on continuing to build it out. If you'd like to say more about your situation, we can do our best to help you make a good decision.

Best,

rif


--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/7322a5d8-45a3-44a0-acea-8be0bcdef6c3n%40tensorflow.org.

John Stewart

unread,
Mar 11, 2022, 2:13:06 AMMar 11
to TensorFlow Probability, Rif A. Saurous, TensorFlow Probability
Thank you Rif (love the name btw) -- I'll try to keep it super-brief, to not bore you.  We're a K-12 education company.  We receive loads of digitized student work, and the job of the R&D team is to write online scoring systems that make suggestions to teachers about what each student needs.  We already have transformer-based scorers for student writing running in production at scale, so we're comfortable with the operational aspects of all this stuff.  Other scoring software uses non-DL tools, like R libraries, custom code, even C.

So we'd like to unify onto a single platform and toolset.  The _idea_ of TFP seems perfect for this.  The issue is we'd have to retrain a bunch of "traditional" statisticians, so the tooling needs to be solid and, especially, well-documented.  The TFP project seemed to have a lot of momentum in 2019 -- videos, conference talks, tutorials -- but not much since then, and the state of the documentation is really not great, tbh.  In terms of the ecosystem, I have taken the Coursera course from Imperial, which also feels like it's aging and is sparsely-attended.  The Manning book is superficial and riddled with errors. 

I guess I'm trying to get a sense of whether TFP is basically in maintenance mode (if active), or whether there are Big Plans.  I realize though that Google may not want to commit one way or the other.

Thanks again,
jds

Krzysztof Rusek

unread,
Mar 11, 2022, 2:35:29 AMMar 11
to John Stewart, TensorFlow Probability, Rif A. Saurous
John, I also noticed a broader slowdown in the entire TF ecosystem. 
It looks like JAX is the new cool kid and I am super happy that TFP works on JAX flawlessly. If you want more tight integration there is distrax and oryx.
Btw I would ask devs about oryx status since its development is much slower compared to TFP. 
 

Wiadomość napisana przez John Stewart <cane...@gmail.com> w dniu 11.03.2022, o godz. 08:13:

John Stewart

unread,
Mar 11, 2022, 2:44:56 AMMar 11
to TensorFlow Probability, kru...@gmail.com
Thanks Krzysztof -- I didn't know about distrax -- checking it out now.

jds

Brian Patton 🚀

unread,
Mar 25, 2022, 5:03:12 PMMar 25
to John Stewart, TensorFlow Probability, kru...@gmail.com
To get back to you here: if what you're looking for is to find a "single platform and toolset" that encompasses transformers and stats, you could certainly choose either of TF+TFP or JAX+TFP.JAX. It's hard to say much more without understanding requirements in some depth -- data size, production environment, batch/minbatch training, MLE/MAP/MCMC/VI/SMC?, etc. JAX may impose a smaller learning curve, because of the Numpy-like API. But practically, folks still need to learn grad/vmap/pmap and embrace the pure FP style of JAX.

As to TFP development status, we've been adding occasional new pieces, often into tfp.experimental, but at this point we do think the toolkit is fairly mature for the use-cases we want internally, so we are more focused on deployments than incremental development. If there are major packages/functionality you find missing, it's helpful to flag those by opening an issue on github.

We are still maintaining the software (including continuous testing, quality fixes, releases), but could do better about putting out more public talks/docs. A big suite of changes we made last year was the ability to scale up inference in multi-GPU/multi-TPU settings. This is fairly advanced--unlocks MCMC at scales not seen before--but we haven't yet found cycles to make the awesome astro-data demo I'd eventually like, to showcase this on GCP.


Brian Patton | Software Engineer | b...@google.com




Brian Patton 🚀

unread,
Mar 25, 2022, 5:09:31 PMMar 25
to John Stewart, TensorFlow Probability, kru...@gmail.com
Re Oryx: we think of this as more of a research codebase than a production one, so take dependencies at risk of some churn.

Distrax is a smaller footprint, pure-JAX, TFP-inspired lib. If all you need is a subset of distributions and bijective transforms, it might serve well. I don't know off-hand what kind of API stability/production story they aim to offer, but we were in touch ~a year ago to work on cross-compatibility.


Brian Patton | Software Engineer | b...@google.com



Reply all
Reply to author
Forward
0 new messages