Numerical experiments for cross-chain warmup

14 views
Skip to first unread message

Charles Margossian

unread,
Oct 25, 2022, 11:06:40 AM10/25/22
to TensorFlow Probability
Hi all,
I'm studying how increasing the number of chains affects cross-chain warmup algorithms, such as ChEES-HMC and others. I may for instance look at the Monte Carlo estimator produced by ChEES-HMC using 32 chains after N iterations / gradient evaluations.

To estimate metrics of interest, I run MCMC many times and average across runs. To have good estimates, I want the number of runs to be large (say 100's or 1000's). If possible, I'd like to do all the runs in parallel on a GPU. 

I can pass nChains * nRuns chains to TFP and then split between groups, but then ChEES-HMC uses all chains to do adaptation. Is there a way to enable adaptation only within subgroups of chains? Of course, the chains will then not all be synchronous (different number of leapfrog steps,...)

Maybe it makes more sense to parallelize a for loop that calls TFP multiple times (e.g. each time with 32 chains). 

Any tips welcomed!

Charles


Pavel Sountsov

unread,
Oct 25, 2022, 1:56:22 PM10/25/22
to Charles Margossian, TensorFlow Probability
You could use jax.vmap to handle the nRuns dimension, something like:

@jax.vmap
def one_run(seed):
  inits = sample_inits(nChains, seed)
  chains = run_hmc(inits, seed)

# all_chains has shape [nRuns, nSteps, nChains, ...]
all_chains = one_run(jax.random.split(seed, nRuns))

--
You received this message because you are subscribed to the Google Groups "TensorFlow Probability" group.
To unsubscribe from this group and stop receiving emails from it, send an email to tfprobabilit...@tensorflow.org.
To view this discussion on the web visit https://groups.google.com/a/tensorflow.org/d/msgid/tfprobability/beae2e81-74cb-4ed0-be90-a6eb95c0096dn%40tensorflow.org.

Charles Margossian

unread,
Oct 26, 2022, 8:33:46 AM10/26/22
to TensorFlow Probability, si...@google.com, TensorFlow Probability, Charles Margossian

Excellent, this works perfectly. Thank you very much!

Mike Lawrence

unread,
Oct 26, 2022, 8:56:06 AM10/26/22
to Charles Margossian, TensorFlow Probability, si...@google.com
Excuse the thread hijack, but seeing Charles posting here (Hey Charles! 👋) stimulated a curiosity as to whether anyone knows of any resources for Stan users to get up to speed with TFP, ideally including some of the pre-sampling data imports and post-sampling diagnostics/exploration. Any recommendations?


--

--
Mike Lawrence, PhD
Co-founder & Research Scientist
Axem Neurotechnology
axemneuro.com

~ Certainty is (usually) folly ~

Charles Margossian

unread,
Oct 26, 2022, 11:02:45 AM10/26/22
to TensorFlow Probability, mike....@gmail.com, TensorFlow Probability, si...@google.com, Charles Margossian
Hi Mike! This is a great topic. I suggest creating a new thread and I'm happy to share some thoughts there (though my tfp remains elementary).
Reply all
Reply to author
Forward
0 new messages