Nonparametric Bayesian Methods in PyMC

343 views
Skip to first unread message

M. Higgs

unread,
Mar 10, 2014, 8:48:18 AM3/10/14
to py...@googlegroups.com
While there exists Christopher Fonnesbeck's iPython notebook on Dirichlet processes, I found little else discussing Bayesian inference in PyMC for non-paramteric models using, for example, Dirichlet or Indian Buffet process priors. I would like to start a group focusing on Bayesian nonparametric inference in PyMC.

More specifically, an initial question would be: Is it possible to define a Dirichlet process without resorting to truncations of the stick breaking representation? Is it possible to use stochastic memorisation in PyMC? Or is this outside of the scope of PyMC?

Thomas Wiecki

unread,
Mar 13, 2014, 5:24:40 PM3/13/14
to py...@googlegroups.com
While I don't have good answers this is definitely of interest.


On Mon, Mar 10, 2014 at 8:48 AM, M. Higgs <mchi...@googlemail.com> wrote:
While there exists Christopher Fonnesbeck's iPython notebook on Dirichlet processes, I found little else discussing Bayesian inference in PyMC for non-paramteric models using, for example, Dirichlet or Indian Buffet process priors. I would like to start a group focusing on Bayesian nonparametric inference in PyMC.

More specifically, an initial question would be: Is it possible to define a Dirichlet process without resorting to truncations of the stick breaking representation? Is it possible to use stochastic memorisation in PyMC? Or is this outside of the scope of PyMC?

--
You received this message because you are subscribed to the Google Groups "PyMC" group.
To unsubscribe from this group and stop receiving emails from it, send an email to pymc+uns...@googlegroups.com.
To post to this group, send email to py...@googlegroups.com.
Visit this group at http://groups.google.com/group/pymc.
For more options, visit https://groups.google.com/d/optout.



--
Thomas Wiecki
PhD candidate, Brown University
Quantitative Researcher, Quantopian Inc, Boston

Kai Londenberg

unread,
Mar 14, 2014, 7:13:08 AM3/14/14
to py...@googlegroups.com
While I don't have good answers either: Definitely interesting, yes. Example code would help a lot here. What do you mean by "stochastic memorisation", btw ? (I googled "stochastic memorisation" turns up a single, funny result )

I think there's a lot of theory in this area, but few understandable code examples. How about starting a collection of IPython Notebooks on GitHub with examples and explanations ? The one by Christopher Fonnesbeck might serve as a good starting point. Maybe I find time to contribute some bits here and there. 

A good starting exercise might be to extend that Dirichlet Process Notebook with an example of the two-parameter Poisson–Dirichlet process prior (alias Pitman Yor Process). It's a generalization of the Dirichlet Process with more control over tail behaviour via a second parameter. See  http://en.wikipedia.org/wiki/Pitman%E2%80%93Yor_process for a short description. There's a paper: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.75.5520&rep=rep1&type=pdf which describes a "Generalized Chinese Restaurant Process" as a constructive method.

Indian Buffet processes are very interesting as well, but I think examples for these would be a bit more involved.

best,

Kai Londenberg

Thomas Wiecki

unread,
Mar 25, 2014, 10:28:37 PM3/25/14
to py...@googlegroups.com
The first example to build would probably be the DP Gaussian mixture model. Efficient sampling would require implementation of the collapsed Gibbs sampler (this would also get around the truncation that Chris did in his NB). It's not difficult though and would also show how to implement new step-methods.
Reply all
Reply to author
Forward
0 new messages