gradient information is now part of the pymc master branch

66 views
Skip to first unread message

John Salvatier

unread,
Feb 1, 2011, 1:14:15 PM2/1/11
to py...@googlegroups.com
Hi All, 

The pymc master branch (https://github.com/pymc-devs/pymc) now supports log probability gradient information and supports more numpy-like syntax. The following demonstrates how it works:

import pymc as pm

s = (3,4)

x = pm.Normal('x', mu = 1.5, tau = 1.0, size = s)
y = pm.Normal('y', mu = x, tau = 1.0)

#pymc comes with many ways of creating deterministics, including many numpy-like factory-functions (sin, exp, log, sum etc.)
# as well as numpy-like indexing and operators
a = pm.exp(y)
b = pm.sum(a, axis = None)
c = a + b
d = a[:, 0]

z = pm.Normal('z', mu = b, tau = 1.0, value = 1.0, observed = True)

# to calculate the gradient of  the log likelihood use 
print pm.logp_gradient_of_set(set((x,y)))
# it returns a dictionary with the gradients for each of the variables with respect to the whole system

# you can also restrict the calculation to a subset of 
restricted_set =  set((x,y)) 
print pm.logp_gradient_of_set(set((x,y)), restricted_set) #won't include the loglikelihood of z in the calculation 

Let me know if you have any questions or encounter any issues. 

The multichain_mcmc package (https://github.com/jsalvatier/multichain_mcmc) takes advantage of gradient information to get better convergence (see examples folder for usage)

Best Regards,
John Salvatier

David Huard

unread,
Feb 1, 2011, 1:32:17 PM2/1/11
to py...@googlegroups.com
Congratulations John,

This is an impressive piece of work.

David

> --
> You received this message because you are subscribed to the Google Groups
> "PyMC" group.
> To post to this group, send email to py...@googlegroups.com.
> To unsubscribe from this group, send email to
> pymc+uns...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/pymc?hl=en.
>

Reply all
Reply to author
Forward
0 new messages