gradient information is now part of the pymc master branch

閲覧: 66 回
最初の未読メッセージにスキップ

John Salvatier

未読、
2011/02/01 13:14:152011/02/01
To: py...@googlegroups.com
Hi All, 

The pymc master branch (https://github.com/pymc-devs/pymc) now supports log probability gradient information and supports more numpy-like syntax. The following demonstrates how it works:

import pymc as pm

s = (3,4)

x = pm.Normal('x', mu = 1.5, tau = 1.0, size = s)
y = pm.Normal('y', mu = x, tau = 1.0)

#pymc comes with many ways of creating deterministics, including many numpy-like factory-functions (sin, exp, log, sum etc.)
# as well as numpy-like indexing and operators
a = pm.exp(y)
b = pm.sum(a, axis = None)
c = a + b
d = a[:, 0]

z = pm.Normal('z', mu = b, tau = 1.0, value = 1.0, observed = True)

# to calculate the gradient of  the log likelihood use 
print pm.logp_gradient_of_set(set((x,y)))
# it returns a dictionary with the gradients for each of the variables with respect to the whole system

# you can also restrict the calculation to a subset of 
restricted_set =  set((x,y)) 
print pm.logp_gradient_of_set(set((x,y)), restricted_set) #won't include the loglikelihood of z in the calculation 

Let me know if you have any questions or encounter any issues. 

The multichain_mcmc package (https://github.com/jsalvatier/multichain_mcmc) takes advantage of gradient information to get better convergence (see examples folder for usage)

Best Regards,
John Salvatier

David Huard

未読、
2011/02/01 13:32:172011/02/01
To: py...@googlegroups.com
Congratulations John,

This is an impressive piece of work.

David

> --
> You received this message because you are subscribed to the Google Groups
> "PyMC" group.
> To post to this group, send email to py...@googlegroups.com.
> To unsubscribe from this group, send email to
> pymc+uns...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/pymc?hl=en.
>

全員に返信
投稿者に返信
転送
新着メール 0 件