Using pymc.Potential and pymc.Deterministic objects

285 views
Skip to first unread message

Daniel Rosenbloom

unread,
May 7, 2015, 6:51:54 AM5/7/15
to hddm-...@googlegroups.com
Hi all,

I am starting to use kabuki and am trying to implement pymc Deterministic and Potential objects in the model. Is this possible?

Specifically, for each subject, I am simulating an observed value (tau) which is the sum of two Weibull-distributed variables (t1 + t2). These underlying t1 and t2 are unobserved. There are two separate population distributions, with shape and scale = (t1shape, t1scale) and (t2shape, t2scale). So the way I have set up the problem is this:

import pymc as pm
import kabuki as kab


class KWeibSum(kab.Hierarchical):
    def create_knodes(self):
        
        # Population parameters:
        
        # Shape & scale for first Weibull distribution:
        t1shape = kab.Knode(pm.Uniform, 't1shape', lower=0, upper=20, depends=self.depends['foo'])
        t1scale = kab.Knode(pm.Uniform, 't1scale', lower=0, upper=20, depends=self.depends['foo'])
        
        # Shape & scale for second Weibull distribution:
        t2shape = kab.Knode(pm.Uniform, 't2shape', lower=0, upper=20, depends=self.depends['foo'])
        t2scale = kab.Knode(pm.Uniform, 't2scale', lower=0, upper=20, depends=self.depends['foo'])
        
        # Subject stochastics:
        tau_subj = kab.Knode(pm.Uninformative, 'tau_subj', col_name='data', observed=True, depends=('subj_idx','tumortype') ) #depends=self.depends['foo'] ?
        t1_subj = kab.Knode(pm.Uniform, 't1_subj', lower=0, upper=tau_subj, depends=('subj_idx',), subj=True)
        
        # Subject deterministics/potentials:
        t2_subj = kab.Knode(SubtractDeterministic, 't2_subj', a=tau_subj, b=t1_subj, depends=('subj_idx',), subj=False, hidden=True)
        t1dist = kab.Knode(WeibullPotential, 't1dist', value=t1_subj, alpha=t1shape, beta=t1scale, depends=('subj_idx',), subj=False, hidden=True)
        t2dist = kab.Knode(WeibullPotential, 't2dist', value=t2_subj, alpha=t2shape, beta=t2scale, depends=('subj_idx',), subj=False, hidden=True)
        
        return [t1shape, t1scale, t2shape, t2scale, tau_subj, t1_subj, t2_subj, t1dist, t2dist]



"WeibullPotential" and "SubtractDeterministic" are my classes for trying to feed in potential & deterministic objects into the kabuki framework:

class WeibullPotential(pm.Potential):
    def __init__(self, name, value, alpha, beta):
        pm.Potential.__init__(self, logp=pm.weibull_like,\
                                    name=name,\
                                    parents={'x':value, 'alpha':alpha, 'beta':beta},\
                                    doc='Weibull Potential',\
                                    verbose=0,\
                                    cache_depth=2)

class SubtractDeterministic(pm.Deterministic):
    def __init__(self, name, a, b):
        pm.Deterministic.__init__(self, eval = lambda a=a,b=b:a-b,\
                                        name=name,\
                                        parents={'a':a,'b':b},\
                                        doc='Subtraction',\
                                        trace=False,\
                                        verbose=0,\
                                        dtype=float,\
                                        plot=False,\
                                        cache_depth=2)


When I use model.approximate_map(), I get an error: "AttributeError: 'WeibullPotential' object has no attribute 'value'" ... well, yes, this is true, a Potential object has no value. Is there a way for approximate_map() to "skip" this node?

When I run the model (without using the MAP to initialize), I get wonky results, including DIC, deviance, and pD all equal to zero.

Many thanks for any advice that you can offer!

Best wishes,
Daniel

Thomas Wiecki

unread,
May 8, 2015, 9:16:20 AM5/8/15
to hddm-...@googlegroups.com
Hi Daniel,

Good to hear you're trying kabuki. Unfortunately, I have never considered potentials in HDDM so it's quite possible that their use is not supported in kabuki. I'm afraid you might have to have a look into hierarchical.py and find out how to skip that node. Unfortunately without the full traceback I can't see where it's failing and advise further.

Thomas

--
You received this message because you are subscribed to the Google Groups "hddm-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to hddm-users+...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.



--
Thomas Wiecki, PhD
Data Science Lead, Quantopian Inc, Boston

Daniel Rosenbloom

unread,
May 8, 2015, 1:49:14 PM5/8/15
to hddm-...@googlegroups.com
Hi Thomas,

Thanks for the quick reply! I figured out two different errors that it was throwing:

First was in my choice to use a subclass of pm.Deterministic rather than pm.Deterministic directly. I see that in line 152 of hierarchical.py you are already catching and dealing with uses of Deterministic nodes. So I just changed my code to call Knode with pm.Deterministic as the first argument.

Second, it is a quick fix to skip over Potentials in the value calculation. After line 931, in values(), I added "if not isinstance(node.node, pm.Potential):". And now approximate_map() is running without error. I haven't delved into the code enough to understand how exactly the MAP optimization works, or if it is properly using the Potentials to give a correct answer.


Whether or not I run approximate_map(), I am still getting values of zero for deviance... inspecting the mc.db.trace('deviance'), I see that deviance is in fact zero for every iterate. I am not sure what's going on here, but it may be an underlying pymc issue. I am still running my model for more iterates to see if it returns sensible estimates, though.

-Daniel

Thomas Wiecki

unread,
May 9, 2015, 4:35:17 AM5/9/15
to hddm-...@googlegroups.com
Does the model converge otherwise?

I've seen the DIC problem before but not sure what causes it.

Daniel Rosenbloom

unread,
May 13, 2015, 5:25:06 PM5/13/15
to hddm-...@googlegroups.com
I was having some convergence issues too... in particular, parameters are sometimes going outside the uniform prior I set for them. Maybe if I used a less rigid prior, e.g., gamma(0.1, 0.1)?

This discussion suggests a solution to a similar DIC issue: https://groups.google.com/forum/#!topic/pymc/pUSrBYAG4cY. Does that seem relevant?

Thomas Wiecki

unread,
May 15, 2015, 3:35:02 AM5/15/15
to hddm-...@googlegroups.com
Or a HalfNormal distribution as we're using for a group variance parameter.

I'm not sure if that thread contains the solution. kabuki should pass in all random variables to MCMC().
Reply all
Reply to author
Forward
0 new messages