> Does pymc support dynamic bayesian net models?
I'd wait for one of the developers to give an authoritative answer,
but I believe the answer is a cautious "yes". I think to have success in
doing this you may have to dig a little bit under the covers to be sure
that everything is working as expected.
I think the best thing to do is configure your model, create a sampler, do
inference, and then repeat, creating a fresh sampler each time. The
reason I say this is that some things are configured at the time a sampler
is created. I recently ran into an issue where I did:
B = SomeStochastic()
MCMC = pymc.MCMC([B])
B.value = somevalue
B.observed = True
This created a problem because the sampler was ignoring my "observed"
state---it saw the variable as unobserved at construction and so assigns
a step method to it.
If the dependency structure of your model doesn't change between runs you
might be able to get away with reusing the sampler, but it really depends
and you might have to dig in. I'm working on a model now that has this
flavor, and I've encapsulated the observed data (which changes between
iterations) in a Potential.
> I have a model where a deterministic variable depends on its value from
> the previous sequence. How do I model this?
When you say "deterministic variable" do you mean "observed Stochastic"
in PyMC parlance? In PyMC, a Deterministic is a variable which is just a
function of its parents. I suppose you could have such a function be
parameterized...
Nick
I'd wait for one of the developers to give an authoritative answer,
On Tue, 17 Mar 2009, Siddhi wrote:
> Does pymc support dynamic bayesian net models?
but I believe the answer is a cautious "yes". I think to have success in
doing this you may have to dig a little bit under the covers to be sure
that everything is working as expected.
> Thanks for answering. I'm intrigued by your answer, it looks like you have
> in mind some cases under the 'dynamic Bayesian network' umbrella that would
> be iffy in PyMC?
Not exactly. I think I'm just more iffy on dynamic Bayesian networks; I
thought that they were networks whose dependency structure changed as a
function of the variables or something along those lines. Now I recall
that the structure is static but the variables are repeated.
I don't see any problems with those sorts of networks but, as you say,
you'd have to create a variable for each time step and inference might not
be as efficient as it could be if you somehow took the repeated structure
into account (presumably there are optimized inference techniques for
DBNs).
If the dependency structure does change over time, PyMC works too but
you have to either specify all possible parents up front (and ignore the
ones that don't matter when computing logps) or simply aggregate your
variables into one big Stochastic, probably with a custom step function.
This latter approach is what I'm doing with my chinese restaurant process;
I use the standard PyMC sampler for process parameters but a custom Gibbs
step for altering the process partition.
I think at the end of the day, frameworks are optimized for certain kinds
of models and if you have a model that diverges from that, some labor will
be required to make it fit. The best a framework can strive for is to
make the easy things easy and the hard things possible.
Nick