> On Feb 26, 2015, at 2:03 PM,
andy...@cubist-asia.com wrote:
>
> ...
> The question is: what's the best/recommended way to do prediction with new values of x now I have a stanfit object?
>
> 1. I can added a generated quantities block into the STAN code, but that would require me to supply the new data at the same time with the fitting process - something not very clean from a design perspective (also practically, my system may not have all the new_data values available during the fitting phrase).
This is the cleanest way to do it.
>
> 2. Should I read off the mean and sd of the fitted parameters from stanfit object and then specify them in the data section of a different STAN code that does the sampling?
This will not give you full Bayesian inference.
It will use the posterior means as point estimates
and thus underestimate uncertainty in your estimates.
> 3. none of the above?
What you want to do is simulate one or more predictions
using an RNG for each draw in the posterior sample.
Then that gives you a sample for the predictions.
If you only care about the mean prediction, then you
can take the mean of this. In some cases, this will work
out to the prediction you get using the posterior means
as point parameters in prediction.
> I guess the more generic version question is: what's the best way to "pass" estimated values from one STAN run (via stanfit object) to another STAN run?
If you really need to do this, the only option is to create
a Stan program that reads in the whole sample object and then
iterates through them generating predictions for each draw in
the sample. Here's a super-duper simple case to illustrate:
data {
int<lower=1> M; // num posterior draws
real mu[M]; // posterior draws for mu
real<lower=0> sigma[M]; // posterior draws for sigma
}
model {
}
generated quantities {
real y[M];
for (m in 1:M)
y[m] <- normal_rng(mu[m], sigma[m]);
}
You'll need to run it in the special fixed parameters mode or it'll
fail.
And of course, if you have predictors or other parameters, those
all need to be passed in the same way.
- Bob