Thanks for sharing. We've been meaning to do something like
this ourselves for ages so would like to include a variant of
what you're doing in our doc or example models.
Were you reproducing a textbook example here (I saw the Iris
data)?
Did you fit it with optimization or MCMC?
How well did it work? We'd like to include an example with
some of the mods suggested below.
Where'd iris come from? This is the first line of the R:
iris1=iris[1:100,]
I'd suggest replacing
fitted_middle[q,n, j] <- 1/ (1+exp(-beta_middle[q, j] * fitted_middle[q-1,n]'));
with:
fitted_middle[q,n,j] <- inv_logit(beta_middle[q,j] * fitted_middle[q-1,n]');
dot_product's even a bit faster. It would be even faster to do a matrix
product for beta_middle and fitted_middle and assign to a temporary and then
assign to fitted_middel using inverse logit.
If you define beta_final and fitted_middle appropriately in terms
of shape, you should be able to replace this:
y_data[n] ~ bernoulli_logit(beta_final' * fitted_middle[num_hidden,n]');
with
y_data ~ bernoulli_logit(beta_final * fitted_middle);
without having to transform anything (though transforms aren't that
expensive because there are no derivatives).
Also, if you don't want to save all of fitted_middle, you could
declare and define it in the model block as a local variable.
- Bob
> --
> You received this message because you are subscribed to the Google Groups "Stan users mailing list" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to
stan-users+...@googlegroups.com.
> For more options, visit
https://groups.google.com/d/optout.