> OK, I probably don't understand the concept of empirical bayes priors well.
>
> What I want is to have more informative priors that give more weight to the data in certain circumstance for both the mean and variance. Is there any "default" way to do this?
Not in the way you’re describing.
Empirical Bayes tries to use the data twice — once to define the prior and once to define the likelihood. This
leads to overfitting and poor estimation in complicated models.
But Empirical Bayes is just an estimate of the full hierarchical model to which Andrew was referring. In fact
you can think about Empirical Bayes as building a full hierarchical model and then making point estimates of
the hyper parameters. The point estimates lead to underestimated uncertainty… just as we would expect
from overfitting.
To summarize, Empirical Bayes is an approximation of a full hierarchical model. Any kind of “automated” or
“default” approach would require building the full model, with all those hyper priors, and then making point
estimates of the hyper parameters with something like maximum marginal likelihood, which has not yet
been implemented in Stan.