Yes, very useful. It's a know feature, and it boil down to numerical
instabilities for almost singular matrices, which is controlled when
conditioning on data but gets worse when adding a (large) prediction
stack that is not just 'in-fill'. (pure) math says this is ok, but
reality is something else.
The 'experimental' mode is an internal rewrite (paper in progress) to
represent the linear predictor as deterministic of the 'model', which
thereby bypass these numerical issues and will provide also better
scalability and run-time (not uniformly, but in most cases).
this new internal representation comes with some restrictions, like we
cannot get the skewness correction for its marginal conditioned on each
hyperparameter but you get it when doing the integration. on the other
hand, the newly developed VB correction for the mean works just
great(!!!) within this framework and provide the road-map for how to
proceed with the developments in the future. there is/will-be-soon a new
report on arxiv
not all options are available for the 'experimental' mode, as during
rewrite only the 'useful' features was brought forward.
personally, I use the 'experimental' mode all the time, and the
numerical stability is better, run-time and memory is (much) better
almost uniformly, accuracy for the mean is better, etc... its still in
development, and will end up as the default option at some point. you
can enable it as default doing
library(INLA)
inla.setOption(inla.mode="experimental")
Best
Håvard
> > classic.pngexperimental.png
--
Håvard Rue
he...@r-inla.org