[ADMB Users] some light reading

45 views
Skip to first unread message

otter

unread,
Mar 1, 2015, 2:08:00 PM3/1/15
to us...@admb-project.org
This is a propos a question by Maunder a while back about whether random
effects would
ever be included into multifan CL.

Every once and a while I read a fish model paper to convince myself that
nothing ever gets any better.

This is a link to a paper adding random effects to some structures in
the fish model Stock synthesis.


http://icesjms.oxfordjournals.org/content/72/1/178.short

The authors eschew the use of ADMB's random effects package in favour of
an ad hoc approach
employing ADMB together with its Hessian calculations suitably
modified. They justify this approach with the
statement.

. Similarly, implementing a model in ADMB-RE requires
a large overhead of time and expertise due to the practical necessity of
finding a “separable” formulation of the population model (Bolker
et al., 2013) and still may take a considerable amount of time
during optimization.

This is more or less completely false. First it is not necessary to
have a separable formulation
of the model. It is quite feasible to have a model with thousands of
random effects without
invoking separability. For a fish model where the random effect (say a
catchability deviation)
affects the population for the rest of the time after it occurs
separability os of no use anyway.

The statement about "may take a considerable amount of time is true" for
general RE models
of course but they are considering an application where they integrate
over all but a small
number of variance parameters (say up to 3 or 5). In ADMB-RE this is
equivalent to declaring
almost all the parameters to be of type random effect. This has been
discussed before
(although the concept seemed to baffle Bates.) in the context of
generalizing restricted maximum
likelihood estimation.

The point is that ADMB-RE does the parameter estimation by first doing
an inner optimization over the
random effects. This is essentially equivalent to the authors use of
ADMB. The difference is that
once the inner optimization is done ADMB-RE calculates the Hessian in a
much more efficient manner
than ADMB. It also uses this hessian to "polish" the estimates so that
they are much closer to the
minimizing values in the sense that the max gradient magnitude is
typically reduced from something like
1.e-4 to 1.e-12. It then also computes the derivatives of the function
with respect to the variance parameters
in a very efficient manner. This enables one to use derivative based
minimization rather than the
incredibly inefficient Nelder Mead (which may work for a really well
behaved problem with 3 parameters
but will never extend well to more parameters or more badly conditioned
optimization procedures.

So ADMB-RE is already almost perfectly equipped to handle this model.
A small extension is needed to
permit it to have bounded random effects compatible with the bounded
parameters in ADMB.


_______________________________________________
Users mailing list
Us...@admb-project.org
http://lists.admb-project.org/mailman/listinfo/users

Reply all
Reply to author
Forward
0 new messages