Hi,
Anderson acceleration (or mixing) is an acceleration technique for fixed-point
iteration methods for the solution of nonlinear equations. While standard
fixed-point method uses only the last approximation to calculate the new one,
mixing methods use last several approximations. For us this matters since
some nonlinear equations in Hermes (such as the Richards equation) are
solved using the fixed point iteration. Therefore the Anderson mixing will become
part of modules that use the fixed point method.
I implemented the Anderson acceleration in the Online Lab
and tested it on a few simple-minded scalar equations. For example for
the function g(x) = 1/(1 + x*x) with initial guess 5.0 it reduced the number
of iterations from 42 to 22. That was with beta = 1.0 and unrestricted
"memory". With beta = 0.8 the number dropped to 11. I also looked at
the influence of the "memory" on the number of iterations. Usually the
method performed best when I remembered just a few (2-3) last iterates.
Pavel
--
Pavel Solin
University of Nevada, Reno
Home page:
http://hpfem.org/~pavelHermes:
http://hpfem.org/
FEMhub:
http://femhub.org/