I have just finished coding up a parallel GA and I am validating the
design using Rastrigin's function:
f(x) = n*A + sum(i=1,n) {x_i^2 - A*cos(2*pi*x_i)}
Currently I am using A = 10, and dimension n = 20, with a single
population.
I would like to know what the best results are for this function
(obtained with a serial or parallel GA) so I have something to compare
against. Any references or links to published results would be
gratefully received.
Thanks,
JV
Sent via Deja.com http://www.deja.com/
Before you buy.
Cheers,
Eric
jon_v...@my-deja.com wrote:
--
---------------------------------
Eric "The Wingnut" Whitney
<er...@aero.usyd.edu.au>
Windows 95 (win-DOH-z), n. A thirty-two bit extension and graphical
shell to a sixteen bit patch to an eight bit operating system
originally coded for a four bit microprocessor which was used in a PC
built by a two bit company that couldn't stand one bit of competition.
Rainer
jon_v...@my-deja.com wrote:
> Hi all,
>
> I have just finished coding up a parallel GA and I am validating the
> design using Rastrigin's function:
>
> f(x) = n*A + sum(i=1,n) {x_i^2 - A*cos(2*pi*x_i)}
>
> Currently I am using A = 10, and dimension n = 20, with a single
> population.
>
> I would like to know what the best results are for this function
> (obtained with a serial or parallel GA) so I have something to compare
> against. Any references or links to published results would be
> gratefully received.
>
> Thanks,
>
> JV
>
> Sent via Deja.com http://www.deja.com/
> Before you buy.
Hi Jon,
You would expect that your simple question would have a simple answer.
Unfortunately, the fact that Rastrigin's function is separable complicates
matters. A separable, d-dimensional function can be decomposed into d,
1-dimensional functions which can be solved individually and then
recombined to provide the solution to the full problem. This
divide-and-conquer strategy is simulated stochastically when you use a
mutation or crossover technique that changes just one or two parameters at
a time to create new solutions. This one-parameter-at-a-time technique is
very fast, but it requires that a function be separable to be effective.
Most interesting functions aren't separable and for them the
change-one-parameter strategy is doomed to fail miserably.
Fortunately, Rastrigin's function can be transformed into a non-separable
function with totally dependent parameters by rotating each vector prior
to evaluating it. This requires generating a random rotation matrix and
involves some extra programming, but in the end it's a good way to
transform away the separable property.
A second special property of the Rastrigin function that can be exploited
but not transformed away is that the optimum typically lies near the
average position of the population. Thus, just by searching around the
population mean you can get very fast run times. Far optimization can make
this problem more difficult, although hardly anyone ever seems interested
in making a function harder! For Rastrigin's function, the solution is at
(0,0,0,0...), so far initialization could be used to initially confine the
population to the domain bounded by (100,90). At least this way, the
solution is not initially near the population mean. Even so, any method
that references the the population mean, (e.g., simplex derivatives,
intermediate crossover...) can take advantage of this "special" property
and should give a good result.
The answer to your question is that the best time for Rastrigin's function
depends on how many of its special properties you want to exploit. Even
so, here are some rough estimates of what to shoot for in the 10
dimensional case:
< 10,000 evaluations, if not randomly rotated and no far initialization
< 100,000 if rotated (complete parameter dependence)+ mean searching
< 1,000,000 if rotated and without mean searching (or other averaging)
These are only order of magnitude estimates, but they should give you some
sense of what seem to me to be "good" run times.
Best regards,
Ken