Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Genetic Algorithms

6 views
Skip to first unread message

Ioannis Androulakis

unread,
Jun 29, 1989, 2:10:09 PM6/29/89
to

*********
The following article concerns Genetic Algorithms. I apologize
for bothering the list with that subject but I guess it is
the only list available for me.
*********
In Genetic Algorithms theory the probabilities of application
of the genetic operators are considered to be independent of
each other. What do you think of a GA where the probabilties
would give a sum of 1. Therefore, after a string has been selected
we apply to that string an operator selected randomly according
to the probability that has been atributed to it. In other words
each string has associated with it a set of operators and the
corresponding probabilities of application of these operators.

Any suggestion is welcomed.
Thank you,
yannis
andr...@helium.ecn.purdue.edu

Philip Resnik

unread,
Jun 29, 1989, 4:58:32 PM6/29/89
to
In article <10...@cb.ecn.purdue.edu> andr...@cb.ecn.purdue.edu
(Ioannis Androulakis) writes:
> In Genetic Algorithms theory the probabilities of application
> of the genetic operators are considered to be independent of
> each other. What do you think of a GA where the probabilties
> would give a sum of 1. Therefore, after a string has been selected
> we apply to that string an operator selected randomly according
> to the probability that has been atributed to it. In other words
> each string has associated with it a set of operators and the
> corresponding probabilities of application of these operators.

It's pretty widely accepted that no single set of genetic algorithm
parameters (e.g. population size, mutation rate, crossover rate)
is optimal for all problems, and it sounds like you're trying
to work out an approach to that issue. One common previous
approach has been to vary the mutation rate from a higher value
at the beginning of a run (thus initially emphasizing exploration of
the search space) linearly down to a lower value at the end of
the run (where presumably crossover will be more successful at
producing good solutions).

Another recent, related piece of work to look at is the "adaptive
operator probabilities" work of Lawrence Davis: he takes a set
of genetic operators (e.g. mutation, single-point crossover,
two-point crossover, uniform crossover, hillclimbing operators)
and gives them each an initial probability at the beginning of
the run, and then modifies the operator's probability on the
basis of the performance of the children produced by this
operator (and their descendants). He's had quite good results
with this approach, and it also provides an interesting way of
testing out new operators to see if they improve performance or
hurt it. Note, however, that he maintains a single probability
for each operator across the entire population, as opposed to
having a different probability for each member.

Lawrence Davis, "Adapting Operator Probabilities in Genetic Algorithms,"
Proceedings of the 3rd International Conference on Genetic Algorithms,
Morgan Kaufman, 1989.

Philip Resnik
pre...@bbn.com

0 new messages