GA with big genomes?

0 views
Skip to first unread message

Howard Landman

unread,
Jul 21, 1993, 4:17:37 PM7/21/93
to
Has anyone had any experience using GA with large genomes, say
in the range of 15K to 20K bits? What population size is
best, and how many generations are usually required for
success or reasonable optimality? Most of the examples I see
are much smaller than this.

Or is this hopeless and I'm falling prey to Patterson's Precept?

Howard A. Landman
lan...@hal.com

PATTERSON'S PRECEPT:
Inexperience coupled with ambition leads to very large designs.

LANDMAN'S LAW:
In any sufficiently large design, if there is a type of error
for which you have no automatic way of checking, then the final
design will contain at least one error of that type.

LANDMAN'S LEMMA:
All designs are now sufficiently large. See Patterson's Precept.

David A. Honig

unread,
Jul 21, 1993, 11:12:18 PM7/21/93
to
lan...@hal.COM (Howard Landman) writes:
>Has anyone had any experience using GA with large genomes, say
>in the range of 15K to 20K bits? What population size is
>best, and how many generations are usually required for
>success or reasonable optimality? Most of the examples I see
>are much smaller than this.

I did a few days' experimentation with a system that "bred pretty chips"
--- it worked on the placement problem in IC design. The genome consisted
of a placement: the assignment of each transistor (well, gate) to a
particular row and column. You could use 16b words for the row and column.

I found that a population of about 200 yielded the same fitness as a
population twice that size, but better than one half that size, for cases in
the 10's of gates. I would run for a few hundred generations, making 1 or 2
mutations per generation per individual. My reproduction function was
extrememly crude: sort the current pop by evaluation, pick the best N,
duplicate them equally with mutations (carrying the parent into the next
generation too, ie, not mutating it.) Typically I only saved the best 2-3
%. Pretty harsh by industrial human standards but not by, e.g., botanical
ones. (Here's a research question: do you keep a constant number of
breeding parents, or do you vary it over time? Punctuated or constant
evolution? When do you toss comets at the buggers? Its not easy playing
god. :-)

I found similar results with larger designs, those with 100s of gates.
That's still somewhat small wrt Howard's query. The larger designs
did better with more generations, though not an order of magnitude more.

But I found that the most important thing is getting your cost functions
right, and to do that *fast* (ie, incrementally), you might use more
memory-intensive representations of individuals. That also takes an
understanding of the problem domain, though not *nearly* the cleverness it
takes to design a good placement algorithm from the beginning. Also, since
the placement fed into a one-pass constructive algorithm routing stage, you
end up designing cost functions that take into account the foibles of the
rest of the system.

Ie, if your router doesn't like some kind of placement, weight those
placements poorly.

Since I was interested in the final product (ie, chip acreage), I was
designing cost functions indirectly in a way. Sort of like this:
the gene(s) that codes for, say, tooth enamel, don't actually care about
your ability to chew so much as your ability to have kids; so you can
end up with a population of individuals with teeth that aren't that great
for chewing but attract highly fertile mates. (Note that in my optimizer
the individuals 'budded', they didn't have crossover or diploid sex.)


>Or is this hopeless and I'm falling prey to Patterson's Precept?
>
> Howard A. Landman
> lan...@hal.com
>
>PATTERSON'S PRECEPT:
> Inexperience coupled with ambition leads to very large designs.

This should be told to grad students upon starting out...

>
>LANDMAN'S LAW:
> In any sufficiently large design, if there is a type of error
> for which you have no automatic way of checking, then the final
> design will contain at least one error of that type.
>
>LANDMAN'S LEMMA:
> All designs are now sufficiently large. See Patterson's Precept.


--
David Honig

You know. Bits. Pixels. Nucleotides. Dollars. Signal-to-noise. Spikes.
You know, information.

R. D. Cook [Rick]

unread,
Jul 22, 1993, 11:08:55 AM7/22/93
to
In article <22k891...@bang.hal.COM>, lan...@hal.COM (Howard Landman) writes:
|> Has anyone had any experience using GA with large genomes, say
|> in the range of 15K to 20K bits? What population size is
|> best, and how many generations are usually required for
|> success or reasonable optimality? Most of the examples I see
|> are much smaller than this.
|>
|> Or is this hopeless and I'm falling prey to Patterson's Precept?

Howard,
I have not done this myself - at least not yet. I will say that your
population size should be at least double the 20K genome size -
say 50K and you may have to let this puppy wrong for several
(100's or 1000's) of generations.

These are rules of thumb - emperical if you will. Good luck.
Rick

Charles Ofria

unread,
Jul 22, 1993, 8:20:26 PM7/22/93
to
In article <22k891...@bang.hal.COM> lan...@hal.COM (Howard Landman) writes:
>Has anyone had any experience using GA with large genomes, say
>in the range of 15K to 20K bits? What population size is
>best, and how many generations are usually required for
>success or reasonable optimality? Most of the examples I see
>are much smaller than this.

There are a lot more variables that you need to take into account to
get any kind of decent estimate of the number of generations required,
mutation rates and exactuly what you want the program to do being chief
among these. In any event, the size of your creatures is on the large
size, and I'd have to reccomend that if you want to experiment that you
start with something smaller.

To elaborate on the point of what the program does, if there are
multiple paths (or at least many minor diviants) to get to an optimized
program, then it stands a much better chance then a totally fragile
program that will break from any change. Definatly play around with it
- I've actually been working on a project dealing with them for th summer
and its certainly been a blast.

>Or is this hopeless and I'm falling prey to Patterson's Precept?
>
> Howard A. Landman
> lan...@hal.com
>
>PATTERSON'S PRECEPT:
> Inexperience coupled with ambition leads to very large designs.
>
>LANDMAN'S LAW:
> In any sufficiently large design, if there is a type of error
> for which you have no automatic way of checking, then the final
> design will contain at least one error of that type.
>
>LANDMAN'S LEMMA:
> All designs are now sufficiently large. See Patterson's Precept.

Gee... this explains the downfall of too many of my ideas....

--- Charles

Bryn Williams

unread,
Jul 26, 1993, 4:42:09 PM7/26/93
to
In article 22k891...@bang.hal.COM, lan...@hal.COM (Howard Landman) writes:
>Has anyone had any experience using GA with large genomes, say
>in the range of 15K to 20K bits? What population size is
>best, and how many generations are usually required for
>success or reasonable optimality? Most of the examples I see
>are much smaller than this.


Check out Robert Collins' AntFarm work - he uses a 25000 bit chromosome. But
then he does have a Connection Machine to run it on. :-)

Collins, R. J. & Jefferson, D. R. AntFarm: Towards Simulated Evolution.
Artificial Life II (eds. Farmer, J.D, Langton, C., Rasmussen S. & Taylor, C.)
Addison-Wesley, 1991.

That's the only work that springs to mind that uses chromosomes of that order
of magnitude. Hope it's of some help.

Bryn

---

+-----------------------------------------------------------------------------+
| Bryn Williams |
| Department Computer Science and Applied Maths, <will...@cs.aston.ac.uk> |
| Aston University, <willi...@kirk.aston.ac.uk> |
| Aston Triangle, |
| Birmingham B4 7ET Tel. +44 (0)21 359 3611 x4671 |
| UK Fax. +44 (0)21 333 6215 |
+-----------------------------------------------------------------------------+

Reply all
Reply to author
Forward
0 new messages