Johnson-Laird, P.N. (2002). How Jazz Musicians Improvise. Music
Perception, 19(3), 415-442.
It has nothing to do with games, but, in the context of jazz, lays out
three types of "creative" algorithms:
-- neo-Darwinian
-- neo-Lamarkian
-- hybrid of the above
Neo-Darwinian describes what you are talking about Julian:
generate-and-test. The allusion to genetic algorithms is apparent.
Neo-Lamarkian algorithms use the evaluation metrics as a model in the
generation of successors. The hybrid approach is a compromise between
the other two. Pros and cons of each are discussed.
Mark Riedl
---
Assistant Professor
College of Computing
Georgia Institute of Technology
---
ri...@cc.gatech.edu
http://www.cc.gatech.edu/~riedl/
Roguelikes were already brought up, which can be seen as hand-coded
versions of that; Harold Cohen's AARON, in a different domain, is
another hand-coded example. The decisions could also be learned from
examples of a designer's output, which might be particularly practical
in the game domain: it seems like a lot of the motivation for PCG in
industry is a hope that we can do things like, "here's 200 examples of
the sort of thing we want, give us a bunch more in the same style
please". An example of that in jazz, which reminded me of this approach,
is this system that learns a generative statistical model of how Charlie
Parker does solo-trading:
B. Thom (2000). Unsupervised Learning and Interactive Jazz/Blues
Improvisation. Proceedings of AAAI 2000.
http://www.cs.cmu.edu/~bthom/PAPERS/aaai2k.pdf
-Mark
---
PhD student, School of Interactive Computing, Georgia Tech
Researcher, Expressive Intelligence Studio, UC Santa Cruz
http://www.cc.gatech.edu/~mnelson/
The way I see it is that a generate-and-test system consists of a
generator that produces complete artifacts according to some
parameters, and a test system that assigns some
score/fitness/"goodness" to the artifact. The score does not need to
be a single scalar, but could be a vector, specifying how good the
artifact is according to several different criteria. The
parametrization for the next artifact(s) will then in some way depend
on the parametrization of the previous artifact(s), for example the
next generation could be created from the previous generation. (In the
case of a score vector, you use a multiobjective evolutionary
algorithm.)
So I would see what Adam calls generate-and-test as a degenerate case
of an optimization-based approach, where there is only a single
score/fitness dimension, this dimension is discrete and only has two
possible values (accept or reject), and where the parameters of the
new artifact that is generated might not depend in a meaningful way on
the parameters of the previous artifact that was generated.
I tend to get into taxonomizing now and then, and especially for a
research field which doesn't have a textbook yet I think there's a
case for doing so. Hmm... maybe it's unfair to call g&t a "degenerate
case". Maybe we should call both optimization-based approaches and the
procedure Adam refers to as different forms of g&t?
Julian
2009/10/23 Adam Smith <rndm...@gmail.com>:
--
Julian Togelius
Assistant Professor
IT University of Copenhagen
Rued Langgaards Vej 7
2300 Copenhagen S
Denmark
jul...@togelius.com
http://julian.togelius.com
+46-705-192088
Julian
2009/10/23 E. Hastings <ejhas...@gmail.com>:
--
So I would see what Adam calls generate-and-test as a degenerate case
of an optimization-based approach
The "estimation of distribution" flavour of evolutionary algorithms
(e.g. CMA-ES) is a form of model-based optimization, as acknowledged
in the Zlochin et al. paper you refer. There are also good arguments
that many other EAs (e.g. evolution strategies with self-adaptation,
or differential evolution) are implicitly model-based.
Moral: every good idea has been discovered independently by many
people and come with many labels (in different research communities).
Julian
I recently ran across a description of the galaxy-generation algorithm
for _Elite_ (1984), which has the amusing motivation of reducing RAM
requirements--- content is procedurally generated (deterministically) so
it doesn't have to be actually stored.
From
<http://en.wikipedia.org/wiki/Elite_(computer_game)#Technical_innovations>:
"The Elite universe contains eight galaxies, each galaxy containing 256
planets to explore. Due to the limited capabilities of 8-bit computers,
these worlds are procedurally generated. A single seed number is run
through a fixed algorithm the appropriate number of times and creates a
sequence of numbers determining each planet's complete composition
(position in the galaxy, prices of commodities, and even name and local
details — text strings are chosen numerically from a lookup table and
assembled to produce unique descriptions for each planet). This means
that no extra memory is needed to store the characteristics of each
planet, yet each is unique and has fixed properties. Each galaxy is also
procedurally generated from the first.
"However, the use of procedural generation created a few problems. There
are a number of poorly located systems that can be reached only by
galactic hyperspace — these are more than 7 light years from their
nearest neighbour, thus trapping the traveller. Braben and Bell also
checked that none of the system names were profane."
-Mark