Stanford AI Class

492 views
Skip to first unread message

labw...@gmail.com

unread,
Aug 8, 2011, 2:47:35 PM8/8/11
to clo...@googlegroups.com
As most of you probably already know, Peter Norvig and S. Thrun will offer a free online intro to AI class in the Fall. The problem is that it will probably require Python since the third edition of the book is in Python. I am somewhat upset that this will make Python the de facto language of AI for a very large number of students. I was hoping for Clojure frankly or have some breathing room. Anybody knows anything about that?

daly

unread,
Aug 8, 2011, 3:22:48 PM8/8/11
to clo...@googlegroups.com
I signed up for the course. Python is not the best choice but
I'm sure that Peter Norvig knows that. I suppose it was chosen
because it is popular.

AI involves learning which, by definition, involves permanent
changes of behavior. The best way to achieve that is to have the
program self-modify. This is trivial to do in lisp because
programs are data. I'm not sure it is possible in Python without
implementing a python reader/writer. You can fake the learning
by pushing it all into data structures, of course. But there is
a subtle joy in watching the machine write its own code. Maybe
they will have hardware with FPGAs that can be reprogrammed on
the fly.

I have been "doing AI programming" for years only to watch what
was once considered AI become just another program. Machine vision,
planning, game playing, computer algebra, knowledge representation,
search, robotics, natural language, speech recognition, and many
other technologies used to be AI. I'm curious to see what they
consider AI today. Will they address the frame problem? Will they
mention facets? Will self-modification even be mentioned?

Tim Daly

> --
> You received this message because you are subscribed to the Google
> Groups "Clojure" group.
> To post to this group, send email to clo...@googlegroups.com
> Note that posts from new members are moderated - please be patient
> with your first post.
> To unsubscribe from this group, send email to
> clojure+u...@googlegroups.com
> For more options, visit this group at
> http://groups.google.com/group/clojure?hl=en


Mark Engelberg

unread,
Aug 8, 2011, 4:17:49 PM8/8/11
to clo...@googlegroups.com
The course website (for the actual classroom-based class that
underlies the online class) doesn't list any programming language as a
prerequisite. The only prerequisites listed are a strong
understanding of probability and linear algebra.

Mark Engelberg

unread,
Aug 8, 2011, 4:22:56 PM8/8/11
to clo...@googlegroups.com
BTW, Norvig's older AI book uses LISP. According to his website, he
switched to Python because students complained that the LISP code did
not look enough like the pseudocode outline of how a given algorithm
works, and had trouble making the connection between the two.

daly

unread,
Aug 8, 2011, 5:18:23 PM8/8/11
to clo...@googlegroups.com
I don't see where they specified a programming language but
I know that Python is making huge inroads. Python is supposed
to be "easier to understand" so I can see why it might be a
factor in language choice.

Another likely factor is that Google (where Norvig works) supports
Python but not Lisp. I find the choice of Python over Lisp
surprising considering that Lisp compiles to machine code and
Python does not. Apparently a free, global, factor-of-10 speed
improvement isn't a worthy engineering goal :-)

It is trivial to make Lisp look like Python,
just put each paren on its own line and move them hard right.
Add a few macros (e.g. for) and you could probably parse it.

Clojure might have been a nice fit since it lives both in the
Lisp world and the Java world. Java is one of the approved Google
languages and Clojure is homoiconic so self modification is easy.
But, hey, its an AI class so we really don't want to work outside
the comfort zone :-)

If you look at "traditional AI" such as games I think you'll
find that minimax algorithms are going to really strain to work
in games like Go without tail recursion or call/cc. Or if you
use Fourier Transforms to spread the learning from the 9x9
training board to the 19x19 board I think you'll miss the
elegance of Lisp for programming the network structures. Fourier
learning doesn't seem to be widely known so I doubt it will get
mentioned.

AI without Lisp is possible but not pretty.

But I'm signed up anyway and I'm willing to be convinced otherwise.
I have worked in Python commercially and it's a reasonable language.
They are hoping for 10,000 students. It would have been great to
have Clojure taught to 10k students in one class but I'm sure that
signing up for that language war would be a distraction. Maybe next
year.

Tim Daly

labw...@gmail.com

unread,
Aug 8, 2011, 5:19:18 PM8/8/11
to clo...@googlegroups.com
But it could also mean that you are expected to learn Python while taking the class.

Mark Engelberg

unread,
Aug 8, 2011, 8:41:54 PM8/8/11
to clo...@googlegroups.com
On Mon, Aug 8, 2011 at 2:18 PM, daly <da...@axiom-developer.org> wrote:
> It is trivial to make Lisp look like Python,
> just put each paren on its own line and move them hard right.
> Add a few macros (e.g. for) and you could probably parse it.

I agree with most of what you said, but not this.

There's a big difference between the readability of:
a[i+1] += 2
and
(vector-set! a (add1 i) (+ (vector-ref a (add1 i)) 2) ; Scheme
or
(swap! a (assoc a (inc i) (+ (a (inc i)) 2)) ; Clojure

Python is widely regarded as the most readable language. My
understanding is that this AI class is a cross-disclipine class at
Stanford, taught to engineers and others who have minimal background
in programming. My impression is that there will be little to no
programming assignments in the class, however, there are programs in
the book as a sort of "executable pseudocode" to describe the
algorithms. In other words, they are optimizing for readability, not
writability. If there does end up being programming in the class, my
guess from the syllabus is that it will be very heavy on linear
algebra; Python has good libraries for that.

On the other hand, Lisp's prefix notation, while natural for some
things, looks very unnatural in many contexts, especially math, where
people have a whole lifetime of seeing infix notation. Furthermore,
Clojure requires a fundamentally different way of thinking about state
and simulation, and nearly every classical algorithm (many of which
are stateful) would have to be reworked in a less stateful manner
and/or wrapped in layers of refs, dosyncs, alters and dereferencing,
all of which potentially distract from the clarity of the algorithm.

I personally prefer Clojure to Python, but in the context of this
class, I have no doubt that Python is the more appropriate choice.

Robert Levy

unread,
Aug 9, 2011, 10:42:50 AM8/9/11
to clo...@googlegroups.com

Another likely factor is that Google (where Norvig works) supports
Python but not Lisp.
I find the choice of Python over Lisp
surprising considering that Lisp compiles to machine code and
Python does not. Apparently a free, global, factor-of-10 speed
improvement isn't a worthy engineering goal :-)



In my spare time I am working on completing a computer science MA program in natural language processing, which I am roughly 1/2 of the way through.  So far all of my courses have made Python the default, but every professor has allowed me to do my work in Clojure instead.  I'm sure we can do the same for the online course.  If you know Python, it's not a big deal to have to read their examples in an inferior language, but do your actual work in Clojure, IMHO. ;)

Rob

Zack Maril

unread,
Aug 9, 2011, 10:49:01 AM8/9/11
to Clojure
If you looks here, http://aima.cs.berkeley.edu/code.html, you can see
that the data/code is provided in formats for Java, Lisp, Python, and
just some plaintext as well. Here is his rationale, and other info,
about why he switched: http://norvig.com/python-lisp.html
Personally, I plan on giving it all a shot with Clojure. It makes the
most sense to me right now and its the language I want to learn.
-Zack

On Aug 8, 7:41 pm, Mark Engelberg <mark.engelb...@gmail.com> wrote:

daly

unread,
Aug 9, 2011, 4:33:23 PM8/9/11
to clo...@googlegroups.com
On Tue, 2011-08-09 at 10:42 -0400, Robert Levy wrote:
>
>
>
> Another likely factor is that Google (where Norvig works)
> supports
> Python but not Lisp.
>
>
> With some exceptions I guess?
>
> http://www.itasoftware.com/
> http://en.wikipedia.org/wiki/Google_App_Inventor
>
> http://groups.google.com/group/clojure/browse_thread/thread/156da03edd630046?pli=1

I interviewed at Google. They stated that they only support 3
languages in production, Java, C++, and Python. I've worked commercially
in all three and one of my 3 open source projects is in C++ (Magnus).
That said, my "native language" is lisp as I've been writing lisp code
for 40 years in a dozen dialects. The interviewers did not know lisp.
I suspect this will get even worse as the top universities teach
Python rather than Scheme.

I believe that ita software has only recently been bought by Google.
I tried to apply to ita software, a lisp shop, but they have the
"hiring puzzles" barrier. I simply refuse to work at a place where
management thinks this is a good idea. I have a resume. I have open
source work, publicly available, where I modify a million line, lisp
based, computer algebra system. In my experience with interviews,
the places with "puzzles" and "MS/Google quizes" are not really
bothering to evaluate the candidate (i.e. me). They are usually
unfamiliar with my resume (nobody at Google knew what was on it
even though they had a copy at the interview). The whole idea of
such approaches shows (a) a lack of respect for the individual and
(b) an arrogant attitude of "you should feel LUCKY that we even
CONSIDERED talking to you"...Google even have me a t-shirt when they
rejected me :-). It smells like a hazing ritual for a frat club.

It should be interesting to see how the culture changes at ita.
I might interview just to get the t-shirt. I could end up never
buying shirts again :-)

Enough of the war stories... back to Clojure. Languages affect the
way that you think. For example, in my Java job everyone talked about
"design patterns" (e.g. "the singleton pattern", "the visitor pattern").
Even though Clojure is Java based I don't see the same design pattern
discussions. Singletons are just eqs. Visitors are mapping functions.
See Norvig's talk:
http://norvig.com/design-patterns/ppframe.html

This seems to be a subtle difference in thinking but it leads to a
profound difference in approach. Java "design patterns" try to shape
the solution to the patterns. Lisp-based solutions try to shape the
solution to the problem. I call this the "impedance mismatch".

Impedance mismatch is when you hook a firehose to a soda straw.
Or when you try to solve your problem by expressing it in bits.
Some languages are very close to your problem, like R for statistics.
They rely on a smart compiler to cross the chasm to the machine.
Some languages are very close to the machine, like assembler.
They rely on a smart programmer to carry the problem to the machine.
Most languages are "general purpose" and sit in the middle.

Lisp is the only language I know that allows you to completely
cover the chasm as you see fit. It is possible to write
(integrate (car x))
where (car x) is the 0th offset from x and integrate is a huge
hairy function. Other languages do this (e.g. a[0]) but I find there
is a strong language-based focus (design patterns) rather than a
problem-based focus. So Lisp seems to have the lowest impedance
mismatch of all of the languages I know. It is easier to match
the solution to the problem without the language being a barrier.

Watch the postings on this list and see how often people are tangled
in design pattern discussions. I don't remember seeing that thread.
Doesn't that seem odd, given the Java connection?

So I think that, on the whole, the universities have made a bad choice
not from a language perspective but from a "thinking" perspective. I
have taken the online SICP course from the authors and it had little to
do with Lisp and a lot to do with thinking. I don't know how to teach
the same way of thinking in a non-lisp course. How can you re-express a
data representation (cons) in lambda form in Python? You don't even have
the words to begin. See
http://lispy.wordpress.com/2007/10/13/how-studying-sicp-made-me-a-better-programmer

Clojure would be ideal for this kind of teaching as you can start with
Java or Clojure and show both ways of expressing things.

Rich has been very clever to use the benefits of OO programming ideas
without the downsides found in OO languages. I don't see the golf-ball
problem showing up in Clojure where I see it a lot in Java:
http://www.perlmonks.org/index.pl?node_id=645261

>
> In my spare time I am working on completing a computer science MA
> program in natural language processing, which I am roughly 1/2 of the
> way through. So far all of my courses have made Python the default,
> but every professor has allowed me to do my work in Clojure instead.
> I'm sure we can do the same for the online course. If you know
> Python, it's not a big deal to have to read their examples in an
> inferior language, but do your actual work in Clojure, IMHO. ;)

When I did natural language work I found that the chart-parser we
wanted was easier to express in Lisp functionally but since the rest
of the system was in C I wrote it in C. Python isn't a bad language,
it is just the MSBasic of its time. Program in Python and Lisp and
then listen to your own thought processes. You'll find that you think
in different ways. Pick the language that matches the way you think.
Some people really do think in Python and find Lisp hard. Language
wars are based on the misunderstanding that there is a right way to
think. Programming is thinking. Find your "native language".

Norvig is very well aware of this so I don't anticipate language
issues being a problem.

Tim Daly


Robert Levy

unread,
Aug 9, 2011, 6:59:21 PM8/9/11
to clo...@googlegroups.com
 
unfamiliar with my resume (nobody at Google knew what was on it
even though they had a copy at the interview). The whole idea of
such approaches shows (a) a lack of respect for the individual and
(b) an arrogant attitude of "you should feel LUCKY that we even
CONSIDERED talking to you"...Google even have me a t-shirt when they
rejected me :-). It smells like a hazing ritual for a frat club.


Wow, I don't know anything about Google's hiring practices, but it sounds like they'd rather shape and mold the cheapest most gaga-eyed fresh-out-of-college youths instead of seeking out experienced people with relevant backgrounds in the projects they'd be working on.  At least with the ITA challenges (I never interviewed there but I have seen these) it can be a way of demonstrating actual skills (albeit on a toy problem), more so than ridiculous on-the-spot puzzles at an interview, but it does seem insulting to require that of people who have more established credentials.

So I think that, on the whole, the universities have made a bad choice
not from a language perspective but from a "thinking" perspective. I
have taken the online SICP course from the authors and it had little to
do with Lisp and a lot to do with thinking. I don't know how to teach
 
&
 
Some people really do think in Python and find Lisp hard. Language
wars are based on the misunderstanding that there is a right way to
think. Programming is thinking. Find your "native language".

I agree with the former, but I'm less of a relativist on the latter.  I mean, some ways of thinking are just more elegant and less muddled than others.  On one hand, it is true that having a diversity of different approaches is more likely to yield novel and unexpected solutions to problems.  But when we can clearly see the same pathologically clumsy behavior repeated and replicated from generation to generation (the need for design pattens being a perfect example), the languages that foster clearer thinking should be favored.

daly

unread,
Aug 12, 2011, 12:41:19 PM8/12/11
to clo...@googlegroups.com, da...@axiom-developer.org
Here is an interesting question to ponder...

Learning involves permanent changes of behavior.

In AI this is often modeled as a self-modifying program.
The easiest way to see this would be a program that handles
a rubics cube problem. Initially it only knows some general
rules for manipulation, some measure of progress, and a goal
to achieve.

Each time it solves a cube it can create a new rule that
moves from the starting configuration to the solution in
one "procedure", rather than constantly invoking new rules.
This could be modeled in the data but we are assuming self
modification for the moment.

So we start with a lisp loop that does something like:

(loop
look at the current cube state
for each rule,
if the condition of the rule matches the cube state
then apply the rule to update the cube state.
)

This could be written in some form of a huge switch
statement where each case matches a cube state, e.g.

switch (cubestate)

(GBG RRB GBB ...): rotate upper cube slice 90 degrees clockwise
(GBG RRR BBY ...): rotate lower cube slice 90 degrees clockwise
.....
(GGG GGG GGG ...): cube solved

When we solve the cube we remember the sequence of rotations
and add a new rule that matches the starting state followed by
a sequence of rotations:

switch (cubestate)

(GGB RRW GYY ...): rotate upper cube slice 90 degrees clockwise
rotate left cube slice 90 degrees clockwise
....
cube solved
(GBG RRB GBB ...): rotate upper cube slice 90 degrees clockwise
(GBG RRR BBY ...): rotate lower cube slice 90 degrees clockwise
.....
(GGG GGG GGG ...): cube solved; learn new rule; self-modify.

Ok. So now we have an architecture for a simple program that learns
by self-modification. We could get all kinds of clever by applying
special recognizers to combine rotations, find subsequences, etc.
but lets discuss the self-modification issue in Clojure.

Hmmm.
Clojure has immutable data structures.
Programs are data structures.
Therefore, programs are immutable.

So is it possible to create a Clojure program that modifies itself?

Tim Daly
da...@axiom-developer.org

Ken Wesson

unread,
Aug 12, 2011, 1:08:55 PM8/12/11
to clo...@googlegroups.com
On Fri, Aug 12, 2011 at 12:41 PM, daly <da...@axiom-developer.org> wrote:
> Clojure has immutable data structures.
> Programs are data structures.
> Therefore, programs are immutable.
>
> So is it possible to create a Clojure program that modifies itself?

Yes, if it slaps forms together and then executes (eval `(def ~sym
~form)) or (eval `(defn ~sym ~argvec ~form)) or similarly, or perhaps
uses alter-var-root. (May require :dynamic true set for the involved
Vars in 1.3 for functions and such to start using the new values right
away -- binding definitely does. In 1.2, alter-var-root should "just
work". Changes occur as with atom's swap!, so the function passed to
alter-var-root may potentially execute more than once.)

Alternatively, you can create new functions on the fly by evaling fn
forms and use atoms or refs to hold stored procedures in Clojure's
other concurrency-safe mutability containers. These need to be called
by derefing them, e.g. (@some-atom arg1 arg2). The functions will
compile to bytecode and be eligible for JIT the same as ones not
created dynamically at runtime, though the reference lookups carry a
performance hit on invocation.

Clojure can probably be quite a good AI research and development platform.

--
Protege: What is this seething mass of parentheses?!
Master: Your father's Lisp REPL. This is the language of a true
hacker. Not as clumsy or random as C++; a language for a more
civilized age.

daly

unread,
Aug 12, 2011, 4:25:31 PM8/12/11
to clo...@googlegroups.com
On Fri, 2011-08-12 at 13:08 -0400, Ken Wesson wrote:
> On Fri, Aug 12, 2011 at 12:41 PM, daly <da...@axiom-developer.org> wrote:
> > Clojure has immutable data structures.
> > Programs are data structures.
> > Therefore, programs are immutable.
> >
> > So is it possible to create a Clojure program that modifies itself?
>
> Yes, if it slaps forms together and then executes (eval `(def ~sym
> ~form)) or (eval `(defn ~sym ~argvec ~form)) or similarly, or perhaps
> uses alter-var-root. (May require :dynamic true set for the involved
> Vars in 1.3 for functions and such to start using the new values right
> away -- binding definitely does. In 1.2, alter-var-root should "just
> work". Changes occur as with atom's swap!, so the function passed to
> alter-var-root may potentially execute more than once.)

Consing up a new function and using eval is certainly possible but
then you are essentially just working with an interpreter on the data.

How does function invocation actually work in Clojure?
In Common Lisp you fetch the function slot of the symbol and execute it.
To modify a function you can (compile (modify-the-source fn)).
This will change the function slot of the symbol so it will execute the
new version of itself next time.

Does anyone know the equivalent in Clojure? Would you have to invoke
javac on a file and reload it? The Clojure compile function only seems
to know about files, not in-memory objects.


>
> Clojure can probably be quite a good AI research and development platform.

Well I'm starting to "prep" for the class by thinking about how to
write a self-modifying function that learns in Clojure.

The mixture of immutability and self-modification would imply that
all of the "old" versions of the function still exist, essentially
forming a set of more primitive versions that could be a interesting
in its own right.

Suppose, for instance, that you did have all of the prior versions.
You could create a "branching learning application". The idea is that
you can have a linear path of learning and then reach back, modify an
old version, and create a second "branch" so now you have two
different paths of learning. This idea might be very useful. One of
the problems that happens in learning systems is that they "hill climb".
They constantly try to get better. But if you think about "better", it
might mean that you reached a local optimum (a low peak) that cannot
get better. But there might be a better optimum (a higher peak) on some
other path. Branched learning could look back to a lower point, and
choose a second path. If the second path ends up better than the first
then you can abandon the worst path.

Since Clojure has immutable data and data are programs it would seem
that Clojure has a way of doing this "branched learning".

Clojure could certainly bring a couple interesting ideas to the AI
class.

Tim Daly
da...@axiom-developer.org

Sergey Didenko

unread,
Aug 12, 2011, 6:16:15 PM8/12/11
to clo...@googlegroups.com
BTW, Is there a case when AI self-modifying program is much more elegant than AI just-data-modifying program?

I just can't figure out any example when there is a lot of sense to go the self-modifying route.

pmbauer

unread,
Aug 12, 2011, 7:30:49 PM8/12/11
to clo...@googlegroups.com
+1

Ken Wesson

unread,
Aug 12, 2011, 8:14:22 PM8/12/11
to clo...@googlegroups.com
On Fri, Aug 12, 2011 at 4:25 PM, daly <da...@axiom-developer.org> wrote:
> Consing up a new function and using eval is certainly possible but
> then you are essentially just working with an interpreter on the data.
>
> How does function invocation actually work in Clojure?
> In Common Lisp you fetch the function slot of the symbol and execute it.
> To modify a function you can (compile (modify-the-source fn)).
> This will change the function slot of the symbol so it will execute the
> new version of itself next time.

As I indicated in the earlier post, consing up a new function and
eval'ing it actually invokes a compiler and generates a dynamic class
with bytecode. It's just as eligible for JIT as a class compiled AOT
and loaded the "usual Java way".

As for your later speculations, whether a history of earlier values is
kept is entirely* up to the programmer. They can keep older versions
around or not. If they stop referencing one, the GC should collect it
at some point (modulo some VM options needed to make unreferenced
classes collectible).

* The STM, as I understand it, may keep a history of the last few
values of a particular ref, if that ref keeps getting involved in
transaction retries. But this history isn't accessible to user code,
at least without doing implementation-dependent things that could
break in future Clojure versions. If you store a lot of big things in
refs, it could impact memory and GC performance though.

daly

unread,
Aug 12, 2011, 9:07:41 PM8/12/11
to clo...@googlegroups.com

Clearly both are equivalent in a Turing sense.

If you are 6 foot tall but "modify that in data"
so you are 7 foot tall and always report yourself
as 7 foot tall then there is no way to distinguish
that from the records. Yet there is a philosophical
difference between "being" and "reporting". AI involves
a lot of debate about philosophy so expect that.

Learning, by one definition, involves a permanent change
in behavior. This has to be reported in some way. Since
programs are data in lisp this is something of a semantics
debate. Is the program "really changed" or just "reporting"?

Consider a more "advanced" kind of learning where we use
genetic programs to evolve behavior. Clearly you can do this
all using data but it is a bit more elegant if you can take
"genes" (i.e. slices of code), do crossovers (i.e. merge the
slices of code into other slices), and get a new mutated set
of "genes". These can be embedded in chromosomes which are
just larger pieces of code. Real cells don't use a "data
scratchpad", they self-modify.

The resulting self-modified lisp code has execution semantics
defined by the language standard (well, CL has a standard).
The "data representation" does not have standard semantics.
Data has semantics relative to the "master smart program thing"
you wrote.

It turns out that self-modifying lisp code is both interesting
and elegant as the "data" is "itself", not some reported-thing.
There is a reason lisp dominated the AI world for so long.

Many centuries ago I authored a language called KROPS (see
[1] below). It was a program that allowed you to
represent knowledge using both subsumption (KREP-style) and
rules (OPS5-style). Adding a new piece of knowledge caused it
to self-modify in a way that allowed both execution semantics
and "data" semantics. KROPS is way too complex to explain here
but it was very elegant. IBM built a financial and marketing
expert system in KROPS on symbolics machines. The point of all
of this self-trumpet noise is that I don't believe I could have
had the insight to build it in a "data" model. I just let it
build itself to evolve a problem solution. There was no data,
only program.

This isn't intended to be a debate about WHY we might want a
self-modifying program. It is a question of whether Clojure, as
a Lisp, is sufficiently well-crafted to allow it to self-modify.
It has been done in other lisps but I don't yet know how to do
it in Clojure. If you only want a "master smart program thing
that reports data" that's perfectly fine. You could write that
in any language.


Tim Daly
da...@axiom-developer.org

[Daly, Kastner, Mays "Integrating rules and inheritance networks
in a knowledge-based financial and marketing consultation system"
HICSS 1988, pp495-500]


Ken Wesson

unread,
Aug 12, 2011, 9:49:01 PM8/12/11
to clo...@googlegroups.com
(defn f [x]
(println "hello, " x))

(defn g []
(eval '(defn f [x] (println "goodbye, " x))))

(defn -main []
(#'user/f "world!")
(g)
(#'user/f "cruel world."))

Close enough? :)

daly

unread,
Aug 12, 2011, 10:23:49 PM8/12/11
to clo...@googlegroups.com
On Fri, 2011-08-12 at 21:49 -0400, Ken Wesson wrote:
> (defn f [x]
> (println "hello, " x))
>
> (defn g []
> (eval '(defn f [x] (println "goodbye, " x))))
>
> (defn -main []
> (#'user/f "world!")
> (g)
> (#'user/f "cruel world."))
>
> Close enough? :)

You get an A. --Tim

Lee Spector

unread,
Aug 13, 2011, 1:36:23 PM8/13/11
to clo...@googlegroups.com

On Aug 12, 2011, at 9:07 PM, daly wrote:
> On Fri, 2011-08-12 at 16:30 -0700, pmbauer wrote:
>> +1
>>
>> On Friday, August 12, 2011 3:16:15 PM UTC-7, Sergey Didenko wrote:
>> BTW, Is there a case when AI self-modifying program is much
>> more elegant than AI just-data-modifying program?
>>
>> I just can't figure out any example when there is a lot of
>> sense to go the self-modifying route.
>
> Clearly both are equivalent in a Turing sense.
> ... [text omitted]

> Consider a more "advanced" kind of learning where we use
> genetic programs to evolve behavior. Clearly you can do this
> all using data but it is a bit more elegant if you can take
> "genes" (i.e. slices of code), do crossovers (i.e. merge the
> slices of code into other slices), and get a new mutated set
> of "genes". These can be embedded in chromosomes which are
> just larger pieces of code. Real cells don't use a "data
> scratchpad", they self-modify.
> ...[text omitted]

>
> This isn't intended to be a debate about WHY we might want a
> self-modifying program. It is a question of whether Clojure, as
> a Lisp, is sufficiently well-crafted to allow it to self-modify.
> It has been done in other lisps but I don't yet know how to do
> it in Clojure.

I think there's some blurring of levels of abstraction going on in this discussion.

As someone who works on code-modifying AI (genetic programming, much along the lines described above -- which, BTW, I would expect Thrun and Norvig to mention only briefly, if at all... but that's a debate for a different forum) I find that languages that make code manipulation simple and elegant do help one to experiment and develop these kinds of AI systems more easily.

But this is true whether the manipulated code is compiled and executed in the normal way or treated as a data structure and interpreted in some other way.

On the one hand most people who work in genetic programming these days write in non-Lisp languages but evolve Lisp-like programs that are interpreted via simple, specialized interpreters written in those other languages (C, Java, whatever).

On the other hand I prefer to work in Lisp (Common Lisp, Scheme, Clojure), but my main project these days involves evolving Push programs rather than Lisp programs, for a bunch of reasons related to evolvability -- see http://hampshire.edu/lspector/push.html if you really want to know. I prefer to work in Lisps because they make it simpler to write code that manipulates program-like structures (however they end up being "executed") and because I like Lisps better than most other languages for a slew of other reasons. But my evolved code is executed on a Push interpreter implemented in the host language and there are Push-based GP systems in many languages (C++, Java, Javascript, Python, several Lisps). The language choice really just affects software engineering and workflow issues, not the fundamental power of the system to evolve/learn.

So if the question about whether Clojure programs can self-modify in the "modified code executes the same way as any other code" sense then the answer is "yes," and Ken Wesson provided a simple demonstration of this on this thread. But if the question is about whether Clojure is a good language for writing AI systems that involve code-self-modification I would say that the answer is also "yes" but for completely different reasons.

-Lee

Ken Wesson

unread,
Aug 13, 2011, 5:14:59 PM8/13/11
to clo...@googlegroups.com
On Sat, Aug 13, 2011 at 1:36 PM, Lee Spector <lspe...@hampshire.edu> wrote:
> As someone who works on code-modifying AI (genetic programming, much along the lines described above -- which, BTW, I would expect Thrun and Norvig to mention only briefly, if at all... but that's a debate for a different forum) I find that languages that make code manipulation simple and elegant do help one to experiment and develop these kinds of AI systems more easily.
>
> But this is true whether the manipulated code is compiled and executed in the normal way or treated as a data structure and interpreted in some other way.
>
> On the one hand most people who work in genetic programming these days write in non-Lisp languages but evolve Lisp-like programs that are interpreted via simple, specialized interpreters written in those other languages (C, Java, whatever).

The ultimate in Greenspunning. :)

Myriam Abramson

unread,
Aug 13, 2011, 9:21:20 PM8/13/11
to clo...@googlegroups.com
I'm just concerned that teaching AI to 25K+ students is going to have an impact and it shouldn't be because P. Norvig didn't publish the next edition of AIMA. I wish they would realize that and change their mind on the assignments for the class. Even Javascript is more suitable for AI than Python!

--

Paulo Pinto

unread,
Aug 14, 2011, 8:50:32 AM8/14/11
to Clojure
I guess that nowadays many AI systems are mainly programmed in
some kind of specialized DSL.

Sure Lisp based languages are a perfect candidate for it, but the
plain
mention of Lisp brings up some issues that you cannot get rid of, like
the parenthesis.

To be honest, while I was at the university I always preferred Prolog
to Lisp, due to the close relationship some our professors had with
Edinburgh's university.

I think it is better that the students learn AI than a new programming
language
with forces them to think in a different way. Learning multiple
concepts at
the same time is not easy and you might loose students along the way
because of it.

Who knows, maybe some of those students will eventually find their way
to
Clojure/Lisp.


On 14 Aug., 03:21, Myriam Abramson <mabram...@gmail.com> wrote:
> I'm just concerned that teaching AI to 25K+ students is going to have an
> impact and it shouldn't be because P. Norvig didn't publish the next edition
> of AIMA. I wish they would realize that and change their mind on the
> assignments for the class. Even Javascript is more suitable for AI than
> Python!
>
>
>
>
>
>
>
> On Sat, Aug 13, 2011 at 5:14 PM, Ken Wesson <kwess...@gmail.com> wrote:
> > On Sat, Aug 13, 2011 at 1:36 PM, Lee Spector <lspec...@hampshire.edu>

Ken Wesson

unread,
Aug 14, 2011, 10:46:24 AM8/14/11
to clo...@googlegroups.com
On Sun, Aug 14, 2011 at 8:50 AM, Paulo Pinto <paulo....@gmail.com> wrote:
> I guess that nowadays many AI systems are mainly programmed in
> some kind of specialized DSL.
>
> Sure Lisp based languages are a perfect candidate for it, but the
> plain
> mention of Lisp brings up some issues that you cannot get rid of, like
> the parenthesis.

And what, exactly, is wrong with parentheses?

Put another way: the syntax tree of a program will have some sort of
particular structure. To the extent that parentheses DON'T determine
the sub-branches, something else WILL. What do other languages use for
that "something else"?

* Other delimiters, either characters as in C's { ... } blocks, whole
words as in "if ... fi" in some shell languages, or other things.

* Significant whitespace, most commonly in languages that use newlines
to delimit statements (BASIC) but sometimes going much further
(Python).

* Syntax rules where the positions of various things among various
keywords determines the structure.

* Precedence rules among math operators.

Drop the first one (Clojure also uses other delimiters) and what we're
left with is, mostly, icky. Significant whitespace means an editor
line-wrapping or, in some cases, even reindenting code might change
its semantics. Syntax rules and precedence rules burden the developer
with remembering them all, and beyond simple and fairly standardized
rules such as "* before +" precedence rules tend to be avoided by
defensive use of ... parentheses. The one language family with a small
and regular syntax other than Lisp seems to be Smalltalk, and it
avoids the parentheses, with devastating results: 1 + 3 * 4 comes out
as 16 instead of 13. Yikes! They just use strict left to right
evaluation to avoid having complex rules.

> To be honest, while I was at the university I always preferred Prolog
> to Lisp, due to the close relationship some our professors had with
> Edinburgh's university.

This seems to be a total non sequitur, like saying you always
preferred beef to chicken due to the US/Canada border being wiggly
instead of straight east of North Dakota.

In other words: How, exactly, are your former university's professors
and Edinburgh's university causally related to a personal preference
for Prolog? There's no obvious reason why that would be. Does
Edinburgh's university have a strong focus on Prolog for some reason,
which "rubbed off" via their professors and your university's
professors to you?

> I think it is better that the students learn AI than a new programming
> language
> with forces them to think in a different way. Learning multiple
> concepts at
> the same time is not easy and you might loose students along the way
> because of it.
>
> Who knows, maybe some of those students will eventually find their way
> to
> Clojure/Lisp.

If they go into AI, it's very likely that will expose them to Lisp at
some point, yes.

I doubt, on the other hand, that Prolog is a very good language to
use, odd though that may seem. The problem is that an AI you try to
develop in Prolog will probably be strongly influenced by the language
choice towards being a theorem-prover type of system with bells on,
and natural intelligence Does Not Work That Way. Natural intelligence
guesses and intuits and invents whole new concepts to create
simplified models that predict its past sense data, and tests them
against future sense data. Self-aware intelligence adds a simplified
model to predict its own past feelings and behaviors (and has
autobiographical memory so these can be "past sense data"), and tests
them against future sense data (and, potentially, also against
simulations spawned using the world models -- think about that as you
go to sleep tonight). Formal logical reasoning, Prolog style, is
actually something we had to invent, rather than something we had
innately. I doubt we'd be anywhere near as prone to various mistakes
and fallacies in reasoning if we were based on Prolog! On the other
hand we'd probably not be creative.

In simpler language, logic-based AI has been "done to death" and
doesn't seem to lead to anything fundamentally new in terms of
software capabilities (i.e., closer to what we can do, able to
automate more that we currently have to do ourselves). Of course it
can be done in principle, since Prolog is Turing-complete, but it may
be easier in some other language.

Lisp, of course, isn't just "some other language" but a kind of
language-mother due to the ease with which it spawns DSLs, so the best
shot probably actually still lies with Lisp. The most advanced AI we
ever made, able to reason in the most humanlike ways and to invent new
concepts, was Eurisko, and that was programmed in Lisp. In fact,
Eurisko was so promising, and we haven't even come close to equaling
that achievement in the many years since, that I still wonder if the
government or aliens (:)) or time travelers (:)) or someone put the
kibosh on that line of research out of fear of Skynet (:)) or
something.

Paulo Pinto

unread,
Aug 14, 2011, 12:33:00 PM8/14/11
to Clojure
Uau! That was a very comprehensive answer.

I was not expecting it.

Please note that you don't need to argue for Lisp to me, I also
enjoy the language quite much. Unfortunately the closest I get to
use Lisp based languages on the job are Emacs and Gimp macros.

I cannot sneak Clojure, because on my area of work. FP concepts
are a domain of gods.

It seems you are not aware of Prolog's history. Some of the language
achievements where done at Edinburgh's University. Some of our AI
professors made their Phds/Masters at Edinburgh's University plus
there was a connection between our universities. So Prolog was the
natural choice for AI projects. The other being ML due to a similar
connection
to INRIA.

While at University I actually learned Lisp on my own, thanks to the
'Little Lisper', and the Common Lisp standard.

The problem with Lisp is the same I observed with any programming
language/paradigm. To really understand it, you need to use it, but
many people base their opinion on what others say, without even
trying.

Lisp having a long history behind it, has many bad conceptions spread
around.

--
Paulo



On 14 Aug., 16:46, Ken Wesson <kwess...@gmail.com> wrote:

Jeff Heon

unread,
Aug 14, 2011, 6:37:57 PM8/14/11
to Clojure
I think we all agree that Lisp would be ideal for AI, given a medium
or long-term exposure, but for an introductory class to varied
branches of AI, we could do worse than Python, an easy to read
language with various numerical and AI libraries (PyEvolve, for
example. http://pyevolve.sourceforge.net/0_6rc1/). And it's dead
simple to start using it with IDLE or the REPL. After all, the class
is not about teaching Python or Lisp, it's about teaching AI concepts.
Even Java has been used this gentle introduction to genetic
programming: http://www.gp-field-guide.org.uk/. In fact, it might be
even better to start with a no-brainer language like Python.

Also if I may digress a bit, we have a tendency to forget that what
works for us best might not be what works best for other. Lisp simply
is not for everybody, maybe not even for most people, and that's ok.

On Aug 13, 1:36 pm, Lee Spector <lspec...@hampshire.edu> wrote:
> On the other hand I prefer to work in Lisp (Common Lisp, Scheme, Clojure), but my main project these days involves evolving Push programs rather than Lisp programs, for a bunch of reasons related to evolvability -- seehttp://hampshire.edu/lspector/push.htmlif you really want to know. I prefer to work in Lisps because they make it simpler to write code that manipulates program-like structures (however they end up being "executed") and because I like Lisps better than most other languages for a slew of other reasons. But my evolved code is executed on a Push interpreter implemented in the host language and there are Push-based GP systems in many languages (C++, Java, Javascript, Python, several Lisps). The language choice really just affects software engineering and workflow issues, not the fundamental power of the system to evolve/learn.

Robert Levy

unread,
Aug 15, 2011, 2:02:41 AM8/15/11
to clo...@googlegroups.com
Hmm, I don't know if people will ever see Lisp as an "AI Language" though (kidding).  My personal experience was that I learned Common Lisp in the context of an AI course in college.  I was pretty excited about learning the language, so I read The Little Lisper the summer before taking the course.  At least in my experience, it was not a distraction at all, but a big plus, and definitely life-changing.

ax2groin

unread,
Aug 16, 2011, 11:52:00 AM8/16/11
to Clojure
The NYTimes article on the class also mentions two other classes being
offered for free:
* Machine Learning, by Andrew Ng
* Introductory course on database software, by Jennifer Widom

I'm not sure of the official website for either of these, but the
Machine Learning class sounds promising and didn't have a required
textbook the way the AI class does.

daly

unread,
Aug 16, 2011, 12:21:58 PM8/16/11
to clo...@googlegroups.com
Ng's course on machine learning is online.
I've already taken it.
You need a background in probability.

Tim Daly

Mark Engelberg

unread,
Aug 16, 2011, 12:32:28 PM8/16/11
to clo...@googlegroups.com
The Machine Learning class from 2008 has been available at iTunes
University for a while, but I'm not aware of it having the same kinds
of support materials as the online AI class has said they will offer.

Be forewarned: the Machine Learning class is presented in a very math
intensive way (statistics, linear algebra, and multivariate calculus),
and a lot of the math refreshers were apparently offered in sessions
with TAs that were not (as far as I know) placed online.

I felt like I got a greater understanding of machine learning by
reading the book "Data Mining" by Witten.

Jeff Heon

unread,
Aug 16, 2011, 12:39:16 PM8/16/11
to Clojure
They will also be available to be taken as an online class with
grading as the AI introduction class.

Links are on the main introduction to AI page - http://www.ai-class.com/
:
http://www.db-class.com/
http://www.ml-class.com/

See also Stanford Engineering Everywhere where past lectures and
material of several other courses are available for free:
http://see.stanford.edu/

Mark Engelberg

unread,
Aug 16, 2011, 1:46:38 PM8/16/11
to clo...@googlegroups.com
Nice. I'm glad these other classes are getting the full treatment.

It's really a shame they don't do a clearer job of defining the
prerequisites. For example, they should post some sort of pre-test
with specific examples of the kind of math that is needed to
understand the class. I find it difficult to know whether to
recommend the classes to the high school students I know.

Christopher Burke

unread,
Aug 16, 2011, 3:41:28 PM8/16/11
to Clojure
I was wondering about the prerequisites as well and found some further
information here:

http://www.stanford.edu/class/cs229/materials.html

In particular, the first 2 entries under Section Notes.

André Thieme

unread,
Aug 16, 2011, 6:31:20 PM8/16/11
to Clojure
On Aug 12, 6:41 pm, daly <d...@axiom-developer.org> wrote:

> In AI this is often modeled as a self-modifying program.
> The easiest way to see this would be a program that handles
> a rubics cube problem. Initially it only knows some general
> rules for manipulation, some measure of progress, and a goal
> to achieve.

The rubic cube is actually a not-so-simple problem, because
the function that measures progress is very difficult to write,
if you don’t know the algorithm of how to solve a cube.
But if you want to have the program find that out, then there
would be no point in having a fitness function that already
contains the solution.


> Hmmm.
> Clojure has immutable data structures.
> Programs are data structures.
> Therefore, programs are immutable.

CL Programs are also immutable. At some point you will need
to call eval or compile.


> So is it possible to create a Clojure program that modifies itself?

Yes, in the same sense as it is possible with CL.

André Thieme

unread,
Aug 16, 2011, 6:39:35 PM8/16/11
to Clojure
On Aug 12, 10:25 pm, daly <d...@axiom-developer.org> wrote:

> Consing up a new function and using eval is certainly possible but
> then you are essentially just working with an interpreter on the data.

This is the same in CL. When you have a GP system that constructs new
program trees then inside your fitness function you will eval it.
Clojure does not come with an interpreter, so eval results in a
compiled program that gets executed.
You can of course use recombination and mutation of programs and/or
subprograms and thus create a new generation that is now modified,
which then gets its fitness measured.


> How does function invocation actually work in Clojure?

Lisp-1.
This is much better suited for a (predominantly) functional
programming
style. Also see this paper by Kent Pitman:
http://www.nhplace.com/kent/Papers/Technical-Issues.html
(“Probably a programmer who frequently passes functions as arguments
or who uses passed arguments functionally finds Lisp1 syntax easier to
read than Lisp2 syntax;”)


> In Common Lisp you fetch the function slot of the symbol and execute it.
> To modify a function you can (compile (modify-the-source fn)).
> This will change the function slot of the symbol so it will execute the
> new version of itself next time.
>
> Does anyone know the equivalent in Clojure? Would you have to invoke
> javac on a file and reload it? The Clojure compile function only seems
> to know about files, not in-memory objects.

In Clojure it is: (eval (modify-the-source fn)).

André Thieme

unread,
Aug 16, 2011, 6:41:08 PM8/16/11
to Clojure
They are all data-modifying programs.
The thing is that the program is itself the data. It is just a list,
nothing
more, and you can apply functions that operate on trees.
Those can be recombined and/or mutated, and then get executed in a
fitness
function.

André Thieme

unread,
Aug 16, 2011, 6:45:07 PM8/16/11
to Clojure


On Aug 13, 11:14 pm, Ken Wesson <kwess...@gmail.com> wrote:
> On Sat, Aug 13, 2011 at 1:36 PM, Lee Spector <lspec...@hampshire.edu> wrote:

> > On the one hand most people who work in genetic programming these days write in non-Lisp languages but evolve Lisp-like programs that are interpreted via simple, specialized interpreters written in those other languages (C, Java, whatever).
>
> The ultimate in Greenspunning. :)


Exactly!
All those people doing GP in C++ end up doing it in Lisp anyway.
They write a GP engine that generates trees and manipulates them and
then
they'll have to write an interpreter for that limited language, which
is basically the idea of Lisp. OMG ;)

Ken Wesson

unread,
Aug 16, 2011, 8:25:23 PM8/16/11
to clo...@googlegroups.com

Whereas if you started out with Lisp, you can skip all that and just write:

a) a few functions/macros

b) something to make/evolve trees of forms whose operator-position
symbols name those functions and macros

c) (eval evolved-form)

and Bob's your uncle. The interpreter you get for free, in the form of
the macroexpander and eval. Actually using Lisp is like having library
support for your Greenspunning. :)

myriam abramson

unread,
Sep 3, 2011, 8:20:55 PM9/3/11
to clo...@googlegroups.com
Good news! The FAQ mentions that any programming language will do.

On Mon, Aug 8, 2011 at 2:47 PM, <labw...@gmail.com> wrote:
As most of you probably already know, Peter Norvig and S. Thrun will offer a free online intro to AI class in the Fall. The problem is that it will probably require Python since the third edition of the book is in Python. I am somewhat upset that this will make Python the de facto language of AI for a very large number of students. I was hoping for Clojure frankly or have some breathing room. Anybody knows anything about that?

Reply all
Reply to author
Forward
0 new messages