Game development and Scheme

18 views
Skip to first unread message

knubee

unread,
May 13, 2005, 11:24:05 AM5/13/05
to
Can anyone point to good resources about Game development and Scheme?

In other words, many Scheme courses and texts use games to illustrate
aspects of computer science. I'm looking for the opposite: an emphasis
on good game-development concepts, mechanisms, techniques, coding
practices, structures, and architectures -- specifically with an eye
towards Scheme as the implementation language.

Contrast this with most game development texts. The ones geared towards
C++ tend to spend a lot of time on helping developers solve technical
problems that are non-problematic in more expressive languages -- or
tend to focus largely on those aspects we typically associate with the
language (low-level optimization, I/O, etc.) The ones geared towards
Java tend to formulate everything about game design and development in
terms of its particular OO model -- and how to find/use existing class
libraries to put together a game.

In my wildly unrealistic optimism, I'm looking for the SICP of game
development: "the book is not about Scheme, but about game development
-- we just happen to use Scheme because it is the most appropriate
language for exploring the topic." :-)

thanks, k

Johan Toki Persson

unread,
May 14, 2005, 10:51:49 AM5/14/05
to
You might be interested in Mike Wiering's master thesis on game
development in Clean. Though it isn't Scheme-specific, the paper
present a pretty interesting way of writing games in a functional
style.
http://cleangl.sourceforge.net/thesis/

knubee

unread,
May 17, 2005, 2:49:03 AM5/17/05
to
Thanks for the pointer; looks interesting. I've also been looking
around for other papers and descriptions. Unfortunately, most of the
references to game programming and scheme/lisp is in the form of "well,
company X uses it for scripting" or "it wouldn't be hard to use it for
parts of a game implementation."

Of course, it's possible to just look at existing texts on game
programming and use the appropriate techniques and mechanisms in
Scheme. But I suspect there are aspects of game design/implemention
that are simply not even considered in those texts because the games
are designed/implemented from the perspective of imperative and OO
languages. Of course, with Scheme we can do those things,too. But, it
would be nice to highlight aspects of game design (if there are any)
that are important, and perhaps only reasonable to consider, if one has
a language like Scheme.

Johan Toki Persson

unread,
May 17, 2005, 10:02:11 AM5/17/05
to
You're welcome! ;-)

I do think there should be, as you mentioned in the first post,
something of a "SICP of game-design", but I suspect that such a book or
paper would be too narrow for general audience: people interesting in
game-algorithms tend to look for papers in mathematics and AI, people
interested in graphics tend to look on very low-level details and use C
(or, god forbid, assembly), and people interested in game-design use
whatever is generally accepted (ie. C/C++).

Brandon J. Van Every

unread,
May 26, 2005, 11:43:22 PM5/26/05
to
knubee wrote:

I'm a game designer and 3D graphics programmer. After losing my shirt
financially on my C++ based project, I started looking around at open
source and High Level Language solutions to these problems of game
development. I've been doing that for 2 years. I have broad survey
knowledge of all the open source HLLs out there, and all of their uses
relevant to games. Briefly, aside from scripting, there aren't any. 3D
game developers use C++ for reasons of performance, tools maturity,
habit, and familiarity. Scripting tends to be done in Lua or Python.

A teeny tiny number are using C#. Apparently Microsoft has beat its
marketing drum often enough to get a few people to swallow this slow
language. There are, to my knowledge, 2 company Testimonials about the
use of C# in the game industry, that it saved them time and money.
This is up from zero 15 months ago. In another 15 months, I'll ask
again, and we'll see if the number has increased. See my post in the
microsoft.public.win32.programmer.directx.managed newsgroup, search for
"Testimonials" in the Subject line. Or try to click on this horrible URL:
http://www.microsoft.com/communities/newsgroups/en-us/default.aspx?dg=microsoft.public.win32.programmer.directx.managed&tid=721f4fa8-fe0d-4b0f-af6c-e5e7c8050d10&cat=en_US_c6ed0388-4263-41e4-a757-f92d4a911ca5&lang=en&cr=US&sloc=en-us&m=1&p=1

Part of the reason 3D game programmers aren't touching Functional
Programming is because the paradigm has nothing to say about things like
shoving vertices into buffers. I'll confess that only recently have I
understood Continuations and still have a half-assed understanding of
monads. The latter paradigm seems primarily concerned with anal
correctness of an ordering, rather than making it fast. Anal
correctness is not a value add to a game developer. At least, not when
performance is required and the correctness comes at great expense to
performance, plus a horrible learning curve with new paradigms and new
toolsets.

Anyways, the point is imperative programming isn't bullshit. A lot of
programming problems are more easily expressed that way. I've gone
around in circles about how FP would be relevant to a 3D engine core,
and I find no relevance. In fact, nowadays I see my designs in terms of
potentially separating the FP and Imperative parts of a program, and
getting them to talk to each other. This is much like separating higher
level OO code from low level optimized C/ASM code on efficiency
grounds. At what point can/should you afford to be Functional? At what
point can/should you afford to be OO?

I don't have any faith in OO anymore, BTW. I think it's sort of a
joke. At least in my case, it led me to an overweening concern with
code reuse. A consideration that wasn't justified considering how
rapidly my designs were changing. OO at too fine a grain just doesn't
hold up; it's brittle in the face of design changes. Even at a coarser
level, you have all the ISA and HAS problems. OO pretends the world can
be viewed as compartmentalized objects, but really that's an arbitrary
decision. The world is really full of relationships. These
relationships destroy OO hierarchies, particularly single inheritance
hierarchies. The real world looks rather more like a generalized graph,
not even a DAG or "forest."

I see more possibility in the FP notion of functional recombination. I
am not certain, however, if the compiler technologies really live up to
what one might hope for though. I've asked questions about that around
here and have even gotten snotty answers. Some things probably exist
"in the lab," but I haven't seen anything that I would count on for
industrial deployment. That may be true of everything in the Scheme
universe for that matter, but I suspend judgement on that. I don't
pretend to be working on all industrial problems, just my own.

So, congratulations. By your interest in Scheme for 3D game
development, you have placed yourself on the bleeding edge of the game
technology curve. The books that you want are not written. You're
going to have to figure it out yourself. I can tell you right now that
if you want performance + usable tools + Windows, you're screwed.
Currently I've pitched my tent on the Bigloo mound, for performance
reasons. Windows support ain't ready for prime time, but it's coming
along, and most importantly others are working on it so I don't have
to. I'm trying to address the "usable tools" problem from a Windows
game developer's standpoint. As you can see from my other post about
OpenGL, I'm pretty much at square one with this.

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

"The pioneer is the one with the arrows in his back."
- anonymous entrepreneur

Alex Shinn

unread,
May 27, 2005, 1:53:12 AM5/27/05
to
>>>>> "BE" == Brandon J Van Every <mylastname...@mycompanyname.com> writes:

BE> So, congratulations. By your interest in Scheme for 3D game
BE> development, you have placed yourself on the bleeding edge of
BE> the game technology curve. The books that you want are not
BE> written. You're going to have to figure it out yourself.

That's what attracts many people to Scheme. We want to figure things
out for ourselves.

If you wish to do only what the masses have done before you, stick to
the languages the masses use.

--
Alex

Brandon J. Van Every

unread,
May 27, 2005, 3:06:40 AM5/27/05
to
Alex Shinn wrote:

So where are the Scheme games? I know of a few open source ones, but I
can't point to any commercial ones. There's a big difference between
tinkering and getting things done. Currently, with the tools available
it is very hard to get things done.

BTW I may have errantly assumed that the OP cared about 3D and
performance. It's a typical game developer concern but not universally
held.

Ray Dillinger

unread,
May 27, 2005, 4:30:38 AM5/27/05
to
Brandon J. Van Every wrote:
> Alex Shinn wrote:
>
>>>>>>> "BE" == Brandon J Van Every
>>>>>>> <mylastname...@mycompanyname.com> writes:
>>>>>>>
>>
>>
>> BE> So, congratulations. By your interest in Scheme for 3D game
>> BE> development, you have placed yourself on the bleeding edge of
>> BE> the game technology curve. The books that you want are not
>> BE> written. You're going to have to figure it out yourself.
>>
>> That's what attracts many people to Scheme. We want to figure things
>> out for ourselves.
>>
>> If you wish to do only what the masses have done before you, stick to
>> the languages the masses use.
>>
>>
>>
> So where are the Scheme games? I know of a few open source ones, but I
> can't point to any commercial ones.

Well, I think maybe you should look at Naughty Dog Software (publishers
of Jak & Daxter). I have heard that they are doing mostly scheme-based
game development. As with most commercial vendors though, they are
likely to be of very limited help to someone who intends to be their
competitor.

Bear


Matthias Buelow

unread,
May 27, 2005, 7:05:05 AM5/27/05
to
Brandon J. Van Every wrote:

> So where are the Scheme games? I know of a few open source ones, but I
> can't point to any commercial ones. There's a big difference between

You assume that there is a significant intersection between Scheme (or
ML/Haskell etc.) programmers, and commercial game programmers. I don't
think there is. Game companies have their tools which work well for
them. Scheme programmers have their tools which are relevant for their
work, and are happy that others have developed games that one can play
in some free time, to let some steam off. While there probably is a
significant interest in the Scheme community to play games, I doubt that
there's a large number of people who'd actually want to write games
(like Quake, Doom3, etc.)

mkb.

Johan Toki Persson

unread,
May 27, 2005, 8:51:38 AM5/27/05
to
What makes me curious about using functional languages in game
development is a completely new way of thinking. I think it's crazy to
replace C/C++/ASM with functional languages, since that won't bring
forth the power. The core, I belive, should be written as a set of
well-defined modules in a low-level language and then interfaced to a
more high-level alternative.
Mike Wiering's paper gives a pretty good picture on how to instead of
defining what the game will do, declare what the game is. I belive an
approach like this will reduce programming time, make game creation
more simple and prehaps more creative.

Joe Marshall

unread,
May 27, 2005, 10:35:41 AM5/27/05
to
"Brandon J. Van Every" <mylastname...@mycompanyname.com> writes:

> Anyways, the point is imperative programming isn't bullshit.

It isn't *always* bullshit, but it is very nearly always inferior.

> A lot of programming problems are more easily expressed that way.

*Some* are, but most are not. Imperative programming has an implicit
dependence upon time. It is straightforward to make that dependence
explicit and turn the imperative program into a functional one. The
resulting program can *usually* be simplified into something quite
easy to understand.

Time, however, appears to move in one direction only (barring the
occasional bouts of deja vu). Imperative programs take advantage of
this by re-using the `historical' data structures. When programming
imperatively, it is incumbent upon the programmer to know when re-use
is valid. This is one of the big burdens of imperative programming.
Functional programming frees you having to `just know' when data
structures can be re-used.

Resources are finite, though, and re-use is necessary. The functional
paradigm makes it easy to automatically manage some of these resources
(through garbage collection), but it also *requires* some form of
automatic management because the programmer isn't supposed to have to
think about it. Functional programming is *generally* more resource
hungry than imperative programming.

Programs are becoming larger and more complex, and resources are
becoming cheaper. Functional programs scale nicely, but imperative
ones do not. Each new imperative construct adds more constraints for
determining whether a data structure can be re-used. At a certain
point, it becomes impossible to reason about whether re-use is
possible. To defend against premature re-use, the imperative
programmer is forced to copy data structures. This can be *far* more
resource intensive than the equivalent functional program.

> I've gone around in circles about how FP would be relevant to a 3D
> engine core, and I find no relevance.

It is unfortunate that the overwhelming majority of 3D programming and
research is done in a rigidly imperative style. It is difficult to
impossible to find *anything* about 3D rendering that has a functional
approach. The most popular 3D APIs (OpenGL and DirectX) are
completely imperative and full of implied state.

Yet a functional approach would be relevant. First, look at Sussman's
Digital Orrery and Supercomputer Toolkit (Berlin and Surati). By
starting with a *functional* high-level language and applying partial
evaluation, they were able to generate low-level *imperative* code
with significant performance.

How is this relevant? The bottleneck for Sussman et al. was pumping
data through a floating point processor. The high-level functional
description of the algorithm allowed the partial evaluator to schedule
floating-point operations at 95%+ utilization. The bottleneck for 3D
graphics is pumping polygons and textures through the graphics card.
Partial evaluation should help tremendously.

In fact, it already does. The entire `first-person shooter' industry
would not exist if it were not for the BSP tree. This is quite simply
a partially evaluated scene description graph. The well-known
limitation of no moving walls in the original Doom engine is a direct
result of this data structure being purely functional. (Not because
of any `functional is better' philosophy --- it is just extremely hard
to efficiently recompute the BSP tree when things move.)

> I don't have any faith in OO anymore, BTW. I think it's sort of a
> joke. At least in my case, it led me to an overweening concern with
> code reuse. A consideration that wasn't justified considering how
> rapidly my designs were changing. OO at too fine a grain just doesn't
> hold up; it's brittle in the face of design changes. Even at a
> coarser level, you have all the ISA and HAS problems. OO pretends the
> world can be viewed as compartmentalized objects, but really that's an
> arbitrary decision. The world is really full of relationships. These
> relationships destroy OO hierarchies, particularly single inheritance
> hierarchies. The real world looks rather more like a generalized
> graph, not even a DAG or "forest."

The emporer has been without clothing for some time. It was amusing
at first, but isn't something that you really want to be exposed to on
a daily basis.

> So, congratulations. By your interest in Scheme for 3D game
> development, you have placed yourself on the bleeding edge of the game
> technology curve.

Beyond.

> The books that you want are not written. You're going to have to
> figure it out yourself.

True.

> I can tell you right now that if you want performance + usable tools
> + Windows, you're screwed.

If you want a popular, commercially supported graphics platform for
Windows, you will not find it in Lisp or Scheme. But then you
wouldn't have found it in C++ just a few years ago. The field is
still wide open.

Brandon J. Van Every

unread,
May 27, 2005, 1:14:22 PM5/27/05
to
Ray Dillinger wrote:

> Brandon J. Van Every wrote:
>
>> Alex Shinn wrote:
>>
>>>>>>>> "BE" == Brandon J Van Every
>>>>>>>> <mylastname...@mycompanyname.com> writes:
>>>>>>>>
>>>>>>>
>>>
>>>
>>> BE> So, congratulations. By your interest in Scheme for 3D game
>>> BE> development, you have placed yourself on the bleeding edge of
>>> BE> the game technology curve. The books that you want are not
>>> BE> written. You're going to have to figure it out yourself.
>>>
>>> That's what attracts many people to Scheme. We want to figure things
>>> out for ourselves.
>>>
>>> If you wish to do only what the masses have done before you, stick to
>>> the languages the masses use.
>>>
>>>
>>>
>> So where are the Scheme games? I know of a few open source ones, but
>> I can't point to any commercial ones.
>
>
> Well, I think maybe you should look at Naughty Dog Software (publishers
> of Jak & Daxter). I have heard that they are doing mostly scheme-based
> game development.

They did that project in something homebrewed called Game Object
Assembly Lisp (GOAL). Franz Inc. apparently helped them on it. I have
heard nothing about Naughty Dog dumping GOAL for Scheme. I think you
are confusing development that's "like Scheme" for development that "is
Scheme."

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

When no one else sells courage, supply and demand take hold.

Brandon J. Van Every

unread,
May 27, 2005, 1:22:54 PM5/27/05
to
Matthias Buelow wrote:

> Brandon J. Van Every wrote:
>
>> So where are the Scheme games? I know of a few open source ones, but
>> I can't point to any commercial ones. There's a big difference between
>
>
> You assume that there is a significant intersection between Scheme (or
> ML/Haskell etc.) programmers, and commercial game programmers.

*I* assume? I think you put words in my mouth for no reason. I'm very
familiar with commercial game developers' almost complete lack of
interest in anything FP.

> I don't think there is. Game companies have their tools which work
> well for them. Scheme programmers have their tools which are relevant
> for their work, and are happy that others have developed games that
> one can play in some free time, to let some steam off. While there
> probably is a significant interest in the Scheme community to play
> games, I doubt that there's a large number of people who'd actually
> want to write games (like Quake, Doom3, etc.)

There are always many CS students who want to write games. Then
industry gets ahold of them, whether game industry or some other
segment, and squashes any progressive language tendencies out of them.
To use tools that industry doesn't use is to be a pioneer. Probably
that means individual protagonists must get the initial gruntwork done,
and can't rely on communities for anything.

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

Taking risk where others will not.

Brandon J. Van Every

unread,
May 27, 2005, 1:24:59 PM5/27/05
to
Johan Toki Persson wrote:

>What makes me curious about using functional languages in game
>development is a completely new way of thinking. I think it's crazy to
>replace C/C++/ASM with functional languages, since that won't bring
>forth the power. The core, I belive, should be written as a set of
>well-defined modules in a low-level language and then interfaced to a
>more high-level alternative.
>
>

Why? The most relevant core primitive to my game AI code is probably
the list. It matters a helluva lot for 3D scene graphs too.

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

When no one else sells courage, supply and demand take hold.

Brandon J. Van Every

unread,
May 27, 2005, 1:59:30 PM5/27/05
to
Joe Marshall wrote:

>"Brandon J. Van Every" <mylastname...@mycompanyname.com> writes:
>
>
>
>>Anyways, the point is imperative programming isn't bullshit.
>>
>>
>
>It isn't *always* bullshit, but it is very nearly always inferior.
>
>
>

That says buckets about the problem domains you've chosen to take on.
From where I sit, worrying about filling vertex buffers, FP is the
inferior concept. I think most FPers spend the vast majority of time on
"language transformation" problems, where FP is an asset rather than a
liability. FP has nothing to say about hardware IO.

>>A lot of programming problems are more easily expressed that way.
>>
>>
>
>*Some* are, but most are not.
>

Again, there is nothing more at work here than the Law Of Selective
Observation. The overwhelming majority of code in the world is
imperative. Industry is stupid, but seen from 10,000 miles up, so much
imperative code is not completely stupid. The FP "language
transformation" guys are the minority fringe of programmatic activity,
not the other way around.

> Imperative programming has an implicit
>dependence upon time.
>

So does life. Time is money.

> It is straightforward to make that dependence
>explicit and turn the imperative program into a functional one.
>

No it isn't. Swallowing papers on monads, for instance, is damn
convoluted. That's why most programmers don't bother.

>Functional programming frees you having to `just know' when data
>structures can be re-used.
>
>

At the enormous sacrifice of performance in some cases, like hardware
IO. The engineering-oriented programmers aren't interested in this
"performance vs. safety" trade.

>Programs are becoming larger and more complex, and resources are
>becoming cheaper.
>

When perfect human beings are being modeled in realtime 3D on virtual
machines, I'll agree that resources are 'cheap'. We're quite some ways
off from that being true though.

> Functional programs scale nicely, but imperative
>ones do not.
>

Nonsense. Functional programs like recursive list stuff and imperative
programs like flat arrays. Whether it scales depends on whether your
problem is better expressed as a list hierarchy or a flat array.

> At a certain
>point, it becomes impossible to reason about whether re-use is
>possible.
>

See, this is the overriding concern of the FP advocate. They want to
reason about their code. This is a strange proposition to the vast
majority of industry. They don't want to reason about it, they want it
to be easy to implement and run fast! FP advocates will make enormous
performance and ease-of-use sacrifices in order to get something they
can reason about. Even if there aren't sufficient resources to handle
the problem on real hardware.

> To defend against premature re-use, the imperative
>programmer is forced to copy data structures. This can be *far* more
>resource intensive than the equivalent functional program.
>
>

It is silly to talk about FP and pretend that data structures aren't
copied. The whole paradigm is, essentially, "Thou Shalt Copy."

>>I've gone around in circles about how FP would be relevant to a 3D
>>engine core, and I find no relevance.
>>
>>
>
>It is unfortunate that the overwhelming majority of 3D programming and
>research is done in a rigidly imperative style. It is difficult to
>impossible to find *anything* about 3D rendering that has a functional
>approach. The most popular 3D APIs (OpenGL and DirectX) are
>completely imperative and full of implied state.
>
>

The basic 3D graphics pipeline is a very well understood problem, with
enormous industrial committments behind it that have brought us
'supercomputer' visualization power for a $50 card. Face it, FP is no
good for reasoning about Von Neumann computing architectures. Other HW
approaches have been tried in the marketplace, and they have lost. Now
in another 20 years who knows, maybe we'll see a rebirth, but they're
dead for now.

Tons of stuff in 3D software could be done in FP. In practice, however,
it often isn't, because the lower level tool chains are so strongly
imperative. A FP OS wouldn't hurt either.

The reason this is all so obvious to me is I've always come at these
problems from the industrial standpoint, not the academic theoretical
I-want-to-reason-about-it standpoint. "Does it work? Will it save me
time and money?" For my problems, the answers have overwhelmingly been
no, no, and no. FP has some great tools for the problems FPers
typically work on; they're abysmal for other things. I see the whole
programming universe in terms of tools nowadays. They torture me
daily. For instance, I've spent the entire week fighting Cygwin and
MinGW builds, to get something that both works and doesn't encumber my
own code with a GPL. There is no way I could possibly sell a mission
critical business on using these tools I'm trying out. If they didn't
laugh, and took me seriously, they'd become very angry with me once
they'd gone down my road.

Why bother at all? Because I do have more of a researcher and
aestheticist in me than a pragmatist. I believe there is something I
can do with this stuff, that will put me ahead of the competitive curve
for a long time. But it's really a leap of faith. I'm paying big dues
right now.

>Yet a functional approach would be relevant. First, look at Sussman's
>Digital Orrery and Supercomputer Toolkit (Berlin and Surati). By
>starting with a *functional* high-level language and applying partial
>evaluation, they were able to generate low-level *imperative* code
>with significant performance.
>
>How is this relevant? The bottleneck for Sussman et al. was pumping
>data through a floating point processor. The high-level functional
>description of the algorithm allowed the partial evaluator to schedule
>floating-point operations at 95%+ utilization. The bottleneck for 3D
>graphics is pumping polygons and textures through the graphics card.
>Partial evaluation should help tremendously.
>
>

Doing stuff "in parallel" is hardly the province of FP alone.
Specifically, it's not enough to say "oh it's FP, the parallel
performance is going to be great." You need an implementation that's
actually geared to do such a thing. Far from all FP language projects
have that as their raison d'etre. So the fair comparison is not a FP
supercomputer guy vs. an imperative Java guy, it's a FP supercomputer
guy and an imperative supercomputer guy.

>In fact, it already does. The entire `first-person shooter' industry
>would not exist if it were not for the BSP tree.
>

All commodity 3D HW has implemented a Z-buffer, not a BSP tree. We owe
the industry to the ex-SGI engineers of 3Dfx and NVIDIA, not to Doom.
If there had been no BSP tree, the FPS industry would simply have been
delayed until commodity HW could handle it. You're talking about a
delay of roughly 3 years. Big deal.

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

"witch-hunt" - (noun) (Date: 1885)
1: a searching out for persecution of persons accused
of witchcraft
2: the searching out and deliberate harassment of
those (as political opponents) with unpopular views
- witch-hunter (noun)
- witch-hunting (noun or adjective)

Joe Marshall

unread,
May 27, 2005, 4:43:59 PM5/27/05
to
I know I shouldn't.....

>>"Brandon J. Van Every" <mylastname...@mycompanyname.com> writes:
>>
>>> Anyways, the point is imperative programming isn't bullshit.
>

> Joe Marshall wrote:
>>
>>It isn't *always* bullshit, but it is very nearly always inferior.

"Brandon J. Van Every" <mylastname...@mycompanyname.com> writes:

> That says buckets about the problem domains you've chosen to take
> on. From where I sit, worrying about filling vertex buffers, FP is
> the inferior concept.

That's the problem: you are worrying about the minutae of filling
vertex buffers when you should be worrying about your goal of
producing an image.

> I think most FPers spend the vast majority of time on "language

> transformation" problems...

That isn't my experience.

> FP has nothing to say about hardware IO.

No more nor less than imperative programming. It is a style of
programming, not an API. You can treat hardware imperatively ---
bring a0 high to load the register --- or functionally --- the image
on the screen is a function of the contents of the graphics buffer.

>>> A lot of programming problems are more easily expressed that way.
>>
>>*Some* are, but most are not.
>>
> Again, there is nothing more at work here than the Law Of Selective
> Observation. The overwhelming majority of code in the world is
> imperative. Industry is stupid, but seen from 10,000 miles up, so
> much imperative code is not completely stupid. The FP "language
> transformation" guys are the minority fringe of programmatic activity,
> not the other way around.

Your point seems to be that since a lot of people do it that way, it
must be the easiest way.

>> It is straightforward to make that dependence
>>explicit and turn the imperative program into a functional one.
>>
> No it isn't.

Is too!

At each point in time the next state of the machine is a function of
the immediately preceeding state. It's a trivial observation.

> Swallowing papers on monads, for instance, is damn convoluted.
> That's why most programmers don't bother.

True. But monads aren't about making time dependence explicit (which
is easy) but about exploiting the fact that time moves in but one
direction and selectively exposing the time dependence only where
necessary and hiding it elsewhere (which is *really* hard).

>>Functional programming frees you having to `just know' when data
>>structures can be re-used.
>>
> At the enormous sacrifice of performance in some cases, like hardware
> IO. The engineering-oriented programmers aren't interested in this
> "performance vs. safety" trade.

This comes up again and again: Functional programming is dismissed
out of hand as `performing badly' and therefore `uninteresting'.
Truly engineering-oriented programmers would consider whether
functional programming offers a possible solution to the engineering
problem at hand rather than simply following the herd.

>>Programs are becoming larger and more complex, and resources are
>>becoming cheaper.
>>
> When perfect human beings are being modeled in realtime 3D on virtual
> machines, I'll agree that resources are 'cheap'. We're quite some
> ways off from that being true though.

I didn't say cheap, I said `cheaper'. What was completely infeasible
a decade ago is now commonplace. For example, MIT Scheme was
considered a horrendously bloated resource hog back in the 80s. At a
tad over 10MB for both the executable and a heap image, it is
positively svelte compared to the monstrosity that is Visual Studio.

>> Functional programs scale nicely, but imperative
>>ones do not.
>>
> Nonsense. Functional programs like recursive list stuff and
> imperative programs like flat arrays. Whether it scales depends on
> whether your problem is better expressed as a list hierarchy or a flat
> array.

You are looking at too narrow a view. Consider a program with tens of
thousands of subroutines and thousands of data structures. If this
program is purely functional, then the kinds of interaction between
different parts of the program are very limited. A function can only
affect another if it is directly in the call chain. As new functions
are added to the system, the behavior of the old parts of the system
remains constant. You need to take into account how the new routines
interact with the old ones, but you need not be concerned with how the
old ones interact with each other.

If the program is imperative, however, a function can easily affect
another by modifying a shared data structure. A new subroutine can
cause existing routines to stop working. Extending the system
requires taking into account the details of how the existing routines
interact through potentially shared data.

It is this level of scaling that I am talking about, not the selection
of lists versus arrays.

>> At a certain
>>point, it becomes impossible to reason about whether re-use is
>>possible.
>>
> See, this is the overriding concern of the FP advocate. They want to
> reason about their code. This is a strange proposition to the vast
> majority of industry.

Yes, it does appear to be the case that applying thought to a problem
is a foreign concept.

> They don't want to reason about it, they want it to be easy to
> implement and run fast! FP advocates will make enormous performance
> and ease-of-use sacrifices in order to get something they can reason
> about. Even if there aren't sufficient resources to handle the
> problem on real hardware.

Again, you misunderstand. I'm not talking about automated reasoning,
I'm talking about simple human understanding. If you can't understand
the code, you can't make it run at all, let alone run fast. You need
to be able plan a solution to your coding problem. If you cannot
perform simple reasoning like determining if a module computes the
answer you want then you won't get very far.

>> To defend against premature re-use, the imperative
>>programmer is forced to copy data structures. This can be *far* more
>>resource intensive than the equivalent functional program.
>>
> It is silly to talk about FP and pretend that data structures aren't
> copied. The whole paradigm is, essentially, "Thou Shalt Copy."

You clearly have no idea what you are talking about. In a pure
functional style, where *nothing* is modified, there is not even a
rational concept of `copying'. That is the entire point! The
identity of an object is no longer a mysterious intrinsic property
somehow tied to the object's history but rather a simple compositional
property of the object's observables.

>>>I've gone around in circles about how FP would be relevant to a 3D
>>>engine core, and I find no relevance.
>>>
>>
>>It is unfortunate that the overwhelming majority of 3D programming and
>>research is done in a rigidly imperative style. It is difficult to
>>impossible to find *anything* about 3D rendering that has a functional
>>approach. The most popular 3D APIs (OpenGL and DirectX) are
>>completely imperative and full of implied state.
>>
> The basic 3D graphics pipeline is a very well understood problem, with
> enormous industrial committments behind it that have brought us
> 'supercomputer' visualization power for a $50 card. Face it, FP is no
> good for reasoning about Von Neumann computing architectures. Other
> HW approaches have been tried in the marketplace, and they have lost.
> Now in another 20 years who knows, maybe we'll see a rebirth, but
> they're dead for now.

You've made quite a leap from a $50 card to Von Neumann architectures.
I don't see how this is relevant to functional programming.

> The reason this is all so obvious to me is I've always come at these
> problems from the industrial standpoint, not the academic theoretical
> I-want-to-reason-about-it standpoint. "Does it work? Will it save me
> time and money?"

I don't see how you can answer those questions without reasoning about
the program. Not automated reasoning --- real human being thinking.

> I see the whole programming universe in terms of tools nowadays.
> They torture me daily.

Is it because they torture you daily that you came to me?

Please go on.

> For instance, I've spent the entire week fighting Cygwin and
> MinGW builds, to get something that both works and doesn't encumber my
> own code with a GPL. There is no way I could possibly sell a mission
> critical business on using these tools I'm trying out. If they didn't
> laugh, and took me seriously, they'd become very angry with me once
> they'd gone down my road.

Earlier you said that the whole programming universe in terms of tools
nowadays.

> Why bother at all?

Indeed.

Brandon J. Van Every

unread,
May 27, 2005, 6:08:03 PM5/27/05
to
Joe Marshall wrote:

>>FP has nothing to say about hardware IO.
>>
>>
>
>No more nor less than imperative programming. It is a style of
>programming, not an API. You can treat hardware imperatively ---
>bring a0 high to load the register --- or functionally --- the image
>on the screen is a function of the contents of the graphics buffer.
>
>

One is practical and the other isn't.

>>Again, there is nothing more at work here than the Law Of Selective
>>Observation. The overwhelming majority of code in the world is
>>imperative. Industry is stupid, but seen from 10,000 miles up, so
>>much imperative code is not completely stupid. The FP "language
>>transformation" guys are the minority fringe of programmatic activity,
>>not the other way around.
>>
>>
>
>Your point seems to be that since a lot of people do it that way, it
>must be the easiest way.
>
>

Not just a lot, the overwhelming majority. For many classess of
problem, yes, absolutely it is. Thinking FP is a cure-all is foolish.

>>> It is straightforward to make that dependence
>>>explicit and turn the imperative program into a functional one.
>>>
>>>
>>>
>>No it isn't.
>>
>>
>
>Is too!
>
>At each point in time the next state of the machine is a function of
>the immediately preceeding state. It's a trivial observation.
>
>

You're a performance pig.

>This comes up again and again: Functional programming is dismissed
>out of hand as `performing badly' and therefore `uninteresting'.
>Truly engineering-oriented programmers would consider whether
>functional programming offers a possible solution to the engineering
>problem at hand rather than simply following the herd.
>
>

I have, in great depth, and have found it wanting for 3D HW IO
problems. If you think I'm stupid, go read the relevant posts in
comp.lang.functional, where more expert people give their opinions.

>>>Programs are becoming larger and more complex, and resources are
>>>becoming cheaper.
>>>
>>>
>>>
>>When perfect human beings are being modeled in realtime 3D on virtual
>>machines, I'll agree that resources are 'cheap'. We're quite some
>>ways off from that being true though.
>>
>>
>
>I didn't say cheap, I said `cheaper'.
>

'Cheaper' is not interesting. I've got big 3D and AI problems to solve.

>You are looking at too narrow a view. Consider a program with tens of
>thousands of subroutines and thousands of data structures.
>

Why? These are not my problems. I'm not interested in corporate-sized
codebases. If I were, I'd damn well be sure to take OSes, APIs, and
installed base into account. Your concerns about big software are also
theoretical. Show me actual huge FP systems vs. actual huge imperative
systems, then we'll talk about relative performance (de)merits.

>>See, this is the overriding concern of the FP advocate. They want to
>>reason about their code. This is a strange proposition to the vast
>>majority of industry.
>>
>>
>
>Yes, it does appear to be the case that applying thought to a problem
>is a foreign concept.
>
>

Artillerymen knew how to hit targets with cannonballs 300 years before
anyone had a formal theory of how it worked. The idea that science and
engineering are or should be synonymous is very recent in human
history. Try "Science and Technology in World History: An Introduction"
for details. ISBN 0-8018-5869-0. "Cookbook" approaches are a perfectly
acceptable way of making technological progress, they're just boring to
people who like to think.

>>They don't want to reason about it, they want it to be easy to
>>implement and run fast! FP advocates will make enormous performance
>>and ease-of-use sacrifices in order to get something they can reason
>>about. Even if there aren't sufficient resources to handle the
>>problem on real hardware.
>>
>>
>
>Again, you misunderstand. I'm not talking about automated reasoning,
>I'm talking about simple human understanding.
>

So am I. And worrying about a number of the FP idioms is damn
convoluted for certain problems.

>>It is silly to talk about FP and pretend that data structures aren't
>>copied. The whole paradigm is, essentially, "Thou Shalt Copy."
>>
>>
>
>You clearly have no idea what you are talking about. In a pure
>functional style, where *nothing* is modified, there is not even a
>rational concept of `copying'.
>

Then implement it. We've got real hardware to worry about.

>>The basic 3D graphics pipeline is a very well understood problem, with
>>enormous industrial committments behind it that have brought us
>>'supercomputer' visualization power for a $50 card. Face it, FP is no
>>good for reasoning about Von Neumann computing architectures. Other
>>HW approaches have been tried in the marketplace, and they have lost.
>>Now in another 20 years who knows, maybe we'll see a rebirth, but
>>they're dead for now.
>>
>>
>
>You've made quite a leap from a $50 card to Von Neumann architectures.
>I don't see how this is relevant to functional programming.
>
>

I imagine you don't.

>>I see the whole programming universe in terms of tools nowadays.
>>They torture me daily.
>>
>>
>
>Is it because they torture you daily that you came to me?
>
>Please go on.
>
>

I'm pretty much done with trying to get Cygwin to behave without GPL
restrictions, or MinGW to basically work. The GNU toolchain is deeply
stacked and it takes forever to build under MinGW, if it builds at all.
I'm going to let other people worry about MinGW packaging technology for
a few years before considering it again. Cygwin is well packaged, but I
seriously doubt half that stuff is going to work when I try to get
-fno-cygwin respected by the builds. My loss of time this week chasing
these issues has been abominable. Far from saving me time and making me
more productive, the supposed point of my open source investments, it
has kept me up all night several nights and made me mildly physically
ill. I'm supposed to be out turing a buck right now, and I simply don't
have the energy because MinGW took it all. Nor did Frankensteining
MinGW into a Cygwin environment work. Or vice versa. I've pretty much
tried what can be tried. There's no strategic reason to have any
confidence in this stuff at all. It sucks, and it is going to continue
to suck for quite awhile.

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

On Usenet, if you're not an open source hippie who
likes to download and play with programming toys
all day long, there's something wrong with you.

Matthias Buelow

unread,
May 28, 2005, 6:06:14 PM5/28/05
to

I'd think such initiative would only come from smaller, less well known
companies. The big players like Id and Valve are under huge pressure to
deliver new titles fast, and they already have a toolchain that works.
I'd think there isn't much room for experimentation here.

mkb.


Brandon J. Van Every

unread,
May 28, 2005, 7:57:06 PM5/28/05
to
Matthias Buelow wrote:

Plus they have lotsa money to buy their way out of problems. You don't
need to innovate that hard when you're an industry leader with lotsa
capital. Capital gets all sorts of things done. On the other hand,
it's also an albatross around the neck. As contemporary AAA game
budgets are multi-million dollar affairs, established companies take
fewer and fewer risks with them. Eventually the boredom is so painful
that someone invents something...

thelifter

unread,
May 29, 2005, 5:29:14 PM5/29/05
to
Brandon J. Van Every escreveu:

> Part of the reason 3D game programmers aren't touching Functional
> Programming is because the paradigm has nothing to say about things like
> shoving vertices into buffers. I'll confess that only recently have I

Well, at least Lisp isn't a pure functional language, since it also
supports imperative style, and I think the same applies to scheme.

I think there some possible advantages of Scheme/Lisp:

Simpler syntax(parenthesis): makes it easier to refactor your code,
etc...

But the best thing: Macros! Did you investigate the possibility to
simplify a lot of the code using macros? I mean you could generate a
lot of the low-level stuff.
Of course this could also be done with imperative languages to some
extent. I remember in Michael Abrash's black book there was an example
of an Assembly program that was generated by another program.

Thanks,

thelifter

thelifter

unread,
May 29, 2005, 5:51:54 PM5/29/05
to
Sorry, I forgot to add some ideas:

About the macros again: while I mentioned the example from Michael
Abrash, of course no macro system has the power of Scheme's.

Other advantages of Scheme/Lisp:

-Change the running system without the need to recompile. On large
projects the compile time can get into the hours. Being able to quickly
change code and see the effect immediately can save you this time. When
you are happy with the result you can still compile the code.

-Ease of portability: design a hardware dependend core of the system,
and do the rest in Scheme. You just need to port the hardware dependend
parts of the system. Squeak(http://www.squeak.org/) is an example of
this kind of approach.

-You could also decide to just prototype the system in Scheme. When it
is running you can then convert it to C/C++ for greater speed. Ken
Silverman(Duke Nukem 3D) used to code fragments in Basic first, before
he converted them to C.

In some of these ideas I'm supposing you are writing your own specific
Scheme implementation, like the guys from "Naughty Dog" did.

thelifter

Brandon J. Van Every

unread,
May 29, 2005, 6:30:09 PM5/29/05
to
thelifter wrote:

>-Ease of portability: design a hardware dependend core of the system,
>and do the rest in Scheme. You just need to port the hardware dependend
>parts of the system. Squeak(http://www.squeak.org/) is an example of
>this kind of approach.
>
>

To a game developer who cares about performance, Scheme is not that
portable. None of the FFIs are standard across Scheme implementations.
Also, Scheme implementations typically build well on some platforms and
poorly or not at all on others. Where "on others" tends to be Windows,
because most Schemes are coming from a Unix universe. Bigloo is no
exception at this time. I'm relying on Cygwin for now, as the licensing
issues haven't actually bitten me yet, and hopefully I can find a way
around them. But supporting a MinGW build of Bigloo is beyond painful
right now. It cost me a week to try to get it to work, and it didn't
work, and that's way too much time. Plus it's not going to get any
better for years yet, so I'm done with MinGW. Meanwhile, the VC++ build
doesn't give you Bee IDE support, and the build isn't even in the main
source pool yet. These issues are going to slowly get solved, but you
really have to be careful about whether development environments and
code are actually portable in practice.

The Lisp universe is a little better in that it has the Universal FFI,
supported by some implementations. But apparently not all. There's
also a dearth of compiled, open source Lisps to try out on Windows.
There's GNU Common Lisp, that's all I'm aware of. I can't remember if
it has licensing problems. IIRC it's Lisp -> C so it'll at least have
the same problems as Bigloo for whatever the C backend is. On the
positive side, that also means it's amenable to the same solutions.
Meanwhile there's that guy who's porting CMUCL to Windows, but we'll
have to wait for that brave soul to tell us he's done the deed.

>-You could also decide to just prototype the system in Scheme. When it
>is running you can then convert it to C/C++ for greater speed.
>

C and C++ are not interchangeable paradigms here. You really must make
up your mind which way you are going. I say, C++ will make your life
miserable, don't do it. As for how much you should prototype vs.
actually get production code done, that's your judgement call. I don't
think solo indie developers have time to prototype all day and accept
some infinitely deferred future rewrite. For most of the code, it had
better be fast enough for your purposes the first time you chug through it.

thelifter

unread,
May 29, 2005, 7:13:07 PM5/29/05
to
And what do you think about the macros? See my first posting in this
thread(immediately above the second one).

thelifter

Brandon J. Van Every

unread,
May 29, 2005, 7:15:03 PM5/29/05
to
thelifter wrote:

>And what do you think about the macros? See my first posting in this
>thread(immediately above the second one).
>
>
>

I think, I really can only think about this sort of thing ad-hoc. May
be useful, may not be, in any given circumstance. I see them, and
metaprogramming in general, as prototyping tools.

--
Cheers, www.indiegamedesign.com
Brandon Van Every Seattle, WA

20% of the world is real.
80% is gobbledygook we make up inside our own heads.

Ulrich Hobelmann

unread,
May 30, 2005, 3:54:59 AM5/30/05
to
thelifter wrote:
> Sorry, I forgot to add some ideas:
>
> About the macros again: while I mentioned the example from Michael
> Abrash, of course no macro system has the power of Scheme's.

s/Scheme's/Lisp's/?

Scheme's macros are the reason why I left Scheme for Lisp. What's
the use of simple syntactic replacement, when you can have full
code generation in a compile-time system?

Lisp allows you something like yacc, or an html-compiler in a
macro. Scheme merely gives you convenient abbreviations.

Plus, so far I haven't seen any tutorial that leads me from
simple, useless macros to powerful macros. And no, I don't want
to do manual CPS and control structures in Scheme macros, like
some crazy people suggested.

If anyone can refute my ugly views about Scheme macros and maybe
provide some useful, but not too unreadable examples, I'd be
thankful. As it is, Common Lisp looks quite good.

--
Don't let school interfere with your education. -- Mark Twain

michele....@gmail.com

unread,
May 30, 2005, 6:23:20 AM5/30/05
to
Ulrich Hobelmann wrote:

> Scheme's macros are the reason why I left Scheme for Lisp. What's
> the use of simple syntactic replacement, when you can have full
> code generation in a compile-time system?

Well, I will not defend syntax-rules and syntax-case, but I will notice
that any major Scheme implementation has define-macro, so you
have the full power of Lisp macros at your disposal. Plus, you have
the choice to have less power (syntax-rules) or more power
(syntax-case).
You may complain that Scheme macrology is (maybe too) complex,
not that it is poor.

Michele Simionato

Marcin 'Qrczak' Kowalczyk

unread,
May 30, 2005, 6:59:05 AM5/30/05
to
Ulrich Hobelmann <u.hob...@web.de> writes:

> Scheme's macros are the reason why I left Scheme for Lisp. What's
> the use of simple syntactic replacement, when you can have full code
> generation in a compile-time system?

Lisp macros are unhygienic. It's like having first-class functions
without lexical scoping.

I want hygienic macros which can execute arbitrary code.

Neither Common Lisp nor R5RS provide that. Syntax-case does, but I
don't like the particular way they look like - the notion of hygiene
is IMHO wrong (a new context should be introduced by each syntax
quotation rather than each macro expansion) and the synatax is ugly.

--
__("< Marcin Kowalczyk
\__/ qrc...@knm.org.pl
^^ http://qrnik.knm.org.pl/~qrczak/

rsher...@gmail.com

unread,
May 30, 2005, 8:59:29 AM5/30/05
to
I'm not sure what qualifies as a powerful macro. Do you have an
example of something you can't easily do with syntax-case that you can
with defmacro? Anyway I think the examples linked from this page
qualify: http://people.csail.mit.edu/jhbrown/scheme/

If that doesn't cut it, you can define low-level macros in terms of
syntax-case (this is ripped out of Chicken's sources, but it works
portably with syntax-case).

(define-syntax (defmacro x)
(syntax-case x ()
((_ (name . args) . body)
#'(defmacro name (lambda args . body)))
((_ name transformer)
#'(define-syntax (name y)
(syntax-case y ()
((k . args)
(datum->syntax-object
#'k
(apply transformer (syntax-object->datum #'args)))))))) )


> (defmacro (aif pred consequent alternative)
`(let ((it ,pred)) (if it ,consequent ,alternative)))

> (aif 2 (* it 3) 'no)
6

There are good reasons to use Common Lisp instead of Scheme, but this
isn't one of them.

Ulrich Hobelmann

unread,
May 31, 2005, 8:47:57 AM5/31/05
to
Marcin 'Qrczak' Kowalczyk wrote:
> Lisp macros are unhygienic. It's like having first-class functions
> without lexical scoping.

So you have to (gensym) all variables that the user doesn't
provide himself. Sure, errors are possible, but in practice this
isn't that bad.

> I want hygienic macros which can execute arbitrary code.

Ok, that would be really cool.

> Neither Common Lisp nor R5RS provide that. Syntax-case does, but I
> don't like the particular way they look like - the notion of hygiene
> is IMHO wrong (a new context should be introduced by each syntax
> quotation rather than each macro expansion) and the synatax is ugly.

I agree with the ugliness, haven't looked at it in depth, though.
Anyway, I'm curious what other ideas are to come in that area...

Ulrich Hobelmann

unread,
May 31, 2005, 8:54:52 AM5/31/05
to
rsher...@gmail.com wrote:
> I'm not sure what qualifies as a powerful macro. Do you have an
> example of something you can't easily do with syntax-case that you can
> with defmacro? Anyway I think the examples linked from this page
> qualify: http://people.csail.mit.edu/jhbrown/scheme/

Hm, the usual code generator (yacc, lex) come to mind. I don't
have to pass weird "string" parameters as in syntax-rules
sometimes. Also, syntax-rules can't create new identifiers (as
might be interesting for OO systems etc. I've heard that this is
considered bad style, but not really why.

> If that doesn't cut it, you can define low-level macros in terms of
> syntax-case (this is ripped out of Chicken's sources, but it works
> portably with syntax-case).
>
> (define-syntax (defmacro x)
> (syntax-case x ()
> ((_ (name . args) . body)
> #'(defmacro name (lambda args . body)))
> ((_ name transformer)
> #'(define-syntax (name y)
> (syntax-case y ()
> ((k . args)
> (datum->syntax-object
> #'k
> (apply transformer (syntax-object->datum #'args)))))))) )

Well, that you need an implementation with syntax-case or
defmacro. I like scheme48, and it has neither. Anyway, sometime
I'll look at the s48 sources. I'm sure there are some nice
syntax-rules examples in there...

> There are good reasons to use Common Lisp instead of Scheme, but this
> isn't one of them.

Yes, there are others, too.

Matthias Blume

unread,
May 31, 2005, 8:59:09 AM5/31/05
to
Ulrich Hobelmann <u.hob...@web.de> writes:

> Marcin 'Qrczak' Kowalczyk wrote:
>> Lisp macros are unhygienic. It's like having first-class functions
>> without lexical scoping.
>
> So you have to (gensym) all variables that the user doesn't provide
> himself. Sure, errors are possible, but in practice this isn't that
> bad.

No, gensym is the solution to only one half (the easy half) of the
hygiene problem.

rsher...@gmail.com

unread,
May 31, 2005, 11:25:18 AM5/31/05
to
Hmm, syntax-case is not as ubiquitous as I thought. A few minutes of
research showed that s48, Bigloo and Gauche do not have it (though
Gauche has low-level macros), Chicken, PLT, Gambit, SISC, and Chez do,
and then I got tired of looking. s48 and bigloo have alternative,
obscure macro systems that I don't know in addition to syntax-rules,
but I too would rather use Common Lisp than be crippled with
syntax-rules (and no reasonably powerful alternative).

David Van Horn

unread,
May 31, 2005, 1:41:08 PM5/31/05
to
rsher...@gmail.com wrote:
> Hmm, syntax-case is not as ubiquitous as I thought.

You do realize there is a portable implementation of syntax-case, right?

David

Marcin 'Qrczak' Kowalczyk

unread,
May 31, 2005, 2:41:13 PM5/31/05
to
Ulrich Hobelmann <u.hob...@web.de> writes:

>> Lisp macros are unhygienic. It's like having first-class functions
>> without lexical scoping.
>
> So you have to (gensym) all variables that the user doesn't provide
> himself.

Not only that; I must also somehow insert references to globally named
things such that they will keep their meanings even if the user of the
macro has shadowed these names for his purposes.

Workaround for this in Common Lisp is possible but ugly: symbols for
such entities need to be interned in a private package, and if some
of them need to be exported too (which is usually the case) then the
exported symbols must be different symbols bound to the same meanings
(not just the same symbols interned in the public package), so that
shadowing them locally by the user of the library doesn't actually
change the meaning of the names inserted by the macros.

I think CL programmers don't bother with that and just create macros
which work only under the assumption that these names are not
shadowed, except that a different rule is used for the COMMON-LISP
package: shadowing its symbols is illegal, so that the blame is moved
from the macro author to the macro user.

rsher...@gmail.com

unread,
May 31, 2005, 3:52:20 PM5/31/05
to
I've heard of such a thing, but then I find something like this (Taylor
Campbell in the SRFI-46 archive):

>You might also be thinking 'um, why not just use SYNTAX-CASE?'
>There are several reasons: (1) Do you really _need_ full Scheme at
>macro-expand-time? Probably not, but you _might_ need some of the
>directives, such as UNHYGIENE. (2) Does the _implementor_ want to
>allow for arbitrary Scheme code at macro-expand-time? Maybe, maybe
>not: it introduces lots of problems regarding phase separation and
>syntactic environment towers. (3) SYNTAX-CASE isn't very formally
>defined anywhere that I know of. (4) SYNTAX-CASE isn't implemented
>everywhere, and you can't just tack psyntax onto any old Scheme system
>and expect it to work; hygiene and macro expansion plays a great role
>in constructing an AST from an S-expression, and so it's often very
>tied to the implementation. psyntax works in some implementations
>because they let psyntax do all that work for them (for instance, SISC
>does this), but those that have their _own_ mechanism for this will
>have problems using psyntax. For instance, Scheme48 doesn't support
>SYNTAX-CASE; it would be a _huge_ kludge to make psyntax work in
>Scheme48; and no one is particularly interested in doing it or writing
>a separate implementation of SYNTAX-CASE, as explicit renaming and
>plain SYNTAX-RULES work fine.

Frankly, I don't understand the issues involved, so I'm just going on
Taylor Campbell's authority, but it looks like the bottom line is that,
for practical purposes, syntax-case isn't portable for all values of
portable.

Jens Axel Søgaard

unread,
May 31, 2005, 5:41:35 PM5/31/05
to
rsher...@gmail.com wrote:

> Frankly, I don't understand the issues involved, so I'm just going on
> Taylor Campbell's authority, but it looks like the bottom line is that,
> for practical purposes, syntax-case isn't portable for all values of
> portable.

The portable syntax-case is portable in the sense that is doesn't
rely in any implementation specific functions. It does require
energy and knowledge to get to run in a given implementation, but
it shouldn't be impossible.

Whether anybody actually did port it to, say, s48 and bigloo is
another matter, but that doesn't make the syntax-case
implementation less portable.

As for (1) and (2): Yes. I want full Scheme at
macro-expansion-time, and I am willing to endure a
few inconveniences on that account.

--
Jens Axel Søgaard

Ulrich Hobelmann

unread,
Jun 1, 2005, 6:06:05 AM6/1/05
to
Marcin 'Qrczak' Kowalczyk wrote:
> Ulrich Hobelmann <u.hob...@web.de> writes:
>
>
>>>Lisp macros are unhygienic. It's like having first-class functions
>>>without lexical scoping.
>>
>>So you have to (gensym) all variables that the user doesn't provide
>>himself.
>
>
> Not only that; I must also somehow insert references to globally named
> things such that they will keep their meanings even if the user of the
> macro has shadowed these names for his purposes.
>
> Workaround for this in Common Lisp is possible but ugly: symbols for
> such entities need to be interned in a private package, and if some
> of them need to be exported too (which is usually the case) then the
> exported symbols must be different symbols bound to the same meanings
> (not just the same symbols interned in the public package), so that
> shadowing them locally by the user of the library doesn't actually
> change the meaning of the names inserted by the macros.

But packages are necessary anyway, IMHO, and AFAIK a macro
(defined in package X) uses all symbols from within X, so the user
can't just shadow a function from that package, imported or not.
If the user writes FOO, then that's USER:FOO, while the macro uses
M-PACKAGE:FOO, because the macro (and thus its expansion) is
defined in M-PACKAGE.

But in general (without packages), the problem exists. Thanks for
reminding me of that.

> I think CL programmers don't bother with that and just create macros
> which work only under the assumption that these names are not
> shadowed, except that a different rule is used for the COMMON-LISP
> package: shadowing its symbols is illegal, so that the blame is moved
> from the macro author to the macro user.

Yes, probably. And most user-defined macros likely use stuff from
packages, so the user doesn't interfere with them.

Marcin 'Qrczak' Kowalczyk

unread,
Jun 1, 2005, 7:32:35 AM6/1/05
to
Ulrich Hobelmann <u.hob...@web.de> writes:

> But packages are necessary anyway, IMHO, and AFAIK a macro (defined
> in package X) uses all symbols from within X, so the user can't just
> shadow a function from that package, imported or not.

If the user imports package X, symbols exported by X are interned in
the importer's package, so they can be used unqualified. In this case
redefining such symbol locally shadows the imported definition.

Joe Marshall

unread,
Jun 2, 2005, 11:54:14 AM6/2/05
to
"Brandon J. Van Every" <mylastname...@mycompanyname.com> writes:
>>>FP has nothing to say about hardware IO.

> Joe Marshall wrote:
>>
>>No more nor less than imperative programming. It is a style of
>>programming, not an API. You can treat hardware imperatively ---
>>bring a0 high to load the register --- or functionally --- the image
>>on the screen is a function of the contents of the graphics buffer.
>>

"Brandon J. Van Every" <mylastname...@mycompanyname.com> writes:
> One is practical and the other isn't.

It depends. PLAs are programmed by specifying the desired logic
equations, not by specifying the sequence of signals necessary to
configure them. Design of combinational logic starts with a
functional description of the components and the desired output, not
an imperative description of the signal levels.

>>>Again, there is nothing more at work here than the Law Of Selective
>>>Observation. The overwhelming majority of code in the world is
>>>imperative. Industry is stupid, but seen from 10,000 miles up, so
>>>much imperative code is not completely stupid. The FP "language
>>>transformation" guys are the minority fringe of programmatic activity,
>>>not the other way around.
>>>
>>
>>Your point seems to be that since a lot of people do it that way, it
>>must be the easiest way.
>>
> Not just a lot, the overwhelming majority. For many classess of
> problem, yes, absolutely it is. Thinking FP is a cure-all is foolish.

I never said it was. But since there is almost *nothing* in the
literature about doing low-level graphics functionally, there is
precious little evidence that is a *poor* choice (or good choice,
either).

>>>> It is straightforward to make that dependence
>>>>explicit and turn the imperative program into a functional one.
>>>>
>>>>
>>> No it isn't.
>>
>>Is too!
>>
>>At each point in time the next state of the machine is a function of
>> the immediately preceeding state. It's a trivial observation.
>
> You're a performance pig.

Name calling?

>>This comes up again and again: Functional programming is dismissed
>>out of hand as `performing badly' and therefore `uninteresting'.
>>Truly engineering-oriented programmers would consider whether
>>functional programming offers a possible solution to the engineering
>>problem at hand rather than simply following the herd.
>>
> I have, in great depth, and have found it wanting for 3D HW IO
> problems. If you think I'm stupid, go read the relevant posts in
> comp.lang.functional, where more expert people give their opinions.

I'm less interested in the opinions of others than I am in forming my
own opinion or understanding of the problem.

>>>>Programs are becoming larger and more complex, and resources are
>>>>becoming cheaper.
>>>>
>>>>
>>>When perfect human beings are being modeled in realtime 3D on virtual
>>>machines, I'll agree that resources are 'cheap'. We're quite some
>>>ways off from that being true though.
>>>
>>
>>I didn't say cheap, I said `cheaper'.
>>
> 'Cheaper' is not interesting. I've got big 3D and AI problems to solve.

If `cheaper' is not interesting, the solution is obvious: throw money
at it.

>>You are looking at too narrow a view. Consider a program with tens of
>>thousands of subroutines and thousands of data structures.
>>
> Why? These are not my problems. I'm not interested in
> corporate-sized codebases. If I were, I'd damn well be sure to take
> OSes, APIs, and installed base into account.

You don't think games have tens of thousands of subroutines and
thousands of data structures?

> Your concerns about big software are also theoretical. Show me
> actual huge FP systems vs. actual huge imperative systems, then
> we'll talk about relative performance (de)merits.

Ok, how about Ericsson's AXD 301 ATM Switch?
Or Google's map/reduce architecture?

>>>See, this is the overriding concern of the FP advocate. They want to
>>>reason about their code. This is a strange proposition to the vast
>>> majority of industry.
>>
>>Yes, it does appear to be the case that applying thought to a problem
>>is a foreign concept.
>>
> Artillerymen knew how to hit targets with cannonballs 300 years before
> anyone had a formal theory of how it worked.

Did they aim lower when overshooting and higher when undershooting?
Or did they just aim at random? Did they guess that larger amounts of
powder propelled the ball further?

>>>The basic 3D graphics pipeline is a very well understood problem, with
>>>enormous industrial committments behind it that have brought us
>>>'supercomputer' visualization power for a $50 card. Face it, FP is no
>>>good for reasoning about Von Neumann computing architectures. Other
>>>HW approaches have been tried in the marketplace, and they have lost.
>>>Now in another 20 years who knows, maybe we'll see a rebirth, but
>>>they're dead for now.
>>>
>>
>>You've made quite a leap from a $50 card to Von Neumann architectures.
>>I don't see how this is relevant to functional programming.
>>
> I imagine you don't.

Do you care to enlighten us?

Matthias Buelow

unread,
Jun 2, 2005, 10:36:44 PM6/2/05
to
Joe Marshall wrote:

>>'Cheaper' is not interesting. I've got big 3D and AI problems to solve.
>
> If `cheaper' is not interesting, the solution is obvious: throw money
> at it.

He's throwing "3D" and "AI" into the same category...

>>Why? These are not my problems. I'm not interested in
>>corporate-sized codebases. If I were, I'd damn well be sure to take
>>OSes, APIs, and installed base into account.

Surely these "big 3D and AI problems" are solved in a couple hundred
lines of the Holy Grailang, which Sir Brandonlot is seeking but hitherto
hath not uncovered.

mkb.

Duncan Patton

unread,
Jun 23, 2005, 8:01:23 PM6/23/05
to
On Fri, 27 May 2005 17:22:54 GMT
"Brandon J. Van Every" <mylastname...@mycompanyname.com> wrote:

> Matthias Buelow wrote:
>
> > Brandon J. Van Every wrote:
> >

> >> So where are the Scheme games? I know of a few open source ones, but
> >> I can't point to any commercial ones. There's a big difference between
> >
> >
> > You assume that there is a significant intersection between Scheme (or
> > ML/Haskell etc.) programmers, and commercial game programmers.
>
> *I* assume? I think you put words in my mouth for no reason. I'm very
> familiar with commercial game developers' almost complete lack of
> interest in anything FP.
>
> > I don't think there is. Game companies have their tools which work
> > well for them. Scheme programmers have their tools which are relevant
> > for their work, and are happy that others have developed games that
> > one can play in some free time, to let some steam off. While there
> > probably is a significant interest in the Scheme community to play
> > games, I doubt that there's a large number of people who'd actually
> > want to write games (like Quake, Doom3, etc.)
>
> There are always many CS students who want to write games. Then
> industry gets ahold of them, whether game industry or some other
> segment, and squashes any progressive language tendencies out of them.

> To use tools that industry doesn't use is to be a pioneer. Probably
> that means individual protagonists must get the initial gruntwork done,
> and can't rely on communities for anything.
>

This is an interesting thread. I've spent a long time doing functional
programming in the context of real-time systems, and I have long thought
that gaming and FP were on the path to genuine machine intelligence.

Dhu


> --
> Cheers, www.indiegamedesign.com
> Brandon Van Every Seattle, WA
>

> Taking risk where others will not.


--
???????????????????????????????????????

All persons named herein are purely fictional victims
of the Canidian Bagle Breeder's Association.

Save the Bagle!

Sun Ðhu

???????????????????????????????????????


Reply all
Reply to author
Forward
0 new messages