Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Turing award for Moore?

34 views
Skip to first unread message

Julian V. Noble

unread,
May 17, 2002, 1:54:30 PM5/17/02
to
I read in the ACM Membernet News that the 2001 Turing
Award has been conferred:

------------------------------------------------------------------------------
The 2001 A. M. Turing Award, considered the "Nobel Prize of Computing,"
was
presented by ACM to Ole-Johan Dahl and Kristen Nygaard for their role in
the
invention of object-oriented programming, the most widely used
programming
model today. Their work has led to a fundamental change in how software
systems are designed and programmed, resulting in reusable, reliable,
scalable
applications that have streamlined the process of writing software code
and
facilitated software programming. Current object-oriented programming
languages include C++ and Java, both widely used in programming a wide
range of applications from large-scale distributed systems to small,
personal
applications, including personal computers, home entertainment devices,
and
standalone arcade applications.
------------------------------------------------------------------------------

Does anyone know whether Charles Moore was ever nominated for one for
inventing Forth? IMO it is long overdue if he has not been considered.

The URL for nominations is

http://www.acm.org/awards/award_nominations.html


--
Julian V. Noble
Professor of Physics
j...@virginia.edu

"Science knows only one commandment: contribute to science."
-- Bertolt Brecht, "Galileo".

Mark I Manning IV

unread,
May 17, 2002, 2:58:09 PM5/17/02
to
Julian V. Noble wrote:
> I read in the ACM Membernet News that the 2001 Turing
> Award has been conferred:
>
> ------------------------------------------------------------------------------
> The 2001 A. M. Turing Award, considered the "Nobel Prize of Computing,"
> was
> presented by ACM to Ole-Johan Dahl and Kristen Nygaard for their role in
> the
> invention of object-oriented programming, the most widely used
> programming
> model today. Their work has led to a fundamental change in how software
> systems are designed and programmed, resulting in reusable, reliable,
> scalable
> applications that have streamlined the process of writing software code
> and
> facilitated software programming. Current object-oriented programming
> languages include C++ and Java, both widely used in programming a wide
> range of applications from large-scale distributed systems to small,
> personal
> applications, including personal computers, home entertainment devices,
> and

bullshit object obfuscation is responsible for 99% of all the over
bloated disfunctional bullshit code thats keeping ME from getting a job.
its OOP and the C/C++ language i HATE most in this world :(

> standalone arcade applications.
> ------------------------------------------------------------------------------
>
> Does anyone know whether Charles Moore was ever nominated for one for
> inventing Forth? IMO it is long overdue if he has not been considered.
>
> The URL for nominations is
>

now THAT i agree with

> http://www.acm.org/awards/award_nominations.html
>
>


a...@redhat.invalid

unread,
May 17, 2002, 3:30:08 PM5/17/02
to
Mark I Manning IV <I4...@mailcity.com> wrote:
> Julian V. Noble wrote:
>> I read in the ACM Membernet News that the 2001 Turing
>> Award has been conferred:
>>
>> ------------------------------------------------------------------------------
>> The 2001 A. M. Turing Award, considered the "Nobel Prize of
>> Computing," was presented by ACM to Ole-Johan Dahl and Kristen
>> Nygaard for their role in the invention of object-oriented
>> programming, the most widely used programming model today.

> bullshit object obfuscation is responsible for 99% of all the over


> bloated disfunctional bullshit code thats keeping ME from getting a
> job.

I doubt it.

> its OOP and the C/C++ language i HATE most in this world :(

To ba fair to Dahl and Nygaard, it's wholly wrong to blame
object-oriented programming for C++: OOP when used correctly
simplifies the programming task rather than obfuscates it.

Beta, Nygaard's current programming langauge, is actually rather
pretty. Subroutine (method) calls and access to data are unified into
a single concept, and simple concurrency is part of the langauge, and
the weirdness required by the C compatibility of C++ is avoided
altogether.

Andrew.

Elizabeth D. Rather

unread,
May 17, 2002, 3:54:10 PM5/17/02
to
a...@redhat.invalid wrote:
> ...

> >> ------------------------------------------------------------------------------
> >> The 2001 A. M. Turing Award, considered the "Nobel Prize of
> >> Computing," was presented by ACM to Ole-Johan Dahl and Kristen
> >> Nygaard for their role in the invention of object-oriented
> >> programming, the most widely used programming model today.
> ...

> To ba fair to Dahl and Nygaard, it's wholly wrong to blame
> object-oriented programming for C++: OOP when used correctly
> simplifies the programming task rather than obfuscates it.

And when the application has object-like features. Purely
procedural apps (such as many embedded systems) don't benefit
at all.

Cheers,
Elizabeth

--
==================================================
Elizabeth D. Rather (US & Canada) 800-55-FORTH
FORTH Inc. +1 310-491-3356
5155 W. Rosecrans Ave. #1018 Fax: +1 310-978-9454
Hawthorne, CA 90250
http://www.forth.com

"Forth-based products and Services for real-time
applications since 1973."
==================================================

Tim Daneliuk

unread,
May 17, 2002, 5:40:02 PM5/17/02
to
"Elizabeth D. Rather" wrote:
>
> a...@redhat.invalid wrote:
> > ...
> > >> ------------------------------------------------------------------------------
> > >> The 2001 A. M. Turing Award, considered the "Nobel Prize of
> > >> Computing," was presented by ACM to Ole-Johan Dahl and Kristen
> > >> Nygaard for their role in the invention of object-oriented
> > >> programming, the most widely used programming model today.
> > ...

Given the profound shift in programming paradigm Moore introduced
with FORTH, its hard to understand why he's not been so recognized.
Then again, the ACM tends to favor more "academic" shifts than
practical ones. (Thompson & Richie notwithstanding.) Perhaps if
Moore had cluttered up his original work with a lot of extraneous
and opaque mathematics, he'd be more seriously considered...

> > To ba fair to Dahl and Nygaard, it's wholly wrong to blame
> > object-oriented programming for C++: OOP when used correctly
> > simplifies the programming task rather than obfuscates it.
>
> And when the application has object-like features. Purely
> procedural apps (such as many embedded systems) don't benefit
> at all.
>

<grumbling-mode>

Ain't that the truth. I recently pontificated indirectly about how
OO is not the answer to the world's problems in responding to
one of those "What's The Best Programming Language" questions that
shows up every couple of nanoseconds on USENET:

http://www.tundraware.com/Technology/How-To-Pick-A-Programming-Language/

Having been in the profession for over 20 years, I never cease to be
amazed by the implicit assumption by so many of our fellow practitioners
that some "silver bullet" is just around the corner, waiting to found,
which will solve our software productivity and quality problems for all
time. We may be the only profession that spends a considerable amount
of skull sweat trying to reduce our practice to a single tool or technique.
I cannot imagine an auto mechanic trying to become so elegant as to
only use a screwdriver for all repairs ;))

In my observation, OO's nominal virtues notwithstanding, it frequently
makes development efforts *more* complex, *longer* in duration, and the
final product *harder* to maintain. IMO, this is because OO never reduces
net complexity, it merely transforms *where* that complexity appears.

Moreover, trying to conquer a non-trivial problem with a single software
paradigm ends up creating a *new* set of problems having nothing to
do with your primary task. The designer/programmer spends 80% of their
time jamming that paradigm down the throat of the last 20% of the problem
because that last 20% does not map well to the paradigm chosen.

In fact, I think this is true of *all* design/programming approaches
and is by no means unique to OO at all.

I like to use the following matrix with the newly-minted youngsters when
explaining why all they just learned in school may not really be
The Way Things Are:


Birth Of Source Of Complexity/
Paradigm Paradigm Principal Intellectual Problem
-------- -------- ------------------------------

1950s Algorithm decomposition/design Comprehending/reducing
O(n) complexity

1960s Heuristic algorithms for Finding heuristics which map
NP-Hard classes of problems well to the problem space
and then determining whether
their worst-case behavior is
acceptable.

1960s AI Defining a rule-set which
maps well to the problem
space and then finding
an ordering of those rules
which scales for non-trivial
problems.

1970s DBMS Approaches Ordering the taxonomy
and semantics of data so
as to create a complete
and manageable schema.

1960s Procedural Programming Maintaining coherence and
consistency among all the
decomposed pieces of code.

1980s Object Oriented Defining a meaningful
inheritance hierarchy
which appropriately factors
generics from derived
properties.

1980s Theorem Proving Techniques Finding mathematical
analogs for the real world
which work for anything more
than "toy" problem spaces.

1980s Functional Programming Transforming real world
problems into functional
equivalents upon which
FP can operate.


It is my claim that, in each of these approaches, all that was really done
was to *transform* where the hard problems showed up - much like doing
a Laplace Transform turns a differential equation into algebra at the
(new + hard) cost of trying to figure out an inverse transform.

Trying to convince the True Believers that this is so is an uphill
battle until they actually have to build a significantly complex
system which includes UI, data store, transaction protection,
fault recovery, and so on. The first thing the OO crowd runs into
is that there is an immense "impedance mismatch" between how OO
inheritance sees the world and a DBMS sees the world. Add to this
the need to transactionally protect key events, and the OO folks
walk around mumbling about a yet-to-be-delivered Sun extension to
JAVA or a new .NET feature.

The reality here is that real problems are almost always
event or condition driven. That is, the structures of time and
events *outside* the program are way more important than the
internal organization of the code itself or what programming
paradigm is chosen. In realtime it is hardware events
which dictate program structure and flow. In transaction systems
it is typically a business event which initiates the work.
In operating systems it is a combination of hardware, process,
and user events which dictate how things get done.

It is amazing with all the advances we've seen in language theory,
design tools, and so on that so little attention is paid to
supporting (a)synchrony, concurrent processing, locking,
and rendezvous *as first-class properties of the languages themselves*.
FORTH clearly lead the way here but newer languages like Erlang
are starting to think this way too.

I reserve the right to be wrong, but it seems to me that we ought to be
evolving to a world in which programmers of complex systems write in
meta-languages like Python or Ruby - which embrace OO, procedural, DBMS and
Functional Programming notions - but whose runtimes would be best delivered
in something like FORTH or Erlang. Different software paradigms could thus
be used interchangeably because they are first-class properties of the
selected language, but the intermediate code output would be realized in
a highly efficient, easily optimized, event-sensitive runtime environment
This is is probably a pipe-dream because trying to efficiently map
the semantics of a late-bound dynamic language like Python onto
a sleek FORTH runtime is possibly too costly in terms of runtime
size and speed. Or is it?

Then again, I'm just Grumbling ...

</grumbling-mode>
------------------------------------------------------------------------------
Tim Daneliuk
tun...@tundraware.com

Mark I Manning IV

unread,
May 17, 2002, 9:10:43 PM5/17/02
to
a...@redhat.invalid wrote:
> Mark I Manning IV <I4...@mailcity.com> wrote:
>
>>Julian V. Noble wrote:
>>
>>>I read in the ACM Membernet News that the 2001 Turing
>>>Award has been conferred:
>>>
>>>------------------------------------------------------------------------------
>>>The 2001 A. M. Turing Award, considered the "Nobel Prize of
>>>Computing," was presented by ACM to Ole-Johan Dahl and Kristen
>>>Nygaard for their role in the invention of object-oriented
>>>programming, the most widely used programming model today.
>>
>
>>bullshit object obfuscation is responsible for 99% of all the over
>>bloated disfunctional bullshit code thats keeping ME from getting a
>>job.
>
>
> I doubt it.
>

lol -ok i have a bad attitude but only agains bad c coders :)

over the past 5 years of contracting ive been out of work for at most a
month at a time. since my last contract ended (pre sept 11) i have had
one fone interview and no jobs. my attitude and disline for (whatever)
are very carefully hidden away when im on a job :)


>
>> its OOP and the C/C++ language i HATE most in this world :(
>
>
> To ba fair to Dahl and Nygaard, it's wholly wrong to blame
> object-oriented programming for C++: OOP when used correctly
> simplifies the programming task rather than obfuscates it.
>
> Beta, Nygaard's current programming langauge, is actually rather
> pretty. Subroutine (method) calls and access to data are unified into
> a single concept, and simple concurrency is part of the langauge, and
> the weirdness required by the C compatibility of C++ is avoided
> altogether.
>
> Andrew.

i have yet to see a single instance of OOP in action where it simplified
a single thing. ive seen it kill more than a few projects however and
they were things i could have done solo in assembler or forth, no need
for 10 developers working with 500k of source files for 4 years when one
developer can create a solution with 30 or 40k of sources in a month.


Tim Daneliuk

unread,
May 17, 2002, 9:40:02 PM5/17/02
to
<SNIP>

A slightly more polished version of this now lives at:

http://www.tundraware.com/Technology/Bullet/

--
------------------------------------------------------------------------------
Tim Daneliuk
tun...@tundraware.com

Mike Halloran

unread,
May 17, 2002, 10:15:37 PM5/17/02
to
Well, I'll be dipped in, uh, stuff.

I feel the same way about C and its evil offspring.

And about Charles Moore.

-Mike-

Mike Halloran

unread,
May 17, 2002, 10:34:43 PM5/17/02
to
That really didn't need a whole lot of polishing.

It did sort of gloss over the attraction of the Silver Bullet, which from a
business person's perspective, is:

IF you had a magic screwdriver, you wouldn't need a trained technician to
fix a car; some kid just out of high school could do it for minimum wage.

IF you had a magic computer language, you wouldn't need a high- paid, smart,
cranky computer person to whine about the cheap chair you provide and tell
you how stupid you are while continually delivering what you asked for
instead of what you want.

..... and so on. You get the point.

What I still don't understand, after decades of rumination while nominally
working for them, is why the people who flunked out of math and science are
running things.

Sometimes I wish I had the stomach to be a salesperson. The Silver Bullet
market appears to have no limits.

-Mike-

Tim Daneliuk

unread,
May 17, 2002, 11:10:06 PM5/17/02
to
Mike Halloran wrote:
>
> That really didn't need a whole lot of polishing.

Thanks. The web article has more surrounding prose to sort of set up
the problem and ends with a discussion of *why* this is a really big
deal.

>
> It did sort of gloss over the attraction of the Silver Bullet, which from a
> business person's perspective, is:
>
> IF you had a magic screwdriver, you wouldn't need a trained technician to
> fix a car; some kid just out of high school could do it for minimum wage.
>
> IF you had a magic computer language, you wouldn't need a high- paid, smart,
> cranky computer person to whine about the cheap chair you provide and tell
> you how stupid you are while continually delivering what you asked for
> instead of what you want.
>
> ..... and so on. You get the point.
>
> What I still don't understand, after decades of rumination while nominally
> working for them, is why the people who flunked out of math and science are
> running things.

That's beginning to change. I'm a technology exec by profession (or was
until the last round of employment executions ;( The new generation
of business execs that are rising to leadership roles are a *lot* smarter
about the role of technology and its importance. They also, I think, have
a more realistic view of what software development entails.

If we criticize the leaders, it is also fair to criticize the technical
community. The technical community is full of Prima Donnas who neither
understand business conditions nor the tradeoffs running a company requires.
A good part of my career has been bringing these two communities together
to form a better understanding of each others' roles and issues.

>
> Sometimes I wish I had the stomach to be a salesperson. The Silver Bullet
> market appears to have no limits.


Again, you'd be suprised just how "technical" good sales people are. A
close friend of mine is a sales guy with whom I've worked on- and off-
for many years. He *easily* grasps the macro technical issues better
than a great many technical people do.

Modern business if it is to be successful needs to position itself right
where technology, business opportunity, and capital meet. That means
*everybody* has to have some understanding of business *and* technology,
and everybody, in some sense, has be "in sales".

Tim Daneliuk

unread,
May 17, 2002, 11:38:10 PM5/17/02
to
a...@redhat.invalid wrote:
<SNIP>


>
> To ba fair to Dahl and Nygaard, it's wholly wrong to blame
> object-oriented programming for C++: OOP when used correctly
> simplifies the programming task rather than obfuscates it.

The single most useful application of OO I've ever seen is not as a
programming construct, but as a *specification* mechanism. If you
strip all the useless bloat out of the Grady/Booch/Rumbaugh
stuff (which AFAIKT is designed to sell more books), there is a core
of useful stuff there that can be very handy when getting your
user to tell what they *really* want. A guy by the name of Rosenberg
did just that and wrote "Use Case Driven Object Modeling With UML."
His approach works well insofar as it helps the specifier crisply define
the business processes that are desired. Once you have that in hand,
you can throw the rest of the OO religion away if you need to and implement with
whatever makes sense.

I share your view, BTW, that there is nothing inherently bad about the
OO paradigm, C++ was just a really bad first effort. Eiffel was better,
Java better even still. But for general applications development,
I continue to be amazed by Python. I hasten to add that Python in
no way can do what Forth does well - sleek embedded runtimes
that are reliable and small. But for all the reasons people love
Forth - intellectual cleanliness, an easily grasped paradigm, consistency
across platforms, and elegance, they will like Python as well (for the
class of problems well-suited to Python solutions).

a...@redhat.invalid

unread,
May 18, 2002, 3:33:58 AM5/18/02
to
Tim Daneliuk <tun...@tundraware.com> wrote:

> I share your view, BTW, that there is nothing inherently bad about the
> OO paradigm, C++ was just a really bad first effort.

First effort? At least one object-oriented programming language was
in use before C was invented, never mind C++! :-)

Andrew.

David Golden

unread,
May 18, 2002, 8:36:03 AM5/18/02
to

>
> I share your view, BTW, that there is nothing inherently bad about the
> OO paradigm, C++ was just a really bad first effort.

C++ was a latecomer, really.

C++ loses most of the dynamic features and reflectivity that most poeple
originally associated with OO programming. The first thing C++ GUI
toolkits seem to do is put dynamic stuff back in to make the language more
useful - e.g. Qt, MFC.

The C++/Java crowd have taken to calling OO minus dynamic/reflective
features of classic OO languages "OO", and the dynamic/reflective features
"component based programming" - e.g. JavaBeans


Eiffel was better,
> Java better even still. But for general applications development,
> I continue to be amazed by Python.

If you really want good OO, I suggest checking out CLOS in Common Lisp.

Try to look beyond the parentheses - they're no more unusual
than the choice of postfix syntax of forth.

Or check out Joy for a very elegant lisp-like/forth-like hybridisation...

--
Don't eat yellow snow.

Julian V. Noble

unread,
May 18, 2002, 1:57:20 PM5/18/02
to
Folks, I asked the following question:

> Does anyone know whether Charles Moore was ever nominated for one for

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


> inventing Forth? IMO it is long overdue if he has not been considered.
>
> The URL for nominations is
>
> http://www.acm.org/awards/award_nominations.html
>

There was a lot of ranting about the evils of the OO
paradigm, etc. But nobody has answered the question.

The reason I included the URL is that many in this group
are more appropriate than I, to nominate Moore and pro-
vide supporting material. I was hinting--perhaps too
gently--that if he has not yet been nominated for a Turing
award he should be. Speaking as someone who

a) knew Richard P. Feynman;

b) does not have much respect for most of the
people he meets in the computer arena;

I would like to say that Charles Moore strikes me as
one of very few authentic geniuses I have met in my
life. That does not mean I agree with everything he
has said or done. But IMO his achievements are unpre-
cedented and should be recognized by the CS and pro-
gramming communities.

Jerry Avins

unread,
May 18, 2002, 2:31:11 PM5/18/02
to

Heartily endorsed! How many here are members of ACM?

Jerry
--
Engineering is the art of making what you want from things you can get.
¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯

Len Zettel

unread,
May 18, 2002, 4:48:47 PM5/18/02
to

"Jerry Avins" <j...@ieee.org> wrote in message
news:3CE69DEF...@ieee.org...

> "Julian V. Noble" wrote:
> >
> > Folks, I asked the following question:
> >
> > > Does anyone know whether Charles Moore was ever nominated for one for
> > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> > > inventing Forth? IMO it is long overdue if he has not been considered.
> > >
> > > The URL for nominations is
> > >
> > > http://www.acm.org/awards/award_nominations.html
> > >
> >
> Heartily endorsed! How many here are members of ACM?

I have been a member of the ACM for about 30 years now. Even if some, or a
bunch,
of people nominated him I would say he would have two chances of getting it:
slim, and none.
Besides all kinds of reasons having to do with the sociology of academe, a
couple of Wil
Baden's posts hint at more substantive ones, to add to the fact that Moore's
impact on the
main stream of computing has been zilch.
-LenZ-

Albert van der Horst

unread,
May 18, 2002, 6:55:16 AM5/18/02
to
In article <3CE57855...@tundraware.com>,
Tim Daneliuk <tun...@tundraware.com> wrote:

<SNIP>
In general I agree with what you wrote. But it is classic
problem. People tend to view object orientation as a solution,
while it is a tool.
I would maintain that a FORTRAN I designed for the calculation of
the Brent oil fields in the North Sea was object oriented as hell.
You can recognise things a classes, messages, objects etc.
Few of the so called OO experts would be able to do that probably.

>The reality here is that real problems are almost always
>event or condition driven. That is, the structures of time and
>events *outside* the program are way more important than the
>internal organization of the code itself or what programming
>paradigm is chosen. In realtime it is hardware events
>which dictate program structure and flow. In transaction systems
>it is typically a business event which initiates the work.
>In operating systems it is a combination of hardware, process,
>and user events which dictate how things get done.

Interestingly this maps rather well on objects. Make every
object an independant process and map the events on
messages. I have been experimenting with this in Forth.
It is a shame (and probably lack of practical mindedness of
the OO dogmatics) that this is not done more.

See Massively Parallel Processing ISBN 0 444 81784 0
(proceedings of Eurosim 1994, Elsevier, pg. 481-488)
"An experiment: using parallel objects for the solution
of demanding algorithmic problems )

(At the time the DFW was doing a lot with transputers and
its parallel features were well supported in Forth.)
--
Greetings Albert
--
Albert van der Horst,Oranjestr 8,3511 RA UTRECHT,THE NETHERLANDS
To suffer is the prerogative of the strong. The weak -- perish.
alb...@spenarnc.xs4all.nl http://home.hccnet.nl/a.w.m.van.der.horst

Albert van der Horst

unread,
May 18, 2002, 6:31:31 AM5/18/02
to
In article <3CE56025...@forth.com>,

Elizabeth D. Rather <era...@forth.com> wrote:
>And when the application has object-like features. Purely
>procedural apps (such as many embedded systems) don't benefit
>at all.

Not too fast. It can be argued that CREATE DOES> is a lightweight
object system. That TO +TO FROM can be viewed as messages send to
objects. Why not be counted among the winners and claim we were there
before the crowd? (And it is even true.)

The concept of wordset (not wordlists) ala Brody is much related
to objects. Less dogmatic and very practical for small systems.

So how is this for your next promotion
"
Heavy weight multiple inheritance systems are of course out
of the question for small embedded system. Also the emphasis is on
procedural code. Even so Forth has object oriented features
from day one, like the CREATE DOES> construct. It has reaped
practical benefits from this, long before the term object
oriented was coined.
"

Feel free to borrow from this formulation.

>Cheers,
>Elizabeth

Greetings Albert.

Anton Ertl

unread,
May 18, 2002, 6:33:05 PM5/18/02
to
alb...@spenarnc.xs4all.nl (Albert van der Horst) writes:
>In article <3CE56025...@forth.com>,
>Elizabeth D. Rather <era...@forth.com> wrote:
>>And when the application has object-like features. Purely
>>procedural apps (such as many embedded systems) don't benefit
>>at all.

It's not entirely clear that an app is purely procedural, however.
Take parser generators, for example (see
<http://www.complang.tuwien.ac.at/papers/ertl99ef.ps.gz>):

Hans-Peter Moessenboeck says that he did not use the OO features of
Oberon2 for his parser generator Coco/R, because there was no benefit
from OO in that application. OTOH, in my parser generator Gray (which
is similar to Coco/R in functionality) I found OO very valuable.

My conclusion: OO is like factoring. You have to see the common
pattern to make good use of it.

>Not too fast. It can be argued that CREATE DOES> is a lightweight
>object system. That TO +TO FROM can be viewed as messages send to
>objects.

I would argue against that view. CREATE...DOES> as usually used is
just using compile-time binding, and TO etc. are doing that, too. For
OO, run-time binding is required. Apart from that, TO etc. cause a
lot of problems because of their parsing behaviour.

- anton
--
M. Anton Ertl http://www.complang.tuwien.ac.at/anton/home.html
comp.lang.forth FAQs: http://www.complang.tuwien.ac.at/forth/faq/toc.html

Peter Lawrence

unread,
May 19, 2002, 5:51:09 AM5/19/02
to

Did you hear the one about the four Boy Scouts who helped a little old lady
across the road? It took so many because she didn't want to go.

Wouldn't it be polite to ask MR. MOORE's views on the subject first? PML.

--
GST+NPT=JOBS

I.e., a Goods and Services Tax (or almost any other broad based production
tax), with a Negative Payroll Tax, promotes employment.

See http://users.netlink.com.au/~peterl/publicns.html#AFRLET2 and the other
items on that page for some reasons why.

Marcel Hendrix

unread,
May 19, 2002, 6:06:26 AM5/19/02
to
(#26313) alb...@spenarnc.xs4all.nl (Albert van der Horst) writes Re: Silver Bullets (Long)

> See Massively Parallel Processing ISBN 0 444 81784 0

> m(proceedings of Eurosim 1994, Elsevier, pg. 481-488)


> "An experiment: using parallel objects for the solution
> of demanding algorithmic problems )

> (At the time the DFW was doing a lot with transputers and
> its parallel features were well supported in Forth.)

Ath this moment, iForth still does support (almost all of)
these constructs, using sockets instead of transputer links
and OS threads to emulate parallel processes. iForth 2.0
even runs benchpin on my LAN.

-marcel

Julian V. Noble

unread,
May 19, 2002, 5:35:06 PM5/19/02
to
Len Zettel wrote:
>
> "Jerry Avins" <j...@ieee.org> wrote in message
> news:3CE69DEF...@ieee.org...
> > "Julian V. Noble" wrote:
> > >
> > > Folks, I asked the following question:
> > >
> > > > Does anyone know whether Charles Moore was ever nominated for one for
> > > ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> > > > inventing Forth? IMO it is long overdue if he has not been considered.
> > > >
> > > > The URL for nominations is
> > > >
> > > > http://www.acm.org/awards/award_nominations.html
> > > >
> > >
> > Heartily endorsed! How many here are members of ACM?
>
> I have been a member of the ACM for about 30 years now. Even if some, or a
> bunch,
> of people nominated him I would say he would have two chances of getting it:
> slim, and none.
> Besides all kinds of reasons having to do with the sociology of academe, a
> couple of Wil
> Baden's posts hint at more substantive ones, to add to the fact that Moore's
> impact on the
> main stream of computing has been zilch.
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

> -LenZ-
> >
> > Jerry
> > --
> > Engineering is the art of making what you want from things you can get.
> > ¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯

I am not sure you are correct here. I don't know what is
meant by "impact on the main stream of computing" means,
but lots of things have been done that obviously derive
from Forth. The developers of PostScript were, IMNVHO,
disingenuous in denying a connection: linguistic analysis
would suggest otherwise. And what of Java, Open Firmware,
etc. etc.? Maybe not so minimal after all.

Julian V. Noble

unread,
May 19, 2002, 5:43:07 PM5/19/02
to
Peter Lawrence wrote:
>
> Jerry Avins wrote:
> >
> > Heartily endorsed!

>
Peter Lawrence wrote:
>
> Did you hear the one about the four Boy Scouts who helped a little old lady
> across the road? It took so many because she didn't want to go.
>
> Wouldn't it be polite to ask MR. MOORE's views on the subject first? PML.

Get real! What would you expect someone to say if you
asked "Should I nominate you for a prize that you richly
deserve but stand a small chance of receiving?" (Sort
of like me and the Nobel ;-)

One nominates people for prizes because one owes it to one's
profession. There is no disgrace in not getting the prize,
but great honor if one does get it. Therefore one should do
it, if he believes the potential recipient worthy. I definitely
do, in Moore's case. I just think it would come better from
someone with more impressive credentials than mine, or I
would do it myself.

Jerry Avins

unread,
May 19, 2002, 7:57:20 PM5/19/02
to

I'm with you there. I've belonged to ACM probably longer than Len, but I
hardly have a name to be reckoned with. I'll add my two cents, though,
if it might help.

Jim Schneider

unread,
May 19, 2002, 8:08:41 PM5/19/02
to

"Julian V. Noble" wrote:

> One nominates people for prizes because one owes it to one's
> profession. There is no disgrace in not getting the prize,
> but great honor if one does get it. Therefore one should do
> it, if he believes the potential recipient worthy. I definitely
> do, in Moore's case. I just think it would come better from
> someone with more impressive credentials than mine, or I
> would do it myself.
>
> --
> Julian V. Noble
> Professor of Physics
> j...@virginia.edu

Unless you're pulling our leg, or stretching the truth a bit, I don't think there
are many of us here with "more impressive credentials". Are you a full professor
(and of a hard science) at UVA, as your signature implies? What credentials would
be "more impressive" than that?

--
Feel like saving money today? http://www.jrcs.biz/

Len Zettel

unread,
May 19, 2002, 9:40:42 PM5/19/02
to
During the sixties the leaders of the ACM worked mightily to create the new
academic
discipline of compuyter science and give it academic respectability. At the
same time
thye transformed the Association for Computing Machinery into the
professional
society for computer scientists. A couple of efforts to change the name
failed to
muster the necessary supermajoirty, so they took to using the initials
almost always
and the words never.

Scientists build things to learn things. The proper products of scientific
research
are discoveries, and their tangible embodiment is the peer-reviewed paper,
with
papers in leading journals respected much more than those in obscurer
venues.
To be considered a high-ranking member of the community you have to have a
life-long publication record of papers highly regarded by your peers. Those
are the
kind of people who get the Turing prize these days.

Engineers learn things to make things. The proper products of engineering
efforts
are useful artifacts, tangible and intangible. Blind worshipers to the
contrary
notwithstanding, I think a reasonable case could be made that Chuck Moore
didn't make any original discoveries in the scientific sense. Even a
two-stack
abstract machine was also constructed by others, if I read Wil Baden's posts
correctly.

What Chuck *did* do was take what he had to make what he wanted, and the
product of his take, try, discard, rework on and on was indeed wonderful.
But it was engineering, not science. I'm not sure he ever published a
peer-reviewed
paper in his life. The journals were certainly not prestigious. Elizabeth
has recounted
the bitter opposition her paper on the history of Forth aroused.

My conclusion: you can't get there from here. You can try if you want. You
can also
piss into the wind if you want.
-LenZ-

Wil Baden

unread,
May 19, 2002, 10:27:40 PM5/19/02
to
In article <P6zF8.64819$fU2.6...@bin8.nnrp.aus1.giganews.com>, Len
Zettel <zet...@acm.org> wrote:

> Besides all kinds of reasons having to do with the sociology of academe, a
> couple of Wil
> Baden's posts hint at more substantive ones, to add to the fact that Moore's
> impact on the
> main stream of computing has been zilch.

In the early 1960s, programming by explicit stack manipulation was
discovered by many. Dave Ferguson of SDS (later XDS) may have been
the first to make commercial use of this in METASYMBOL. This was a
metacompiler for the SYMBOL assembler, FORTRAN, COBOL, and JOVIAL.

In the later 1960s, the communications division of Control Data used a
like technology for the Fortran IV compiler for the Federal Reserve
Board. (I was manager.)

Chuck Moore's contribution was making the development environment
interactive and integrated. If it's not interactive, it's not Forth.

With this viewpoint, Postscript is certainly not Forth.

--
Wil Baden Costa Mesa, California Per neil...@earthlink.net

Jerry Avins

unread,
May 19, 2002, 10:32:56 PM5/19/02
to

Pissing into the wind can leave a bad taste in your mouth. Nominating
Moore is more like wetting yourself in a dark wool suit: you get a warm
feeling and no one notices. Let's do it! Your explanation of Moore's
work and worth, along with the inspiration I provide below <ahem!> might
not be so laughable after all.

Len Zettel

unread,
May 19, 2002, 11:14:01 PM5/19/02
to

"Julian V. Noble" <j...@virginia.edu> wrote in message
news:3CE81A8A...@virginia.edu...

> ¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯
>
> I am not sure you are correct here. I don't know what is
> meant by "impact on the main stream of computing" means,
> but lots of things have been done that obviously derive
> from Forth. The developers of PostScript were, IMNVHO,
> disingenuous in denying a connection: linguistic analysis
> would suggest otherwise. And what of Java, Open Firmware,
> etc. etc.? Maybe not so minimal after all.
>
To the academic computer scientist, all of those things you mention are
engineering,
not computer science, and therefore irrelevant. So even if those claims
could
me made in a clear and uncontested manner, it wouldn't help.
-LenZ-

Tim Daneliuk

unread,
May 20, 2002, 1:40:03 AM5/20/02
to
Len Zettel wrote:
>
<SNIP>

>
> My conclusion: you can't get there from here. You can try if you want. You
> can also
> piss into the wind if you want.
> -LenZ-

"Science", as I understand the term, involves proposing a hypothesis and
then empirically verifying or refuting that hypothesis. Most of what
passes as Computer "Science" is really more a branch of Mathematics than
it is an empirical science. The 'empirical' parts of the discipline
almost invitably are about solving Engineering problems.

Innovation in Computer Science, almost by definition, has an element of
Engineering to it. Why else would the ACM give Richie and Thompson the
Turing Award? Their contributions, while clearly significant, are almost
entirely in the Engineering realm and not the "peer reviewed science"
you cite as the central aim of the ACM.

After nearly 20 years of ACM membership, I did not renew this year.
I declined to do so for a variety of reasons:

1) There is an implicit condescension towards "mere" Engineering in
much of what is written. "Fuzzy Studies" like the psychology
of user interface design seem to 'get more respect' than programming
techniques and implementation strategies.

2) The ACM regularly departs both from a Mathematical and Engineering focus
in its endless "soft" articles on academic enrollment statistics,
political analysis of the role of minorities/women/... in the discipline,
and various fluff pieces on who funds what and how much.

3) The bulk of the value I got from ACM was in their SIG publications.
Much of this information was out-of-date by the time of publication.
More current information could be had on the web in many cases.

I don't know if Moore deserves a Turing Award, but I suspect the kind of
haughty, snide tone that is implicit in today's ACM almost guarantees
he has no shot, no matter how deserving. I much preferred the ACM in the
days where there was room for both theory and practice, a balance struck
best, IMO, by Jon Bentley's columns.

Tim Daneliuk

unread,
May 20, 2002, 1:50:06 AM5/20/02
to
"Julian V. Noble" wrote:
>
> Peter Lawrence wrote:
> >
> > Jerry Avins wrote:
> > >
> > > Heartily endorsed!
> >
> Peter Lawrence wrote:
> >
> > Did you hear the one about the four Boy Scouts who helped a little old lady
> > across the road? It took so many because she didn't want to go.
> >
> > Wouldn't it be polite to ask MR. MOORE's views on the subject first? PML.
>
> Get real! What would you expect someone to say if you
> asked "Should I nominate you for a prize that you richly
> deserve but stand a small chance of receiving?" (Sort
> of like me and the Nobel ;-)
>
> One nominates people for prizes because one owes it to one's
> profession. There is no disgrace in not getting the prize,
> but great honor if one does get it. Therefore one should do
> it, if he believes the potential recipient worthy. I definitely
> do, in Moore's case. I just think it would come better from
> someone with more impressive credentials than mine, or I
> would do it myself.


Well said. But, as Mr. Zettel points out, the focus of ACM appears
to be *academic* contribution, not application in the field of
computing. Sadly, I fear he is right and the nomination, no matter
how noble or well intentioned would merely precipitate a public cat fight.

Anton Ertl

unread,
May 19, 2002, 10:40:11 PM5/19/02
to
Wil Baden <neil...@earthlink.net> writes:
>Chuck Moore's contribution was making the development environment
>interactive and integrated. If it's not interactive, it's not Forth.
>
>With this viewpoint, Postscript is certainly not Forth.

Postscript is interactive (what do you think operators like "pstack"
are for?). You can see this easily by invoking Ghostscript, and even
Postscript embedded in a printer can talk back, and can be used
interactively (if the host supports that). E.g., here's a small
session with Ghostscript:

[~:5147] gs
GNU Ghostscript 5.50 (2000-2-13)
Copyright (C) 1998 Aladdin Enterprises, Menlo Park, CA. All rights reserved.
This software comes with NO WARRANTY: see the file COPYING for details.
GS>1 2
GS<2>pstack
2
1
GS<2>add ==
3

However, Postscript is not Forth anyway, there are too many
differences, e.g., in typing and name binding.

Wil Baden

unread,
May 20, 2002, 9:17:00 AM5/20/02
to
In article <2002May2...@a0.complang.tuwien.ac.at>, Anton Ertl
<an...@mips.complang.tuwien.ac.at> wrote:

> Postscript is interactive (what do you think operators like "pstack"
> are for?).

Oh my. Ny experience with Postxcript has been as a language that was
written by other languages. Thanks.

--
Wil

Elizabeth D. Rather

unread,
May 20, 2002, 2:08:02 PM5/20/02
to
Anton Ertl wrote:

>
> "Len Zettel" <zet...@acm.org> writes:
> >Scientists build things to learn things. The proper products of scientific
> >research
> >are discoveries, and their tangible embodiment is the peer-reviewed paper,
> >with
> >papers in leading journals respected much more than those in obscurer
> >venues.
> >To be considered a high-ranking member of the community you have to have a
> >life-long publication record of papers highly regarded by your peers.
>
> ...
>
> Language design work has been Turing-awarded several times (McCarthy,
> Backus, Iverson, Wirth, Milner, Dahl and Nygaard), and compiler work
> (e.g., Perlis, Cocke) and hardware work (e.g. Wilkes), too, so on that
> account Moore worked in award-prone areas. However, his achievements
> found little academic recognition, so I think his chances of actually
> receiving the award are extremely slim.

For all the reasons cited by Len & others, I think nominating Chuck for
a Turing award would be a wasted effort. The overwhelming majority of
ACM members have never heard of Chuck or Forth, and those few who have
probably remember the "hobbyist toy" image left over from the 80's.

A far more appropriate and productive exercise, IMO, would be to
nominate him for a McArthur grant, because the money might actually
enable him to contribute usefully from his current work.

Cheers,
Elizabeth

--
==================================================
Elizabeth D. Rather (US & Canada) 800-55-FORTH
FORTH Inc. +1 310-491-3356
5155 W. Rosecrans Ave. #1018 Fax: +1 310-978-9454
Hawthorne, CA 90250
http://www.forth.com

"Forth-based products and Services for real-time
applications since 1973."
==================================================

Keith A. Lewis

unread,
May 20, 2002, 3:37:02 PM5/20/02
to
In article <3CE93BC3...@forth.com>,

Elizabeth D. Rather <era...@forth.com> wrote:
>Anton Ertl wrote:
>>
>> "Len Zettel" <zet...@acm.org> writes:
>> >Scientists build things to learn things. The proper products of scientific
>> >research
>> >are discoveries, and their tangible embodiment is the peer-reviewed paper,
>> >with
>> >papers in leading journals respected much more than those in obscurer
>> >venues.
>> >To be considered a high-ranking member of the community you have to have a
>> >life-long publication record of papers highly regarded by your peers.
>>
>> ...
>>
>> Language design work has been Turing-awarded several times (McCarthy,
>> Backus, Iverson, Wirth, Milner, Dahl and Nygaard), and compiler work
>> (e.g., Perlis, Cocke) and hardware work (e.g. Wilkes), too, so on that
>> account Moore worked in award-prone areas. However, his achievements
>> found little academic recognition, so I think his chances of actually
>> receiving the award are extremely slim.
>
>For all the reasons cited by Len & others, I think nominating Chuck for
>a Turing award would be a wasted effort. The overwhelming majority of
>ACM members have never heard of Chuck or Forth, and those few who have
>probably remember the "hobbyist toy" image left over from the 80's.

How many ACM members know APL or Iverson? How many know that the best
tick database (widely used in a number of Wall Street firms and hedge
funds) uses k, APL's modern incarnation?

I hope this is not too far off topic for c.l.f.

Elizabeth D. Rather

unread,
May 20, 2002, 4:06:22 PM5/20/02
to

30 years ago the answer re Iverson/APL would have been quite a few.
He'd have more trouble today.

Bernd Paysan

unread,
May 18, 2002, 4:39:35 PM5/18/02
to
Mike Halloran wrote:
> IF you had a magic screwdriver, you wouldn't need a trained technician to
> fix a car; some kid just out of high school could do it for minimum wage.

Hm, if I look at Harry Potter, I think if you had a magic screwdriver, you
wouldn't need a trained mechanican to fix your car (after all, that's a
relatively low-wages job that doesn't require that much of education, 9
years school and 3 years apprenticship are sufficient in Germany), but a
highly trained magican (7 years Hogwards after 4 years elementary school,
and 4-5 years university), and he will want to be paid in gold!

Magic Screwdrivers fall under the "Abuse of Muggle Artefact" convention,
anyway.

> IF you had a magic computer language, you wouldn't need a high- paid,
> smart, cranky computer person to whine about the cheap chair you provide
> and tell you how stupid you are while continually delivering what you
> asked for instead of what you want.

For me, programming is already quite close to magic. One of the important
aspects of Forth is that it requires *less* magic to be mastered. The ideas
behind Forth are easier to understand, so there are less "and now something
magic happens" boxes in the mind of the user. Typical OO languages have
quite a lot of magic build in. Does it help? I don't think so, the more
magic there is, the less productive the language is.

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/

Albert van der Horst

unread,
May 20, 2002, 11:34:50 AM5/20/02
to
In article <2002May1...@a0.complang.tuwien.ac.at>,

Anton Ertl <an...@mips.complang.tuwien.ac.at> wrote:
>I would argue against that view. CREATE...DOES> as usually used is
>just using compile-time binding, and TO etc. are doing that, too. For
>OO, run-time binding is required. Apart from that, TO etc. cause a
>lot of problems because of their parsing behaviour.

OO is about concepts. Early or late binding has nothing to do with it.
I referred to the article about parallel objects sending real messages
to each other in real time. This is late binding by definition!

TO need not cause any problems because of its parsing requirement.
ISO requires TO to parse but interestingly it is impossible
to write a standard program that detects that your TO doesn't parse!
This is because it is forbidden to postpone TO in a standard program.
(I would be very interested if somebody came up with a program to
disprove this!)

This implementation of VALUE clearly shows the relation with OO concepts.
And I claim it is ISO-compliant.

VARIABLE TO-MESSAGE
0 CONSTANT %from : FROM %from TO-MESSAGE ! ;
1 CONSTANT %to : TO %to TO-MESSAGE ! ;
2 CONSTANT %+to : +TO %+to TO-MESSAGE ! ;

: VALUE CREATE DOES>
TO-MESSAGE @ %from = IF @ THEN
TO-MESSAGE @ %to = IF ! THEN
TO-MESSAGE @ %+to = IF +! THEN
FROM
;

To ugly `` FROM '' at the end of DOES> is because ISO requires
that if no message has been sent the VALUE-object is supposed
to behave as if a %from message was sent. Purists would
leave it out.

You can make this faster by sprinkling POSTPONE's and IMMEDIATE's
all over the place, getting the equivalent of late binding.
I am not willing to that just to make all this a little bit
faster.

>- anton
>--
Greetings Albert

Bernd Paysan

unread,
May 20, 2002, 6:28:00 PM5/20/02
to
Stephen J. Bevan wrote:
> What sort of linguistic analysis whould suggest this?

dup, roll, the order by which sub and div work (a b sub is a-b). The use of
a parameter passing stack. Could also be been influenced by HP's
calculators. The calculators from HP AFAIK were influenced by Forth. Chuck
Moore however got the stack idea from Borroughs B5500, but Borroughs did
only have one stack and in so far did not allow to use the stack as far as
in Forth, PostScript or HP's calculators. Furthermore, PostScript is an
interactive language with incremental compiling, and the memory management
of the first version of PostScript was strictly allot-style (so grestore
also freed all memory allocated between the last gsave and grestore).

Linguistic anti-thesis: definitions (/string{list}def instead of : string
list ;), add/sub/mul/div instead of punctation, using of more complex
datatypes instead of raw memory as in Forth, ifelse as function, not
compiling jumps, etc.

Since PostScript bases on a raster language developed at Xerox PARC, the
original authors of this language might be one of the missing links.

>> And what of Java, Open Firmware, etc. etc.? Maybe not so minimal after
>> all.
>

> I agree with Open Firmware, and I wish that it was used more widely.
> It is less clear to me what impact Forth had on Java.

Java's inventor worked on a Display-PostScript compiler before. So that's an
important influence factor. Working for Sun, he *also* must have known
about Open Firmware, but not necessarily in details. Java is a step back to
the origin, because it's nothing more than Borroughs with OOP but without
hardware.

Len Zettel

unread,
May 20, 2002, 7:48:41 PM5/20/02
to

"Elizabeth D. Rather" <era...@forth.com> wrote in message
news:3CE93BC3...@forth.com...
(snip)

> A far more appropriate and productive exercise, IMO, would be to
> nominate him for a McArthur grant, because the money might actually
> enable him to contribute usefully from his current work.
>
Now *there's* an idea!
-LenZ-

> Cheers,
> Elizabeth
>

Michael Parker

unread,
May 20, 2002, 8:08:29 PM5/20/02
to
a...@redhat.invalid wrote:
> Tim Daneliuk <tun...@tundraware.com> wrote:
>
>
>>I share your view, BTW, that there is nothing inherently bad about the
>>OO paradigm, C++ was just a really bad first effort.
>
>
> First effort? At least one object-oriented programming language was
> in use before C was invented, never mind C++! :-)
>
> Andrew.

maybe he meant first effort for Stroustrup?

Tim Daneliuk

unread,
May 20, 2002, 8:40:03 PM5/20/02
to

Actually, I meant "first effort with wide commercial support/appeal"...

Jeff Fox

unread,
May 19, 2002, 10:54:09 PM5/19/02
to
Len Zettel wrote:
> I'm not sure he (Chuck Moore) ever published a

> peer-reviewed paper in his life. The journals were
> certainly not prestigious. Elizabeth has recounted
> the bitter opposition her paper on the history of
> Forth aroused.

Not to mention that in addition to contributing to the paper
that Elizabeth got published at HOPL II Chuck submitted a paper
at the same time that they simply rejusted. I think it helped
discourage him from bothering to write articles. I found it quite
insightful into the nature of early Forth.

[Moore, 1991] Moore, Charles H.,The Invention of Forth,
(submitted to HOPL II but rejected)
http://www.colorforth.com/HOPL.html

I found it quite insightful into the nature of early Forths.

In the references it lists a number of other papers including:

[Rather, 1993] Rather, Elizabeth D., Colburn, Donald R. and
Moore, Charles H., The Evolution of Forth, in the History of
Programming Languages-II, Bergin T. J. and Gibson, R.G., Ed.,
New York NY: Addison-Wesley, 1996, ISBN 0-202-89502-1
http://www.forth.com/Content/History/History1.htm

[Moore, 1958] Moore, Charles H. and Lautman, Don A., Predictions
for photographic tracking stations - APO Ephemeris 4, in SAO Special
Report No. 11, Schilling G. F., Ed., Cambridge MA: Smithsonian
Astrophysical Observatory, 1958 March.

[Moore, 1970] Moore, Charles H. and Leach, Geoffrey C., FORTH -
A Language for Interactive Computing, Amerterdam NY: Mohasco
Industries, Inc. (internal publication) 1970.
http://www.ultratechnology.com/4th_1970.pdf
http://www.ultratechnology.com/4th_1970.html

[Moore, 1972] Moore, Charles H. and Rather, Elizabeth D., The
FORTHprogram for spectral line observation on NRAO's 36 ft.
telescope, Astronomy & Astrophysics Supplement Series, Vol. 15,
No. 3, 1974 June, Proceedings of the Symposium on the Collection and
Analysis of Astrophysical Data, Charlottesville VA, 1972 Nov. 13-15.

[Moore, 1980] Moore, Charles H., The evolution of FORTH, an
unusual language, Byte, 5:8, 1980 August.
(the largest selling issue in the history of the magazine)

[Veis, 1960] Veis, George and Moore, C. H., SAO differential
orbit improvement program, in Tracking Programs and Orbit
Determination Seminar Proceedings, Pasadena CA: JPL, 1960
February 23-26.

I also liked Chuck's statement that
"I'd like to applaud the invaluable work of Hart [Hart, 1978]
in tabulating function approximations with various accuracies.
They have provided freedom from the limitations of existing
libraries to those of us in the trenches."

This was the reference:
[Hart, 1968] Hart, John F., et al, Computer Approximations.
Malabar FL: Krieger, 1968; (Second Edition), 1978,
ISBN 0 88275 642 7.

best wishes,
Jeff Fox

Jeff Fox

unread,
May 20, 2002, 1:13:29 AM5/20/02
to
In reviewing Chuck's 91 version of pre-history

[Moore, 1991] Moore, Charles H.,The Invention of Forth,
(submitted to HOPL II but rejected)
http://www.colorforth.com/HOPL.html

I thought it was interesting that Chuck listed 3D, Spacewar,
and Chess as some of his early applications around 1968
around the time he named what he was doing Forth.

I also noticed from the end of that section
"After giving notice, I wrote and angry poem and a book that
has never been published. It described how to develop Forth
software and encouraged simplicity and innovation. It also
described indirect-threaded code, ..."

I wonder how many people have seen them. I wonder if they
still exist and how hard I could twist Chuck's arm again.
I wonder if Elizabeth got to see them.

I was also fascinated by the line
"The word FIELD was used in the manner of Mohasco and Forth
Inc's data-base management."

It goes along with a comment that Chuck made in the IRC
#forth chatroom a couple of weeks ago. I'll have to ask
if it means what I think it means.

best wishes,
Jeff Fox

Elizabeth D. Rather

unread,
May 20, 2002, 9:30:11 PM5/20/02
to
Bernd Paysan wrote:
>
> Stephen J. Bevan wrote:
> > What sort of linguistic analysis whould suggest this?
>
> dup, roll,

No Forth had ROLL until Fig, well after Postscript was invented.

> the order by which sub and div work (a b sub is a-b).

That's pretty much an inevitable consequence of stack use (see
HP below).

> The use of
> a parameter passing stack. Could also be been influenced by HP's
> calculators. The calculators from HP AFAIK were influenced by Forth.

Definitely not. HP calculators were widely marketed before anyone
but Chuck's best friends had ever heard of Forth.

Joe English

unread,
May 20, 2002, 9:18:32 PM5/20/02
to
Bernd Paysan wrote:
>Stephen J. Bevan wrote:
>> What sort of linguistic analysis whould suggest this?
>
>dup, roll, the order by which sub and div work (a b sub is a-b).
>[...]

>Linguistic anti-thesis: definitions (/string{list}def instead of : string
>list ;), add/sub/mul/div instead of punctation, using of more complex
>datatypes instead of raw memory as in Forth, ifelse as function, not
>compiling jumps, etc.

Also: 'pop' and 'exch' instead of DROP and SWAP (which always
drives me nuts when trying to write PostScript.)


--Joe English

jeng...@flightlab.com

Elizabeth D. Rather

unread,
May 20, 2002, 9:52:15 PM5/20/02
to
Jeff Fox wrote:
>
> I also noticed from the end of that section
> "After giving notice, I wrote and angry poem and a book that
> has never been published. It described how to develop Forth
> software and encouraged simplicity and innovation. It also
> described indirect-threaded code, ..."
>
> I wonder how many people have seen them. I wonder if they
> still exist and how hard I could twist Chuck's arm again.
> I wonder if Elizabeth got to see them.

I have them.



> I was also fascinated by the line
> "The word FIELD was used in the manner of Mohasco and Forth
> Inc's data-base management."
>
> It goes along with a comment that Chuck made in the IRC
> #forth chatroom a couple of weeks ago. I'll have to ask
> if it means what I think it means.

Ask me, at least I know about FORTH, Inc.'s DBM.

Stephen J. Bevan

unread,
May 20, 2002, 11:41:29 PM5/20/02
to
Bernd Paysan <bernd....@gmx.de> writes:
> Stephen J. Bevan wrote:
> > What sort of linguistic analysis whould suggest this?
>
> dup, roll, the order by which sub and div work (a b sub is a-b). The use of
> a parameter passing stack.

I agree there are commonalities, though the above isn't a lot, but
it isn't clear to me that this implies Forth was an (indirect)
ancestor of PostScript.

> ... Furthermore, PostScript is an

> interactive language with incremental compiling,

As is/was Lisp and Pop (the latter being more relevant but I doubt
that the PostScript developers had heard of it).

> and the memory management
> of the first version of PostScript was strictly allot-style

Indeed, though this was later changed to require GC like like Lisp & Pop.

> Since PostScript bases on a raster language developed at Xerox PARC, the
> original authors of this language might be one of the missing links.

Well according to Jim Bowery, he would appear to be the missing link
and also provide the link to Forth:
http://www.geocities.com/jim_bowery/psgenesis.html

> > I agree with Open Firmware, and I wish that it was used more widely.
> > It is less clear to me what impact Forth had on Java.
>
> Java's inventor worked on a Display-PostScript compiler before. So that's an
> important influence factor.

Doesn't that depend on the Forth->PostScript link which has yet to be
established?

> Working for Sun, he *also* must have known about Open Firmware, but
> not necessarily in details.

Sun were working on NeWS and Open Firmware in 1988. Gosling worked on
NeWS. So while Gosling might well have known about Open Firmware it
isn't clear to me that it had any direct influence on Java. Using a
(stack based) virtual machine was nothing new to Gosling having hacked
on Display PostScript and before that (Gosling) Emacs.

Anton Ertl

unread,
May 20, 2002, 6:03:42 PM5/20/02
to
"Elizabeth D. Rather" <era...@forth.com> writes:
>Bernd Paysan wrote:
>>
>> Stephen J. Bevan wrote:
>> > What sort of linguistic analysis whould suggest this?
>>
>> dup, roll,
>
>No Forth had ROLL until Fig, well after Postscript was invented.

fig-Forth was developped in or soon after 1978
(<http://www.forth.com/Content/History/History3.htm#3.1>). Postscript
apparently only started existing after Adobe was founded in 1982, but
it seems to be based on a number of ancestors ("the Design system",
"JaM", "Interpress", see
<http://www.usc.edu/isd/publications/networker/95-96/Mar_Apr_96/innerview.warnock.html>),
and at least "the Design system" apparently existed before 1982; it's
unclear if any of these ancestors had ROLL.

Anyway, Postscript ROLL works differently from fig-Forth ROLL, so it's not an indication of

>> the order by which sub and div work (a b sub is a-b).
>
>That's pretty much an inevitable consequence of stack use

No, they could also have designed sub such that "b a sub" gives "a-b".
I think there are some stack machines around that did so.

However, keeping the arguments in the conventional order helps in
programming, so using that order has more than a 50% chance, and does
not support a thesis of common ancestry.

Julian Fondren

unread,
May 21, 2002, 9:19:55 AM5/21/02
to
"Elizabeth D. Rather" <era...@forth.com> wrote in message news:<3CE9A35D...@forth.com>...

> Definitely not. HP calculators were widely marketed before anyone
> but Chuck's best friends had ever heard of Forth.

A history showing Forth influence to some HP calcs (the HP-71 at least
had Forth in the ROM and later a card with a complete Forth system):

http://groups.google.com/groups?selm=17096MGCOEKTUHKTLOF%40kbbs.com&rnum=2

The most recent RPL calc, the HP-49G, has three languages easily
programmed in by the user: Saturn assembler (through the built-in
MASD), SysRPL (through third-party compilers, though some support
is built-in), and UserRPL -- which is a safe subset of SysRPL with
a friendler interpreter.

The HP-49G's OS is built in a mixture of assembler and SysRPL, and
is "headerless". In order to use the two or three thousand supported
(which addresses the HP-49G team promises not to change) and unsupported
(which don't exist anymore, as the HP-49G ROM has died) SysRPL entry
points you need to get 'extable', which is basically a dictionary for the
ROM. The ROM has a dictionary for the (again, I don't know the exact
number; there are many of them) UserRPL parts of the ROM built-in, of
course, as these form the basic programming language of the calc.

SysRPL's DUP is exactly analogous to Forth's DUP (UserRPL's DUP is the
same, except that it checks the stack first). SysRPL also has ROLL
PICK OVER DROP and many combinations of these -- SWAPDROP 3DROP 2DUP
3PICK3PICK etc. Here is a short SysRPL definition:

:: 91 DUP SysITE ClrSysFlag SetSysFlag ;

which corresponds to

:noname 91 dup SysFlagSet? if ClrSysFlag else SetSysFlag then ;

except... not really =) :: ... ; (when compiled) creates an object which
contains SysRPL code, which is like a Subroutined-threaded Forth definiton.
SysITE actually manipulates what you might as well think of as the input
stream so that the next word is executed if the TOS System-flag is true, or
else the next-next word is executed. Only one of these is. It's not as
simple as this, since

:: 91 DUP SysITE :: ClrSysFlag "Cleared" ; :: SetSysFlag "Set" ; ;

als works how you'd expect. All control-flow works this way.

91 above, BTW, would be compiled to BINT91 (which is simply a word that
returns a 5-nibble number 91 -- SysRPL has BINT0 to I think BINT132, with
other special numbers). Here is code with a floating-point number, a
symbolic constant, a complex number, and an infinite-precision integer:

:: % 5.6
xPI
C% 0.,0.
ZINT 12345678901234567890
;

HP-49G SysRPL has more than this, since the 49G is basically a
symbolic math machine.

finally,

:: CK2 ZERO_DO :: DUP x* ; LOOP ;

which is a very stupid word with a stack-effect of
( any-type-of-number 5-nibble-number -- any-type-of-number^[5-nibble-number] )

Jenny Brien

unread,
May 21, 2002, 9:44:22 AM5/21/02
to
On Mon, 20 May 2002 15:34:50 GMT, alb...@spenarnc.xs4all.nl (Albert
van der Horst) wrote:

>
TO need not cause any problems because of its parsing requirement.
>ISO requires TO to parse but interestingly it is impossible
>to write a standard program that detects that your TO doesn't parse!
>This is because it is forbidden to postpone TO in a standard program.
>(I would be very interested if somebody came up with a program to
>disprove this!)

<quote>
A.6.2.2295 TO
Historically, some implementations of TO have not explicitly parsed.
Instead, they set a mode flag that is tested by the subsequent
execution of name. ANS Forth explicitly requires that TO must parse,
so that TO's effect will be predictable when it is used at the end of
the parse area.
</quote>

BUT
If you put a parsing TO at the end of the parse area it will get a
zero-length name AND "An ambiguous condition exists if name was not
defined by VALUE."

I think that's the only way you can distiguish between the two
versions, and it can't be used in a Standard program, since it has no
useful effect with a parsing TO.

A non-parsing TO will change the action of the next VALUE irrespective
of where it occurs. That will also happen if an error should occur
between the exectuion of TO and its recipient. Not good. The only way
to aviod it is to define REFILL and/or whatever words unnest input to
unset the TO-message.

Jenny Brien
http://www.fig-uk.org
Home of the Fig UK website

Peter Lawrence

unread,
May 21, 2002, 9:57:33 AM5/21/02
to
Julian V. Noble wrote:
>
> Peter Lawrence wrote:
> >
> > Jerry Avins wrote:
> > >
> > > Heartily endorsed!
> >
> Peter Lawrence wrote:
> >
> > Did you hear the one about the four Boy Scouts who helped a little old lady
> > across the road? It took so many because she didn't want to go.
> >
> > Wouldn't it be polite to ask MR. MOORE's views on the subject first? PML.
>
> Get real! What would you expect someone to say if you
> asked "Should I nominate you for a prize that you richly
> deserve but stand a small chance of receiving?" (Sort
> of like me and the Nobel ;-)

YOU get real. As Shaw said, "Do not do unto others as you would have them do
unto you, your tastes may not be the same."

I know from my own personal experience that I have suffered far more from
people trying to help than from people who weren't actively interfering in my
life. So now I try to show consideration of this sort.

>
> One nominates people for prizes because one owes it to one's
> profession.

If I may say so, that is very selfish.

There is no disgrace in not getting the prize,
> but great honor if one does get it. Therefore one should do
> it, if he believes the potential recipient worthy.

Whether or not the potential recipient wants it? All I suggested was that he
should at least be consulted. In my view, it is arrogant to impose in this
way without at least asking.

I definitely
> do, in Moore's case. I just think it would come better from
> someone with more impressive credentials than mine, or I
> would do it myself.

In other words, hard luck, he deserves it so he's liable to have it thrust
upon him. It's all very reminiscent of people who claim that celebrities ask
for it. PML.

--
GST+NPT=JOBS

I.e., a Goods and Services Tax (or almost any other broad based production
tax), with a Negative Payroll Tax, promotes employment.

See http://users.netlink.com.au/~peterl/publicns.html#AFRLET2 and the other
items on that page for some reasons why.

Joe Armstrong

unread,
May 22, 2002, 5:31:45 AM5/22/02
to
Tim Daneliuk <tun...@tundraware.com> writes:

... cut ..

> <grumbling-mode>
>
... wise words ... cut ...

>
> Birth Of Source Of Complexity/
> Paradigm Paradigm Principal Intellectual Problem
> -------- -------- ------------------------------
>
> 1950s Algorithm decomposition/design Comprehending/reducing
> O(n) complexity
>
...

>
> 1980s Functional Programming Transforming real world
> problems into functional
> equivalents upon which
> FP can operate.
... more wise words ...


>
> It is amazing with all the advances we've seen in language theory,
> design tools, and so on that so little attention is paid to
> supporting (a)synchrony, concurrent processing, locking,
> and rendezvous *as first-class properties of the languages themselves*.
> FORTH clearly lead the way here but newer languages like Erlang
> are starting to think this way too.

Thank you - at last ...

I have been arguing myself blue in the face that that concurrency
should be a property of the language and NOT the OS.

Concurrency has got its a thoroughly bad name because of the lousy
way threads are *implemented* in most languages.

Green threads in Java give concurrency a bad name - creating a
parallel process in Erlang (or Oz) is about 100 times faster than in
Java (or C#) - that's because concurrency is *designed into the
language*

The real-world *is* concurrent - IMHO writing programs to interact
with the real-world is a simply a matter of identifying the
concurrency in the problem, identifying the message channels and
mapping these 1:1 onto the code - the program then writes itself.

Programming with threads is usually considered a "black art" (mainly
because every implementor starts by writing their own scheduler) - - I
guess if every Java programmer had to start by writing their own
garbage collector then automatic memory management would not be
popular; this is more or less analogous to writing a scheduler if you
want to write a concurrent application :-)

Once you put concurrency into the language a lot of things look
*very* different - (and IMHO are simpler to formulate) - my favorite
is error handling. In Erlang the golden rule is "let it crash" - we
divide applications into two concurrent processes - a doer (which does
the job) and a watcher (who observes the doing of the job). The doer
has *no* code for error handling - the watcher has.

This gives us total separation of concerns - you either concentrate
on "doing the job" or on "recovering from an error"

>
> I reserve the right to be wrong, but it seems to me that we ought to be
> evolving to a world in which programmers of complex systems write in
> meta-languages like Python or Ruby - which embrace OO, procedural, DBMS and
> Functional Programming notions - but whose runtimes would be best delivered
> in something like FORTH or Erlang. Different software paradigms could thus
> be used interchangeably because they are first-class properties of the
> selected language, but the intermediate code output would be realized in
> a highly efficient, easily optimized, event-sensitive runtime environment
> This is is probably a pipe-dream because trying to efficiently map
> the semantics of a late-bound dynamic language like Python onto
> a sleek FORTH runtime is possibly too costly in terms of runtime
> size and speed. Or is it?

The Erlang run-time is highly optimised for Erlang. I'd like to see
the ideas in the Erlang run-time in the .NET run-time.

The first Erlang VM was a pretty forth-like machine. Creating a
thread and sending a message etc. was achieved with a single byte
op-code. Later versions use a threaded-word interpretor (again like
forth) - Unlike Forth Erlang functions are well disciplined in their
stack usage :-)

Unfortunately the .NET run time and IL seems best suited for
implementing sequential languages - hence the appalling thread
behaviour of C# - they made the *same* mistake as in Java - you get
the thread behaviour of the underlying OS - NOT (as is should be) a
defined property of the language.

/Joe Armstrong

>
> Then again, I'm just Grumbling ...
>

... grumble on, you're quite right ...


> </grumbling-mode>
> ------------------------------------------------------------------------------
> Tim Daneliuk
> tun...@tundraware.com

Anton Ertl

unread,
May 22, 2002, 3:36:09 AM5/22/02
to
alb...@spenarnc.xs4all.nl (Albert van der Horst) writes:
>In article <2002May1...@a0.complang.tuwien.ac.at>,
>Anton Ertl <an...@mips.complang.tuwien.ac.at> wrote:
>>I would argue against that view. CREATE...DOES> as usually used is
>>just using compile-time binding, and TO etc. are doing that, too. For
>>OO, run-time binding is required. Apart from that, TO etc. cause a
>>lot of problems because of their parsing behaviour.
>
>OO is about concepts. Early or late binding has nothing to do with it.

It has everything to do with it. Take the textbook OOP example:
drawing a collection of various graphical objects. With early binding
you can just draw one kiind of object, with late binding you can draw
each object in the right way.

>This implementation of VALUE clearly shows the relation with OO concepts.
>And I claim it is ISO-compliant.
>
>VARIABLE TO-MESSAGE
>0 CONSTANT %from : FROM %from TO-MESSAGE ! ;
>1 CONSTANT %to : TO %to TO-MESSAGE ! ;
>2 CONSTANT %+to : +TO %+to TO-MESSAGE ! ;
>
>: VALUE CREATE DOES>
> TO-MESSAGE @ %from = IF @ THEN
> TO-MESSAGE @ %to = IF ! THEN
> TO-MESSAGE @ %+to = IF +! THEN
> FROM
>;

So you can implement VALUEs in a (very cumbersome) OO way. That shows
that OOP is at least as powerful as VALUEs and their prepositions; it
does not show that VALUEs are OOP.

As for a standard program where your implementation fails; how about:

: double-to ['] to dup >r execute r> execute ;
value a
value b
1 2 double-to a b

It's not entirely clear that ticking TO is allowed, though: it has
interpretation semantics, but no execution semantics. Since ticking
TO and S" could make any STATE-smartness in the implementation of
these words visible, the TC will probably decide it's non-standard if
asked.

Anton Ertl

unread,
May 22, 2002, 3:55:49 AM5/22/02
to
k...@panix.com (Keith A. Lewis) writes:
>How many ACM members know APL or Iverson?

Many, I think, probably about as many as for Forth and Moore. The
problem is not that they have not heard of it, but that they don't
think highly of it. I would have expected APL and Iverson to have a
similar problem, and therefore I am surprised that he received the
award. Maybe it has something to do with Backus' Turing award lecture
a few years earlier, where he proposed FP, a functional programming
variant of APL.

Anton Ertl

unread,
May 22, 2002, 4:00:08 AM5/22/02
to
Bernd Paysan <bernd....@gmx.de> writes:
>Java's inventor worked on a Display-PostScript compiler before. So that's an
>important influence factor.

The JVM has SWAP, not EXCH, indicating additional influences. I would
be surprised if the JVM authors did not know a little bit about Forth.

>Java is a step back to
>the origin, because it's nothing more than Borroughs with OOP but without
>hardware.

Pascal P-code also comes to mind.

Anton Ertl

unread,
May 22, 2002, 4:06:58 AM5/22/02
to
jeng...@flightlab.com (Joe English) writes:
>Also: 'pop' and 'exch' instead of DROP and SWAP (which always
>drives me nuts when trying to write PostScript.)

That's easy to solve:

/drop {pop} def
/swap {exch} def

Elizabeth D. Rather

unread,
May 22, 2002, 2:35:52 PM5/22/02
to
Anton Ertl wrote:
>
> k...@panix.com (Keith A. Lewis) writes:
> >How many ACM members know APL or Iverson?
>
> Many, I think, probably about as many as for Forth and Moore. The
> problem is not that they have not heard of it, but that they don't
> think highly of it. I would have expected APL and Iverson to have a
> similar problem, and therefore I am surprised that he received the
> award. Maybe it has something to do with Backus' Turing award lecture
> a few years earlier, where he proposed FP, a functional programming
> variant of APL.

Iverson at least worked for IBM, which was at the time a very good
credibility factor. And IBM officially promulgated APL, as did
a number of companies ~1970. APL didn't really catch on, though,
probably due to its requirement for special characters and keyboards,
but was quite innovative and interesting in its way.

The Turing award is partly about what you did, but also to a very
great degree who you are and what credentials you present that are
acceptable to the Computer Science community. That's a problem
for Chuck, since he never played in that sand box.

Albert van der Horst

unread,
May 22, 2002, 5:57:36 AM5/22/02
to
In article <3cea455...@usenet.plus.net>,

Jenny Brien <jen...@figuk.plus.com> wrote:
>On Mon, 20 May 2002 15:34:50 GMT, alb...@spenarnc.xs4all.nl (Albert
>van der Horst) wrote:
>
>>
>TO need not cause any problems because of its parsing requirement.
>>ISO requires TO to parse but interestingly it is impossible
>>to write a standard program that detects that your TO doesn't parse!
>>This is because it is forbidden to postpone TO in a standard program.
>>(I would be very interested if somebody came up with a program to
>>disprove this!)
>
><quote>
>A.6.2.2295 TO
>Historically, some implementations of TO have not explicitly parsed.
>Instead, they set a mode flag that is tested by the subsequent
>execution of name. ANS Forth explicitly requires that TO must parse,
>so that TO's effect will be predictable when it is used at the end of
>the parse area.
></quote>

I am aware of that quote.

>BUT
>If you put a parsing TO at the end of the parse area it will get a
>zero-length name AND "An ambiguous condition exists if name was not
>defined by VALUE."

Exaxctly. So you have to make a standard program that provokes
an ambiguous condition. But then it is no longer standard.

>I think that's the only way you can distiguish between the two
>versions, and it can't be used in a Standard program, since it has no
>useful effect with a parsing TO.

Ambiguous means that it may crash or do whatever it likes.
So if I detect that someone is using my non-parsing TO at the
end of the parse area, I just reformat the hard disk to
remove all traces of what happened. (Running on WIndows, or as
superuser.)

>A non-parsing TO will change the action of the next VALUE irrespective
>of where it occurs. That will also happen if an error should occur

Actually because this is an ambiguous condition nobody cannot forbid
me to do that and I am still be standard.

>between the exectuion of TO and its recipient. Not good. The only way

^^^^^^^
Of course you are entitled to that opinion. But we are talking legal
here.

>to aviod it is to define REFILL and/or whatever words unnest input to
>unset the TO-message.

^^^^^^^^^^
Good! You are picking up the OO language.

>Jenny Brien

Groetjes Albert.

Jeff Fox

unread,
May 22, 2002, 8:11:11 PM5/22/02
to
"Elizabeth D. Rather" wrote:
> > I was also fascinated by the line
> > "The word FIELD was used in the manner of Mohasco and Forth
> > Inc's data-base management."
> >
> > It goes along with a comment that Chuck made in the IRC
> > #forth chatroom a couple of weeks ago. I'll have to ask
> > if it means what I think it means.
>
> Ask me, at least I know about FORTH, Inc.'s DBM.

Thanks. Chuck was recently asked how he did C style
structures. He replied, in his typical fashion,
that he didn't (do them that way). He said something
about how the word FIELD selected the current record
in the current file in the Forth Inc. software.

This made me think that a database was built starting
with BLOCK and that records and fields were definable
in Forth and thus provided a mechanism similar to
programmer defined structures in C, but different.
It seems like having database and file system words as
parts or extensions of Forth would provide a mechanism
similar to structures in C.

Can you tell us more about FIELD and such in that software?

best wishes,
Jeff

Elizabeth D. Rather

unread,
May 22, 2002, 9:18:27 PM5/22/02
to
Jeff Fox wrote:
> ...

> Chuck was recently asked how he did C style
> structures. He replied, in his typical fashion,
> that he didn't (do them that way). He said something
> about how the word FIELD selected the current record
> in the current file in the Forth Inc. software.
>
> This made me think that a database was built starting
> with BLOCK and that records and fields were definable
> in Forth and thus provided a mechanism similar to
> programmer defined structures in C, but different.
> It seems like having database and file system words as
> parts or extensions of Forth would provide a mechanism
> similar to structures in C.
>
> Can you tell us more about FIELD and such in that software?

You're on the right track. Here's a brief tutorial:

A FILE was a contiguous range of blocks. Its definition
included starting blk#, size, record size.

A RECORD was a fixed-length subset of a file, less than
a block in size. Records couldn't straddle block
boundaries, if the record size was such that they didn't
fit evenly in a block some bytes were left over at the
end of the block.

A FIELD was a fixed-length subset of a record. There
were several field types, such as NUMERIC (single-cell
integer), DOUBLE (2-cell integer), BYTES (string <n>
bytes long. There were different defining words for
each field type, and each associated the name of the
definition with an offset from the beginning of the
record.

At any instant there's a "current file" and "current
record" within that file. You select a file by
invoking its name; you select a record by <n> READ.

All real access was done by <field-name> <method>.
There were methods for each field type.

For example:

\ Blk# b/r Recs Blks name
1000 32 128 4 FILE PEOPLE

NUMERIC SEQ# DOUBLE PHONE# 26 BYTES NAME

: SHOW ( n -- ) \ Display record n
PEOPLE READ CR
SEQ# N@ . NAME B? PHONE# D@ .PHONE# CR ;

There's a little more to it, including index methods
etc., but in principle it's very simple. Some very
large and complex databases have been built on it.
One was the first electronic funds transfer program
written by Citibank in the 1970's. Another was a
municipal bond analysis system developed in 1978
and still in use... the customer told me a state
agency spent $5M trying to duplicate it using
"modern" database tools but failed (programs too
slow, databases too large; our version used a
2-dimensional bit matrix that they couldn't
replicate using standard DB tools).

FWIW, here's how to do C-style structs in
Forth:

\ Structs
0 constant struct
: field ( n1 size -- n2 ) create over , +
does> ( a1 -- a2 ) @ + ;

\ Example:
struct ( scsi registers)
0c field >cmd-adr \ Up to 12 command bytes
4 field >cmd-le \ Length of command block
4 field >data-adr \ Base address of DMA data area
4 field >data-len \ Length of data area
1 field >host-selectid \ Host's selection ID
1 field >target-selectid \ Target's selection ID
1 field >input? \ 1 for output, 0 for input
1 field >message-out \ Outgoing message byte
\ more fields...
constant /scsi-regs \ Size of register structure

Samuel A. Falvo II

unread,
May 23, 2002, 3:37:36 PM5/23/02
to
"Elizabeth D. Rather" <era...@forth.com> wrote in message news:<3CEC439E...@forth.com>...

> For example:
>
> \ Blk# b/r Recs Blks name
> 1000 32 128 4 FILE PEOPLE
>
> NUMERIC SEQ# DOUBLE PHONE# 26 BYTES NAME
>
> : SHOW ( n -- ) \ Display record n
> PEOPLE READ CR
> SEQ# N@ . NAME B? PHONE# D@ .PHONE# CR ;

As a regular practitioner of the "C-style structures," I immediately
see the above as merely a specialization of the concept. It's highly
adapted for use using a block structure, where the base address is
kept in a variable or computed at run-time:

VARIABLE FieldOffset

: NUMERIC CREATE FieldOffset @ , 1 CELLS FieldOffset +!
DOES> @ RecordAddr + ;

People keep telling me that Chuck's way is somehow "faster". Faster
to type, maybe, but overall run-time performance will be about the
same for structures stored in blocks.

Hence, the only advantage to Chuck's method is it's easier to type,
which for me, is equivalent to a stylistic issue.

--
Samuel A. Falvo II

Elizabeth D. Rather

unread,
May 23, 2002, 6:23:19 PM5/23/02
to
"Samuel A. Falvo II" wrote:
> ...

> As a regular practitioner of the "C-style structures," I immediately
> see the above as merely a specialization of the concept. It's highly
> adapted for use using a block structure, where the base address is
> kept in a variable or computed at run-time:
>
> VARIABLE FieldOffset
>
> : NUMERIC CREATE FieldOffset @ , 1 CELLS FieldOffset +!
> DOES> @ RecordAddr + ;
>
> People keep telling me that Chuck's way is somehow "faster". Faster
> to type, maybe, but overall run-time performance will be about the
> same for structures stored in blocks.

Indeed, that's roughly how it's done, except the offset is kept
on the stack at compile time, so the VARIABLE isn't required.

Albert van der Horst

unread,
May 23, 2002, 10:47:58 AM5/23/02
to
In article <2002May2...@a0.complang.tuwien.ac.at>,

Anton Ertl <an...@mips.complang.tuwien.ac.at> wrote:
>alb...@spenarnc.xs4all.nl (Albert van der Horst) writes:
>>In article <2002May1...@a0.complang.tuwien.ac.at>,
>>Anton Ertl <an...@mips.complang.tuwien.ac.at> wrote:
>>>I would argue against that view. CREATE...DOES> as usually used is
>>>just using compile-time binding, and TO etc. are doing that, too. For
>>>OO, run-time binding is required. Apart from that, TO etc. cause a
>>>lot of problems because of their parsing behaviour.
>>
>>OO is about concepts. Early or late binding has nothing to do with it.
>
>It has everything to do with it. Take the textbook OOP example:
>drawing a collection of various graphical objects. With early binding
>you can just draw one kiind of object, with late binding you can draw
>each object in the right way.

If you argue that late binding is the True Way of OO I am with you.

>>This implementation of VALUE clearly shows the relation with OO concepts.
>>And I claim it is ISO-compliant.

<CODE SNIPPED>


>
>So you can implement VALUEs in a (very cumbersome) OO way. That shows
>that OOP is at least as powerful as VALUEs and their prepositions; it
>does not show that VALUEs are OOP.

The first time we demonstrated the transputer Forth (tForth) the
number theoretical program (multiple polynomial quadratic sieve)
was organised in this, let us call it OO-reminiscent, way.
So we had a matrix visualised on the screen where you could send
partial solutions to (messages) etc. You had TheNumber where
you could send potential factors too and it would stop the
whole circus if it discovered it was fully factored etc.
So I maintain that organising your program in this way with
CREATE DOES> is a light weight way of doing OO and in my view
the advantages of OO were realised in this program.
I can't agree that it is ``very cumbersome''. It is a bit slow,
but inspecting messages was no overhead compared to all the
calculation with big numbers that was going on.

>As for a standard program where your implementation fails; how about:
>
>: double-to ['] to dup >r execute r> execute ;
>value a
>value b
>1 2 double-to a b
>
>It's not entirely clear that ticking TO is allowed, though: it has
>interpretation semantics, but no execution semantics. Since ticking
>TO and S" could make any STATE-smartness in the implementation of
>these words visible, the TC will probably decide it's non-standard if
>asked.

Exactly my point. So this is not a counter example.
The paragraph of the Standard that was quoted is very peculiar,
because it impedes implementors without any need. This is unique
because in the whole standards the emphasis is on what a user
can rely on, not on what an implementor is disallowed from
doing to realise an implementation.

I guess there is a lot of politics working in the background
of that paragraph. (Anti-europism, because the solutions
disallowed is the typical classical European solution.)

>- anton
Groetjes Albert

jmdrake

unread,
May 24, 2002, 5:34:30 PM5/24/02
to
an...@mips.complang.tuwien.ac.at (Anton Ertl) wrote in message news:<2002May2...@a0.complang.tuwien.ac.at>...

> alb...@spenarnc.xs4all.nl (Albert van der Horst) writes:
> >In article <2002May1...@a0.complang.tuwien.ac.at>,
> >Anton Ertl <an...@mips.complang.tuwien.ac.at> wrote:
> >>I would argue against that view. CREATE...DOES> as usually used is
> >>just using compile-time binding, and TO etc. are doing that, too. For
> >>OO, run-time binding is required. Apart from that, TO etc. cause a
> >>lot of problems because of their parsing behaviour.
> >
> >OO is about concepts. Early or late binding has nothing to do with it.
>
> It has everything to do with it. Take the textbook OOP example:
> drawing a collection of various graphical objects. With early binding
> you can just draw one kiind of object, with late binding you can draw
> each object in the right way.

OO is classicly defined as having 3 properties: 1) Abstraction (information
hiding) 2) Polymorphism (late binding) 3) Inheritence.

I suppose that CREATE...DOES> can be thought of as a type of abstraction.
Though information isn't "hidden" it does let you look at certain problems
at a higher level. Standard Forth does support polymorphism through tick
and execute. Here's the textbook OOP example in standard Forth.

: stars 0 do 42 emit loop ;

: drawtriangle
1+ 1 do
2dup swap i - swap at-xy i 2* 1- stars 1+
loop 2drop ;

: drawsquare
dup 0 do
>r 2dup at-xy 1+ r> dup stars
loop drop 2drop ;

: geo.obj create swap rot , , , , does> dup 3 cells + ;

: @+ dup @ swap cell+ ;

: getparams @+ @+ @ ;

: draw @ >r getparams r> execute ;

' drawtriangle 10 5 8 geo.obj triangle1
' drawsquare 15 8 10 geo.obj square1
triangle1 draw square1 draw

Quick explanation of the code. The CREATE part of geo.obj stores the
XT for the specified drawing word as well as the parameters X, Y and
height. The DOES part puts the address for the start of the member
variables X,Y and H on the stack as well as the address for the
drawing word. DRAW retrieves the paramaters X, Y and H, the function
pointer and then executes the function.

Now what's missing? Inheritence for one thing. This example breaks
down if I need an additional member variable. For example, say if
I want to draw a rectangle? Then I need X, Y, Height and Width.
But polymorphism isn't a problem. Also note, this same example
could work in Pascal which isn't object oriented but has procedure
types. And likewise the Pascal example will break if you need
additional member variables or functions. Oberon, a descendent
of Pascal, has inheritence through record extension. It also has
abstraction by way of modules. It is considered object oriented.

Jeff Fox

unread,
May 25, 2002, 12:37:18 AM5/25/02
to
"Samuel A. Falvo II" wrote:
> People keep telling me that Chuck's way is somehow "faster".
> Faster to type, maybe, but overall run-time performance will be about the
> same for structures stored in blocks.

The other day in the chat room I was not talking about
what Chuck was doing 35 years ago at Mohasco or later at Forth
Inc. but what he is doing today. It is nice that you can sort of
map what Chuck did thirty years ago to what you are doing today.
Blocks to memory mapped files, database record fields to
C structure fields etc. but I think you still missed the
`somehows' that account for the way faster.

I have explained the reasons for big numbers before. I did
last week again, well I tried. Chuck sat through all our
many discussions at iTV about the worst examples vs the best
examples of using his alternatives to structures compared to
using C style structures in Forth. Worst case being the code
that was pasted in from the FSL, best case being machineForth
done Chuck's way. Today, the machineForth code would be presented
in colorforth source format but the object code being specified
would be the same.

In my tutorials at iTV I showed that 99% of the time, by
runtime analysis of my code, the access was free. The
overhead was zero so comparing it to what the people doing C
style strucutres in ANS Forth was difficult because
division by zero is undefined. Perhaps your Forth compiler
has some advanced optimizations that I am not aware of.


> Hence, the only advantage to Chuck's method is it's easier to type,
> which for me, is equivalent to a stylistic issue.

I really don't know what you are talking about. Maybe you
can explain it to me. I consider anything bigger than 2 big.
2-3x here times 5-10x there times 100x over here etc. result in
really big numbers in this context. That's why when asked the
question, "How do you do C style structures, which are essential
to my work? Chuck's answer was "I don't."

Perhaps you are comparing what Chuck had been
doing 35 years ago with what you are doing today.

best wishes,
Jeff Fox

Albert van der Horst

unread,
May 25, 2002, 5:27:47 AM5/25/02
to
In article <e20a4a47.02052...@posting.google.com>,
jmdrake <jmdra...@yahoo.com> wrote:

>OO is classicly defined as having 3 properties: 1) Abstraction (information
>hiding) 2) Polymorphism (late binding) 3) Inheritence.

No, it is the modern version (and inasfar dogmatic: perversion) of
the class concept of SIMILA (1967). It has cristallized with Meyers
book about Eiffel.

If one were to be as dogmatic as some OO advocates about structured
programming, Forth would not be counted among the structured programming
languages.
An argument like "Forth has no looping construct with a built in
condition" would be used to say that Forth is not structured.
There was an editorial in Dobb's some time ago where it was argued
that the polymorphism was in fact never essential, and that we just
had been repeating the same stories to each other over and over again.

Object orientation is useful, but you must get at the heart of it.
Polymorphism and inheritance are just limbs that you can cut off.
Abstraction is, of course, right at the heart.

Greetings Albert

Samuel A. Falvo II

unread,
May 26, 2002, 7:28:20 PM5/26/02
to
Jeff Fox <f...@ultratechnology.com> wrote in message news:<3CEF16BC...@ultratechnology.com>...

> The other day in the chat room I was not talking about
> what Chuck was doing 35 years ago at Mohasco or later at Forth
> Inc. but what he is doing today. It is nice that you can sort of

What he was doing 35 years ago, and what he's doing today, to me, are
exactly the same.

> map what Chuck did thirty years ago to what you are doing today.
> Blocks to memory mapped files, database record fields to
> C structure fields etc. but I think you still missed the
> `somehows' that account for the way faster.

Those somehows were not explained. If they were, please point them
out to me, because I sure didn't see them.

You can't break the laws of physics, and as a physicist yourself, you
should know this already. To store a datum at a storage location, the
computer *somehow* has to do the math to figure out where to store it.
In the simplest possible case, the programmer says, "Store it at
location 56," and be done with it. That 56 is hardcoded into the
program, and as such, no further computation is necessary.

The disadvantage of this solution is, as I've pointed out during those
same chat room discussions, is you're limited to one, and *only* one,
instance of that datum. Storage location 56 must have some special
significance to the program, or else it wouldn't be storing data
there.

If the computer needs to store one datum, it will likely need to store
several of them. What's worse, the computer will quite likely NOT
know how many data it'll need before hand; it will have to create
them, basically, "on the fly" at run time. Hence, the locations of
the data will not even be known until it actually needs them.

Thus, possessing the ability to define a data structure allows the
programmer to specify a base address for an object, and an offset.
The computer does the math, either implicitly in a primitive, or
explicitly by using "C-style structures" (never mind the fact that
they're not bound to being C-style by any stretch of the imagination).
It doesn't matter when or how it does it; the fact is, at some point
in time, a base address and an offset MUST be computed.

Now, if you're referring to Chuck's OKAD application, where most of
his work seems to have been concentrated recently, I'd like to point
out that he DOES know, ahead of time, how much data he'll need,
because chips don't dynamically change size while in operation (except
for thermal expansion and contraction, but that's negligable). That's
different: he can allocate N data structures, all contiguously
allocated in memory, and have free reign over how they're accessed.

Software which must run in, and even participate in, the dynamically
changing world around it, however, doesn't have that luxury.

> In my tutorials at iTV I showed that 99% of the time, by
> runtime analysis of my code, the access was free. The

Please explain yourself here. Run-time analysis of my code tells me
that structures are not only necessary, but even critical, to the
success of my project.

> overhead was zero so comparing it to what the people doing C
> style strucutres in ANS Forth was difficult because
> division by zero is undefined. Perhaps your Forth compiler
> has some advanced optimizations that I am not aware of.

More accurately, perhaps your development processes have not been
fully exposed or explained.

> > Hence, the only advantage to Chuck's method is it's easier to type,
> > which for me, is equivalent to a stylistic issue.
>
> I really don't know what you are talking about. Maybe you

It's very simple. It's easier to type because:

1. Chuck assigns fields with the same names to the same offsets
2. Chuck doesn't ever access data structures outside of a block

Hence, his word nomenclature need not include explicit type naming
(e.g., String.Address versus just Address) -- less characters to type,
and hence, faster to type.

But not faster to execute. Between his style and mine, it's clear
that, *TO ACHIEVE THE SAME RESULTS*, we both need access to blocks,
and we both need to update blocks, and we both need to flush them back
to backing storage, in the same relative order. The method in which
this is done is not important. While there will be some slight
differences in execution times, the I/O latency will more than dwarf
those differences. Hence, over the course of a program execution
cycle, our programs will run essentially at the same speed.

> can explain it to me. I consider anything bigger than 2 big.
> 2-3x here times 5-10x there times 100x over here etc. result in
> really big numbers in this context. That's why when asked the
> question, "How do you do C style structures, which are essential
> to my work? Chuck's answer was "I don't."

Using C-style structures as defined by Elizabeth may incur some
runtime overhead, only because it's implemented quickly and naively.
It's far from the only way of implementing them. It's quite possible
to implement the words as compiler words which updates a state
variable in the compiler for the next @ or ! instruction to be
compiled. Hence, the emitted fetch or store can use the appropriate
CPU (reg+offset) addressing mode, without explicit additions. Hence,
overhead becomes *zero*, and thus, supports my observation that
Chuck's technique really isn't all that different from what other
people are doing.

Now, here's some food for thought. With careful analysis of how the
fields of a data structure is used, and in what order, you can create
an "optimal" structure layout such that you effectively "stream" data
structures onto and off of the data stack. I noticed that Chuck does
this extensively in his Machine Forth examples, employing the A
register for this. Hence, in only a couple of clocks, you blast an
entire data structure onto the data stack, where access to it is
essentially free thereafter. This, I think, is what you're most
likely talking about, where the cost of computing effective addresses
is amortized across the size of the structure.

However, this doesn't invalidate or even compete with the use of
C-styled structure access. In fact, it compliments it quite nicely.
I utilize a similar technique for building my word headers in my
FS/Forth program that I'm writing. Although the word header is
defined (and hence documented) in the source as:

STRUCTURE
1 CELLS FIELD xt.Previous
SIZEOF_String FIELD xt.Name
1 CELLS FIELD xt.SourceBlock
END-STRUCT SIZEOF_xt

I actually build the word headers by streaming data into memory:

: (S,) ( caddr u -- )
THERE T>HOST SWAP CMOVE ;

: S, ( caddr u -- )
2DUP (S,) NIP TALLOT ;

: Name, ( -- caddr u )
THERE GetWord DUP >R S, R> ;

: Header, ( caddr u -- )
THERE tLastXT T!
tPrevXT T@ T, SWAP T, T, BLK @ T, ;

: :-Header,
Name, Header, ;

You'll also notice that I made xt.Previous the first field of the
structure. Since it's clear that that will be the most frequently
accessed field of the structure, it doesn't make sense to put it
anywhere else -- addition with zero is a no operation. Hence, I can
just @ to get the next execution token.

The name field is next in line, since that's actually the second most
heavily used field. Access to that field occurs with the use of $@
and $!:

( xt ) xt.Name $@ TYPE

will print the name of the word the xt points to. $@ and $!, as you
might expect, more or less stream data onto the stack. Since ANSI
Forth doesn't have A! and @A+ and the like, it's relatively high
overhead. Of course it's going to take some time to execute. Machine
Forth doesn't have that restriction. Of course, it'll be near-zero
overhead. If my memory serves me correctly about MachineForth, this
is the difference:

: $@ ( str -- caddr u )
DUP String.Address @ SWAP String.Length @ ;

': $@ ( str -- caddr u )
A! @A+ @A+ ;'

You do the math. Of course, in Pygmy Forth, I could also do this:

CODE $@
0 [BX] AX MOV,
AX PUSH,
2 [BX] BX MOV,
NXT,
END-CODE

Can't get much cleaner than that. BUT, I didn't do that, because
frankly, the naive solution works just as good, gives no noticable
degradation in performance compared to the uber-speedy version, and
porting it is made more difficult (I quite regularly exchange software
between Pygmy and GForth, for example). Hence, it's counter to what I
like Forth for -- what good is a super fast program if I have to spend
hours hacking at the code to port it to specific environments?

Now, if I were to write a program that did, say, numerical
simulations, or super heavy string manipuation, or was so CPU bound
that I/O latencies are no longer the limiting factor of program
performance, **AND** I could prove that structure accesses were a
large, contributing factor to the lack of performance, then yes, I'd
optimize that.

To conclude, my experience shows that your 2-3x slowdown in "record
access" (to make it explicitly more general than structure access) by
using structures is simply unfounded in all the software I've written
to date. I therefore have to take your claims of 5x to 100x slowdown
as being the result of a poor algorithm, and NOT from the use of
structures.

Bernd Paysan

unread,
May 26, 2002, 3:57:03 PM5/26/02
to
Albert van der Horst wrote:
> Object orientation is useful, but you must get at the heart of it.
> Polymorphism and inheritance are just limbs that you can cut off.

Albert, if you cut polymorphism and inheritance out of OO, it becomes just a
namespace+structures thing.

And by the way: Forth has about every structural concept that you can
imagine. If you want to implement Algol's for i to j by k while cond do
xyz;, you get

( j+1 i -- ) DO cond 0= IF LEAVE THEN xyz k +LOOP

or ?LEAVE instead of IF LEAVE THEN if available. Unlike the Algol case, k
can be different for each iteration. Forth allows to implement *all* cases
Knuth gives in his book "Structured programming with Gotos" without a
single goto.

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/

Bruce McFarling

unread,
May 27, 2002, 6:44:25 PM5/27/02
to
On Sun, 26 May 2002 21:57:03 +0200, Bernd Paysan <bernd....@gmx.de>
wrote:

>Albert van der Horst wrote:
>> Object orientation is useful, but you must get at the heart of it.
>> Polymorphism and inheritance are just limbs that you can cut off.

>Albert, if you cut polymorphism and inheritance out of OO, it becomes just a
>namespace+structures thing.

What if you cut polymorphism out and leave inheritence in?

(
----------
Virtually,

Bruce McFarling, Newcastle,
ec...@cc.newcastle.edu.au
)

Bernd Paysan

unread,
May 28, 2002, 4:19:38 AM5/28/02
to
Bruce McFarling wrote:

> On Sun, 26 May 2002 21:57:03 +0200, Bernd Paysan <bernd....@gmx.de>
> wrote:
>
>>Albert van der Horst wrote:
>>> Object orientation is useful, but you must get at the heart of it.
>>> Polymorphism and inheritance are just limbs that you can cut off.
>
>>Albert, if you cut polymorphism and inheritance out of OO, it becomes just
>>a namespace+structures thing.
>
> What if you cut polymorphism out and leave inheritence in?

You get a crippled "early-binding-only" sort-of-OO system. IMHO the more
useful subset of OO is if you leave inheritance out, and just put
polymorphism in.

M Joonas Pihlaja

unread,
May 28, 2002, 3:07:29 PM5/28/02
to
On Sun, 26 May 2002, Bernd Paysan wrote:

> And by the way: Forth has about every structural concept that you can
> imagine.

[snip]

How do you implement a (possibly labelled) break or continue
statement that can be used in nested control flow constructs in
Forth? I.e. something like:

loop1:
while (condition1) {
loop2:
while (condition2) {
\ ...more nesting
if (some_condition) { continue outer; }
}
}

The crucial point here is that the user of the continue statement
shouldn't need to care about how deep they are in the control
flow. There doesn't seem to be a way for custom control flow
constructs to mark a point on the control flow stack for later
CS-PICKing, is there?


Regards,

Joonas

jmdrake

unread,
May 28, 2002, 6:20:02 PM5/28/02
to
alb...@spenarnc.xs4all.nl (Albert van der Horst) wrote in message news:<GwnuAC.JDH...@spenarnc.xs4all.nl>...

> In article <e20a4a47.02052...@posting.google.com>,
> jmdrake <jmdra...@yahoo.com> wrote:
>
> >OO is classicly defined as having 3 properties: 1) Abstraction (information
> >hiding) 2) Polymorphism (late binding) 3) Inheritence.
>
> No, it is the modern version (and inasfar dogmatic: perversion) of
> the class concept of SIMILA (1967). It has cristallized with Meyers
> book about Eiffel.

Interesting argument. There's one slight problem with it though.
Simula 67 CLEARLY supports inheritence!

http://www.cis.um.edu.mt/~jskl/talk.html#Classes



> If one were to be as dogmatic as some OO advocates about structured
> programming, Forth would not be counted among the structured programming
> languages.
> An argument like "Forth has no looping construct with a built in
> condition" would be used to say that Forth is not structured.
> There was an editorial in Dobb's some time ago where it was argued
> that the polymorphism was in fact never essential, and that we just
> had been repeating the same stories to each other over and over again.
>
> Object orientation is useful, but you must get at the heart of it.
> Polymorphism and inheritance are just limbs that you can cut off.
> Abstraction is, of course, right at the heart.
>
> Greetings Albert

Well I don't consider myself "dogmatic" about OO, but if I were I
would say that Forth far better fits the "polymophic" requirement
though the use of tick and execute than it does the "asbtraction"
requirement through create does>. Create does> doesn't (pardon the
pun) fit the classical definition of abtraction (information hiding)
because nothing is really "hidden". I count create does as a type
of abstraction though. Oh, and by your definition Modula-2 would
be considered object oriented.

Regards,

John M. Drake

Bernd Paysan

unread,
May 29, 2002, 5:14:51 AM5/29/02
to
M Joonas Pihlaja wrote:

I use LEAVE to directly exit a deeply nested looping structure.
Unfortunately, the ANS standard doesn't have an easy way to resolve LEAVE
without a DO LOOP. In bigFORTH I use BEGIN .. DONE, but in ANS Forth, you
have to use 1 0 DO .. LOOP for that purpose. Looping back to the start is
easy, just put the REPEAT directly after the LOOP.

Other (and better) possibility: Factor the words. The nested parts you want
to leave from the middle become a separate word, and the outer loop just
calls that word. Then you can use EXIT to leave the nested control
structures.

Bernd Beuster

unread,
May 29, 2002, 6:46:52 AM5/29/02
to
You may see `A.3.2.3.2 Control-flow stack'
(http://www.forth.org/dpans/dpansa3.htm)


--
Bernd Beuster, Design Engineer
CREATIVE CHIPS GmbH
Am Ockenheimer Graben 54, 55411 Bingen/Rhein, Germany
Tel.: +49-6721-7999-17 Fax: +49-6721-7999-12

Mark I Manning IV

unread,
May 29, 2002, 12:47:24 PM5/29/02
to

you would neecd to know...

how many items the user pushed onto the return stack at each level of
nested DO loops, and how many nested DO loops there are. Also you might
have for/next loops in there somewhere, these use the return stack too!.

do loops are usually compiled as...

dd (do), LOOP-EXIT-ADDRESS
LOOPT-BACK:
dd ...
dd ...
dd ...
dd (loop), LOOP-BACK
LOOP-EXIT-ADDRESS:
dd ...

(do) pushes the exit address onto the return stack followed by the start
and end indicies the exit-point is so you can 'leave' the loop early.

any new control flow code to exit muliple levels of DO loops is going to
have a huge ammount of return stack juggling to do. It could put some
magic numbre onto the return stack as a placemarker but that would be
dangerous

maybe someone could think of a better way to do this but the code to
implement it is going to be tricky (complex).

J E Thomas

unread,
May 29, 2002, 1:25:46 PM5/29/02
to
Mark I Manning IV wrote:
> M Joonas Pihlaja wrote:

> > The crucial point here is that the user of the continue statement
> > shouldn't need to care about how deep they are in the control
> > flow. There doesn't seem to be a way for custom control flow
> > constructs to mark a point on the control flow stack for later
> > CS-PICKing, is there?

You can save a copy of the current item, for later. If it uses an IF
AHEAD etc you should just save it and not leave a copy on the
control-flow stack. That takes knowledge of your system, you'll have to
know how big it is at the least. If it uses BEGIN you can leave a copy
behind to be used again. There's no easy standard way to do it.

Here's a sketch of a complicated way: Make your new statement increment
a counter and put its item on top. Every control-flow word checks the
counter. A word that leaves a control-flow item will do the equivalent
of CS-SWAP so the special one stays on top. WHILE would put both its
words under the special one. A word that consumes a control-flow-item
would swap the special one first and then consume the one it finds.
When you resolve your special structure you just use the one on top and
decrement the counter. It would be a little easier if you never use
more than one CONTINUE at a time.

> you would neecd to know...

> how many items the user pushed onto the return stack at each level of
> nested DO loops, and how many nested DO loops there are. Also you
> might have for/next loops in there somewhere, these use the return
> stack too!.

> do loops are usually compiled as...

> dd (do), LOOP-EXIT-ADDRESS
> LOOPT-BACK:
> dd ...
> dd ...
> dd ...
> dd (loop), LOOP-BACK
> LOOP-EXIT-ADDRESS:
> dd ...

> (do) pushes the exit address onto the return stack followed by the
> start and end indicies the exit-point is so you can 'leave' the loop
> early.

> any new control flow code to exit muliple levels of DO loops is going
> to have a huge ammount of return stack juggling to do. It could put
> some magic numbre onto the return stack as a placemarker but that
> would be dangerous

> maybe someone could think of a better way to do this but the code to
> implement it is going to be tricky (complex).

You could have one compile-time variable that keeps the depth of loop
nesting. When you exit you have to know how many levels of loop you're
exiting from. Do that many UNLOOPs and you're set. There's the
standards problem that do-sys isn't guaranteed to be the same size as
orig or dest.

Wil Baden

unread,
May 29, 2002, 2:11:10 PM5/29/02
to
In article <3CF50026...@mailcity.com>, Mark I Manning IV
<I4...@mailcity.com> wrote:

> M Joonas Pihlaja wrote:

> How do you implement a (possibly labelled) break or continue
> > statement that can be used in nested control flow constructs in
> > Forth?

An implementation of GOTO and LABEL in Standard (ANS) Forth is in
<http://home.earthlink.net/~neilbawd/>

LABEL     ( "name" -- )( C: -- dest OR orig_1 ... orig_n -- )
A destination. If name has no gotos to it, LABEL name becomes a
BEGIN, otherwise enough THENs are used to resolve the gotos. As
labels are resolved they are removed (from Label-Table).

GOTO                ( "name" -- )( C: -- orig OR dest -- )
The origin of an unconditional branch. If name has no LABEL, GOTO
name becomes FALSE IF (or AHEAD), otherwise the last LABEL name is
resolved with AGAIN and removed (from Label-Table).

Elizabeth D. Rather

unread,
May 29, 2002, 2:54:02 PM5/29/02
to
Bernd Paysan wrote:

> M Joonas Pihlaja wrote:
>
> > On Sun, 26 May 2002, Bernd Paysan wrote:
> >
> >> And by the way: Forth has about every structural concept that you can
> >> imagine.
> > [snip]
> >
> > How do you implement a (possibly labelled) break or continue
> > statement that can be used in nested control flow constructs in
> > Forth? I.e. something like:
> >
> > loop1:
> > while (condition1) {
> > loop2:
> > while (condition2) {
> > \ ...more nesting
> > if (some_condition) { continue outer; }
> > }
> > }
> >
> > The crucial point here is that the user of the continue statement
> > shouldn't need to care about how deep they are in the control
> > flow. There doesn't seem to be a way for custom control flow
> > constructs to mark a point on the control flow stack for later
> > CS-PICKing, is there?
>

> ...


>
> Other (and better) possibility: Factor the words. The nested parts you want
> to leave from the middle become a separate word, and the outer loop just
> calls that word. Then you can use EXIT to leave the nested control
> structures.

I feel strongly that this solution is the correct one. The entire design and
philosophy of Forth is optimized to support extensive factoring. It's very
hard to test and maintain complicated and deeply nested looping structures,
and _far_ easier when they're factored into simple components.

Cheers,
Elizabeth


M Joonas Pihlaja

unread,
May 31, 2002, 10:24:52 AM5/31/02
to

On Wed, 29 May 2002, Bernd Paysan wrote:

[snip 0 1 DO ...LEAVE... LOOP]


> In bigFORTH I use BEGIN .. DONE, but in ANS Forth, you have
> to use 1 0 DO .. LOOP for that purpose. Looping back to the
> start is easy, just put the REPEAT directly after the LOOP.

Ugh, what a kludge. :-)

> Other (and better) possibility: Factor the words. The nested
> parts you want to leave from the middle become a separate
> word, and the outer loop just calls that word. Then you can
> use EXIT to leave the nested control structures.

In general I agree that this would be the right thing to do, but
it is dodging the issue somewhat, IMO. Namely, it introduces the
need to factor for the wrong reason: just so you can use EXIT. A
portable way to save and restore dests and origs without undue
hackery would a better solution.

Regards,

Joonas

Michael Coughlin

unread,
Jun 9, 2002, 11:02:29 PM6/9/02
to
Jeff Fox wrote:
>
> Len Zettel wrote:
> > I'm not sure he (Chuck Moore) ever published a
> > peer-reviewed paper in his life. The journals were
> > certainly not prestigious. Elizabeth has recounted
> > the bitter opposition her paper on the history of
> > Forth aroused.
>
> Not to mention that in addition to contributing to the paper
> that Elizabeth got published at HOPL II Chuck submitted a
> paper at the same time that they simply rej[ec]ted. I think
> it helped discourage him from bothering to write articles.

I know people who can't use a computer because they can't
cope with plugs that come loose, manuals that don't make any
sense and programs with bugs in them. Instead of making the
effort to learn unpleasant new things, they give up quickly.
Most people who earn their living as writers start off with a
lot of rejection slips. You get a lot of rejection from dumb
machines when you start programming, don't you? But you don't
give up. Why should give up when the rejection comes from a
human being instead?

If it is no fun for Chuck to write articles for learned
prestigious computer science journals (or any other kind), then
I can understand. I don't write such articles myself. If Chuck
was writing journal articles as a professor of computer science,
then maybe he wouldn't have as many good ideas. I would like to
be able to write learned articles about his latest programs, but
I don't understand what he is doing or why he is doing it.

> [Moore, 1991] Moore, Charles H.,The Invention of Forth,
> (submitted to HOPL II but rejected)
> http://www.colorforth.com/HOPL.html
>
> I found it quite insightful into the nature of early Forths.

You are being too wishy-washy. This is one of the most
important papers I've read about Forth. I've been looking for
material like this forever. After reading it I understand much
more about Forth in the 1960's than I knew before. I'm unhappy
that I didn't see it when it was first written, but I'm really
annoyed that it is hiding where I wouldn't be able to find it
without being told about it. There is no reference to it on
Chuck's home page, http://www.colorforth.com/ and I check that
page to see if there is anything new that I missed every few
weeks.

There is a problem with the Forth information published on
the web. It is scattered all over creation, in little nooks and
crannies. If you know what you need, you can ask on
comp.lang.forth, and somebody might tell you where to find
exactly what you want. But if you are new to Forth you probably
don't know what questions to ask and woun't even know that there
is important information available just out of reach. I knew
about and read Elizabeth Rather's paper on the history of Forth,
but did not know Chuck had also written about it at the same
time. Elizabeth's paper is referenced all then time, but Chuck's
is not.

When I checked on Google, I found the right keywords, and
found Chuck's article ( also at
http://www.mindspring.com/~chipchuck/HOPL.html ) as well as
other things I hadn't seen before. The article at
http://www.msmisp.com/futuretest/Forth's_Dilemma.htm was
interesting since it made reference to Chuck's work and Jeff
Fox's web page and mentioned uneconomical factors influencing
large corporations in the choice of software. But I don't agree
with it. Forth loses out to bloatware because it is not
documented as a simple logical way to do computation with that
information placed where people who are going to write useful
applications will find it before they learn the bad habits of
other programming languages.

--
Michael Coughlin m-cou...@attbi.com Cambridge, MA USA

Nicholas Geovanis

unread,
Jun 10, 2002, 5:38:23 PM6/10/02
to
On Mon, 10 Jun 2002, Michael Coughlin wrote:

> Jeff Fox wrote:
> >
> > Len Zettel wrote:
> > > I'm not sure he (Chuck Moore) ever published a
> > > peer-reviewed paper in his life. The journals were
> > > certainly not prestigious. Elizabeth has recounted
> > > the bitter opposition her paper on the history of
> > > Forth aroused.

OK, I missed that. Would anyone care to give me a summary description
of the opposition?

> > Not to mention that in addition to contributing to the paper
> > that Elizabeth got published at HOPL II Chuck submitted a
> > paper at the same time that they simply rej[ec]ted. I think
> > it helped discourage him from bothering to write articles.

> > I found it quite insightful into the nature of early Forths.
>
> You are being too wishy-washy. This is one of the most
> important papers I've read about Forth.

Well, OK. It's poorly written, so I'm not too surprised that it
was rejected. Ms. Rather's history of FORTH is much more informative and
systematic.

> Michael Coughlin m-cou...@attbi.com Cambridge, MA USA

* Nick Geovanis The very term 'icon' has been appropriated and
| IT Computing Svcs changed radically in our computer age, signifying
| Northwestern Univ an ultimately unreal, 'virtual' world.
| n-geo...@nwu.edu - Metropolitan Iakovos, Hierarch of Chicago
+------------------->

ernobe

unread,
Jun 15, 2002, 4:48:17 PM6/15/02
to

> ... When I checked on Google, I found the right keywords, and

> found Chuck's article ( also at
> http://www.mindspring.com/~chipchuck/HOPL.html ) as well as
> other things I hadn't seen before. The article at
> http://www.msmisp.com/futuretest/Forth's_Dilemma.htm was
> interesting since it made reference to Chuck's work and Jeff
> Fox's web page and mentioned uneconomical factors influencing
> large corporations in the choice of software. But I don't agree
> with it. Forth loses out to bloatware because it is not
> documented as a simple logical way to do computation with that
> information placed where people who are going to write useful
> applications will find it before they learn the bad habits of
> other programming languages.
>

If Forth could be documented as placing information where people who are going
to write useful application will find it, instead of having to learn the bad
habits of other programming languages, then these bad habits could be avoided
altogether. But does this really characterize Forth in its current
incarnations? Languages encourage bad habits if they do not adapt to
developments, and we all know that thing have changed quite dramatically since
the late 60s'.


0 new messages