Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Universal Computer Truths

30 views
Skip to first unread message

Tim Trussell

unread,
Jul 6, 2010, 11:59:41 AM7/6/10
to
Found this, and wanted to share it:


There are some universal truths about computer languages that you
learn after studying and learning them for so many years.

a) No language is self-evident or obvious
b) All languages are insufficient and have deficiencies
c) All languages have strengths and weaknesses
d) All language require trade-offs and compromises
e) Most programmers are uncomfortable with (a) through (d), and many
try to *fix* those unfix-able, intractable truths, or persuade other's
that they aren't true.

MrC

Alex McDonald

unread,
Jul 6, 2010, 4:05:40 PM7/6/10
to

f) Languages that avoid (a) thru (d) most often look like Lisp.
g) Most programmers avoid Lisp, on the grounds that fixing the
unfixable, not facing the truth and persuading others that there's a
problem is easier than learning Lisp.

jacko

unread,
Jul 6, 2010, 6:00:05 PM7/6/10
to

pHone is going to be like a postfix Lisp/Logo.

h) Only languages in which some problems are intractable are worth
avoiding.

Cheers Jacko

jacko

unread,
Jul 6, 2010, 6:04:13 PM7/6/10
to
So what list handling opcodes would you like? Consider the factorized
basic set CONS/CAR/CDR (JOIN/TOP DROP/TOP SWAP DROP) what others?

Rod Pemberton

unread,
Jul 6, 2010, 9:13:31 PM7/6/10
to
"Tim Trussell" <tgtru...@hotmail.com> wrote in message
news:a91cf08a-bb30-4762...@n8g2000prh.googlegroups.com...

> Found this, and wanted to share it:
>

Oh, I missed that line. I thought this was *your* perspective... And, I'm
not sure what your post has to do with Forth...

> There are some universal truths about computer languages that you
> learn after studying and learning them for so many years.
>
> a) No language is self-evident or obvious

Is anything? What in life doesn't require learning or experience? This
claim is stupid. I needed to say that, and I feel better saying that
knowing that it's not *your* perspective, although your posting of it could
be considered a proxy for agreeing with that statement...

> b) All languages are insufficient and have deficiencies

Mathematics has been used to map, model, describe, just about everything
that is known to exist. Most languages support a large subset of
mathematics. So, if a programmer declares some language is "insufficient
and have deficiencies" which has the ability to do most mathematics, how is
that claim possibly true? (It's not.) Are they solving the most advanced
mathematics problems in existence? (No. They aren't.) So, how can such a
programmer claim that some language is "insufficient and has deficiencies"?
(...) Did he mathematically prove some language wasn't capable of solving
some class of general problems? (Unlikely.)

> c) All languages have strengths and weaknesses

That's very vague. It's way too vague for me in fact. You could have a
language that is exceptionally strong in all areas, when compared against
other languages, yet that language, while superior to the other languages in
all respects, will have areas of weakness when compared with itself. Should
the internal "weaknesses" of such a superior language be viewed as a general
"weakness"?

> d) All language require trade-offs and compromises

Ok, I don't understand this claim. Well, actually, I don't understand half
of this claim. I can understand needing "trade-offs and compromises" in the
areas that a language has "weakness", but can you claim that "trade-offs and
compromises" are required where the language has "strengths"? That makes no
sense whatsoever. How is a strength a strength if it requires "trade-offs
and compromises"? It'd be a weakness. That claim clearly needs to be
qualified with "in the areas of weakness".

> e) Most programmers are uncomfortable with (a) through (d), and many
> try to *fix* those unfix-able, intractable truths, or persuade other's
> that they aren't true.

Obviously, that's from a novice... The real truth is some programmers are
blatantly incompetent and so perceive weaknesses in the language, instead of
with their own skillset or intellect, and so therefore embrace the
tangential task of "fixing" the perceived defects ("intractable truths") in
a language, instead of learning the language. Why? Because, fixing the
language is "solvable" problem from their perspective, whereas programming
in a "defective" language is not... Does the majority ever adopt the
"solutions" found by such people? (No. They're always the idiots.) So,
wasn't the effort a fruitless pursuit? (Yes.)


Rod Pemberton


Krishna Myneni

unread,
Jul 6, 2010, 10:55:27 PM7/6/10
to
On Jul 6, 8:13 pm, "Rod Pemberton" <do_not_h...@notreplytome.cmm>
wrote:
...

> Mathematics has been used to map, model, describe, just about everything
> that is known to exist.  Most languages support a large subset of
> mathematics.  So, if a programmer declares some language is "insufficient
> and have deficiencies" which has the ability to do most mathematics, how is
> that claim possibly true?  (It's not.)  Are they solving the most advanced
> mathematics problems in existence? (No.  They aren't.)  So, how can such a
> programmer claim that some language is "insufficient and has deficiencies"?
> (...)  Did he mathematically prove some language wasn't capable of solving
> some class of general problems?  (Unlikely.)
> ...

So, what are the fundamental logic operations? Apparently all logic
functions can be built from NAND. Does it follow then that all
languages which provide a NAND operation should, in principle, be
capable of describing the same algorithms?

Perhaps off topic, and I don't know much about the subject, but in
mathematics there is something known as Gödel's Theorem, which
basically states that there exist truths in mathematics which are
inherently unprovable. The Wikipedia article states,

"Gödel's incompleteness theorems are two theorems of mathematical
logic that establish inherent limitations of all but the most trivial
axiomatic systems for mathematics. ...The first incompleteness theorem
states that no consistent system of axioms whose theorems can be
listed by an "effective procedure" (essentially, a computer program)
is capable of proving all facts about the natural numbers. For any
such system, there will always be statements about the natural numbers
that are true, but that are unprovable within the system."

Krishna


Andrew Haley

unread,
Jul 7, 2010, 5:09:12 AM7/7/10
to
Rod Pemberton <do_no...@notreplytome.cmm> wrote:
> "Tim Trussell" <tgtru...@hotmail.com> wrote in message
>
>> b) All languages are insufficient and have deficiencies
>
> Mathematics has been used to map, model, describe, just about everything
> that is known to exist. Most languages support a large subset of
> mathematics. So, if a programmer declares some language is "insufficient
> and have deficiencies" which has the ability to do most mathematics, how is
> that claim possibly true?

Of course it is. You can do anything computable in a Turing Machine,
but you'd have to admit that it has deficiencies as a programming
language.

Andrew.

Rod Pemberton

unread,
Jul 7, 2010, 6:21:37 PM7/7/10
to
"Krishna Myneni" <krishna...@ccreweb.org> wrote in message
news:457a65ce-6960-4c71...@z10g2000yqb.googlegroups.com...

> On Jul 6, 8:13 pm, "Rod Pemberton" <do_not_h...@notreplytome.cmm>
> wrote:
> ...
> > Mathematics has been used to map, model, describe, just about everything
> > that is known to exist. Most languages support a large subset of
> > mathematics. So, if a programmer declares some language is "insufficient
> > and have deficiencies" which has the ability to do most mathematics, how
is
> > that claim possibly true? (It's not.) Are they solving the most advanced
> > mathematics problems in existence? (No. They aren't.) So, how can such a
> > programmer claim that some language is "insufficient and has
deficiencies"?
> > (...) Did he mathematically prove some language wasn't capable of
solving
> > some class of general problems? (Unlikely.)
> > ...
>
> So, what are the fundamental logic operations? Apparently all logic
> functions can be built from NAND. Does it follow then that all
> languages which provide a NAND operation should, in principle, be
> capable of describing the same algorithms?
>

That's entirely another topic. One I'm not interested in, at the moment...

Basically, you've switched the discussion around. The OP's quoted anonymous
individual says all languages are insufficient. So, you should be asking
what's *missing* that classifies a language as insufficient. But, now
you're asking what is *needed* in order to be sufficient...

> Perhaps off topic, and I don't know much about the subject, but in
> mathematics there is something known as Gödel's Theorem, which
> basically states that there exist truths in mathematics which are
> inherently unprovable.
>

1) Are you saying that some language, in order to not be deemed
insufficient per the OP's original quotes, needs to identify and implement
those truths? How do you intend to identify a truth that is unprovable? It
is through proofs that we identify truth and falsity. I.e., if your claim
is correct, it's unknowable as to whether an "inherently unprovable" truth
is a truth, or not. Yes? So, one cannot say it is a truth.

2) IMO, the posted Wikipedia quote doesn't state that truths exist in
mathematics which are unprovable, as you've stated. The posted Wikipedia
quote states that there are computational systems with limitations that
cannot prove some truths about natural numbers. That doesn't mean that the
truth is unprovable, only that it's unprovable *within* a certain
computational system due to that system's limitations. I.e., there can be a
different computational system with different limitations that proves the
unprovable truths of the first computational system.

3) I made a point of referring to the "posted Wikipedia quote" since I don't
know if Wikipedia accurately expresses those theorems in layman's terms.


Rod Pemberton


Rod Pemberton

unread,
Jul 7, 2010, 6:23:12 PM7/7/10
to
"Andrew Haley" <andr...@littlepinkcloud.invalid> wrote in message
news:LvqdnXKY-o6l2anR...@supernews.com...

AFAIK, there is no working implementation of a Turing Machine. All
computable architectures place limitations on the Turing Machine model.
There's no such thing as infinite tape, infinite stacks, or infinite
register sizes. I.e., a programming language that implements the Turing
Machine model on a real computing platform will *not* be Turing complete.

But, let's use such a language for discussion anyway as if it was complete.
Unfortunately, my response to e) accurately describes the situation of a
"deficient" Turing Machine programming language... It's up to that
programmer to become an expert in programming that model, not declare the
language as "deficient".


RP


Paul Rubin

unread,
Jul 7, 2010, 7:08:01 PM7/7/10
to
"Rod Pemberton" <do_no...@notreplytome.cmm> writes:
> 2) IMO, the posted Wikipedia quote doesn't state that truths exist in
> mathematics which are unprovable, as you've stated. The posted Wikipedia
> quote states that there are computational systems with limitations that
> cannot prove some truths about natural numbers.

IMO, the original post was just a humorous "deduction" that Lisp is the
One Perfect Programming Language. And also IMO, any programmer who has
never felt that way about Lisp has never really lived ;-). Some Lispers
have never gotten over it.

Goedel's first incompleteness theorem (per the Wikipedia article) states
that

no consistent system of axioms whose theorems can be listed by an
"effective procedure" (essentially, a computer program) is capable

of proving all facts about the natural numbers.

> That doesn't mean that the truth is unprovable, only that it's
> unprovable *within* a certain computational system due to that
> system's limitations. I.e., there can be a different computational
> system with different limitations that proves the unprovable truths of
> the first computational system.

Per the above, there is NO such system that proves ALL the truths of
arithmetic. You can add more axioms, but there's in general no way to
know if a proposed new axiom is actually true, so you may end up with an
inconsistent system.

Rod Pemberton

unread,
Jul 7, 2010, 7:13:05 PM7/7/10
to
"Paul Rubin" <no.e...@nospam.invalid> wrote in message
news:7xmxu23...@ruckus.brouhaha.com...

> "Rod Pemberton" <do_no...@notreplytome.cmm> writes:
> > 2) IMO, the posted Wikipedia quote doesn't state that truths exist in
> > mathematics which are unprovable, as you've stated. The posted
Wikipedia
> > quote states that there are computational systems with limitations that
> > cannot prove some truths about natural numbers.
>
> IMO, the original post was just a humorous "deduction" that Lisp is the
> One Perfect Programming Language. And also IMO, any programmer who has
> never felt that way about Lisp has never really lived ;-). Some Lispers
> have never gotten over it.
>

It was about Lisp? How do you come to that conclusion?


RP


Paul Rubin

unread,
Jul 7, 2010, 7:21:00 PM7/7/10
to
"Rod Pemberton" <do_no...@notreplytome.cmm> writes:
> It was about Lisp? How do you come to that conclusion?

Oh wait, you're right, that was somebody's followup. In any case I
interpreted it as a wisecrack, although a lot of Lispers seem to believe
that it is really true.

jacko

unread,
Jul 7, 2010, 7:30:18 PM7/7/10
to
> Per the above, there is NO such system that proves ALL the truths of
> arithmetic.  You can add more axioms, but there's in general no way to
> know if a proposed new axiom is actually true, so you may end up with an
> inconsistent system.

Incosistancy may be proved, by finding something which is both true
and false, or by finding something which is indeterminate (0).
Cosistancy is another matter, a system can not prove its own
consistancy, and this is theorem two.

QED => zero is not a number, photons never exist (spend any time
being), useful structural concepts are symbolic, but not materially
existential.

Cheers Jacko

Paul Rubin

unread,
Jul 7, 2010, 8:31:43 PM7/7/10
to
jacko <jacko...@gmail.com> writes:
>> arithmetic.  You can add more axioms, but there's in general no way to
>> know if a proposed new axiom is actually true, so you may end up with an
>> inconsistent system.
>
> Incosistancy may be proved, by finding something which is both true
> and false, or by finding something which is indeterminate (0).

You also might add a statement S which is false but not provably false.
In that case you don't get an inconsistent system, but you get one that
proves a lot of false sentences. In the jargon this is called an
"unsound" system, and its models are called "nonstandard".

Phil Martel

unread,
Jul 7, 2010, 9:13:09 PM7/7/10
to

"Paul Rubin" <no.e...@nospam.invalid> wrote in message

news:7xmxu2a...@ruckus.brouhaha.com...

I don't think this is correct. If a mathematical statement S with
parameters (a, b, c ...) is not true, there is some (a, b, c...) for which
you can show it is false. It doesn't matter how you find (a, b, c...) .

Jerry Avins

unread,
Jul 7, 2010, 10:11:59 PM7/7/10
to
On 7/7/2010 6:21 PM, Rod Pemberton wrote:

...

>> So, what are the fundamental logic operations? Apparently all logic
>> functions can be built from NAND. Does it follow then that all
>> languages which provide a NAND operation should, in principle, be
>> capable of describing the same algorithms?


For what it's worth, all logic opreations can also be built from NORs.

> That's entirely another topic. One I'm not interested in, at the moment...
>
> Basically, you've switched the discussion around. The OP's quoted anonymous
> individual says all languages are insufficient. So, you should be asking
> what's *missing* that classifies a language as insufficient. But, now
> you're asking what is *needed* in order to be sufficient...

It's really very simple. To be sufficient, a language needs a DWIM command.

...

> 2) IMO, the posted Wikipedia quote doesn't state that truths exist in
> mathematics which are unprovable, as you've stated. The posted Wikipedia
> quote states that there are computational systems with limitations that
> cannot prove some truths about natural numbers. That doesn't mean that the
> truth is unprovable, only that it's unprovable *within* a certain
> computational system due to that system's limitations. I.e., there can be a
> different computational system with different limitations that proves the
> unprovable truths of the first computational system.
>
> 3) I made a point of referring to the "posted Wikipedia quote" since I don't
> know if Wikipedia accurately expresses those theorems in layman's terms.

Goedel's first incompleteness theorem states that except for trivial
weak systems, a set of axioms that generate theorems which can be listed
by an "effective procedure" -- Cantor's sieve is an example -- can
express more truths than can be proven. A corollary states that they can
also express falsehoods that they are insufficiently powerful to disprove.

The second incompleteness theorem states that except for trivial weak
systems, given a set of axioms A and rules by which you can deduce
(prove) theorems from the axioms, if you can deduce all the laws of good
old elementary arithmetic from the axioms A, then you can't prove that
the axioms are not self-contradictory).

Jerry
--
Engineering is the art of making what you want from things you can get.
¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯

Paul Rubin

unread,
Jul 7, 2010, 10:33:27 PM7/7/10
to
"Phil Martel" <poma...@comcast.net> writes:
> I don't think this is correct. If a mathematical statement S with
> parameters (a, b, c ...) is not true, there is some (a, b, c...) for
> which you can show it is false. It doesn't matter how you find (a, b,
> c...) .

That's only the case for the so-called pi-0-1 sentences, like "every
even number is the sum of two primes" (if that is false, you can show a
counterexample).

A pi-0-2 sentence like "for every n, there is a pair (p, p+2) so that p
and p+2 are both prime" (i.e. "there are infinitely many twin primes")
can't be disproven that way. Even if it's false, you can't necessarily
show a provable counterexample.

Why are we going on about this?

Bernd Paysan

unread,
Jul 8, 2010, 4:37:25 AM7/8/10
to
Paul Rubin wrote:
> You also might add a statement S which is false but not provably false.
> In that case you don't get an inconsistent system, but you get one that
> proves a lot of false sentences. In the jargon this is called an
> "unsound" system, and its models are called "nonstandard".

A lot of proofs in mathematics work that way. There are enough unproven
foundations, which are necessary to proof other statements, and people write
proofs basing on the assumption that these unproven foundations are true.
You could just assume the opposite, and then you have a nonstandard system,
where most of these other lemmas are false, as well. People slowly fill in
the gaps of proofs, and some of these foundations have recently been made
more solid (like Fermat's theorem). Thus a nonstandard system would have
been one where there exists an n > 2 with a^n+b^n=c^n for all a, b, c, and n
integer. Now with the proof of Fermat's theorem, this nonstandard system is
just plain inconsistent, but before, it was merely nonstandard.

--
Bernd Paysan
"If you want it done right, you have to do it yourself!"
http://www.jwdt.com/~paysan/

Alex McDonald

unread,
Jul 8, 2010, 4:57:32 AM7/8/10
to
On 8 July, 03:11, Jerry Avins <j...@ieee.org> wrote:

>
> It's really very simple. To be sufficient, a language needs a DWIM command.

I regularly use it. And since DWIM is very powerful, I'm finding that
I use that old standby WTF? less and less.


Paul Rubin

unread,
Jul 8, 2010, 5:02:13 AM7/8/10
to
Bernd Paysan <bernd....@gmx.de> writes:
> Thus a nonstandard system would have
> been one where there exists an n > 2 with a^n+b^n=c^n for all a, b, c, and n
> integer. Now with the proof of Fermat's theorem, this nonstandard system is
> just plain inconsistent, but before, it was merely nonstandard.

No really, that system was inconsistent before FLT was proved. We just
didn't know for sure, though we had a strong suspicion ;-). Nonstandard
means something specific. It means there is a number x, such that
x > 1, x > 2, x > 3, ... no matter how high you count. The standard or
"true" natural numbers do not have such an element.

http://en.wikipedia.org/wiki/Non-standard_model_of_arithmetic

jacko

unread,
Jul 8, 2010, 5:09:34 AM7/8/10
to
> It's really very simple. To be sufficient, a language needs a DWIM command.

So the machine can follow the whim of the programmer? And do it with
doubles too?

Cheers Jacko

Bernd Paysan

unread,
Jul 8, 2010, 10:53:46 AM7/8/10
to
jacko wrote:
> So the machine can follow the whim of the programmer? And do it with
> doubles too?

Yes, but in a Douglas-Adams/Terry-Pratchett way of doing things, it fails
even more often than the WFT approach. WTF programs usually fail to do what
the programmer means because he couldn't say what he means in precise terms.
DWIM programs fail, because the programmer changes his mind too often, and
is inconsistent about what he really means and wishes. For example, using a
DWIM programming language is not allowed for Apple's App Store, since DWIM
programs tend to display porn all the sudden, especially at 4am in the
morning, when the programmer has wet dreams.

Alex McDonald

unread,
Jul 8, 2010, 11:36:26 AM7/8/10
to
On 8 July, 15:53, Bernd Paysan <bernd.pay...@gmx.de> wrote:
> jacko wrote:
> > So the machine can follow the whim of the programmer? And do it with
> > doubles too?
>
> Yes, but in a Douglas-Adams/Terry-Pratchett way of doing things, it fails
> even more often than the WFT approach.  WTF programs usually fail to do what
> the programmer means because he couldn't say what he means in precise terms.  
> DWIM programs fail, because the programmer changes his mind too often, and
> is inconsistent about what he really means and wishes.  For example, using a
> DWIM programming language is not allowed for Apple's App Store, since DWIM
> programs tend to display porn all the sudden, especially at 4am in the
> morning, when the programmer has wet dreams.

I thought that Apple's main objection was to programs that insisted on
flashing?

Albert van der Horst

unread,
Jul 8, 2010, 1:58:07 PM7/8/10
to
In article <7xmxu23...@ruckus.brouhaha.com>,

Paul Rubin <no.e...@nospam.invalid> wrote:
>"Rod Pemberton" <do_no...@notreplytome.cmm> writes:
>> 2) IMO, the posted Wikipedia quote doesn't state that truths exist in
>> mathematics which are unprovable, as you've stated. The posted Wikipedia
>> quote states that there are computational systems with limitations that
>> cannot prove some truths about natural numbers.
>
>IMO, the original post was just a humorous "deduction" that Lisp is the
>One Perfect Programming Language. And also IMO, any programmer who has
>never felt that way about Lisp has never really lived ;-). Some Lispers
>have never gotten over it.

And so is Forth, because to the enlighted Forth is just
Reverse Polish Lisp.

Groetjes Albert

--
--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst

John Passaniti

unread,
Jul 8, 2010, 3:01:34 PM7/8/10
to
On Jul 8, 1:58 pm, Albert van der Horst <alb...@spenarnc.xs4all.nl>
wrote:

> And so is Forth, because to the enlighted Forth is just
> Reverse Polish Lisp.

That line gets repeated often. Would be nice if true. Too bad it
isn't:

For the unenlightened, show a Forth program that operates on itself--
not by manipulating Forth source, but by treating (as in Lisp) the
Forth program itself as data. For example, take this (dumb) Forth
code:

: add-ten 2 3 * 4 + + ;

Now, as you could easily in Lisp, write a program that does simple
constant folding. Iterate over a Forth word, identify all occurrences
of two numbers being operated on with a binary math operator
recursively, and replace that with the calculated value. The end
result should of course be:

: add-ten 10 + ;

Replace the original definition with this new one. Do this portably,
as you can in Lisp. Oh, and reclaim the memory that is saved with
this optimized version (this should be easy, just use Forth's garbage
collector). When you're done, perhaps we'll do a similar optimization
involving closures.

Forth *weakly* gives *some* of the power of Lisp. The only people who
believe that Forth is a "reverse Polish Lisp" are those who don't do
much Lisp.

Alex McDonald

unread,
Jul 8, 2010, 3:32:42 PM7/8/10
to

While much of that is true, even my dumb-as-a-box-of-rocks Forth
generates this for add-ten;

: add-ten 2 3 * 4 + + ; ok
see add-ten
: add-ten ( ? -- ? ) \ std call compiles
\ len=3 type=7 flag=00
\ in file (console)
( $42E994 ' add-ten ) add eax, # $A \ 83C00A
( $42E997 ' add-ten+$3 ) ret near \ C3
( end ) ok

Jerry Avins

unread,
Jul 8, 2010, 5:28:45 PM7/8/10
to
On 7/8/2010 3:32 PM, Alex McDonald wrote:
> On 8 July, 20:01, John Passaniti<john.passan...@gmail.com> wrote:

...

>> Forth *weakly* gives *some* of the power of Lisp. The only people who
>> believe that Forth is a "reverse Polish Lisp" are those who don't do
>> much Lisp.
>
> While much of that is true, even my dumb-as-a-box-of-rocks Forth
> generates this for add-ten;
>
> : add-ten 2 3 * 4 + + ; ok
> see add-ten
> : add-ten ( ? -- ? ) \ std call compiles
> \ len=3 type=7 flag=00
> \ in file (console)
> ( $42E994 ' add-ten ) add eax, # $A \ 83C00A
> ( $42E997 ' add-ten+$3 ) ret near \ C3
> ( end ) ok

That wasn't John's point. Your compiler treats source as data. That's
not the same as a user treating his own code as data.

Alex McDonald

unread,
Jul 8, 2010, 6:23:49 PM7/8/10
to

I know.

jacko

unread,
Jul 8, 2010, 6:48:30 PM7/8/10
to
> ( end ) ok- Hide quoted text -
>
> - Show quoted text -

Machine code is not very Lispy. The lisp folks have already released a
forth for lisp, although I wasn't that interested on the grounds of
the effective ROM or code segment size.

The other way round seems more useful, but Lisp is a little over the
top for list handling.

Cheers Jacko

Message has been deleted

Paul Rubin

unread,
Jul 9, 2010, 2:06:42 PM7/9/10
to
John Passaniti <john.pa...@gmail.com> writes:
> Not the point. The point is that in Lisp, the programmer is free to
> manipulate the program as data, allowing arbitrarily-complex
> transformations of the code. Yes, I picked a particular
> transformation that some Forths happen to already do. I would hope
> people could abstract a bit and see the larger point.

I think programs-as-data is a feature found in many Lisps but isn't
essential to Lisp. The defining features of Lisp IMO are:

1. Convenient representation of lists (as S-expressions) at compile
time, including tree structures as nested lists.

2. Easy manipulation and creation of lists at runtime, including the
CONS primitive that creates new list nodes

3. Automatic storage management (garbage collection) to reclaim memory
occupied by no-longer-reachable data created by CONS.

4. Functions represented as expressions in lambda calculus, leading to
convenient implementation of higher-order functions like map and reduce,
delayed evaluation by wrapping an expression in a lambda, etc.

5. Data has runtime type tags ensuring type safety.

All these things are doable in Forth with enough contortions, but then
it's not really Forth any more.

Aleksej Saushev

unread,
Jul 9, 2010, 4:23:55 PM7/9/10
to
John Passaniti <john.pa...@gmail.com> writes:

> On Jul 8, 3:32 pm, Alex McDonald <b...@rivadpm.com> wrote:
>> While much of that is true, even my dumb-as-a-box-of-rocks Forth
>> generates this for add-ten;
>

> Not the point. The point is that in Lisp, the programmer is free to
> manipulate the program as data, allowing arbitrarily-complex
> transformations of the code. Yes, I picked a particular
> transformation that some Forths happen to already do. I would hope
> people could abstract a bit and see the larger point.

That's not true, you can coerce specially crafted list into function
but not vice versa, you can do the same in Forth with : POSTPONE and ;


--
HE CE3OH...

Phil Martel

unread,
Jul 9, 2010, 5:31:43 PM7/9/10
to

"Jerry Avins" <j...@ieee.org> wrote in message
news:PtaZn.1990$Bh2....@newsfe04.iad...


> On 7/7/2010 6:21 PM, Rod Pemberton wrote:
>
> ...
>
>>> So, what are the fundamental logic operations? Apparently all logic
>>> functions can be built from NAND. Does it follow then that all
>>> languages which provide a NAND operation should, in principle, be
>>> capable of describing the same algorithms?
>
>
> For what it's worth, all logic opreations can also be built from NORs.

Or ANDs, ORs and two NOTs.

Phil Martel

unread,
Jul 9, 2010, 5:41:54 PM7/9/10
to

"Paul Rubin" <no.e...@nospam.invalid> wrote in message

news:7xk4p6a...@ruckus.brouhaha.com...

Because it's fun?

Rod Pemberton

unread,
Jul 9, 2010, 5:44:30 PM7/9/10
to
"John Passaniti" <john.pa...@gmail.com> wrote in message
news:be853ee1-dd3a-4a45...@u26g2000yqu.googlegroups.com...

> On Jul 8, 3:32 pm, Alex McDonald <b...@rivadpm.com> wrote:
> > While much of that is true, even my dumb-as-a-box-of-rocks Forth
> > generates this for add-ten;
>
> Not the point. The point is that in Lisp, the programmer is free to
> manipulate the program as data, allowing arbitrarily-complex
> transformations of the code.
>

...

I think most who read you post said to themselves, "Why would I need that?
Or, who needs that? Or, how is this really relevant?"

I understand that interpreters can do things that would be part of the
compiler for some other language. Does that mean that the language should
provide the feature for the programmer to use? Why does Forth or C or some
other language need this functionality as a native language feature? If
that functionality is present in the optimization portion of a compiler, or
as a separate optimization program, what functionality - from the
programmer's perspective - is missing? (IMO, C proves that the "ability to
manipulate program code as data" is not needed as part of the language.)


Rod Pemberton


Alex McDonald

unread,
Jul 9, 2010, 6:21:18 PM7/9/10
to
On 9 July, 22:31, "Phil Martel" <pomar...@comcast.net> wrote:
> "Jerry Avins" <j...@ieee.org> wrote in message
>
> news:PtaZn.1990$Bh2....@newsfe04.iad...
>
> > On 7/7/2010 6:21 PM, Rod Pemberton wrote:
>
> >   ...
>
> >>> So, what are the fundamental logic operations? Apparently all logic
> >>> functions can be built from NAND. Does it follow then that all
> >>> languages which provide a NAND operation should, in principle, be
> >>> capable of describing the same algorithms?
>
> > For what it's worth, all logic opreations can also be built from NORs.
>
> Or ANDs, ORs and two NOTs.
>
>

Or a subtract and branch-when-negative.

Paul Rubin

unread,
Jul 9, 2010, 6:24:09 PM7/9/10
to
"Rod Pemberton" <do_no...@notreplytome.cmm> writes:
> (IMO, C proves that the "ability to
> manipulate program code as data" is not needed as part of the language.)

C has a preprocessor that allows limited manipulation of the program as
data. It is nearly indispensible in real-world C programming,
indicating that C really can't get along so well without at least some
form of the capability. Java tried to get along without it, but ended
up growing a complicated reflection interface instead. Scheme tried to
get along without it but ended up with hygienic macros. Haskell tried
to get along without it but ended up with numerous workarounds including
Template Haskell, quasiquoting, Liskell, at least 3 different generics
preprocessors. Overall, the usefulness of such a thing empirically
recurs again and again in real-world programming systems. So it's
reasonable to infer that it's probably important.

Jerry Avins

unread,
Jul 9, 2010, 7:30:45 PM7/9/10
to

Did anyone claim that such manipulation is needed? The claim I read is
that it is possible.

Timothy Trussell

unread,
Jul 9, 2010, 9:56:41 PM7/9/10
to
Tim Trussell wrote:

> Found this, and wanted to share it:

After having read everything posted, all I can say is that everyone
involved has valided just about every entry in this list.

Tim-


--- news://freenews.netfront.net/ - complaints: ne...@netfront.net ---

Rod Pemberton

unread,
Jul 10, 2010, 12:26:35 AM7/10/10
to
"Timothy Trussell" <tgtru...@hotmail.com> wrote in message
news:i18k0o$2hg3$1...@adenine.netfront.net...

> Tim Trussell wrote:
>
> > Found this, and wanted to share it:
>
> After having read everything posted, all I can say is that everyone
> involved has valided just about every entry in this list.
>

... validated ... ?

Does that complete the list? Curious, was that a) or b)? Or sarcastically,
e)?


RP


John Passaniti

unread,
Jul 10, 2010, 1:14:24 AM7/10/10
to
On Jul 9, 5:44 pm, "Rod Pemberton" <do_not_h...@notreplytome.cmm>
wrote:

> I think most who read you post said to themselves, "Why would I need that?
> Or, who needs that?  Or, how is this really relevant?"

You have a skill at taking singular examples used to make larger
points, and misunderstanding them as the point itself.

The original statement was that Forth is a "reverse Polish Lisp".
It's a claim that gets offered every now and then in comp.lang.forth,
but when you look at the capabilities of Lisp verses Forth, you see
that Forth can't do what Lisp can. So *that* is the point-- an
example of how Forth *isn't* a "reverse Polish Lisp."

But since you asked, the Lisp ability to treat programs as data and
manipulate them is used for all sorts of solutions to problems. In
general this ability allows the Lisp programmer to take apart programs
and subject them to analysis. You see it used in real-world cases
like simplifying symbolic math expressions by application of rules.
You can also use it to inject code into a system (such as aspect or
role oriented programming).

> I understand that interpreters can do things that would be part of  the
> compiler for some other language.  Does that mean that the language should
> provide the feature for the programmer to use?  

If it's useful and helps the programmer solve problems, why not? In
Forth, the ability to create colon definitions is an interpreter-level
feature. Yet, there are applications where being able to create colon
definitions on the fly programmatically are useful. As you suggesting
that this ability be removed from the Forth programmer's arsenal
because it's traditionally not offered in most programming languages?

In the case of Lisp, this is a non-question, since Lisp is somewhat
unique in that the external source representation of programs is
"homoiconic" with the internal representation. It's as natural to
manipulate programs in Lisp as any other data, because programs *are*
data.

> Why does Forth or C or some
> other language need this functionality as a native language feature?  

It doesn't. The claim is that Forth is a "reverse Polish Lisp." It
isn't. I'm not advocating for the feature in those languages. I'm
pointing out it doesn't exist.

But since you asked, if Forth had the ability to portably treat
programs as data in the Lisp sense, it would probably lead to all
sorts of useful facilities. Some Forths offer optimizations ranging
from simple constant folding on up to sophisticated dataflow
analysis. Unfortunately, these facilities are locked away in the
compiler. You can add to or change those optimizations. But if Forth
did offer what Lisp does, you could. Those transformations would be
as accessible as anything else in Forth. You could also do
metacompilation to other targets in a radically simpler (and better)
way, by having the entire Forth as a program you could iterate over
and transform into other forms.

> If
> that functionality is present in the optimization portion of a compiler, or
> as a separate optimization program, what functionality - from the
> programmer's perspective - is missing?  

If the languages you know don't offer such features, and if you've
never thought about how you would exploit such features, then why
would it be missing? The point of the wide variety of programming
languages out there is that they can offer some radically different
ways to view and solve problems, and having that in your toolbox means
that you aren't limited to solving the same problems in the same
ways. If all you know is Forth and C, then your toolbox is firmly
planted in the imperative procedural world. That's fine, and you can
solve a lot of problems that way. But there are other ways to solve
problems that can lead to other solutions that are simpler, easier,
and more elegant.

> (IMO, C proves that the "ability to
> manipulate program code as data" is not needed as part of the language.)

A Turing machine "proves" that a language far simpler than C or Forth
can solve any computable problem. So your point is?

It's been proven that a microprocessor with a single instructions-- a
subtract and branch-- can be built up to do anything. So your point
is?

Here's your homework assignment: Learn Lisp. Like Forth, it's not a
difficult language to learn. You can pick up the basics in just a few
days. Then, with that basic knowledge, research the kind of solutions
that Lisp programmers come up with and then you'll understand why your
questions are so... silly.


Rod Pemberton

unread,
Jul 10, 2010, 2:46:08 AM7/10/10
to
"John Passaniti" <john.pa...@gmail.com> wrote in message
news:c916dea3-0fc6-4170...@q12g2000yqj.googlegroups.com...

> The point of the wide variety of programming
> languages out there is that they can offer some radically different
> ways to view and solve problems, and having that in your toolbox
> means that you aren't limited to solving the same problems in the
> same ways. If all you know is Forth and C, then your toolbox is
> firmly planted in the imperative procedural world. That's fine, and
> you can solve a lot of problems that way. But there are other ways
> to solve problems that can lead to other solutions that are simpler,
> easier, and more elegant.
>

Ahem, wasn't this e) in the original list? ...

If all the problems I need to solve are solvable with a certain programming
language and/or paradigm, why would I bother with learning another language
just to find another solution - which may or may not be simpler, easier, or
more elegant - to a problem which is already solvable?

If I had a miniature 3HP chainsaw with a seven inch bar, I could cut down
the typical 4 inch branch twice as fast as the 3 minutes it takes with the
laser-cut tooth handsaw, and I wouldn't be so sweaty... It's only $229USD
plus $2.29/gallon per gas and $0.99 for 2-stroke oil vs. $19.99 for the
handsaw, some effort, and my favorite sugar water: Mt. Dew. Makes sense to
select the first option, doesn't it? Let's ignore the two-stroke pollution.
We hate sweat. Saves a minute and a half... Etc.

Even if I learned a "new method" of solving programming problems, it doesn't
guarantee that the results I get are *more useful* to me. There is nothing
to prove they have some other benefit: time, money, skills, etc. All that's
known is that they *may* provide an alternate and equivalent solution.
I.e., no proven advantage. This is ignoring the time investment for
"playing" around with a "new method". I wouldn't be an "expert" in this
"new method" of solving problems for some time, and I'm likely to introduce
errors into the solution(s) or incorrectly implement solutions until I
become one. The languages you suggest are obscure, not widely available,
and have few to no commercial programmers available. (I.e., they're "dead".)

> > (IMO, C proves that the "ability to
> > manipulate program code as data" is not needed as part of the language.)
>

> So your point is?
>

C is a powerful language with no need to manipulate code as data. The C
preprocessor effects code transformations, but it's not a necessity.

> research the kind of solutions
> that Lisp programmers come up with and
> then you'll understand why your
> questions are so... silly.
>

JP's up on Cloud-9 again... Just because Lisp can be bootstrapped from a
small set of "primitives", like Forth, does not mean that Lisp is better
than Forth or some other language. Just because Lisp has the ability to
natively implement binary or other data trees does not mean that the
solutions Lisp programmers "come up with" are any more elegant than some
other language, once code for manipulating trees has been written for them.
Their solutions may be shorter, smaller, etc., but the operations performed
are the same. It just means you have to work a bit less. Ditto for other
"advantages" or "features" of Lisp. (Lisp is another language frequently
implemented in C.) I see this as no different than your RegEx's are the
best way to parse - because I have to do less coding - argument.


Rod Pemberton

Paul Rubin

unread,
Jul 10, 2010, 3:40:30 AM7/10/10
to
"Rod Pemberton" <do_no...@notreplytome.cmm> writes:
> If all the problems I need to solve are solvable with a certain programming
> language and/or paradigm, why would I bother

If those are the only problems you're capable of solving, you're in a
hard place once the world around you changes and presents a different
set of problems. Imagine someone making a post like yours, but changing
all the Forth references to Cobol references. Your post comes across
about the same way.

See the discussion of the "Blub paradox" in Paul Graham's famous
"Beating the Averages" essay:

http://www.paulgraham.com/avg.html

What Graham doesn't realize, of course, is that Lisp is also a Blub.

jacko

unread,
Jul 10, 2010, 6:56:23 AM7/10/10
to
So all said and done, macros are just clever use of xts, or just
freeform functional parameter types *fn().

The best thing to say to anoy a lisper? "I learn't scheme the other
day, and it's based on that outdated language you use."

Cheers Jacko

Andreas

unread,
Jul 10, 2010, 11:53:52 AM7/10/10
to
> Here's your homework assignment: Learn Lisp. Like Forth, it's not a
> difficult language to learn. You can pick up the basics in just a few
> days. Then, with that basic knowledge, research the kind of solutions
> that Lisp programmers come up with and then you'll understand why your
> questions are so... silly.


Well, I wouldn't put it that way, however
Learning Lisp (or Prolog or Haskell) can be a revelation when you are
used to imperative programming.

And while we're at it: ECLiPSe.


jacko

unread,
Jul 10, 2010, 3:59:25 PM7/10/10
to
> 1. Convenient representation of lists (as S-expressions) at compile
> time, including tree structures as nested lists.

Just like threaded code but with an extra next instruction pointer and
node type overhead.

> 2. Easy manipulation and creation of lists at runtime, including the
> CONS primitive that creates new list nodes

n NODES ( for expanding the free list )

> 3. Automatic storage management (garbage collection) to reclaim memory
> occupied by no-longer-reachable data created by CONS.

Requires a modification of ! when used with lists.

> 4. Functions represented as expressions in lambda calculus, leading to
> convenient implementation of higher-order functions like map and reduce,
> delayed evaluation by wrapping an expression in a lambda, etc.

Just a form of locals.

> 5. Data has runtime type tags ensuring type safety.

Sometimes implemented as (LIST (NODE a) (LIST (NODE b) (LIST (NODE
c)))) i.e. a hidden type cell before the value. While this method
looks bad in pointer chase terms it does mean nodefn can be an array
switch, instead of a nested if then. Usually this is optimized to a 4
node <leftval rightval count class>. Which is sometimes reduced to a 2
node with the class encoded as the alignment bit of the rightval which
is always made a pointer, and some horrid locking garbage collector
code.

It seems to me the tendancy to pass by value for compiler efficiency
is what makes many languages of listy nature quite crap ;-) 'cos the
evil EVAL has to occur somewhere, or does it?

Cheers Jacko

Paul Rubin

unread,
Jul 10, 2010, 4:07:55 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> It seems to me the tendancy to pass by value for compiler efficiency
> is what makes many languages of listy nature quite crap ;-) 'cos the
> evil EVAL has to occur somewhere, or does it?

Like Forth, Lisp is generally compiled these days. Eval is rarely used,
and in some Lisp dialects, it doesn't exist.

Paul Rubin

unread,
Jul 10, 2010, 4:09:52 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> The best thing to say to anoy a lisper? "I learn't scheme the other
> day, and it's based on that outdated language you use."

Heh, I think at this point they'd be delighted that someone new had
gotten interested in any Lisp dialect. The way to really tick off
Lispers involves discussing the virtues of static type systems.
I won't go into it since it's probably a touchy subject even here ;-).

jacko

unread,
Jul 10, 2010, 4:28:54 PM7/10/10
to
On 10 July, 21:07, Paul Rubin <no.em...@nospam.invalid> wrote:

But surely just using the parameter would provide an eval, unless of
course it was passed to another function.

x -> eval x

f x -> f 'x

Bernd Paysan

unread,
Jul 10, 2010, 4:36:14 PM7/10/10
to
Paul Rubin wrote:
> Like Forth, Lisp is generally compiled these days. Eval is rarely
> used, and in some Lisp dialects, it doesn't exist.

One should note, that like Lisp, the ancient way of implementing Forth
(by indirect or direct threaded code) allows the same depth of
manipulation of the program, because it *is* now data, too. It is not a
list, it's an array, but this shouldn't matter much. You can decompile
that stuff easily, and also modify it, e.g. to run a single step
debugger (Gforth's debugger requires Gforth to run in ITC mode).

Using a tokenized representation of a word (i.e. one where program=data)
is also deployed by advanced native code compilers like VFX Forth, where
the tokenized words are fed to the inliner (before, it was a source
inliner, but that was longer and less friendly to guru code). The
tokenized representation is actually not executed in VFX, but with some
diving into its internals, using that representation to the same effect
as in classical Lisp should not be difficult.

--
Bernd Paysan
"If you want it done right, you have to do it yourself"
http://www.jwdt.com/~paysan/

Paul Rubin

unread,
Jul 10, 2010, 4:56:56 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
>> Like Forth, Lisp is generally compiled these days.  Eval is rarely used,
>> and in some Lisp dialects, it doesn't exist.
>
> But surely just using the parameter would provide an eval, unless of
> course it was passed to another function.

No it's just like saying f(g(x), h(x)) in a C-like language. The
compiler figures out what it means and generates traditional machine
code for it. Using something like eval at runtime is a crufty and
outdated style even when it's available.

Aleksej Saushev

unread,
Jul 10, 2010, 5:27:42 PM7/10/10
to
Paul Rubin <no.e...@nospam.invalid> writes:

Very strange impression, I don't quite understand how you got it.
SBCL does type inference when compiling, ECL does either, AFAIK.


--
HE CE3OH...

Paul Rubin

unread,
Jul 10, 2010, 5:35:36 PM7/10/10
to
Aleksej Saushev <as...@inbox.ru> writes:
> Very strange impression, I don't quite understand how you got it.
> SBCL does type inference when compiling, ECL does either, AFAIK.

It's one thing to say the compiler deduces some type info that it can
use for optimization when it's available. That's much different from
saying the compiler should reject all programs that it can't deduce are
completely free of type errors.

jacko

unread,
Jul 10, 2010, 6:04:56 PM7/10/10
to
On 10 July, 21:56, Paul Rubin <no.em...@nospam.invalid> wrote:

So then the complexity of the compiler rises, special forms enter, and
dynamic entry at the prompt is lost. 'Cos if that don't invole an
eval, then a miracle must have happened. Data as programs => eval,
programs as data => recall with no evaluate (yet).

Paul Rubin

unread,
Jul 10, 2010, 6:24:38 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> So then the complexity of the compiler rises, special forms enter, and
> dynamic entry at the prompt is lost. 'Cos if that don't invole an
> eval, then a miracle must have happened. Data as programs => eval,
> programs as data => recall with no evaluate (yet).

Most systems have a read-eval-print loop. Eval is an interpreter in
some and a compile-and-run operation in others. It's typical for a Lisp
to have both a compiler and an interpreter, where the interpreter is
much slower. Forth is the same way from what I understand. The
interpreter is good for quick interaction and testing stuff out. Then
you compile the program for compactness and speed.

Evalling stuff at an interactive prompt is common and useful. It's less
usual to call eval in the middle of a larger program.

John Passaniti

unread,
Jul 10, 2010, 6:32:45 PM7/10/10
to
On Jul 10, 2:46 am, "Rod Pemberton" <do_not_h...@notreplytome.cmm>
wrote:

> If all the problems I need to solve are solvable with a certain
> programming language and/or paradigm, why would I bother with
> learning another language just to find another solution - which
> may or may not be simpler, easier, or more elegant - to a problem
> which is already solvable?

Let's all be thankful that the world isn't filled with people like
you-- that for most people there is a constant drive for something
better. Forth didn't solve any fundamentally new problems when it was
created, so by your argument, Charles Moore wasted his time in
adapting and synthesizing the existing ideas around him together into
Forth.

The only reason I can think someone would hold your attitude regarding
programming languages and paradigms is that you haven't ever seriously
stepped outside what you know. And when something more advanced-- or
even just different-- is presented to you, your mind races to a
bizarre kind of reductionism.

> Even if I learned a "new method" of solving programming problems,
> it doesn't guarantee that the results I get are *more useful* to me.  

Correct. But you'll never know unless you try. In my experience, my
love of learning programming languages and paradigms has led to ways
of thinking that are worse, merely different, and better. Having
these in my head means that I can see solutions to problems that
others can't. Where I now work, the designer of some DSP code has
some limitations in it that I found arbitrary. In our current
products, the user can create DSP signal chains to solve real-world
problems (like feedback suppression, time alignment of speakers, room
equalization, etc.). But the user can only apply one single algorithm
per chain. I looked at this, and based on my wider experience, found
this limitation to be completely unnecessary. I can apply the
techniques I've learned from other languages and paradigms to remove
the limitation and give a different way to solve the problem (and
write the code) that opens up the product in ways that the original
limited view would never allow. And I can do it with less code and
simpler techniques.

This is the real-world. I don't just study other languages and
paradigms because I'm bored. It gives me a competitive advantage over
people like-- well, like you-- because I don't have to work as hard to
get the same result and because I can come up with simpler and more
robust systems that are more flexible. That's something real. That
means that I can do more. In the same amount of time you could crank
through a canonical solution, I'm already on the next product, making
money for the company, and enriching myself with more experience.

> JP's up on Cloud-9 again...  Just because Lisp can be bootstrapped from a
> small set of "primitives", like Forth, does not mean that Lisp is better
> than Forth or some other language.  

I didn't claim it was better. I claimed it was different, and that
difference allows different solutions to problems. Better for some
tasks, equivalent for others, and sometimes worse. The value in Lisp
over Forth is that it allows a different way of thinking about
problems. In much the same way that Forth over C allows a different
way to think about problems. That's not "cloud nine." That's simple
pragmatism combined with humility that the world is bigger than
myself, and that others have come up with different ways to think
about the solutions to problems. You apparently haven't learned that
life lesson yet.

> Just because Lisp has the ability to
> natively implement binary or other data trees does not mean that the
> solutions Lisp programmers "come up with" are any more elegant than some
> other language, once code for manipulating trees has been written for them.
> Their solutions may be shorter, smaller, etc., but the operations performed
> are the same.  

No, they aren't. And that's the key part you seem to be missing.
They *aren't* the same. The abstractions that are possible in other
languages and paradigms don't always reduce down to the same basic
operations. And they certainly don't require the same amount of
effort to craft. I'll provide a *simple* example, which you'll
distort and completely miss the point:

Here's a problem: You have to create a parser for a language. One
way to do this would be use recursive-descent. You grab tokens and
transition through code, possibly back-tracking over recursive paths.
That is a code-centric way to deal with the problem of parsing.
Another way you could do this would be to use a tool like yacc, which
instead creates a structure that captures the possible transitions
that you can take as you consume tokens. That is a data-centric way
to deal with parsing. They aren't the same. They have different run-
time properties, different strengths and weaknesses. So say you only
knew about one method-- say, recursive descent. So now you come
across a parsing task that can't be easily solved-- such as a grammar
that has left-recursion in the definition. If you only knew how
recursive descent, it would mean you would avoid such grammars,
introduce hacks in the code to get around them, or limit the language
you're writing the parser for. Someone else who knew there was
another way wouldn't need this-- they could consider solutions where
such compromises wouldn't be needed.

Here's another example. Let's say you have a GUI where you have three
views of the same data. Think of a spreadsheet-like application where
you have a column of numbers in one window, a bar graph in another,
and some statistics based on that column of numbers in a third
window. When the user changes values in the column of numbers, the
graph and statistics windows need to be updated. And let's say that
you also allow the user to drag bars up and down in the graph window
and update those same numbers. Now the question is how do you handle
this? If you didn't know any better, you might approach this problem
by having code that for each view explicitly updates the other ones.
But what happens if the user can open up another graph view-- say a
pie chart of the same data? Or what happens if this is an embedded
system and we also need to update a number of the front-panel of the
box it's in?

One solution to this is the model-view-controller (MVC) design
pattern. But if you didn't know what a design pattern was or how MVC
worked, you might try to solve the problem with a code-centric,
inflexible, and explicit solution to the problem. The end solution is
not the same-- in one, you expressly write code to update every view.
That's a centralized solution, and requires /a priori/ knowledge. It
also becomes a maintenance problem, because as the definitions change,
you need to update the code in each and every view. The MVC solution
is decentralized, data-centric, and flexible. It represents less
maintenance on the code when things change, and it scales.

> It just means you have to work a bit less.  Ditto for other
> "advantages" or "features" of Lisp.  (Lisp is another language
> frequently implemented in C.)  I see this as no different than
> your RegEx's are the best way to parse - because I have to do
> less coding - argument.

I guess you don't value your time. I do. I like to spend time with
family and friends. You might look into it.

Your brain has apparently been hardwired in such a way that you can't
accept the notion that there is more than one way to do something, and
that those differences can be quantitatively and qualitatively
better. And that prevents you from being able to see discussions like
this as anything more than a simple "X is better than Y" argument. I
never claimed that regular expressions are the best way to parse-- I
said that for some classes of problems, they are better. I never said
that Lisp is better than Forth-- I said that for some classes of
problems, it is better.

You also continue to confuse that because one language can be
implemented in another-- such as a Forth or Lisp in C-- that somehow
makes it pointless to code in Forth or Lisp, since it all reduces down
to what you could do in C. But that's the thing-- when I'm writing
code in Forth or Lisp, I'm operating on a different level than I am in
C. I'm not thinking about the same things I am when I'm writing in
C. And that allows me to solve problems differently.

You have to understand this on some level, or you wouldn't be in this
newsgroup. You would be in comp.lang.c, writing that all problems
have been solved, and there is no reason to invent new languages or
paradigms. But weirdly, you're still here. Why is that?

Aleksej Saushev

unread,
Jul 10, 2010, 6:50:08 PM7/10/10
to
Paul Rubin <no.e...@nospam.invalid> writes:

This is highly controversial. in most cases you don't need formal proofs
for your code, and strong typing is a part of it. If you need powerful
typing system (and you understand that you need it quite fast), then you
lose type inference and have to write more additional type annotations.
It's too high price for a little gain.


--
HE CE3OH...

jacko

unread,
Jul 10, 2010, 6:59:04 PM7/10/10
to
My basic point is lambda expressions can be unfurled with the
arguments via list substitution without evaluation, making a super-
function that is the program. The pre-evaluation of function arguments
does not fit with lazy evaluation, especially if the program is only
returning a subsection of all that would get evaluated. I'm sure this
was the reason for not specifying an order of evaluation of the
arguments in Scheme.

jacko

unread,
Jul 10, 2010, 7:30:48 PM7/10/10
to

The only reason macros are needed is because of the faux evaluation of
arguments to a function. Yes low level prims do cause an eval, and
that needs to be, so why have the higher level functions do it before
necessity?

After all lambda expressions just replace the names in the list with
(CDR (CAR CDR) (CAR CAR CDR) ... )

Paul Rubin

unread,
Jul 10, 2010, 7:33:02 PM7/10/10
to
Aleksej Saushev <as...@inbox.ru> writes:
> This is highly controversial. in most cases you don't need formal proofs
> for your code, and strong typing is a part of it. If you need powerful
> typing system (and you understand that you need it quite fast), then you
> lose type inference and have to write more additional type annotations.
> It's too high price for a little gain.

Earlier I wrote:

> The way to really tick off Lispers involves discussing the virtues of
> static type systems.

If anyone cares, they are now seeing an example ;-)

Paul Rubin

unread,
Jul 10, 2010, 7:35:25 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> The pre-evaluation of function arguments
> does not fit with lazy evaluation, especially if the program is only
> returning a subsection of all that would get evaluated. I'm sure this
> was the reason for not specifying an order of evaluation of the
> arguments in Scheme.

I'm not sure what you're getting at. Scheme uses strict evaluation.
I'm not sure why it didn't specify the order of evaluation and maybe
that was an error. I think ML specifies the order.

jacko

unread,
Jul 10, 2010, 7:55:59 PM7/10/10
to
> I'm not sure what you're getting at.  Scheme uses strict evaluation.
> I'm not sure why it didn't specify the order of evaluation and maybe
> that was an error.  I think ML specifies the order.

I'm for not having an order if such a thing is justified. It's the
programming idea that something is evaluated before passage into a
function, and then expect said function to be as flexible as an
operator or functor, this is so, well whats the word, silly?

Paul Rubin

unread,
Jul 10, 2010, 8:39:57 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> I'm for not having an order if such a thing is justified. It's the
> programming idea that something is evaluated before passage into a
> function, and then expect said function to be as flexible as an
> operator or functor, this is so, well whats the word, silly?

Haskell uses deferred evaluation for everything and it can get pretty
confusing. What you really want is the language's type system tracking
whether any term is lazy or strict. That seems to be the current trend
in research languages, but maybe not yet in real-world use.

jacko

unread,
Jul 10, 2010, 9:18:50 PM7/10/10
to
On 11 July, 01:39, Paul Rubin <no.em...@nospam.invalid> wrote:

xstrict = eval xlazy. of course only at the point the eval is
evaled ;-).

Lazy is not a type property. It's a method of achiving everything that
macros do without making them distinct from functions. Any function is
capable of forcing an evaluation if needed. Quite simply fold would be
the place to put the evaluated folds of embedded lists, to be maily
trasparent to the user. Such that when + etc, were used, evaluation
would happen.

I don't want strict or lazy tracking, I want no evaluation of function
arguments, unless the function requests it by doing the evaluation
explicitly, or relies on the result of a function which does an
explicit evaluation.

jacko

unread,
Jul 10, 2010, 9:34:38 PM7/10/10
to
> I don't want strict or lazy tracking, I want no evaluation of function
> arguments, unless the function requests it by doing the evaluation
> explicitly, or relies on the result of a function which does an
> explicit evaluation.

To put that in forth terms. I want my xt to remain an xt until I
EXECUTE it.

Paul Rubin

unread,
Jul 10, 2010, 9:37:49 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> Lazy is not a type property. ...

> I don't want strict or lazy tracking, I want no evaluation of function
> arguments, unless the function requests it by doing the evaluation
> explicitly,

What do you do with recursive or infinite structures?

Harper has discussion of type-tracked lazy evaluation in PFPL:

http://www.cs.cmu.edu/~rwh/plbook/book.pdf

See the "Polarization" chapter, ch. 42 in the current draft above.

jacko

unread,
Jul 10, 2010, 10:03:44 PM7/10/10
to
On 11 July, 02:37, Paul Rubin <no.em...@nospam.invalid> wrote:

Well, lets see... Recursive structures are programs, and so running a
function would provide the output, therefore the list would be
produced by the recursive algorithm as the + operator combined its
arguments.

An infinite structures? Well I don't intend to use one.

Paul Rubin

unread,
Jul 10, 2010, 10:17:53 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> Well, lets see... Recursive structures are programs, and so running a
> function would provide the output,

In Haskell you can have recursive data, not just recursive programs

> An infinite structures? Well I don't intend to use one.

[1..] is the infinite list [1,2,3,...] and it's quite convenient to be
able to use such values.

John Passaniti

unread,
Jul 10, 2010, 10:34:50 PM7/10/10
to
On Jul 10, 4:36 pm, Bernd Paysan <bernd.pay...@gmx.de> wrote:
> One should note, that like Lisp, the ancient way of implementing
> Forth (by indirect or direct threaded code) allows the same depth
> of manipulation of the program, because it *is* now data, too.  

Kinda. Yes, it was theoretically possible to treat programs as data
in Forth-- at least in a limited way. Words like "see" weakly count
to this, but the fidelity of some decompliations (say, for defining
words that use does>) shows that the compiled form didn't always track
back exactly to the source. And "see" is just decompilation-- actual
modification of Forth code was largely limited to debugging
facilities. But more importantly, both "see" and debugging facilities
require carnal knowledge of the internals of the Forth. And that made
such efforts completely non-portable which means that it never became
part of the toolchest of Forth programmers. Even today with the ANS
Standard, there isn't enough standardization that someone could
reliably write code that could manipulate Forth programs as data. And
so except for marginal cases like "see" and debugging tools, it simply
isn't something Forth programmers use, because they can't count on it.

That was never a problem Lisp had. In Lisp, there is no special API
for manipulating programs as data. Programs are a list, just like any
other list. The programmer had to learn *nothing* about the system in
order to be able to reliably manipulate programs.

> It is not a
> list, it's an array, but this shouldn't matter much.  You can decompile
> that stuff easily, and also modify it, e.g. to run a single step
> debugger (Gforth's debugger requires Gforth to run in ITC mode).

Then please take up my initial challenge. Write a Forth program that
takes the add-ten word I defined, and does constant folding,
recursively, on the definition. Since you can't possibly do this
portably, pick your favorite Forth and do it in that. As you do this,
please note what information you had to look up (such as the structure
of definitions) and how you had to handle numeric literals as special
cases. When you're done with this, I'll show the Lisp solution to the
problem I came up with in about a minute.

> Using a tokenized representation of a word (i.e. one where program=data)
> is also deployed by advanced native code compilers like VFX Forth, where
> the tokenized words are fed to the inliner (before, it was a source
> inliner, but that was longer and less friendly to guru code).  The
> tokenized representation is actually not executed in VFX, but with some
> diving into its internals, using that representation to the same effect
> as in classical Lisp should not be difficult.

Show me your code, and I'll show you mine. Keep track of the amount
of time you spent "diving into its internals", something the Lisp
programmer doesn't have to consider.

jacko

unread,
Jul 10, 2010, 10:51:52 PM7/10/10
to
On 11 July, 03:17, Paul Rubin <no.em...@nospam.invalid> wrote:

> jacko <jackokr...@gmail.com> writes:
> > Well, lets see... Recursive structures are programs, and so running a
> > function would provide the output,
>
> In Haskell you can have recursive data, not just recursive programs

Recursive data always comes from a program.

> > An infinite structures? Well I don't intend to use one.
>
> [1..] is the infinite list [1,2,3,...] and it's quite convenient to be
> able to use such values.

Let's just say it's a recursive sequence, with no halting, which
exceeds lifespan. You're just hankering after a yield of many function
returns instead of a single construction. Or streaming producers.
There is no need for these to be explicit in a language.

Paul Rubin

unread,
Jul 10, 2010, 11:05:02 PM7/10/10
to
jacko <jacko...@gmail.com> writes:
> Recursive data always comes from a program.
Here is a circular list in Haskell:

let x = 1:2:3:x

Where is the program?

> There is no need for these to be explicit in a language.

Shrug. It's useful. Okasaki's book "Purely functional data structures"
shows quite a few algorithms on recursive data, though implemented with
explicit thunks in ML instead of pervasive lazy evaluation like Haskell.

Rod Pemberton

unread,
Jul 11, 2010, 5:17:59 AM7/11/10
to
"John Passaniti" <john.pa...@gmail.com> wrote in message
news:59979751-98eb-4ea3...@u7g2000yqm.googlegroups.com...

> The abstractions that are possible in other
> languages and paradigms don't always reduce
> down to the same basic
> operations.
>

Yes, they do. *ALL* high-level languages reduce to *ASSEMBLY*. All
microprocessors use standardized memory (8-bit byte sized, contiguous, due
to ASCII and EBCDIC circa '74). If they don't reduce to such basic
operations, then they are *uncomputable* on current hardware.

> And they certainly don't require the same amount of
> effort to craft.

True. Like I said, I think you're really only interested in how much work
you have to do, not whether or not the language can be used to effectively
implement a solution.

> [parsing]


> Someone else who knew there was
> another way wouldn't need this-- they could consider
> solutions where such compromises wouldn't be needed.
>

The techniques used for parsing are implementable in numerous languages.
The choice of solution has nothing to do with the language. The
implementation of the solution must do the same thing regardless of
language. You'll find recursive-decent, yacc, LALR(1), LL(1), and others
implemented for many languages. This is not an example of a situation where
one language has "abstractions" which simplify the solution as you've
claimed. They may simplify the implementation... slightly.

> [ GUI, MVC blah]

Ditto. First, nothing to do with the language, again. Second, it's much
more likely someone will "happen upon" an event driven solution or a message
passing solution - even if they don't know what either is - than what you've
suggested... It's even more likely they will given modern OOP languages
with object-action paradigms.

> You also continue to confuse that because one language can be
> implemented in another-- such as a Forth or Lisp in C-- that
> somehow makes it pointless to code in Forth or Lisp, since it
> all reduces down to what you could do in C.

Isn't it?

C is everywhere.
C is effective.
C is simple.
C is powerful.
C is high-level.
C is low-level.
C is standardized.
C is portable. (The percentage is very debatable...)
C is compilable to assembly.
C is interpretable.
C is reduceable. (subsets of C, Java)
C is embeddable.
C is self-hosting.
C is extensible (C++, Cg, GLSL, Objective-C, OpenStep)
C works well with standardized microprocessor architectures.
C has lasted the test of time.

IMO, the only other language that comes close to that list is Forth.

The "Language Popularity Index" indicates C derivatives at 51.5% vs. 0.622%
for Lisp and/or Scheme...
http://lang-index.sourceforge.net/

The "TIOBE Programming Community Index" indicates C derivatives at 58.274%
vs. 0.622% for Lisp and/or Scheme and/or Clojure...
http://www.tiobe.com/index.php/content/paperinfo/tpci/index.html

If you include D and C shell etc., they'd be higher. If they decoupled Lisp
from Scheme and/or Clojure, it'd probably be below Forth for both...

But, even you must admit, that Lisp is slowly, logically, progressing
towards the ideology of C:

Lisp ->
Scheme (lexical scoping, tail recursion) ->
Clojure (functional) ->
???? (procedural) ...

Don't you understand there is a reason for this?

> But that's the thing-- when I'm writing
> code in Forth or Lisp, I'm operating on a
> different level than I am in C. I'm not thinking
> about the same things I am when I'm writing in
> C. And that allows me to solve problems differently.
>

So, you've confirmed it's a personal issue. Thank you. Finally.

> You have to understand this on some level, or you wouldn't
> be in this newsgroup.

OK, not sure how you leapt to that conclusion...

> You would be in comp.lang.c, writing that all problems
> have been solved, and there is no reason to invent new
> languages or paradigms.

Newbs and idiots... er, self-proclaimed C experts who aren't in reality.
There are a couple (by couple, I mean two...) self-proclaimed experts who
are knowledgeable, but they don't post very often, they still express some
misunderstandings, and they don't respond to replies ("post-n-go"). But,
there are a couple (by couple, I mean two...) really obnoxious and ignorant
self-proclaimed experts, who post incessantly. One has mellowed over the
past few years and become more knowledgeable, but the other has become
worse. The motivations of the latter two seem to be either 1) drive away
all C "newbs" from c.l.c. or 2) push their (incorrect) interpretations of C
specifications. I suggested they start another newsgroup so they didn't
have to "deal with" the newbs and could discuss "pure" C without the
unwanted posts. They rejected that idea... It's "our" group...

> But weirdly, you're still here. Why is that?
>

I reworked two early Forth interpreters from one assembler to another and
had intended to post them. But, I haven't gotten around to checking that
all branches were converted correctly, so I haven't posted them. And, I've
been writing a Forth interpreter in C for a while now. It derived out of
another project. Also, I'm also interested in the Forth concept of
"primitives" whereby a language or library is built up from a small subset
of functionality. I've posted previously about this topic, including a list
of counted operations needed to implement Forth, Lisp, C, C libraries,
Posix, etc. Why are you here? You never seem to discuss Forth... You
always discuss topics better suited to comp.lang.misc (language design,
features, parsing, tokenization, syntax, semantics, grammar, etc.). Why
aren't you there?


Rod Pemberton


ken...@cix.compulink.co.uk

unread,
Jul 11, 2010, 5:36:15 AM7/11/10
to
In article
<be853ee1-dd3a-4a45...@u26g2000yqu.googlegroups.com>,
john.pa...@gmail.com (John Passaniti) wrote:

> Not the point. The point is that in Lisp, the programmer is free to
> manipulate the program as data, allowing arbitrarily-complex
> transformations of the code.

And this is a good thing? Self modifying code has IIRC been described
as "devil spawn".

Besides Lisp originated as a language for writing AI programs in, Forth
originated as a language for writing real time hardware control
programs, C originated as a language for writing Unix systems in. There
are very few languages that did not originate as application specific
ones and are still best for that application in spite of also being
useful as general programming languages. By the way the C preprocessor
was included to handle hardware specific functions meaning that main
programs are easier to port.


Of the general programming languages the only ones I remember catching
on in the same time period that Forth and Lisp originated were Basics
and Pascal. USCD Pascal became quite popular because it was easy to port
as like Java it was a byte code language and only the byte code
interpreter needed porting. For RAD I believe that Pascal and Basic are
still the most popular especially as the IDE of Delphi and Visual Basic
makes designing the user interface much easier.

Personally I do not think there was or ever will be a perfect computer
language but there is a most suitable for practable applications. Of
course in some cases most suitable may be defined by the availability of
programmers in which case C is likely to be top.

Ken Young

Albert van der Horst

unread,
Jul 11, 2010, 7:26:10 AM7/11/10
to
In article <7xlj9kt...@ruckus.brouhaha.com>,
Paul Rubin <no.e...@nospam.invalid> wrote:

>John Passaniti <john.pa...@gmail.com> writes:
>> Not the point. The point is that in Lisp, the programmer is free to
>> manipulate the program as data, allowing arbitrarily-complex
>> transformations of the code. Yes, I picked a particular
>> transformation that some Forths happen to already do. I would hope
>> people could abstract a bit and see the larger point.
>
>I think programs-as-data is a feature found in many Lisps but isn't
>essential to Lisp.

This misconception has set back AI research for decennia.
Trying to do AI on a programming system where it is impossible.

> The defining features of Lisp IMO are:

e>
>1. Convenient representation of lists (as S-expressions) at compile
>time, including tree structures as nested lists.
>
>2. Easy manipulation and creation of lists at runtime, including the
>CONS primitive that creates new list nodes
>
>3. Automatic storage management (garbage collection) to reclaim memory
>occupied by no-longer-reachable data created by CONS.
>
>4. Functions represented as expressions in lambda calculus, leading to
>convenient implementation of higher-order functions like map and reduce,
>delayed evaluation by wrapping an expression in a lambda, etc.
>
>5. Data has runtime type tags ensuring type safety.

All these things are done by Python, and better.
And no, I don't think LISP and Python are the same,
not by a far stretch.
A list of features can't express the essence of a language,
although John managed to pin down the most important one.

>
>All these things are doable in Forth with enough contortions, but then
>it's not really Forth any more.

You don't understand the essence of Forth. Suppose I set out to
write an editor in Forth. That is enough of an effort, that I
can amortize an automatic storage facility over it.
So I write a dedicated editor storage facility. I need
functions represented as data (a macro facility as they
call it in an editor) and write a dedicated user friendly one.
So the absence of these things as built-ins is a trait of
the languages, and belongs to its essence.

In an embedded system blindly using a function like malloc() makes you
loose control. This is the kind of mistakes that might have caused the
run-away Toyota's.

Look at colorforth, and fig-Forth, and iso-Forth's.
What do they have in common?
It is not the presence of the DROP word that makes us want
to call all these languages Forth.

Groetjes Albert

--
--
Albert van der Horst, UTRECHT,THE NETHERLANDS
Economic growth -- being exponential -- ultimately falters.
albert@spe&ar&c.xs4all.nl &=n http://home.hccnet.nl/a.w.m.van.der.horst

Albert van der Horst

unread,
Jul 11, 2010, 7:05:24 AM7/11/10
to
In article <be853ee1-dd3a-4a45...@u26g2000yqu.googlegroups.com>,
John Passaniti <john.pa...@gmail.com> wrote:
>On Jul 8, 3:32=A0pm, Alex McDonald <b...@rivadpm.com> wrote:
>> While much of that is true, even my dumb-as-a-box-of-rocks Forth
>> generates this for add-ten;

>
>Not the point. The point is that in Lisp, the programmer is free to
>manipulate the program as data, allowing arbitrarily-complex
>transformations of the code. Yes, I picked a particular
>transformation that some Forths happen to already do. I would hope
>people could abstract a bit and see the larger point.

"allowing arbitrarily-complex transformations of the code"

That is the definition of an AI language. I intend to go there
with ciforth (computer intelligence Forth) but I'm not there.
I understand your point about that in LISP the entitlement
is more built-in and fundamental.

Aleksej Saushev

unread,
Jul 11, 2010, 7:01:23 AM7/11/10
to
"Rod Pemberton" <do_no...@notreplytome.cmm> writes:

> "John Passaniti" <john.pa...@gmail.com> wrote in message
> news:59979751-98eb-4ea3...@u7g2000yqm.googlegroups.com...
>> The abstractions that are possible in other
>> languages and paradigms don't always reduce
>> down to the same basic
>> operations.
>>
>
> Yes, they do. *ALL* high-level languages reduce to *ASSEMBLY*. All
> microprocessors use standardized memory (8-bit byte sized, contiguous, due
> to ASCII and EBCDIC circa '74). If they don't reduce to such basic
> operations, then they are *uncomputable* on current hardware.

This is the problem with modern Forth community: it is mostly made of people
with no imagination and no experience in the "outer" world.

Prolog reduces to WAM, Lisp reduces to SECD, which you can emulate on
your real hardware but only in limited sense. That it is limited is easy to see:
Lisp is Church-complete, but your assembly is not.

>> [parsing]
>> Someone else who knew there was
>> another way wouldn't need this-- they could consider
>> solutions where such compromises wouldn't be needed.
>
> The techniques used for parsing are implementable in numerous languages.
> The choice of solution has nothing to do with the language. The
> implementation of the solution must do the same thing regardless of
> language. You'll find recursive-decent, yacc, LALR(1), LL(1), and others
> implemented for many languages. This is not an example of a situation where
> one language has "abstractions" which simplify the solution as you've
> claimed. They may simplify the implementation... slightly.

Prolog is "slightly simplifying" backtracking, Haskell is "slightly simplifying"
infinite structures, that's quite nice.

>> You also continue to confuse that because one language can be
>> implemented in another-- such as a Forth or Lisp in C-- that
>> somehow makes it pointless to code in Forth or Lisp, since it
>> all reduces down to what you could do in C.
>
> Isn't it?
>
> C is everywhere.

...


> C has lasted the test of time.
>
> IMO, the only other language that comes close to that list is Forth.

Forth failed to last the test of time.

> The "Language Popularity Index" indicates C derivatives at 51.5% vs. 0.622%
> for Lisp and/or Scheme...
> http://lang-index.sourceforge.net/
>
> The "TIOBE Programming Community Index" indicates C derivatives at 58.274%
> vs. 0.622% for Lisp and/or Scheme and/or Clojure...
> http://www.tiobe.com/index.php/content/paperinfo/tpci/index.html
>
> If you include D and C shell etc., they'd be higher. If they decoupled Lisp
> from Scheme and/or Clojure, it'd probably be below Forth for both...

Have you ever tried to go outside Forth community and visit those of
Scheme and Lisp? You're talking about how Forth is superior, alright, it is.

Now may I look at your web server in Forth? Web framework? None?
Sorry, it isn't rocket science. I can bring you a handful of web
frameworks in Lisp and Scheme. Perhaps even if you separate them.

Alright, you may have to fix bugs there, but still you don't write from scratch.

> But, even you must admit, that Lisp is slowly, logically, progressing
> towards the ideology of C:
>
> Lisp ->
> Scheme (lexical scoping, tail recursion) ->
> Clojure (functional) ->
> ???? (procedural) ...
>
> Don't you understand there is a reason for this?

Perhaps it is revelation to you, but Lisp has lexical scoping,
and it's been so for 25 years already, tail calls are optimized where
you ask for it, and sometimes it's default optimization level.
And then there exists ACL2 operationg on functional subset,
and there exists procedural subset ("tagbody" and so on).

It isn't hard to learn that, if you spend some time reading on what
you're going to talk.


--
HE CE3OH...

Aleksej Saushev

unread,
Jul 11, 2010, 7:10:12 AM7/11/10
to
ken...@cix.compulink.co.uk writes:

> In article
> <be853ee1-dd3a-4a45...@u26g2000yqu.googlegroups.com>,
> john.pa...@gmail.com (John Passaniti) wrote:
>
>> Not the point. The point is that in Lisp, the programmer is free to
>> manipulate the program as data, allowing arbitrarily-complex
>> transformations of the code.
>
> And this is a good thing? Self modifying code has IIRC been described
> as "devil spawn".

Are you from that "rebuild the world with debug print" camp?

Self-modifying code isn't good in many places, still it has its uses.

> Besides Lisp originated as a language for writing AI programs in,

You should go back and read again the history of Lisp.


--
HE CE3OH...

Albert van der Horst

unread,
Jul 11, 2010, 7:32:18 AM7/11/10
to
In article <7xiq4mn...@ruckus.brouhaha.com>,

Paul Rubin <no.e...@nospam.invalid> wrote:
>jacko <jacko...@gmail.com> writes:
>> The pre-evaluation of function arguments
>> does not fit with lazy evaluation, especially if the program is only
>> returning a subsection of all that would get evaluated. I'm sure this
>> was the reason for not specifying an order of evaluation of the
>> arguments in Scheme.
>
>I'm not sure what you're getting at. Scheme uses strict evaluation.
>I'm not sure why it didn't specify the order of evaluation and maybe
>that was an error. I think ML specifies the order.

For a new language that is a mistake if there was one.
It precludes the parallel evaluation of arguments to functions
in traits. Algol68 (yes that was in 68) explicitly specifies
that order of argument evaluation is undefined, in order
not to hinder progress.

Albert van der Horst

unread,
Jul 11, 2010, 8:00:54 AM7/11/10
to
In article <7xpqywm...@ruckus.brouhaha.com>,
Paul Rubin <no.e...@nospam.invalid> wrote:
>"Rod Pemberton" <do_no...@notreplytome.cmm> writes:
>> (IMO, C proves that the "ability to
>> manipulate program code as data" is not needed as part of the language.)
>
>C has a preprocessor that allows limited manipulation of the program as
>data. It is nearly indispensible in real-world C programming,
>indicating that C really can't get along so well without at least some
>form of the capability. Java tried to get along without it, but ended
>up growing a complicated reflection interface instead. Scheme tried to
>get along without it but ended up with hygienic macros. Haskell tried
>to get along without it but ended up with numerous workarounds including
>Template Haskell, quasiquoting, Liskell, at least 3 different generics
>preprocessors. Overall, the usefulness of such a thing empirically
>recurs again and again in real-world programming systems. So it's
>reasonable to infer that it's probably important.

If you go that way, then a compiler is treating code as data.

Heck, what is a loader? It finds some data in a disk file,
then puts it in memory, manipulates some MMU registers and
sets up page lists. Then tell the computer go on at that point:
it is actually code.

I'm pretty sure this isn't what John meant, but it is insightful
to stress how essential it is. It is the invention of the
stored program computer, visavis the fixed programming of the
weaving machines.

Albert van der Horst

unread,
Jul 11, 2010, 7:51:22 AM7/11/10
to
In article <bd2b58f3-43d4-4c51...@b35g2000yqi.googlegroups.com>,
John Passaniti <john.pa...@gmail.com> wrote:
<SNIP>

>
>Then please take up my initial challenge. Write a Forth program that
>takes the add-ten word I defined, and does constant folding,
>recursively, on the definition. Since you can't possibly do this
>portably, pick your favorite Forth and do it in that. As you do this,
>please note what information you had to look up (such as the structure
>of definitions) and how you had to handle numeric literals as special
>cases. When you're done with this, I'll show the Lisp solution to the
>problem I came up with in about a minute.

I have done that, but in ciforth. It can be argued that ciforth
is a dialect/implementation model, so there.
Note that this kind of Lisp-like entitlement was a design
criterium for the Forth.

It starts from knowledge about assembler Forth words like
being side-effects free.
I'm interested to know how it compares to a simple Lisp solution.

>
>> Using a tokenized representation of a word (i.e. one where program=3Ddata=


>)
>> is also deployed by advanced native code compilers like VFX Forth, where
>> the tokenized words are fed to the inliner (before, it was a source

>> inliner, but that was longer and less friendly to guru code). =A0The


>> tokenized representation is actually not executed in VFX, but with some
>> diving into its internals, using that representation to the same effect
>> as in classical Lisp should not be difficult.
>
>Show me your code, and I'll show you mine. Keep track of the amount
>of time you spent "diving into its internals", something the Lisp
>programmer doesn't have to consider.

forthlecture5.html (on my site in the .sig) is about this.
It has been actually implemented, but experimentally.
It is more theory than Forth and I would be surprised if
Lisp could escape the general line of reasoning that is in
there.

The code is not published on the website, but I could send it to you.
The reason that release 5 of ciforth takes so long is that it
would entitle the optimiser (as an example of an AI feature)
by the ciforth programming model.

jacko

unread,
Jul 11, 2010, 8:32:47 AM7/11/10
to
On 11 July, 13:00, Albert van der Horst <alb...@spenarnc.xs4all.nl>
wrote:
> In article <7xpqywmgza....@ruckus.brouhaha.com>,
> Paul Rubin  <no.em...@nospam.invalid> wrote:

>
> >"Rod Pemberton" <do_not_h...@notreplytome.cmm> writes:
> >> (IMO, C proves that the "ability to
> >> manipulate program code as data" is not needed as part of the language.)
>
> >C has a preprocessor that allows limited manipulation of the program as
> >data.  It is nearly indispensible in real-world C programming,
> >indicating that C really can't get along so well without at least some
> >form of the capability.  Java tried to get along without it, but ended
> >up growing a complicated reflection interface instead.  Scheme tried to
> >get along without it but ended up with hygienic macros.  Haskell tried
> >to get along without it but ended up with numerous workarounds including
> >Template Haskell, quasiquoting, Liskell, at least 3 different generics
> >preprocessors.  Overall, the usefulness of such a thing empirically
> >recurs again and again in real-world programming systems.  So it's
> >reasonable to infer that it's probably important.

Much of this is due to the early aplication or evaluation of the
feature shovel. This was opften considered good in a competitative
language, but does have the gaping problem of doing premature
optimization. Which as we all know it the root to all eval, or lack
there of. I must compile a monolith! I must compile in extra Call
instruction opcodes instead of thread! I must miss understand code
density in massive cache miss fetch rates! Have I forgotten anything
for my new experimental language Spangled? ? Note the extra query,
it's for making you question weather this is the tool for the job when
you use the language. Of course my current project is DTC with lazy
prefixes until the postfix operator gets applied, so Spangled? will
have to wait a while.

> If you go that way, then a compiler is treating code as data.
>
> Heck, what is a loader? It finds some data in a disk file,
> then puts it in memory, manipulates some MMU registers and
> sets up page lists. Then tell the computer go on at that point:
> it is actually code.
>
> I'm pretty sure this isn't what John meant, but it is insightful
> to stress how essential it is. It is the invention of the
> stored program computer, visavis the fixed programming of the
> weaving machines.

Yes definite data is code is data. Although I often find code is
finalized data.

John Passaniti

unread,
Jul 11, 2010, 10:19:21 AM7/11/10
to
On Jul 11, 5:17 am, "Rod Pemberton" <do_not_h...@notreplytome.cmm>
wrote:

> Yes, they do.  *ALL* high-level languages reduce to *ASSEMBLY*.  

Again, you probably mean machine language, not assembly. But this
absurd reductionism aside, you keep missing the most important point--
you're looking at the end, not the start. When I'm coding in a
language like Forth or C or Lisp or Ruby or Python or *whatever*, I'm
not programming in assembly. We can certainly talk about the stream
of machine code that is executed when I'm iterating over a collection
in C# or when I'm returning a closure in Lua. But when I'm writing
that code, I'm (usually) completely disconnected from that level. And
that is the point of all programming languages-- even your beloved C.
Say you write this function in C:

int square(int x) { return x*x; }

You're absolutely right that it ultimately gets translated to machine
code. But are you thinking of that machine code when you write that
function? Are you thinking, "okay, I first have create a stack frame
so that I can address the argument, fetch the argument, copy the
argument to two registers so the multiply instruction can operate on
it, get the value out and put it in a register to match the ABI's
specification for return values, and return to the caller." No, of
course not. When you write a function in C, you're (usually) not
thinking in terms of the underlying machine code. You're thinking in
terms of C and the abstractions and facilities it gives you. And the
same is true for *every* other language. When I create a set of rules
for Prolog to match, it does indeed ultimately reduce down to machine
code, but that's not what I'm thinking about. I'm thinking in terms
of the abstractions, facilities, and models that the language offers
me.

> > And they certainly don't require the same amount of
> > effort to craft.
>
> True.  Like I said, I think you're really only interested in how much work
> you have to do, not whether or not the language can be used to effectively
> implement a solution.

They are the same thing. The effectiveness of the solution is
directly related to how much work one has to do in order to solve it.
As an example, when I whip out a quick parser using a simple regular
expression, it isn't just that I have done something far faster,
easier, and more expressively than in a language where I don't have
such a facility. It directly goes to the effectiveness of the
solution. And that in turn has real, measurable benefits. As a
software engineer, I'm paid to solve problems. And while I could
certainly toil away and crank out assembly language solutions to
problems (and sometimes do), that would slow down development
dramatically. My first real job was maintaining and extending a huge
set of Z80 assembly language. Major new features would take weeks to
write, test, and debug. When we moved development to C, we reduced
that time to days. And later, when I moved some of the tools we used
to work with our systems to higher-level languages, I could create new
features in hours.

Now obviously, all this ultimately reduced down to machine language.
But the fact we weren't coding in machine language meant that we could
create faster, better, and more effective solutions. And indeed, at
least twice, by not working at the assembly or machine language level,
it meant that when we decided using an 8051 derivative would be a
better solution, virtually none of the code needed to change.

This is pretty basic stuff, Rod. Your goofy reductionism apparently
prevents you from seeing this, but what's funny is that at the same
time you say "it all reduces down to assembly [sic]", you appear to be
advocating for a language-- C-- that isn't assembly. Weird. It's
like you understand that C insulates you from having to think at the
machine level, but if it all reduces down to machine code, then why
bother? Why let C get in the way?

I think if you honestly answer that question, you'll see the value of
other programming languages.

> The techniques used for parsing are implementable in numerous languages.
> The choice of solution has nothing to do with the language.  

As usual, that isn't the point. The point is that you appear to only
know (that is, are competent in) a small set of languages-- and those
languages are all of the imperative, procedural variety. Because of
that, you are limited in your ability to see different solutions to
problems-- it's like the old adage that if all you have is a hammer,
then everything is a nail. And in exactly the same way, in my example
of if someone only knew of recursive descent parsing, then there are
entire grammars that can't be implemented-- or that will require hacks
and compromises in order to implement. Imperative, procedural
languages are your hammer, and they necessarily limit you.

> Ditto.  First, nothing to do with the language, again.  Second, it's much
> more likely someone will "happen upon" an event driven solution or a message
> passing solution - even if they don't know what either is - than what you've
> suggested...  It's even more likely they will given modern OOP languages
> with object-action paradigms.

But gosh, why? If it all "reduces down to assembly [sic]" then this
fancy OOP is just useless window dressing right? It may help solve
some problems faster and easier and better, but it's just the
programmer being lazy, right?

> > You also continue to confuse that because one language can be
> > implemented in another-- such as a Forth or Lisp in C-- that
> > somehow makes it pointless to code in Forth or Lisp, since it
> > all reduces down to what you could do in C.
>
> Isn't it?

No, it isn't. Having a working knowledge of C is certainly useful,
but what you need to add to C in order to solve many problems are
features that other languages natively have. A fine example are
problems that require run-time evaluation of code. Say I want to
write a program that allows you to enter an equation and have it
rendered as a graph. C natively has no ability to take an equation at
run-time and evaluate it. You can certainly find libraries to do
that, but then you're not longer programming in C-- you're programming
in the library. And at that point, you have to ask yourself, "what
other capabilities does C not natively have that I need to extend with
libraries-- why not just use a language that is better suited for the
problem?"

> But, even you must admit, that Lisp is slowly, logically, progressing
> towards the ideology of C:
>
> Lisp ->
> Scheme (lexical scoping, tail recursion) ->
> Clojure (functional) ->
> ???? (procedural) ...
>
> Don't you understand there is a reason for this?

A reason for what? All I see is that you've listed three Lisp-family
languages and some features. I have no idea what "????" represents,
an what this supposedly has with Lisp supposedly progressing toward
the "ideology of C" (whatever that is). I guess this passes for
thinking in your mind.

> > But that's the thing-- when I'm writing
> > code in Forth or Lisp, I'm operating on a
> > different level than I am in C.  I'm not thinking
> > about the same things I am when I'm writing in
> > C.  And that allows me to solve problems differently.
>
> So, you've confirmed it's a personal issue.  Thank you.  Finally.

It's hardly personal. It's the reason *everyone* uses different
programming languages.

> > But weirdly, you're still here.  Why is that?
>
> I reworked two early Forth interpreters from one assembler to another and
> had intended to post them.  But, I haven't gotten around to checking that
> all branches were converted correctly, so I haven't posted them.  And, I've
> been writing a Forth interpreter in C for a while now.  It derived out of
> another project.  Also, I'm also interested in the Forth concept of
> "primitives" whereby a language or library is built up from a small subset
> of functionality.  I've posted previously about this topic, including a list
> of counted operations needed to implement Forth, Lisp, C, C libraries,
> Posix, etc.  Why are you here?  You never seem to discuss Forth...  You
> always discuss topics better suited to comp.lang.misc (language design,
> features, parsing, tokenization, syntax, semantics, grammar, etc.).  Why
> aren't you there?

Odd, I discuss Forth all the time. I compare and contrast it to other
languages, talk about both problems I see in Forth and strengths it
has, discuss my own experiences using Forth, talk both objectively and
subjectively about the quality of code posted here. That's why I'm
here-- because I see Forth as a solution to a class of problems, where
as you see it as just "reducing to assembly [sic]".

John Passaniti

unread,
Jul 11, 2010, 10:28:03 AM7/11/10
to
On Jul 11, 5:36 am, ken...@cix.compulink.co.uk wrote:

> john.passan...@gmail.com (John Passaniti) wrote:
> > Not the point.  The point is that in Lisp, the programmer is free to
> > manipulate the program as data, allowing arbitrarily-complex
> > transformations of the code.
>
>  And this is a good thing? Self modifying code has IIRC been described
> as "devil spawn".

It isn't good or evil. It's a technique, and if it's used
effectively, it can radically simplify problems. If used stupidly, it
can convolute code. If you approach programming as some things are
off limits merely because they can be abused, then you're going to
find you are quite limited indeed.

>  Personally I do not think there was or ever will be a perfect computer
> language but there is a most suitable for practable applications. Of
> course in some cases most suitable may be defined by the availability of
> programmers in which case C is likely to be top.

For there to be a "perfect" computer language, it would have to
coherently encompass all the wildly different ways that people have to
solve problems. Such a language would probably be a complete mess and
nearly unusable. It would also be a language that would be constantly
evolving. It would be forever moving target, and few could ever learn
it.

Andrew Haley

unread,
Jul 11, 2010, 10:34:01 AM7/11/10
to
John Passaniti <john.pa...@gmail.com> wrote:
> On Jul 11, 5:17?am, "Rod Pemberton" <do_not_h...@notreplytome.cmm>

> wrote:
>> True. Like I said, I think you're really only interested in how
>> much work you have to do, not whether or not the language can be
>> used to effectively implement a solution.
>
> They are the same thing. The effectiveness of the solution is
> directly related to how much work one has to do in order to solve
> it.

I wondered about that too. If bang for the buck (over the entire
project lifecycle) isn't the measure of a programming language, what
else possibly could be?

Andrew.

Paul Rubin

unread,
Jul 11, 2010, 10:53:54 AM7/11/10
to
Albert van der Horst <alb...@spenarnc.xs4all.nl> writes:
>>I'm not sure what you're getting at. Scheme uses strict evaluation.
>>I'm not sure why it didn't specify the order of evaluation and maybe
>>that was an error. I think ML specifies the order.
>
> For a new language that is a mistake if there was one.
> It precludes the parallel evaluation of arguments to functions
> in traits.

But that means that the semantics are undefined, since the arg
evaluations can have side effects. Evaluating them in parallel makes
things even worse because of the locks and stuff needed. It's
legitimate when the compiler knows that the evaluations are
side-effect-free, either through flow analysis or through a type system.

> Algol68 (yes that was in 68) explicitly specifies that order of
> argument evaluation is undefined, in order not to hinder progress.

That surprises me. I may ask one of the ML guys about it.

Paul Rubin

unread,
Jul 11, 2010, 11:04:10 AM7/11/10
to
ken...@cix.compulink.co.uk writes:
>> The point is that in Lisp, the programmer is free to
>> manipulate the program as data, allowing arbitrarily-complex
>> transformations of the code.
>
> And this is a good thing? Self modifying code has IIRC been described
> as "devil spawn".

Self-modifying code means something different than what John is talking
about. It means modifying the mutable store where the program lives.
Lisp program manipulations are more like a fancy preprocessor, or at
most like staged metaprogramming, something that has been done even
in typed functional languages like MetaOcaml.

jacko

unread,
Jul 11, 2010, 1:03:39 PM7/11/10
to

It has no real impact if the program is written with style. The use of
side effect functions as arguments is just an attempt a premature
optimization, obviously it makes more sense to perform the side effect
sequentially after the results for its display etc, have been
calculated. Then the strict after speculative execution semantic
provides the required behaviour, where as, functional argument
evaluation orders cause excessive evaluations when some of them are
not needed, and short cut operators need a special "fix".

And all because call by value is considered "safe" and call by
substitution of expression "may" mean double evaluation in rare cases
with bad programming, and looks complicated compared to call by
reference and call by value. This is what I mean by lazy evaluation.
Pity most languages don't do pointers to expressions/subroutines. I
see no problem in a function returning a list within which is stored a
value and another function to continue the sequence. An explicit lazy
sequence say.

What would be the difference between a subroutine pointer and a
datalist pointer? Probably the LLITERAL.

ok>.

Paul Rubin

unread,
Jul 11, 2010, 1:09:29 PM7/11/10
to
jacko <jacko...@gmail.com> writes:
> It has no real impact if the program is written with style. The use of
> side effect functions as arguments is just an attempt a premature
> optimization, obviously it makes more sense to perform the side effect
> sequentially after the results for its display etc,

Not possible. The value may depend on the side effects being run.

jacko

unread,
Jul 11, 2010, 1:55:43 PM7/11/10
to
On 11 July, 18:09, Paul Rubin <no.em...@nospam.invalid> wrote:

x = F(y) -- do the side effect call
z = G(x) -- then do the function

Paul Rubin

unread,
Jul 11, 2010, 2:00:46 PM7/11/10
to
jacko <jacko...@gmail.com> writes:
> x = F(y) -- do the side effect call
> z = G(x) -- then do the function

Say there's more than one arg with side effects. What order do you do
them in?

Hint: that road leads to Haskell.

jacko

unread,
Jul 11, 2010, 2:01:23 PM7/11/10
to
x F BLIP G

The word BLIP pushes the stack result of x F into the parse stream of
G which is a prefix function.

jacko

unread,
Jul 11, 2010, 2:04:22 PM7/11/10
to

Of course the sequence BLIP BLIP shall push 2 arguments with the first
blip being the second argumnet!

jacko

unread,
Jul 11, 2010, 2:28:35 PM7/11/10
to
This only works if BLIP is a prefix function, 'cos as we all know
prefix functions never pass by value.

jacko

unread,
Jul 11, 2010, 2:38:13 PM7/11/10
to
On 11 July, 19:28, jacko <jackokr...@gmail.com> wrote:
> This only works if BLIP is a prefix function, 'cos as we all know
> prefix functions never pass by value.

It turns out BLIP needs an extra throw forward number of elements
argument, for ease of implementation.

jacko

unread,
Jul 11, 2010, 2:53:31 PM7/11/10
to
On 11 July, 19:00, Paul Rubin <no.em...@nospam.invalid> wrote:

Which ever you want, and BLIP them (actually looks like this must be a
postfix word) into the argument list, or as I would call it, the
following elements of the execution code list. Throwing them too far
leads to them being placed in the NIL collector ring. Which may prove
useful.

Harold

unread,
Jul 11, 2010, 8:28:13 PM7/11/10
to
On Jul 11, 2:36 am, ken...@cix.compulink.co.uk wrote:
> Self modifying code has IIRC been described
> as "devil spawn".

I wouldn't go that far. ANY software that creates machine code for
later execution is a self-modifying program. A subroutine-threaded or
direct-threaded FORTH compiler is one such beast.

Coos Haak

unread,
Jul 11, 2010, 9:00:53 PM7/11/10
to
Op Sun, 11 Jul 2010 17:28:13 -0700 (PDT) schreef Harold:

I would not call compilation in Forth self-modifying, just extending the
compiler and interpreter. The threading model does not matter here.

--
Coos

CHForth, 16 bit DOS applications
http://home.hccnet.nl/j.j.haak/forth.html

Jerry Avins

unread,
Jul 11, 2010, 11:02:09 PM7/11/10
to

The Z-80 can do I/O through a port whose address in in a register. For
an 8080 monitor to use a specific port, the monitor needs either a
prohibitively large collection of I/O routines or a way to write the
needed code on the fly. One way was to push the needed code onto the
stack and then execute the stack.

Jerry
--
Engineering is the art of making what you want from things you can get.
¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯¯

Rod Pemberton

unread,
Jul 12, 2010, 6:53:33 AM7/12/10
to
"John Passaniti" <john.pa...@gmail.com> wrote in message
news:54c19bee-f7a7-4843...@d16g2000yqb.googlegroups.com...
>
> ... you keep missing the most important point--

> you're looking at the end, not the start. When I'm coding in a
> language like Forth or C or Lisp or Ruby or Python or *whatever*, I'm
> not programming in assembly. We can certainly talk about the stream
> of machine code that is executed when I'm iterating over a collection
> in C# or when I'm returning a closure in Lua. But when I'm writing
> that code, I'm (usually) completely disconnected from that level. And
> that is the point of all programming languages-- even your beloved C.
>

No, it's not. The point of programming languages are to be able to
_implement_ solutions, not to provide you with a nice mental abstraction to
help you solve some problem. That's what mathematics, languages, visual
diagrams, etc. do.

Also, in regards to "that level," it's up to the programmer to make sure
that what is being done in assembly (and/or machine code) is correct for the
high-level language. There are many instances where the programmer
*believes* the language is doing one thing, when the compiler is actually
doing something else. If you've talked to any "C99 spec. kiddies" from
comp.lang.c, you know what I mean... They don't have a clue as to how the
code is converted to assembly (and/or machine language) for *any* computing
platform, or whether or not they're code is going to do what they intended.
"It's up to the compiler to implement The C99 spec correctly, dude... It's
a contract." IMO, it's your job, as the programmer, to make sure it's done
correctly, right down to the lowest level. Your code is not what is being
executed. The compiler's compiled representation of your code is what is
being executed. That means thinking about what assembly is generated, if
you know, and looking at the (dis)assembly to make sure it is, if you don't.

> Say you write this function in C:
>
> int square(int x) { return x*x; }
>
> You're absolutely right that it ultimately gets translated to machine
> code. But are you thinking of that machine code when you write that
> function?
>

Yes. I do think about what the compiled C code is going to do once
converted to assembly. I don't do so all the time because I've learned that
certain things need to be checked and other stuff doesn't.

> Are you thinking, "okay, I first have create a stack frame
> so that I can address the argument, fetch the argument, copy the
> argument to two registers so the multiply instruction can operate on
> it, get the value out and put it in a register to match the ABI's
> specification for return values, and return to the caller." No, of
> course not.

Of course, not. But, I am considering any potential issues that could arise
from the way the code is compiled.

A simple example is how different compilers implement x86 instructions for
byte-sized operands. x86 code becomes faster and smaller if you can
generate x86 byte-sized instructions. For some C compilers, it's almost
impossible to get them to emit x86 byte-sized instructions. If there is
anything that hints at an integer promotion, they get promoted to larger
sized integer operations. This is slower and can cause a register shortages
on x86. You'll have to go through many "hoops" with them: unsigned chars,
auto var, register keyword, etc. Other C compilers, don't promote to larger
sizes so quickly or hastily and will produce efficient x86 instruction
sequences using byte-sized operands.

Another simple example is C's for(). While it's well known how to unroll a
for() in C into a while() and equivalent statements, it's not well known
that the equivalent code using a while(1) and if()-break typically optimizes
far better than the for() with most x86 compilers. If you've never studied
the output (in assembly, or disassembled from machine code) from multiple C
compilers, you might not ever consider how a for() is converted to assembly.
You might just assume that C optimizes a for() loop very efficiently. It's
also likely you'd assume that the C compiler would produce *identical* code
for both since the unrolling is trivial. But, that is a faulty assumption.

> When you write a function in C, you're (usually) not
> thinking in terms of the underlying machine code. You're thinking in
> terms of C and the abstractions and facilities it gives you. And the
> same is true for *every* other language.
>

Yes, that's how most programmers do things... It's common to ignore the
important stuff. (The resulting machine code, in this case.) And, that's
why we get "bugs".

> When I create a set of rules
> for Prolog to match, it does indeed ultimately reduce down to machine
> code, but that's not what I'm thinking about. I'm thinking in terms
> of the abstractions, facilities, and models that the language offers
> me.
>

Are you saying you've *never* considered what the assembly (and/or machine
code) actually does? What about *after* you've coded something? How do you
*know* the compiler is doing the *correct* thing? I.e., what you intended,
without checking...

I've yet to see *any* C compiler that complies 100% with any of the C
"standards" (C '74, K&R, ANSI C89, ISO C99) etc. They all do some things
incorrectly, have some unimplemented features, and do a few things in an
unexpected manner. They all take shortcuts. They all simplify things.
They all do integer conversions with truncations and/or "implicit casts".
Many C89 and C99 compilers still do some things the old K&R way. IMO, C
compilers are usually better than other languages in this regard. But,
they're far from perfect.

> > > And they certainly don't require the same amount of
> > > effort to craft.
> >
> > True. Like I said, I think you're really only interested in how much
> > work you have to do, not whether or not the language can be
> > used to effectively implement a solution.
>
> They are the same thing. The effectiveness of the solution is
> directly related to how much work one has to do in order to solve it.
>

Not true, and unrelated to what I said... I didn't mention the
"effectiveness of the solution". I mentioned "effectively implement[ing] a
solution". My statement regards the effectiveness of the process necessary
to arrive at a solution, while yours regards how effective an implementation
of the solution is.

> ... but what's funny is that at the same


> time you say "it all reduces down to assembly [sic]", you appear to be
> advocating for a language-- C-- that isn't assembly. Weird.

Assembly isn't portable. It's microprocessor specific.

> It's like you understand that C insulates you from having to
> think at the machine level,

C captures the minimal essence of structured programming. It does so in a
simple, clean, compact, fairly portable, etc. way. It uses a model that
fits well on microprocessors since '74. Once you have that, what more do
you need?

> but if it all reduces down to machine code, then why
> bother?
>

If an assembler supported variables, as C implements them, I'd consider it.
But, they don't. In C, the compiler keeps track of where the variable is at
any point in time: memory, register, stack. It can relocate it as needed
for speed or size. In assembly (and Forth...), one must do this for
theirself. There is no named variable. In the end, I still wouldn't use
assembly. Why? Assembly is not portable... Now, if one used a set of
assembly macro's, instead of assembly, and those macro's worked on a wide
range of assemblers, then you'd have something similar enough to a C subset
that it'd be functionally the same. For assembly to be used on another
platform, it has to be ported to another assembly or converted to another
language. If it's hardware specific code, it's not an issue - assembly is
fine. But, the vast majority of code is not hardware specific. The code is
for user applications. So, all you need is something which "captures the
minimal essence of structured programming". C does that very effectively.

> Why let C get in the way?
>

Why get "locked in" to hardware specific, awkward, limited, or "dead"
languages? Isn't a simple, generic, programming language a better choice
for most situations?

> > The techniques used for parsing are implementable in numerous languages.
> > The choice of solution has nothing to do with the language.
>
> As usual, that isn't the point. The point is that you appear to only
> know (that is, are competent in) a small set of languages-- and those
> languages are all of the imperative, procedural variety.
>

Fourteen or so is a small set of languages? ... Yes, many are of a
"procedural variety". But, so too are the vast majority of the languages
that have been used since the late '70's... Don't you think there is a
reason for this? E.g., fits well with the structured minds of people who
are adept at programming?

> Because of
> that, you are limited in your ability to see different solutions to
> problems-- it's like the old adage that if all you have is a hammer,
> then everything is a nail.

An illogical segway derived from a (repeated) faulty assumption...

> ... fancy OOP is just useless window dressing right?
>

Yes, it is, or at least some of it is. I haven't done enough OOP to
classify all of it as "window dressing". I don't recall what was in
Stroustrup's book anymore, but I know that everything in Stroustrup's book
was. You can implement an object-action paradigm in C with code that looks
identical to C++. Does it do the same thing? Yes. Does it work the same
way? No. Can you implement all of C++'s OOP features in C? Yes, that's how
C++ started. Will it look like "typical" C? No. Is it C? Yes.

> Having a working knowledge of C is certainly useful,
> but what you need to add to C in order to solve many problems are
> features that other languages natively have.

This is like arguing that because it's difficult to solve some problem in
linear algebra, you should therefore learn calculus and then Laplace
transforms so that obtaining a solution is easier. You can already solve
the problem using linear algebra. Once it's solved, you know what the
solution is. Use it. It may be convenient to use the "faster" method using
Laplace transforms, if you're solving many different problems. But, it's
not a necessity, nor does it disadvantage you any to solve the problem the
"long way". You've still got an effective method leading to a solution, and
you've found the solution.

> ... because I see Forth as a solution to a class of problems, where


> as you see it as just "reducing to assembly [sic]".
>

Actually, for interpretative Forth (e.g., DTC or ITC), typically there is
very little assembly. Most (85-97%) of an interpretative Forth is a set of
addresses which captures the control-flow of the program. The interpreter,
"primitives", and "cfa-routines" (docon, dovar, etc.) are in assembly. For
a compiled Forth or Forth using STC, then, yes, it just "reduces to machine
code".


Rod Pemberton

jacko

unread,
Jul 12, 2010, 9:46:46 AM7/12/10
to
So is LISP left or right associative w.r.t. argument evaluation?

Is there a MicroScheme for those who can live without macros
(unevaluated argument passing in all functions and possibly with ((CDR
(CAR argv)) being the current function name), and can make their own
special forms by using eval when needed? And does it use a counted
reference GC, to avoid dirty paging flush chains, and larger list node
objects? I've been informed by master 'googly' internet that pre-
evaluation of arguments is for the purpose of constant reduction and
type subgroup limiting in function compilation, so not really that
relevant on the 8K ROM power efficient scene.

Note that unevaluated does not mean variables are not substituted for
value, 'cos there always a pointer to value, stored in the program
like (VAR value) i.e. always bound, even with changes. In this way
symbols and strings get combined as a single type.

Cheers Jacko

It is loading more messages.
0 new messages