He contributed to Scheme, Lisp , Java and did the first port of Tex.
--
Java has too few parenthesis.
I've never understood this. How could anyone who knew Scheme and
Lisp and even worked on Scheme standardization produce one of the
WORST languages in the whole wide world?
How much did they pay him to forget (?) what he knew?
--
No man is good enough to govern another man without that other's
consent. -- Abraham Lincoln
Where can I read that interview on the net? I suppose Guy Steele is
not charging anyone for reading his words?
> israel wrote:
> > The current DDJ ( at least the latest that our antipodean
> > newsstands have ) has an interesting interview with Guy Steele.
> >
> > He contributed to Scheme, Lisp , Java and did the first port of Tex.
>
> I've never understood this. How could anyone who knew Scheme and
> Lisp and even worked on Scheme standardization produce one of the
> WORST languages in the whole wide world?
>
> How much did they pay him to forget (?) what he knew?
Guy also worked on Fortran when he was at Thinking Machines Corporation.
He's not a language bigot.
In the case of Java, it's always seemed like it started with some good
ideas, but as it grew it went off-track. The original intent seems to
have been to adopt some ideas from the Lisp world into a C-like
framework.
--
Barry Margolin, bar...@alum.mit.edu
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
I may have to write programs in a language that has this description of the
int type:
"For INT simple variables, a one-byte initialization behaves differently
from a one-byte assignment as follows:
* If you _initialize_ an INT variable with a one-byte character string, the
compiler allocates the character in the _left_ byte of the word. The right
byte is undefined.
* If you _assign_ a one-byte character string to the variable, at run-time
the system places it in the _right_ byte and sets the left byte to zero. If
you want the character to be placed in the left byte, assign a two-byte
character string that consists of a character and a blank space."
No kidding.
I often wondered about this myself.
However the statement above is not really strong enough. He did work on
Scheme standardization, but that was only after he and Sussman invented
it! He did work on standardization of CL, though.
As for Java, as far as I understand, he got into the game when Java was
already developed. His influence can probably be seen in generics and
stuff like that. (Anyway I should not say too much about Java, cause I
actually do not know much--have read just one Java book).
-- Hrvoje
I like most the following part of this article:
****************************
A couple of things happened to Steele after the specification was complete.
One was that he got a phone call from a gruff-voiced man at a small firm
with half a dozen programmers.
"You the guy who wrote the Java spec?"
"I co-wrote it, yeah."
"Well, I just had to call to thank you. Since we switched to Java,
productivity at our firm has quadrupled, just because Java is catching
our mistakes."
****************************
:-)
André
--
He talks a little about his involvement in Java at the last panel at
http://www.ai.mit.edu/projects/dynlangs/wizards-panels.html
Pascal
--
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
>israel wrote:
>> The current DDJ ( at least the latest that our antipodean
>> newsstands have ) has an interesting interview with Guy Steele.
>>
>> He contributed to Scheme, Lisp , Java and did the first port of Tex.
>
>I've never understood this. How could anyone who knew Scheme and
>Lisp and even worked on Scheme standardization produce one of the
>WORST languages in the whole wide world?
What is wrong with Java?... Please be specific.
A.L.
> israel wrote:
>> The current DDJ ( at least the latest that our antipodean
>> newsstands have ) has an interesting interview with Guy Steele.
>> He contributed to Scheme, Lisp , Java and did the first port of Tex.
>
> I've never understood this. How could anyone who knew Scheme and Lisp
> and even worked on Scheme standardization produce one of the WORST
> languages in the whole wide world?
If you think that Java is the worst language, you haven't seen many
languages.
> How much did they pay him to forget (?) what he knew?
Your comment reflects extremely poorly on you, not on Guy.
Matthias
PS: No, and I am absolutely NO fan of Java!
It cleverly pointed out one difficulty with Scheme.
My now vague recollection from the keynote itself is that a
lispy/schemy syntax wouldn't fly, ergo we got {}, the distinction
between Integer and int was for speed (remember, systems were slower
in '95), and that Java had grown way more than anticipated.
cheers
bruce
--
Reality is 80m polygons - Alvy Ray Smith
Bruce O'Neel - edwin...@siggraph.org
http://edoneel.chaosnet.org
Everything that's not like in Smalltalk. ;)
> He gave a nice Keynote at OOPSLA '98 (it's easy to remember
> conferences when you don't go to many :-( ) where the whole keynote
> assumed that you knew the definition of all English words 4 letters or
> less and he could only use known words. New words could be defined
> only only from existing words. It was quite cute. He declined to
> follow these rules for the question and answer session afterwords
> though. That said, some of the questions did follow this rule.
>
> It cleverly pointed out one difficulty with Scheme.
>
> My now vague recollection from the keynote itself is that a
> lispy/schemy syntax wouldn't fly, ergo we got {}, the distinction
> between Integer and int was for speed (remember, systems were slower
> in '95), and that Java had grown way more than anticipated.
Here's the PDF:
http://homepages.inf.ed.ac.uk/wadler/steele-oopsla98.pdf
Regards,
Chris
Well, all that only makes me ask my question stronger...
> http://www.sun.com/presents/minds/2005-0302/
Thanks, insightful article. Only the beginning sentence is quite
funny: "are taking on an enormous challenge -- create a
programming language better than Java."
Woooo!
Anyway, the good thing is that you can choose what language to use
(unless your boss tells you otherwise).
Thanks for that link. I remember watching the first one last
year, but I must have lost the link...
I still don't get the integer/Integer thing. What prevents the
compiler to use the 32bits for an integer instead for in Int
pointer (to store directly in Lists)? What prevents it from
making this optimization transparent, so that every Integer.bla()
method just used the direct value, not the pointer?
In '95, machines didn't even have much memory and speed, still
Java wasted tons of resources on runtime checks (get elements out
of a list -> typecast+check) and space on integers.
Does any functional language BOX 32bit integer values?
After having seen Smalltalk, OCaml and Python (even though I never
really learnt/used them), I thought, "this is what Java *should*
have been like", the irony being that Smalltalk is so much older
than Java.
What's wrong? The language sucks with lots of arbitrary
restrictions that make my life hard (some of them the same is C,
even though everybody says Java is so much better and
higher-level; guess what: when I code C, my problem usually
*never* lies with memory management).
Integers != integers; Iterators are a PITA; the syntax is
braindead and not extensible, and far too verbose (public
static...). It blows that I'm forced to put *everything* in
classes, even static methods which should really be part of
*modules* instead.
The JVM is really bad (kindof like what I came up with after I
studied the basics of Forth and thought about what a nice VM could
be like; after a little while I had something resembling the JVM,
but of course I rejected it). It does dynamic checks all over the
place, but even that doesn't make it safe (there was something,
was it by Phil Wadler?, that there are some possibilities to
circumvent the JVM security). Even the new versions of the JVM
that BREAK compatibility and introduce generics at the language
level don't bother to include generics into the VM, so that the
typecasts could be eliminated (from what I heard). The VM also
has no support for tail recursion (which I like).
It's unnatural that any method like stringToInt should be part of
either the String or Integer class, or even both! Why not have a
module with *one* method?
The lack of first class functions means I have to use bloody crap
like "strategy patterns", and wrap these methods in interfaces and
classes.
Java isn't even portable. While Scheme, Smalltalk, Lisp, ML and
others run on tons of platforms, Java until recently didn't even
run on the BSDs. For exotic platforms (non x86,PPC,ARM,SPARC) you
probably have to ask a third party for a commercial VM.
Java is full of implicit null pointers (in contrast to the option
or maybe type in FPLs), so your program can fail any minute
without you knowing it. NP-exceptions are fun, right? No,
seriously, if you do static checking, please do it right!
Primitive types also add more fun to Java, because now every
PrintStream or stuff like that (btw why are there Streams and
Writers? What's the difference anyway?) has a print(float)
print(int) print(Object) print(char) ... method. Beautiful!
Even more beautiful is the combination of runtime and the
language's libraries. Exceptions are part of the runtime, but the
subclass "RuntimeException" doesn't have to be caught, unlike all
others subclasses of Exception. (yes, I know, this is enforced by
the compiler, not the runtime)
For threads you create an object (instead of just having a
startThread method that you pass a function). To have the thread
run Thread.RUN() you invoke Thread.START(), cool huh? And this
start() methods probably does some sweet low-level meddling with
the VM. Which leads me to the InterruptedException. It has
something to do with sleep(), but I don't know what and nobody can
tell me. Why can sleep be interrupted? Don't all threads run in
mutual isolation? The freaking CPU should just avoid scheduling
the thread for a while, no need to give stupid interrupt
exceptions (and it never fired on me anyway).
That's all I can think of right now, but I'm sure there's far more
to it.
Basically they botched up almost everything they could.
Hm, maybe Fortran, Cobol. Maybe stuff like Brainf*** and
Unlambda, which aren't serious. What other abominations are out
there?
I know of at least twenty languages that are FAR better than Java,
IMHO.
>>How much did they pay him to forget (?) what he knew?
>
>
> Your comment reflects extremely poorly on you, not on Guy.
I don't mean that personally, or as an attack. It just seems that
whatever he knows and did isn't reflected in Java, at all. But in
another post someone mentioned that he probably only arrived at
Sun, when it was too late. Oak had been created. (I believe
Gosling is to "blame")
> Matthias
>
> PS: No, and I am absolutely NO fan of Java!
If you want to defend Java, please do. Almost every single
language I know doesn't have the braindead horridness of Java to
it, so I think it could have been done differently, without any
negative consequences.
> What's wrong? The language sucks with lots of arbitrary restrictions
Forgot one of the more important ones that regularly annoy me: no
multiple return values.
I end up creating a Tuple or sometimes Triple class that wraps two
Objects, that later have to be extracted manually,
garbage-collected (instead of just returning the stuff in two
registers), type-casted upon extraction etc.
You get the idea. Everything, that wasn't deemed necessary for
coding by the Java popes, simply isn't in there!
> The JVM is really bad (kindof like what I came up with after I studied
> the basics of Forth and thought about what a nice VM could be like;
> after a little while I had something resembling the JVM, but of course I
> rejected it). It does dynamic checks all over the place, but even that
> doesn't make it safe (there was something, was it by Phil Wadler?, that
> there are some possibilities to circumvent the JVM security).
All the modern JVMs eliminate those checks during JIT when they are
provably unnecessary. How about security on Lisps? Most provide the
same wide-open security model as any other language.
Even the
> new versions of the JVM that BREAK compatibility and introduce generics
> at the language level don't bother to include generics into the VM, so
> that the typecasts could be eliminated (from what I heard). The VM also
> has no support for tail recursion (which I like).
You seem to have it backwards. The reason generics aren't integrated
into the VM is explicitly for backwards compatibility with older VMs.
The only compatibility that is broken is source compatibility, which is
obvious. I agree about tail recursion. Sun's lame excuse is that their
security model relies on maintaining the call trace.
>
> It's unnatural that any method like stringToInt should be part of either
> the String or Integer class, or even both! Why not have a module with
> *one* method?
So that you can add new types without having to modify that one
"module". If I add a Complex type, I would need to go and modify the
Conversions class or whatever you'd want to call it.
>
> The lack of first class functions means I have to use bloody crap like
> "strategy patterns", and wrap these methods in interfaces and classes.
>
Agreed.
> Java isn't even portable. While Scheme, Smalltalk, Lisp, ML and others
> run on tons of platforms, Java until recently didn't even run on the
> BSDs. For exotic platforms (non x86,PPC,ARM,SPARC) you probably have to
> ask a third party for a commercial VM.
Scheme, Smalltalk, Lisp and ML implementations collectively run on many
platforms, but few implementations are as portable as the JVM.
> Java is full of implicit null pointers (in contrast to the option or
> maybe type in FPLs), so your program can fail any minute without you
> knowing it. NP-exceptions are fun, right? No, seriously, if you do
> static checking, please do it right!
Agreed.
> Primitive types also add more fun to Java, because now every PrintStream
> or stuff like that (btw why are there Streams and Writers? What's the
> difference anyway?) has a print(float) print(int) print(Object)
> print(char) ... method. Beautiful!
Yes, this is an area where Smalltalk got things right. Again, Sun's
excuse was that they wanted to trade performance for uniformity, having
primitive types to make JIT compilation easier. Shortsighted in my opinion.
>
> Even more beautiful is the combination of runtime and the language's
> libraries. Exceptions are part of the runtime, but the subclass
> "RuntimeException" doesn't have to be caught, unlike all others
> subclasses of Exception. (yes, I know, this is enforced by the
> compiler, not the runtime)
>
That actually turns out to be very convenient. When to create checked
versus unchecked exceptions is delicate of course.
> For threads you create an object (instead of just having a startThread
> method that you pass a function). To have the thread run Thread.RUN()
> you invoke Thread.START(), cool huh? And this start() methods probably
> does some sweet low-level meddling with the VM. Which leads me to the
> InterruptedException.
You're arguing against object orientation in general here. It doesn't
really apply to Java specifically.
It has something to do with sleep(), but I don't
> know what and nobody can tell me. Why can sleep be interrupted? Don't
> all threads run in mutual isolation? The freaking CPU should just avoid
> scheduling the thread for a while, no need to give stupid interrupt
> exceptions (and it never fired on me anyway).
You're obviously quite confused about this one. Sleep does exactly what
you think if noone interrupts. Interruption is *very* important feature
that allows inter-thread communication without complex pre-agreed upon
condition variables. Now, for most uses you should use condition
variables, and you'll be just fine.
>
> That's all I can think of right now, but I'm sure there's far more to it.
>
> Basically they botched up almost everything they could.
>
I think you're more emmotionally than logically involved on this one.
Far be it from me to defend Java, but its an order of magnitude
improvement over C++, the language it most intends to replace. Its
biggest strength its ability to faciliate managed team software
development. That sounds like a buzzword, but when companies have to
hire several programmers with varying skill levels, its a godsend in
productivity. Scheme and Lisp are better general purpose languages by
far, but the fact of the world is that people are trained on the
imperative style, and its much easier to hire or train a Java developer
for a large project than a Lisp programmer. Economics rule.
Scott
Scott
The 1.4 and 1.5 Runtimes aren't fully compatible with older ones.
Lots of programs stop working and have to be ported every time,
some due to API changes, some due to binary incompatibility.
>> It's unnatural that any method like stringToInt should be part of
>> either the String or Integer class, or even both! Why not have a
>> module with *one* method?
>
>
> So that you can add new types without having to modify that one
> "module". If I add a Complex type, I would need to go and modify the
> Conversions class or whatever you'd want to call it.
Why? Create a Complex module, put the type and functions in it.
Most languages work like that!
>> Java isn't even portable. While Scheme, Smalltalk, Lisp, ML and
>> others run on tons of platforms, Java until recently didn't even run
>> on the BSDs. For exotic platforms (non x86,PPC,ARM,SPARC) you
>> probably have to ask a third party for a commercial VM.
>
>
> Scheme, Smalltalk, Lisp and ML implementations collectively run on many
> platforms, but few implementations are as portable as the JVM.
What CPUs does the JVM even support? What about MIPS, HPPA? What
about slightly alternative systems?
Scheme48, SML/NJ, MLton, OCaml, PLT Scheme(?), Clisp, CMUCL(?),
Squeak are all very portable. Most of them are even much faster
than Java on function calls. Sure, the solution (as usual in
typical Java code) is to create huge methods covering several tens
of lines, often hundreds.
>> Primitive types also add more fun to Java, because now every
>> PrintStream or stuff like that (btw why are there Streams and
>> Writers? What's the difference anyway?) has a print(float) print(int)
>> print(Object) print(char) ... method. Beautiful!
>
>
> Yes, this is an area where Smalltalk got things right. Again, Sun's
> excuse was that they wanted to trade performance for uniformity, having
> primitive types to make JIT compilation easier. Shortsighted in my
> opinion.
They could just have the compiler transparently convert a print on
a Character into a conversion from Char->String, for instance.
And that's just off the top of my head.
>> For threads you create an object (instead of just having a startThread
>> method that you pass a function). To have the thread run Thread.RUN()
>> you invoke Thread.START(), cool huh? And this start() methods
>> probably does some sweet low-level meddling with the VM. Which leads
>> me to the InterruptedException.
>
> You're arguing against object orientation in general here. It doesn't
> really apply to Java specifically.
True, in lots of respects (fascistoidly enforced) OO sucks.
> It has something to do with sleep(), but I don't
>
>> know what and nobody can tell me. Why can sleep be interrupted?
>> Don't all threads run in mutual isolation? The freaking CPU should
>> just avoid scheduling the thread for a while, no need to give stupid
>> interrupt exceptions (and it never fired on me anyway).
>
> You're obviously quite confused about this one. Sleep does exactly what
> you think if noone interrupts. Interruption is *very* important feature
> that allows inter-thread communication without complex pre-agreed upon
> condition variables. Now, for most uses you should use condition
> variables, and you'll be just fine.
Why? My thread, just like a forked Unix process runs
independently from all other threads. It *can't* be interrupted,
to my knowledge. The only thing in Unix is that it can receive a
signal or write(), which would get resolved after the system call,
or when reading(), respectively. In Java it's all shared memory,
no need to interfere with other threads at all, unless you enter a
monitor.
>>
>> That's all I can think of right now, but I'm sure there's far more to it.
>>
>> Basically they botched up almost everything they could.
>>
> I think you're more emmotionally than logically involved on this one.
> Far be it from me to defend Java, but its an order of magnitude
> improvement over C++, the language it most intends to replace. Its
I recently studied C++, and it seems like an improved C at first.
After a while you notice the bloat, the horridness of templates etc.
Still I much prefer to code (plain) C to Java, also because I have
better native library access (Unix libs, terminal handling if I
want, which doesn't exist in Java) and much better startup time
(instantly instead of several seconds on my 800MHz machine, lots
of that is disk IO, though).
> biggest strength its ability to faciliate managed team software
> development. That sounds like a buzzword, but when companies have to
> hire several programmers with varying skill levels, its a godsend in
> productivity. Scheme and Lisp are better general purpose languages by
> far, but the fact of the world is that people are trained on the
> imperative style, and its much easier to hire or train a Java developer
> for a large project than a Lisp programmer. Economics rule.
It's a buzzword, but you're right. But then, if Lisp or another
language had a powerful module system, again, Java would so lose ;)
(disclaimer, I don't know anything about asdf and the like yet)
>> I've never understood this. How could anyone who knew Scheme and Lisp
>> and even worked on Scheme standardization produce one of the WORST
>> languages in the whole wide world?
>
>If you think that Java is the worst language, you haven't seen many
>languages.
Good for him. I wish Java had been the worst language I've had to deal
with.
However, I really don't understand how on Earth Java managed to rise
having alternatives as Smalltalk or CL.
> However, I really don't understand how on Earth Java managed to rise
> having alternatives as Smalltalk or CL.
Sun has a good marketing and the syntax and the very dynamic concept of
Smalltalk and CL looks too weird for someone who knows C or Basic and
doesn't want to learn something new.
--
Frank Buß, f...@frank-buss.de
http://www.frank-buss.de, http://www.it4-systems.de
>Ulrich Hobelmann <u.hob...@web.de> writes:
>
>> israel wrote:
>>> The current DDJ ( at least the latest that our antipodean
>>> newsstands have ) has an interesting interview with Guy Steele.
>>> He contributed to Scheme, Lisp , Java and did the first port of Tex.
>>
>> I've never understood this. How could anyone who knew Scheme and Lisp
>> and even worked on Scheme standardization produce one of the WORST
>> languages in the whole wide world?
>
>If you think that Java is the worst language, you haven't seen many
>languages.
Such as C++...
A.L.
>Matthias Blume wrote:
>> If you think that Java is the worst language, you haven't seen many
>> languages.
>
>Hm, maybe Fortran, Cobol. Maybe stuff like Brainf*** and
>Unlambda, which aren't serious. What other abominations are out
>there?
>
>I know of at least twenty languages that are FAR better than Java,
>IMHO.
With respect to what criteria?...
A.L.
> I still don't get the integer/Integer thing. What prevents the
> compiler to use the 32bits for an integer instead for in Int pointer
> (to store directly in Lists)?
Because the Integer pointer might be null, you cannot encode every
possible value in 32 bits.
Unlike the numbers and characters of CL, Java's Integer objects
have a reliable identity that can be compared: new Integer(0) !=
new Integer(0). So you'd still have to store the pointer along
with the value.
> Matthias Blume wrote:
>> If you think that Java is the worst language, you haven't seen many
>> languages.
>
> Hm, maybe Fortran, Cobol. Maybe stuff like Brainf*** and Unlambda,
> which aren't serious. What other abominations are out there?
I once heard of languages called "C" and "C++", if that rings a bell...
> I know of at least twenty languages that are FAR better than Java,
> IMHO.
Well, you'd better be humble with your opinion. Most languages are
incomparable, as most languages have certain aspects that are handled
better than in Java, and most have certain aspects that are handled
worse.
>>>How much did they pay him to forget (?) what he knew?
>> Your comment reflects extremely poorly on you, not on Guy.
>
> I don't mean that personally, or as an attack. It just seems that
> whatever he knows and did isn't reflected in Java, at all.
Are you sure? How much do _you_ know? Or, in other words, who are
you to make this kind of judgment? I find your attitude extremely
preposterous.
> If you want to defend Java, please do. Almost every single language I
> know doesn't have the braindead horridness of Java to it,
Can you explain what you mean by "brain-dead horridness"?
Almost every language I know has at least one aspect that is at least
as brain-dead if not more so than it is in Java. Of course, there are
several languages for which this is the exception, i.e., they are
almost uniformely better than Java in most regards. But your blanket
statement above sounds uninformed at best.
[about Guy Steele's involvement in Java]
> I don't mean that personally, or as an attack. It just seems that
> whatever he knows and did isn't reflected in Java, at all. But in
> another post someone mentioned that he probably only arrived at Sun,
> when it was too late. Oak had been created. (I believe Gosling is to
> "blame")
From what can be reconstructed from public statements by both Steele
and Gosling, the following happened:
- Sun was itself surprised by Java's very early success. [1] In order to
take advantage of its momentum, they started to rush things.
- Guy Steele was involved in writing down the specification of the Java
language because he has a track record of writing excellent
specifications. He has stated that he tried to fix as many important
conceptual problems as possible in a short amount of time. In other
words, things could have been worse.
- It's especially interesting to read the Oak specification at
http://today.java.net/jag/old/green/ (Oak was the original name for
Java). There are lots of comments in that spec about things that need to
be changed but that were not changed in Java, and are like that still.
- Most of the things that are kept as is are kept because of
compatibility (whatever that means in the Java world ;).
- One interesting tidbit by Guy Steele is that he stated that they had a
full working implementation of inner classes that close not only over
the final variables of the surrounding environment, but over any
variables. This was voted down due to "efficiency" issues - variable
bindings had to be allocated on the heap instead of the stack. Closing
over final variables doesn't impose an implicit heap allocation. (Don't
make fun of this! Such compromises always happen when more than one
person is involved in specifying a language. The really bad thing about
Java is that there is no easy way to fix such deficiencies at the user
level.)
Pascal
[1] BTW, it was only then that they poured a lot of money into Java.
Java's success has to be measured against the fact that it seemed like
C++ had won and Smalltalk had lost. For many, Java was a compromise they
could live with, especially because it was not a too bad language back
then, and some of the restrictions weren't that obvious at first. Please
also recall that Java didn't even have inner classes at first, so it
looked like a good basis to start from.
> Scheme and Lisp are better general purpose languages by
> far, but the fact of the world is that people are trained on the
> imperative style
Imperative style is easy to do in Scheme, and I haven't seen anything in
CL that would make it hard there either.
> and its much easier to hire
You will find more resumes with "Java" on them, yes. The question is
how many of these resumes describe people who can contribute good code.
> or train a Java
> developer for a large project than a Lisp programmer.
I would like to see this claim backed up. I really don't think most
people are going to require a lot of time transitioning from f(a, b) to
(f a b).
> Economics rule.
Only in the long run. For short to medium time periods, inertia rules.
>>
>> So that you can add new types without having to modify that one
>> "module". If I add a Complex type, I would need to go and modify the
>> Conversions class or whatever you'd want to call it.
>
>
> Why? Create a Complex module, put the type and functions in it. Most
> languages work like that!
>
Including Java. I don't understand what you're arguing about.
> Scheme48, SML/NJ, MLton, OCaml, PLT Scheme(?), Clisp, CMUCL(?), Squeak
> are all very portable. Most of them are even much faster than Java on
> function calls. Sure, the solution (as usual in typical Java code) is
> to create huge methods covering several tens of lines, often hundreds.
What? Who cares which is faster. Thats a property that is usually
irrelevant, and varies considerably from platform to platform in any
language.
>
> They could just have the compiler transparently convert a print on a
> Character into a conversion from Char->String, for instance. And that's
> just off the top of my head.
>
Realise that there are subtle issues here. Characters aren't just numbers.
> Why? My thread, just like a forked Unix process runs independently from
> all other threads. It *can't* be interrupted, to my knowledge. The
> only thing in Unix is that it can receive a signal or write(), which
> would get resolved after the system call, or when reading(),
> respectively. In Java it's all shared memory, no need to interfere with
> other threads at all, unless you enter a monitor.
>
Interrupt is just a communications technique, nothing more. You may not
like it, others do. Unix threads aren't the panacea of threading, btw.
>
> Still I much prefer to code (plain) C to Java, also because I have
> better native library access (Unix libs, terminal handling if I want,
> which doesn't exist in Java) and much better startup time (instantly
> instead of several seconds on my 800MHz machine, lots of that is disk
> IO, though).
Sure, but all of those libraries are platform specific, so you're no
longer portable. The startup time is an issue, but one that will be
addressed.
>
> It's a buzzword, but you're right. But then, if Lisp or another
> language had a powerful module system, again, Java would so lose ;)
>
> (disclaimer, I don't know anything about asdf and the like yet)
I disagree. Lisp isn't losing to Java because of its module system, its
losing due to huge momentum behind C-like languages and better marketing.
Scott
Sorry, I should have said C-like languages. Imperative is only part of it.
>
>>and its much easier to hire
>
>
> You will find more resumes with "Java" on them, yes. The question is
> how many of these resumes describe people who can contribute good code.
>
>
I've gone over 100 resumes for a position in our company recently. I
saw exactly one person who knew any functional languages. Out of those
100, every one of them had C or Java experience, and out of those, five
candidates were qualified. Thats still five times as many Java
programmers than Lisp ones. Its a terrible vicious cycle, but its reality.
>>or train a Java
>>developer for a large project than a Lisp programmer.
>
>
> I would like to see this claim backed up. I really don't think most
> people are going to require a lot of time transitioning from f(a, b) to
> (f a b).
Maybe its not really true, but its perceived true by programming
managers. And thats enough.
>
>
>>Economics rule.
>
>
> Only in the long run. For short to medium time periods, inertia rules.
True.
Scott
I do think that "economically," Java has an enormous advantage. Who are
the decisionmakers, who decide which tool is used? I would say
management. Those decisionmakers would like to apply mass production
principles to software. How do you do that? Make sure your tools can't
cut everything and encourage specialization, so your organization can
scale with many commodity programmers working under "architects."
I think that's a reasonable, idealized model of what this
decisionmaking segment desires. If you were to convene a focus group, I
think managers would voice a desire for control and consistency; the
ability to add more manpower which would lead to at least linear
productivity improvements.
For our part, technocrats generally desire our tech to have the quality
of genius, and are annoyed when these Dilbertian capitalists obstruct
us. We are biased in favor of technological solutions, even when not
economical.
I used to believe one could find good programmers by sifting through
resumes. Then I thought one could at least find them by giving very
thorough interviews. Now I'm not sure what to believe.
Cheers,
Bill.
>> Hm, maybe Fortran, Cobol. Maybe stuff like Brainf*** and Unlambda,
>> which aren't serious. What other abominations are out there?
>
>I once heard of languages called "C" and "C++", if that rings a bell...
C is NOT an abomination. It's a good low-level langauge.
> Matthias Blume wrote:
>> If you think that Java is the worst language, you haven't seen many
>> languages.
>
> Hm, maybe Fortran, Cobol. Maybe stuff like Brainf*** and
> Unlambda, which aren't serious. What other abominations are out
> there?
>
(Visual) Basic!
--
Christopher
Brain-dead macros.
No tail recursion.
No closures.
Completely broken type system.
Pointless distinction between `statements' and `expressions'.
Bizarre syntax.
Vague semantics.
What is it good for?
I wonder why no one mentioned Perl yet...
That any function that has Complex as a source or result type
needs to be in the Complex class. If I want string->complex, and
this isn't in the complex class distribution, and it isn't in
String, obviously, then what do I do? In another language I can
write a stringToComplex function *anywhere*. In Java I have to
put it into a class, why? And it's static, a very descriptive
keyword for something that's just supposed to be a function, and
doesn't even *want* to be part of a class in the first place.
Doesn't make code very readable.
And for all Java basic types a single conversion module that's
part of the standard java.lang import would have been much nicer.
Now we have Integer.toString() and String.intValue() or stuff
like that, where any function appears twice.
>> Scheme48, SML/NJ, MLton, OCaml, PLT Scheme(?), Clisp, CMUCL(?), Squeak
>> are all very portable. Most of them are even much faster than Java on
>> function calls. Sure, the solution (as usual in typical Java code) is
>> to create huge methods covering several tens of lines, often hundreds.
>
>
> What? Who cares which is faster. Thats a property that is usually
> irrelevant, and varies considerably from platform to platform in any
> language.
But if a language is already portable, speed is nice, too. I
mentioned it, because Squeak (which is probably slower), CLisp and
Scheme48 are written in subsets of their own language, which is
nice, too, and much more maintainable than writing them in C.
When I say portable, Java zealots cry that Java is frigging fast.
To me even the simple scheme48 seems a lot faster in many cases,
but that's probably not much monolithic inline code, but lots of
funcalls.
>> They could just have the compiler transparently convert a print on a
>> Character into a conversion from Char->String, for instance. And
>> that's just off the top of my head.
>>
> Realise that there are subtle issues here. Characters aren't just numbers.
But they fit into 32bits just as well. And the compiler knows the
type, so it can insert the appropriate conversion. Anyway, I
suspect Smalltalk compilers/JITers do just that with chars and
numbers, no need to box everything everywhere.
>> Why? My thread, just like a forked Unix process runs independently
>> from all other threads. It *can't* be interrupted, to my knowledge.
>> The only thing in Unix is that it can receive a signal or write(),
>> which would get resolved after the system call, or when reading(),
>> respectively. In Java it's all shared memory, no need to interfere
>> with other threads at all, unless you enter a monitor.
>>
> Interrupt is just a communications technique, nothing more. You may not
> like it, others do. Unix threads aren't the panacea of threading, btw.
Processes or pthreads + shared mem, I think it doesn't make a
difference. I don't know what interrupts in threading systems (as
a communication thing) are and what they do. Do you have a
reference? Honestly this puzzles me, since I don't think I know
nothing about OSes and threading, but I never heard of interrupts
before.
>> Still I much prefer to code (plain) C to Java, also because I have
>> better native library access (Unix libs, terminal handling if I want,
>> which doesn't exist in Java) and much better startup time (instantly
>> instead of several seconds on my 800MHz machine, lots of that is disk
>> IO, though).
>
>
> Sure, but all of those libraries are platform specific, so you're no
> longer portable. The startup time is an issue, but one that will be
> addressed.
They seems to run on Linux, the BSDs, the Mac, probably Solaris
und HP-UX (I guess, since these libs *are* kind of standard). On
Windows probably only straight POSIX runs, but not the terminal
stuff, but I don't know much about NT's POSIX emulation. X Window
also runs quite portably if you need it to.
>> It's a buzzword, but you're right. But then, if Lisp or another
>> language had a powerful module system, again, Java would so lose ;)
>>
>> (disclaimer, I don't know anything about asdf and the like yet)
>
>
> I disagree. Lisp isn't losing to Java because of its module system, its
> losing due to huge momentum behind C-like languages and better marketing.
Well, that's like the Mac sales having slower growth than PC
sales. Absolutely they still increase their numbers. Of course
all the world jumps on standard technology immediately, since
there is a large market for work. That doesn't mean that Lisp is
losing. At least Kenny Tilton often posted in c.l.l that he
notices quite a "surge" in interest.
To me losing means that a language would just waste my time
compared to another. So it's more of a technical sense.
Hey, Perl is nearly as cool for one-liners as APL!
--
Christopher
All those subjective annoyances I mentioned further up in the
thread. All other people I know who know both other $LANGUAGE-X
and Java generally agree that Java sucks.
Your mileage may vary.
Despite all these I prefer it to Java by a good margin.
>> C is NOT an abomination. It's a good low-level langauge.
>
>Brain-dead macros.
I think this is unavoidable without a Lisp-like syntax.
>No tail recursion.
>No closures.
>Completely broken type system.
>Pointless distinction between `statements' and `expressions'.
>Bizarre syntax.
>Vague semantics.
>
>What is it good for?
Keeping you away from asm. ;-) It's a good language for recoding
bottlenecks.
Actually, I would have included 'lack of gc' at the top of the list,
but it's not a highlevel langauge, and never pretended to be one. I
think this is like bashing a hammer for not being able to cut wood.
It's just a different tool (that has been misused, but that's another
story).
Not nearly as bad, at least plain C. But again, this is my
personal opinion. There is no objective better/worse scale.
>
>>I know of at least twenty languages that are FAR better than Java,
>>IMHO.
>
>
> Well, you'd better be humble with your opinion. Most languages are
> incomparable, as most languages have certain aspects that are handled
> better than in Java, and most have certain aspects that are handled
> worse.
Again, my opinion; and all the people I know who know $LANGUAGE-X
and Java agree that Java loses, for pretty much all languages.
>>>>How much did they pay him to forget (?) what he knew?
>>>
>>>Your comment reflects extremely poorly on you, not on Guy.
>>
>>I don't mean that personally, or as an attack. It just seems that
>>whatever he knows and did isn't reflected in Java, at all.
>
>
> Are you sure? How much do _you_ know? Or, in other words, who are
> you to make this kind of judgment? I find your attitude extremely
> preposterous.
My opinion. Comparing Scheme and Java I see so many annoyances in
Java, and on the Scheme side merely a lack of features (modules,
standard structs, maybe something an OO system more like CLOS).
Again, I'm not attacking anyone, I simple wonder. And in this
context someone else's post that Guy Steele joined the Java group
after most design had been done, makes perfect sense to me.
>>If you want to defend Java, please do. Almost every single language I
>>know doesn't have the braindead horridness of Java to it,
>
>
> Can you explain what you mean by "brain-dead horridness"?
All those things that personally annoy me, further up in the thread.
> Almost every language I know has at least one aspect that is at least
> as brain-dead if not more so than it is in Java. Of course, there are
> several languages for which this is the exception, i.e., they are
> almost uniformely better than Java in most regards. But your blanket
> statement above sounds uninformed at best.
Wow.
Well, of course C has some disadvantages to Java, like being lower
level. OTOH it is less hassle for me to debug a C than a Java
program and get it running. Memory management issues usually
don't even pop up or are cleaned up in a matter of minutes. Most
features that Java has don't buy me *anything*, personally. The
lack of varargs, pointers and stuff like that furthers hinders me
from coding what I want done.
Most other languages that don't share C's low-level-ness and
structure are just much easier to write in. The huge blooming
scripting language community seems to agree here. Even the Java
world embraced that and developed some languages, like Groovy (was
it called?).
This bugs me too. I like C#'s solution: ref and out args. And unlike C++, Ada
and Delphi, you can *see* at the call site which are the out args, e.g.
int quotient; // no need to initialize -- first use is a write
int remainder;
divide(out quotient, out remainder);
--
Cheers, The Rhythm is around me,
The Rhythm has control.
Ray Blaak The Rhythm is inside me,
rAYb...@STRIPCAPStelus.net The Rhythm has my soul.
In practice, Java is actually the most portable language I've used. Given a VM
implementation that is.
> Primitive types also add more fun to Java, because now every PrintStream or
> stuff like that (btw why are there Streams and Writers? What's the difference
> anyway?) has a print(float) print(int) print(Object) print(char) ... method.
PrintStreams vs PrintWriters suck. What's wrong with just a regular writer?
Something to avoid the io exceptions I suppose. General streams vs writers is
useful though: binary data vs char data.
> Even more beautiful is the combination of runtime and the language's
> libraries. Exceptions are part of the runtime, but the subclass
> "RuntimeException" doesn't have to be caught, unlike all others subclasses of
> Exception. (yes, I know, this is enforced by the compiler, not the runtime)
Now I actually like this. That is, I like being able to choose between checked
exceptions vs unchecked exceptions. Check exceptions help me in general, and
by default. But when I need the flexibility of unchecked exceptions I am glad
to have it.
RuntimeException being an exception is a pain, since one has to always
remember to handle them differently:
catch (RuntimeException passThru) {throw passThru;}
catch (Exception err) {log.error("action failed", err);}
I would have had Exception and UnCheckedException extending Throwable,
then Error and RuntimeException extending UnCheckedException
> For threads you create an object (instead of just having a startThread method
> that you pass a function). To have the thread run Thread.RUN() you invoke
> Thread.START(), cool huh? And this start() methods probably does some sweet
> low-level meddling with the VM.
Well it kind of has to. If you can invoke run() directly, then would would be
running its code in calling thread, not its own.
> > C is NOT an abomination. It's a good low-level langauge.
> What is it good for?
Portable assembler.
mkb.
> Again, my opinion; and all the people I know who know $LANGUAGE-X
> and Java agree that Java loses, for pretty much all languages.
OTOH, all people I know who know "$LANGUAGE-X"' runtime library and
Java's runtime library agree that Java wins, for pretty much all
languages (except perhaps Perl and Python, when scripting suffices).
> and get it running. Memory management issues usually don't even pop
> up or are cleaned up in a matter of minutes. Most features that Java
Then you're a genius. Everyone I know who's written C programs larger
than a couple thousand lines has spent hours with tools like DEC's 3rd
degree atom tools, Purify, or other essential lifesavers.
mkb.
> No kidding.
Is this related to an HP3000 per chance?
You forgot the worst design decision ever:
null-terminated strings. They are inefficient, insecure,
hard to program with, there is no reason at all any language should
have them (even a low level one).
> Kristof Bastiaensen <kri...@vleeuwen.org> writes:
>
>> You forgot the worst design decision ever:
>> null-terminated strings. They are inefficient, insecure,
>> hard to program with, there is no reason at all any language should
>> have them (even a low level one).
>
> Well, C does not have strings. I agree that it would have been nice
> if C actually had strings.
C has string literals, which translate to 0-terminated arrays of char.
That's what Kristof was talking about.
> You forgot the worst design decision ever:
> null-terminated strings. They are inefficient, insecure,
> hard to program with, there is no reason at all any language should
> have them (even a low level one).
Well, C does not have strings. I agree that it would have been nice
if C actually had strings.
mkb.
> Ulrich Hobelmann wrote:
>
> [about Guy Steele's involvement in Java]
>
>> I don't mean that personally, or as an attack. It just seems that
>> whatever he knows and did isn't reflected in Java, at all. But in
>> another post someone mentioned that he probably only arrived at Sun,
>> when it was too late. Oak had been created. (I believe Gosling is to
>> "blame")
>
> From what can be reconstructed from public statements by both Steele
> and Gosling, the following happened:
>
> - Sun was itself surprised by Java's very early success. [1] In order to
> take advantage of its momentum, they started to rush things.
>
> - Guy Steele was involved in writing down the specification of the Java
> language because he has a track record of writing excellent
> specifications. He has stated that he tried to fix as many important
> conceptual problems as possible in a short amount of time. In other
> words, things could have been worse.
>
There are lots of good reasons why Guy Steele Jr. is at Sun, but perhaps
it still stings a bit that such a talented person in the LISP community is
associated with such a mediocre language, especially since Java bashing is
such venerable pastime around here.
Matt
--
"You do not really understand something unless you can
explain it to your grandmother." — Albert Einstein.
Sure it has to. I just think the way it's done isn't too elegant.
Maybe it also was the lack of good threading and OO design
examples I had initially. Right now my only criticism there is
that they made Thread.start() a method (that looks as though it
would run just like any method, but instead it spawns a thread),
instead of saying like forkThread(Function r), where Function
encapsulates a method, so it could be Runnable; forkThread would
not be an OO library method, but a language element. I don't know
much about pthreads; it seems to work that way (except that the
methods are all library functions).
The one thing that instances of Runnable handle elegantly, is
thread-local storage: just put it in the Runnable object.
Fernando> On Thu, 14 Apr 2005 15:27:52 -0400, Joe Marshall <j...@ccs.neu.edu>
Fernando> wrote:
>>> C is NOT an abomination. It's a good low-level langauge.
>>
>> Brain-dead macros.
Fernando> I think this is unavoidable without a Lisp-like syntax.
>> No tail recursion.
>> No closures.
>> Completely broken type system.
>> Pointless distinction between `statements' and `expressions'.
>> Bizarre syntax.
>> Vague semantics.
>>
>> What is it good for?
Fernando> Keeping you away from asm. ;-) It's a good language for recoding
Fernando> bottlenecks.
No it's not. I still have to code in assembly for my bottlenecks
sometimes.
Ray
> - One interesting tidbit by Guy Steele is that he stated that they had
> a full working implementation of inner classes that close not only
> over the final variables of the surrounding environment, but over any
> variables. This was voted down due to "efficiency" issues - variable
> bindings had to be allocated on the heap instead of the stack. Closing
> over final variables doesn't impose an implicit heap allocation.
What about autoboxing introduced in Java 5? It does implicit heap
allocation!
--
__("< Marcin Kowalczyk
\__/ qrc...@knm.org.pl
^^ http://qrnik.knm.org.pl/~qrczak/
Exactly. If it were a decent portable assembler I never would
have learnt assembly language. My plans are to build my own
simple backend eventually, so I can have *efficient* code with
non-wasteful calling conventions. In effect, a C replacement.
The only good thing about C is that it's reasonably low-level and
runs *everywhere*.
> Matthias Blume wrote:
>>>>>C is NOT an abomination. It's a good low-level langauge.
>>>>
>>>>What is it good for?
>>>
>>>Portable assembler.
>> Actually, it is not even really that good for that.
>
> Exactly. If it were a decent portable assembler I never would have
> learnt assembly language. My plans are to build my own simple backend
> eventually, so I can have *efficient* code with non-wasteful calling
> conventions. In effect, a C replacement. The only good thing about C
> is that it's reasonably low-level and runs *everywhere*.
I guess what you are looking for is C--.
Sorry, yes I know we're off topic, but I wasn't meaning to suggest that
you hadn't interviewed them.
Rather, I was meaning to suggest that neither resumes nor interviews
actually seem to be good ways of determining how good people are at
programming computers.
Best wishes,
Bill.
Maybe it's because everything I've written isn't more than a
couple 1000 lines. ;)
Still, I claim that good structuring pretty much removes MM bugs.
If you use malloc() and free() all over the place, that *not*
good structuring.
You might think of explicit MM as dynamic typing: the errors don't
get caught be the machine; you have to plan and test carefully.
In practice, most dynamic language programmers don't suffer much
runtime type errors, I hear. I don't suffer dangling pointers and
stuff, which might in the right type system be considered a type
error.
As someone said: 'Aging is not that bad if you consider the
alternatives'.
What are the alternatives to C as a portable asm?
PS It's not a rhetoric question, I'm interested.
Last I looked it didn't seem very alive. Now actually the website
dates Nov 2004 :)
Still, for portability I'm thinking of just building a simple VM
instead. Speed was never my problem. Coding convenience is.
>>>> Hm, maybe Fortran, Cobol. Maybe stuff like Brainf*** and
>>>> Unlambda, which aren't serious. What other abominations are out
>>>> there?
>>>>
>>>
>>>(Visual) Basic!
>>
>> I wonder why no one mentioned Perl yet...
>
>Hey, Perl is nearly as cool for one-liners as APL!
And nearly as cryptic... ;-)
Actually, I do like APL and its baroque beauty.
> Well, C does not have strings. I agree that it would have been nice
> if C actually had strings.
OTOH, I'm not so sure about that anymore. Given the environment early
C was used in (slow, low-memory systems such as the PDP-11 and early
68K and 8086 machines), it was probably easier to deal with arrays
(i.e., blocks) of characters. Just read in a block from some file
with low-level i/o, put a '\0' at the end, and that was it. With a
string datatype, how would one go about? Let the OS write data into a
character array first, then convert it to a string (possibly invoking
copying, a performance and space killer)? Or not make strings an
opaque datatype but expose their implementation, i.e. for example, "a
string is an int followed by a sequence of chars"? Then how large is
the int? 2 bytes, as on 16-bit machines, 32, or something in between?
What about small-/middle-/large endian issues? The string would have
to be converted when serialized (written to disk or network). How
would one assign or define such a string? Would it create two
variables, one for the length, the other for the character array
itself? Just so that one still has the array adress for the OS to
write into? A _lot_ of problems pop up if you assume a Unix-style (or
DOS) environment on 30 years old small, underpowered machines.
Zero-terminated character arrays can still be used with UTF-8 in 2005.
What about an opaque string datatype that has been designed in 1970?
Would it still be usable? The key to C's survival lies in its
simplicity and its usefulness for low level programming, it does not
limit the programmer, foreclose design decisions, or assume opaqueness
in datatypes that would hinder low level jobs. It is intended for
distinctly different purposes than higher level languages. It's not
C's fault that people write large applications in it. Of course the
syntax and semantics could be better but even C's inventors
acknowledge that.
mkb.
> You might think of explicit MM as dynamic typing: the errors don't get
These two things do not compare.
mkb.
> - One interesting tidbit by Guy Steele is that he stated that they had a
> full working implementation of inner classes that close not only over
> the final variables of the surrounding environment, but over any variables.
Could you please describe what the disadvantages of the current state of
Javas "closures" are? In what situations do you want to close over
non-final vars?
André
--
> As someone said: 'Aging is not that bad if you consider the
> alternatives'.
Okay, it is way offtopic, but about what alternatives are you talking
about? Not aging? Well, that sounds like a *great* alternative and I
hope science will allow us this alternative in 2-3 decades.
André
>Fernando Rodriguez schrieb:
>
>> As someone said: 'Aging is not that bad if you consider the
>> alternatives'.
>
>Okay, it is way offtopic, but about what alternatives are you talking
>about?
Besides dying? ;-)
Time for my favourite birthday joke.
Damn! I'm turning 40.
Oh, so what would you rather be? Not turning 40? But that means you're dead.
Ah. You mean you want to be 40 with a 25 year old's body.
Me, I would like to get as old as possible. The plan is not to stop.
Now being old and *healthy* -- that's the trick.
Not aging means that time as stopped. I seem to recall a scifi story somewhere
where one wanted to stay 25 or so, so they literally repeated that exact year
again and again forever.
Yes exactly, I agree!
>
> Not aging means that time as stopped. I seem to recall a scifi story somewhere
> where one wanted to stay 25 or so, so they literally repeated that exact year
> again and again forever.
When I said "not aging" I meant staying in a body that is around 25
years old. We would rejuvenation therapies for that.
http://news.bbc.co.uk/1/hi/uk/4003063.stm
André
--
> Pascal Costanza <p...@p-cos.net> writes:
>
>>- One interesting tidbit by Guy Steele is that he stated that they had
>>a full working implementation of inner classes that close not only
>>over the final variables of the surrounding environment, but over any
>>variables. This was voted down due to "efficiency" issues - variable
>>bindings had to be allocated on the heap instead of the stack. Closing
>>over final variables doesn't impose an implicit heap allocation.
>
> What about autoboxing introduced in Java 5? It does implicit heap
> allocation!
Why do you expect consistency in their decisions?
The story about implicit heap allocation of closed-over variables was
told by Guy Steele in some mailing list some time ago. (I think it was
ll-discuss.)
Pascal
--
2nd European Lisp and Scheme Workshop
July 26 - Glasgow, Scotland - co-located with ECOOP 2005
http://lisp-ecoop05.bknr.net/
That's not the right question.
Do you know what Java programmers simulate when they write something
like this:
final int[] i = new int[1]; i[0] = ...;
...
new SomeInterface() {
...
i[0] = ...;
...
}
? ;)
>
> Forgot one of the more important ones that regularly annoy me: no
> multiple return values.
This always struck me as redundant and unnecessarily complicating code in
other parts (cf function combination). You can just use tuples. For
example, in ML, if `divide` returns a tuple, and you only need the first
value, you can write:
let i,_ = divide x y
Oh no. Not again.
> Only the beginning sentence is quite
> funny: "are taking on an enormous challenge -- create a programming
> language better than Java."
Only fifty other languages come to mind :-)
> Scott G. Miller wrote:
> >>> So that you can add new types without having to modify that one
> >>> "module". If I add a Complex type, I would need to go and modify the
> >>> Conversions class or whatever you'd want to call it.
> >>
> >>
> >>
> >> Why? Create a Complex module, put the type and functions in it. Most
> >> languages work like that!
> >>
> > Including Java. I don't understand what you're arguing about.
>
> That any function that has Complex as a source or result type
> needs to be in the Complex class. If I want string->complex, and
> this isn't in the complex class distribution, and it isn't in
> String, obviously, then what do I do? In another language I can
> write a stringToComplex function *anywhere*. In Java I have to
> put it into a class, why? And it's static, a very descriptive
> keyword for something that's just supposed to be a function, and
> doesn't even *want* to be part of a class in the first place.
> Doesn't make code very readable.
You're arguing against the whole class of languages that are "OO all the
way down". That's just the nature of this type of programming language:
everything is in a class. Smalltalk, the archetype of OO languages,
also requires everything to be in a class.
--
Barry Margolin, bar...@alum.mit.edu
Arlington, MA
*** PLEASE post questions in newsgroups, not directly to me ***
> I've gone over 100 resumes for a position in our company recently. I
> saw exactly one person who knew any functional languages. Out of those
> 100, every one of them had C or Java experience, and out of those, five
> candidates were qualified. Thats still five times as many Java
> programmers than Lisp ones. Its a terrible vicious cycle, but its reality.
But if you can get 5 times as much product from the Lispers, you still
win.
Most of the C/Java programmers probably learned at Joe Blow Technical
Institute. The functional programmers, on the other hand, are mostly
going to be full-fledged hackers.
You don't know enough C programmers. I remember programming a LOT of C
in the 80s and early 90s but can't remember memory allocation ever being
a problem. I was quite surprised, actually, to hear everyone else
thought it was.
>
> You forgot the worst design decision ever:
> null-terminated strings. They are inefficient, insecure,
> hard to program with, there is no reason at all any language should
> have them (even a low level one).
>
You don't have to use them. You can specify the length of
variable-length character arrays. Besides, null-terminated strings are
supported in some instruction sets and used inside BIOS. It doesn't get
too lower-level than that.
Because it is not an abomination. Advanced programming in perl is a
mind altering experience. Try creating a new class, at runtime, for
example, or a new named variable in some namespace/package; also, look
at perl's goto.
--
A. Kanawati
NO.anto...@comcast.net
> Where can I read that interview on the net? I suppose Guy Steele is
> not charging anyone for reading his words?
Not sure, I read it at the local Borders.
They have a relaxed policy about extended browsing.
> No man is good enough to govern another man without that other's
> consent. -- Abraham Lincoln
I wonder if the South consented to being governed by him...
> If you think that Java is the worst language, you haven't seen many
> languages.
Well, you must admit that it was a step backwards from
ML, Miranda, Smalltalk,Prolog, Simula or any one of many far better
languages that existed at that time.
However, as a language for the "average programmer" it was an improvement
on C++ the then dominant language.
> Almost every language I know has at least one aspect that is at least
> as brain-dead if not more so than it is in Java.
I am curious about that statement.
Care to list candidates for brain-deadness in Scheme, SML and Haskell ?
Just asking because of curiosity, not challenging or doubting you
I would have nominated monads for haskell, but I am reliably
informed that they replaced an even worse previous IO system.
> Matthias Blume <fi...@my.address.elsewhere> writes:
>
>> Almost every language I know has at least one aspect that is at least
>> as brain-dead if not more so than it is in Java.
>
> I am curious about that statement.
> Care to list candidates for brain-deadness in Scheme, SML and Haskell ?
Scheme: see the many flamewars I have been involved in
SML: see Appel's critique
Haskell: well, not sure here; it's probably closest to "perfect" for
some definition of "perfect"; but it is probably completely
unsuitable for the "average" programmer, as sad as that may be.
> Just asking because of curiosity, not challenging or doubting you
>
> I would have nominated monads for haskell, but I am reliably
> informed that they replaced an even worse previous IO system.
IMO, the worst part about monads is that they are called "monads",
which always draws puzzled looks -- which don't get any less puzzled
when the explanation "it's something from category theory" is put
forward.
But this is not a problem with the language but mainly a problem with
how it is presented. Monads aren't scary at all -- every "average"
programmer uses a (very restricted version of Haskell's IO) monad on a
daily basis. They call it "imperative programming", and it isn't
scary at all (to them).
...which term may be, in turn, based on Leibniz'
"Monadology" and "Discourse on Metaphysics". Most of us
remember Leibniz as the independent inventor of calculus,
but he also had some bizarre metaphysical ideas. The
monads were simple and unextended elements that had no
direct contact with each other, but seemed to have
because they were synchronized in his clockwork universe.
It was also Leibniz' metaphysical views that Voltare
lambasted in his novel "Candide," in the person of the
character Dr. Pangloss, who kept reminding the
unfortunate Candide that she lived in the best of all
*possible* worlds.
--
mbstevens http://www.mbstevens.com
Not really. In imperative programming every function has some
kind of IO type (if I understand Haskell correctly there). The
problem is that in Haskell you don't want everything to be IO. As
soon as you need a side effect in one place, all functions
dependent on it also have to get an IO type, just like in Java all
functions have to throw an Exception if you add one to the code.
This can get really inconvenient for code that changes a lot.
OTOH I never understood Haskell monads enough to write programs
with it; I pretty much dumped it for SML ;)
So maybe in "real" programming monads aren't an issue at all, once
you understand them.
--
That turns out not to be the case. Dylan is "OO all the way down" but
methods do not belong in classes.
--
Bruce | 41.1670S | \ spoken | -+-
Hoult | 174.8263E | /\ here. | ----------O----------
I don't know that much about the civil war, being European, but
apart from fighting about slavery, did he force the south to be
part of the union?
Certainly he was elected by a majority. But you're right --
democracy is usually the majority forcing things upon the
minority(s), and I can't say I always agree with it.
--
No, not again. Let me just say that Java has no tuples with which
to emulate MRVs. In ML I consider a tuple to be a MV in fact :)
I guess so. I can't see any real reason to force the user to 100%
OO, when any procedural language can perfectly integrate OO as
well, with no forcing the user to anything.
> Despite all these I prefer it to Java by a good margin.
>>>Closing over final variables doesn't impose an implicit heap allocation.
>> What about autoboxing introduced in Java 5? It does implicit heap
>> allocation!
>
> Why do you expect consistency in their decisions?
>
> The story about implicit heap allocation of closed-over variables was
> told by Guy Steele in some mailing list some time ago. (I think it was
> ll-discuss.)
I've heard this explanation too.
Variable arity methods, also introduced in Java 5 to match C# by
features, do implicit heap allocation too.
Since C# 2.0 introduced true closures, with the ability to rebind
outer variables, shouldn't Java start supporting this again? :-)
--
__("< Marcin Kowalczyk
\__/ qrc...@knm.org.pl
^^ http://qrnik.knm.org.pl/~qrczak/
>That turns out not to be the case. Dylan is "OO all the way down" but
>methods do not belong in classes.
BTW, does Dylan have macros 'a la lisp'? I always wondered if it was
possible with a traditional infix syntax...
Yep, for example check out Jonathan Bachrach's work on "Dylan Procedural
Macros" and even "The Java Syntactic Extender". See
http://people.csail.mit.edu/people/jrb/jrb.html
It's not as straightforward as in Lisp, though.
>> I wonder why no one mentioned Perl yet...
>
>Because it is not an abomination. Advanced programming in perl is a
>mind altering experience. Try creating a new class, at runtime, for
I had a very short exposure to Perl several years ago (immediately
after learning scheme) and the design seemed a mess to me. Maybe
things have changed. At least the new VM for Perl looks interesting.
Do you know of any quick intro to the new cool aspects of perl, for
experienced programmers?
>example, or a new named variable in some namespace/package; also, look
>at perl's goto.
goto? Hmm...
> >>>>> "Fernando" == Fernando Rodriguez <frr@THOU_SHALL_NOT_SPAMeasyjob.net> writes:
>
> Fernando> On Thu, 14 Apr 2005 15:27:52 -0400, Joe Marshall <j...@ccs.neu.edu>
> Fernando> wrote:
>
>
> >>> C is NOT an abomination. It's a good low-level langauge.
> >>
> >> Brain-dead macros.
>
> Fernando> I think this is unavoidable without a Lisp-like syntax.
>
> >> No tail recursion.
> >> No closures.
> >> Completely broken type system.
> >> Pointless distinction between `statements' and `expressions'.
> >> Bizarre syntax.
> >> Vague semantics.
> >>
> >> What is it good for?
>
> Fernando> Keeping you away from asm. ;-) It's a good language for recoding
> Fernando> bottlenecks.
>
> No it's not. I still have to code in assembly for my bottlenecks
> sometimes.
Me too. In fact, I'd replace the "sometimes" with a "usually", since
I work with a good native CL compiler. As for "keeping you away from
asm", why is that a good thing? I don't find x86 assembly
particularly worse than C, but I do find C much worse than SPARC and
PPC assembly.