Larry Wall's comments on Ruby

135 views
Skip to first unread message

Phil Tomson

unread,
Sep 6, 2002, 12:53:34 PM9/6/02
to
http://interviews.slashdot.org/article.pl?sid=02/09/06/1343222&mode=thread&tid=145

We may not agree with what Larry has to say about Ruby, but as usual he
says it well.

The relevant part:

"As for specifics, I must say that the example of Ruby is the main reason
I decided against implicit lexical scoping for Perl 6. We'll be sticking
with explicit my declarations. But I have to like the majority of Ruby
simply because that's the part that was borrowed straight out of Perl. :-)

I also liked Ruby's unary splat operator, so I borrowed it for Perl 6.

The main problem I see with Ruby is that the Principle of Least Surprise
can lead you astray, as it did with implicit lexical scoping. The question
is, whose surprise are you pessimizing? Experts are surprised by different
things than beginners. People who are trying to grow small programs into
large programs are surprised by different things than people who design
their programs large to begin with.

For instance, I think it's a violation of the Beginner's Principle of
Least Surprise to make everything an object. To a beginner, a number is
just a number. A string is a string. They may well be objects as far as
the computer is concerned, and it's even fine for experts to treat them as
objects. But premature OO is a speed bump in the novice's onramp. "

Personally, I like Ruby's scoping rules a lot better than Perl's.

Also, I think that everyting being an object is actually helpful for
beginners - It's much easier to pick up OO programming ideas if you learn
them first, I think. It also tends to make things much more consistent.

Thoughts?

Phil

Andrew Hunt

unread,
Sep 6, 2002, 1:48:42 PM9/6/02
to

Larry Wall is reputed to have said:

>For instance, I think it's a violation of the Beginner's Principle of
>Least Surprise to make everything an object. To a beginner, a number is
>just a number. A string is a string. They may well be objects as far as
>the computer is concerned, and it's even fine for experts to treat them as
>objects. But premature OO is a speed bump in the novice's onramp. "

I think Larry's off his rocker on this one. Consistency is far
more important than familiarity. IMHO, Larry is demonstrating a
widely-held bias that objects are somehow "different" and should
be segregated and not taught to beginners.

I think that's a load of crap; exposing native types as non-objects
in Java, for instance, is to me one the largest failings of that language.

I thinks that's why so many people have trouble mastering OO concepts:
because it's taught as a sort of an add-on to already entrenched
procedural, linear thinking. It's been my experience that you get
better mileage is you start with objects right out of the gate -- which
is exactly what Dave and I did in the pickaxe book.

To the best of my knowledge, not one reviewer or fan letter has yet
to criticize that decision. Far from it -- every bit of feedback we've
gotten so far has been uniformly positive that we started right off
with objects, without excuses.

>They may well be objects as far as
>the computer is concerned

Actually, I think that's backwards -- I don't care *what* the
computer thinks about these things. By the time it hits the
CPU it sure as hell isn't an object anymore. *I* want to
to think of it as an object, and I want the other programmers
on my team to think of it as an object as well.

Just my humble $0.02.

/\ndy

--
Andrew Hunt, The Pragmatic Programmers, LLC.
Innovative Object-Oriented Software Development and Mentoring for Agile Methods
web: http://www.pragmaticprogrammer.com email: an...@pragmaticprogrammer.com
--
Author of "The Pragmatic Programmer" * "Programming Ruby" * The Agile Manifesto
Columnist for IEEE Software Magazine * Board of Directors, Agile Alliance
Pragmatic T-shirts available at: www.pragmaticprogrammer.com/merchandise.html
--

Nigel Clarke

unread,
Sep 6, 2002, 1:51:42 PM9/6/02
to
What do you expect him to say?

Ruby Rocks and Perl Sucks.

No. In a nutshell he had some good things to say about Ruby.

Common, this is Larry Wall.

dbl...@candle.superlink.net

unread,
Sep 6, 2002, 1:55:36 PM9/6/02
to
Hi --

On Sat, 7 Sep 2002, Phil Tomson wrote:

Larry Wall wrote:

> The main problem I see with Ruby is that the Principle of Least Surprise
> can lead you astray, as it did with implicit lexical scoping. The question
> is, whose surprise are you pessimizing?

No question here. We have the answer: Matz's.


David

--
David Alan Black | Register for RubyConf 2002!
home: dbl...@candle.superlink.net | November 1-3
work: blac...@shu.edu | Seattle, WA, USA
Web: http://pirate.shu.edu/~blackdav | http://www.rubyconf.com

JamesBritt

unread,
Sep 6, 2002, 2:04:07 PM9/6/02
to
> The main problem I see with Ruby is that the Principle of Least Surprise
> can lead you astray, as it did with implicit lexical scoping. The
> question
> is, whose surprise are you pessimizing? Experts are surprised by
> different
> things than beginners. People who are trying to grow small programs into
> large programs are surprised by different things than people who design
> their programs large to begin with.

This may have to do with the granularity of surprise. For example, if I use
an object n a string context (e.g., puts my_obj), I would be surprised if
there wasn't an implicit call to to_s or to_str. An I believe that is how
Perl does it; the parser "knows" the context, be it scalar, or vector, or
whatever, and does The Right Thing.

On the other hand, I tried testing some blogging code on a different box,
and was "surprised" it failed. (Turns out the code needs to know if it is
run under CGI, or mod_ruby, or command line.) But I'd be *really* surprised
if anyone thought this point was the responsibility of anyone but the
developer.

I'd be interested to know if anyone was surprised by something in Ruby when
they moved a small app to a large app.

The POLS is sort of an 80/20 thing. Certain things (like some_string[x]
returning a number, not a character) surprised me, but by and large the
least surprise grows from a consistency of design principles, not from
Matz's mood on any given day.

>
> For instance, I think it's a violation of the Beginner's Principle of
> Least Surprise to make everything an object. To a beginner, a number is
> just a number. A string is a string. They may well be objects as far as
> the computer is concerned, and it's even fine for experts to
> treat them as
> objects. But premature OO is a speed bump in the novice's onramp. "
>
> Personally, I like Ruby's scoping rules a lot better than Perl's.
>
> Also, I think that everyting being an object is actually helpful for
> beginners - It's much easier to pick up OO programming ideas if you learn
> them first, I think. It also tends to make things much more consistent.
>
> Thoughts?

I, too, think it makes more sense to present objects first, then show
concrete examples later. Because, later on, if you have various execptions
to Everything Is An Object (see Java), you have to mentally track more
stuff. So, rather than some possible brief discomfort when learning the
language, you have mild discomfort all the time (see Java).

I don't think Lary is giving OO newcomers enough credit.


James

>
> Phil
>

Alan Chen

unread,
Sep 6, 2002, 2:10:36 PM9/6/02
to

On Sat, Sep 07, 2002 at 02:31:49AM +0900, Phil Tomson wrote:
> http://interviews.slashdot.org/article.pl?sid=02/09/06/1343222&mode=thread&tid=145


>
> "For instance, I think it's a violation of the Beginner's Principle of
> Least Surprise to make everything an object. To a beginner, a number is
> just a number. A string is a string. They may well be objects as far as
> the computer is concerned, and it's even fine for experts to treat them as
> objects. But premature OO is a speed bump in the novice's onramp. "
>
> Personally, I like Ruby's scoping rules a lot better than Perl's.
>
> Also, I think that everyting being an object is actually helpful for
> beginners - It's much easier to pick up OO programming ideas if you learn
> them first, I think. It also tends to make things much more consistent.

A total beginner, I think, would have the same amount of difficulty
learning OO vs. non-OO languages. For a beginner to ruby, but not to
programming, I would think that the learning curve would be strongly
background dependent.

Having recently dived into a perl project for a client after a short
vacation from perl, I'm finding that having everything OO in ruby
makes it much easier to lookup functions and operators in code and
documentation. For rarely used perl functions, I find that I have to
go poking around in the docs about twice as long to locate the
reference info I want.


--
Alan Chen
Digikata LLC
http://digikata.com

W Kent Starr

unread,
Sep 6, 2002, 3:23:34 PM9/6/02
to
On Fri, 2002-09-06 at 13:48, Andrew Hunt wrote:
>
> Larry Wall is reputed to have said:
>
> >For instance, I think it's a violation of the Beginner's Principle of
> >Least Surprise to make everything an object. To a beginner, a number is
> >just a number. A string is a string. They may well be objects as far as
> >the computer is concerned, and it's even fine for experts to treat them as
> >objects. But premature OO is a speed bump in the novice's onramp. "
>
> I think Larry's off his rocker on this one. Consistency is far
> more important than familiarity. IMHO, Larry is demonstrating a
> widely-held bias that objects are somehow "different" and should
> be segregated and not taught to beginners.

Which, geven the time of his rise to ascendency would be expected. We
are all a product of "our times". While complexity theory has been
around since the 70's (some would say off and on since the time of the
Egyptians) but was not widely known outside of very tight academic
circles before the mid-90's. Thus, Larry can be forgiven for thinking
the concept of "objects" too difficult for the beginner.


>
> I think that's a load of crap; exposing native types as non-objects
> in Java, for instance, is to me one the largest failings of that language.
>

Unfortunately, it is worse than just a "mere load of crap"; it's a time
bomb. I know through a contact one large brand (not at liberty to say
who) had to replace their Java-based e-commerce solution in order to
efficiently scale up. (My contact solved their problem with perl/MySQL.)
I suspect, as time unfolds many more such issues will arise, especially
regarding maintainability.

> I thinks that's why so many people have trouble mastering OO concepts:
> because it's taught as a sort of an add-on to already entrenched
> procedural, linear thinking. It's been my experience that you get
> better mileage is you start with objects right out of the gate -- which
> is exactly what Dave and I did in the pickaxe book.

OO isn't taught well IMO. "Pickaxe" is a rare exception. The irony is
that Ruby is closely modelled in accordance with contemporary thinking
in modern theoretical physics where everything -is- an object (in
concept, not necessarily name). The physical world in which we live is
de facto Rubyesque. :-)

Regards,

Kent Starr


Denys Usynin

unread,
Sep 6, 2002, 3:24:33 PM9/6/02
to
> The main problem I see with Ruby is that the Principle of Least Surprise
> can lead you astray, as it did with implicit lexical scoping. The question
> is, whose surprise are you pessimizing? Experts are surprised by different
> things than beginners. People who are trying to grow small programs into
> large programs are surprised by different things than people who design
> their programs large to begin with.

I think this whole Least Surprise Principle is a load of bullshit that
is invoked far too often for no good reason. It has a fancy name, but I
translate it to myself as "when matz made Ruby he made sure the way it
worked made sense to him". Excuse me, isn't it how all languages are(or
should be) made?

When you are a complete novice to computers, nothing in Ruby (or in any
other language) will be familiar to you and you will be 'surprised' by
everything. You just go ahead and learn how Ruby (or any other language)
works and you live with it and let Ruby be Ruby. Then you gradually
train your intuition to Ruby and everything is dandy.

When you learn Ruby after having a lot of experience with languages such
as C++/Java everything makes sense, and the Principle seemingly works.
If you only programmed in Fortran before (which is the case with a lot
of old-school physicists for example) then you will probably be even
more surprised than if you were a novice...

So face it, Ruby is just the language with its own structure, logic,
syntax and attitude. You just learn it, like every other language. It
is a good language simply because it was written by a good programmer;
and the story about the Principle of the Least Surprise is a good
anecdote for language historians but has nothing to do with any serious
discussion on advantages and disadvantages of Ruby.


Denys Usynin

unread,
Sep 6, 2002, 3:36:22 PM9/6/02
to
Andrew Hunt wrote:
> Larry Wall is reputed to have said:
>
> >For instance, I think it's a violation of the Beginner's Principle of
> >Least Surprise to make everything an object. To a beginner, a number is
> >just a number. A string is a string. They may well be objects as far as
> >the computer is concerned, and it's even fine for experts to treat them as
> >objects. But premature OO is a speed bump in the novice's onramp. "
>
> I think Larry's off his rocker on this one. Consistency is far
> more important than familiarity. IMHO, Larry is demonstrating a
> widely-held bias that objects are somehow "different" and should
> be segregated and not taught to beginners.
>
> I think that's a load of crap; exposing native types as non-objects
> in Java, for instance, is to me one the largest failings of that language.
>
> I thinks that's why so many people have trouble mastering OO concepts:
> because it's taught as a sort of an add-on to already entrenched
> procedural, linear thinking. It's been my experience that you get
> better mileage is you start with objects right out of the gate -- which
> is exactly what Dave and I did in the pickaxe book.
>
I agree completely,I think Ruby might be the best language to learn as
your first computer language, with it's relatively forgiving syntax and
consistently object oriented logic.

after all, isn't the whole purpose of OO programming to let the person
think as a human instead of trying to think as a machine?

the way programming is taught right now, first you wrestle your brain
into thinking in terms of computer logic so that you can code in C, and
then wrestle it back into using human logic again when you learn OOP in
say C++. Such an unnatural process.

William Djaja Tjokroaminata

unread,
Sep 6, 2002, 3:41:14 PM9/6/02
to
He, he, I think Larry Wall really got himself into trouble this time. The
first time I learned Ruby, I didn't use OO feature at all; I just did
"straight" procedural programming in a script:

a = ...
b = ...
c = a * b
def func (x)
....

Then finally, after I learned all the OO stuff, I got the quite pleasant
surprise, that I actually already programmed in OO since the beginning as
the simple script above is actually inside the class Object. What can be
better than this? Even in Java people have to be choked with endless
keywords and object stuff since the beginning. I would not hesitate to
say that Matz is much more genius than Larry!

Regarding the keyword "my", my philosophy is always less typing is better
(unless we are paid by the hour :) ). To me, the use of "@" for
instance var and nothing for local var is one of the best, if not *the
best* way in designing a language. As I already wrote before, there is no
comparison between Perl and Ruby, well, ... except probably for CPAN...

Regards,

Bill
===========================================================================
Phil Tomson <pt...@shell1.aracnet.com> wrote:
> Larry Wall:

Andrew Hunt

unread,
Sep 6, 2002, 3:47:00 PM9/6/02
to
Kent Starr says:

>While complexity theory has been
>around since the 70's (some would say off and on since the time of the
>Egyptians) but was not widely known outside of very tight academic
>circles before the mid-90's.

Hmm. As it turns out, I keep finding out about more and more neat
theories that aren't currently well known. Odds are I'll be embarassed
by one of those twenty years from now :-)

>> I think that's a load of crap; exposing native types as non-objects
>> in Java, for instance, is to me one the largest failings of that language.
>>
>

>Unfortunately, it is worse than just a "mere load of crap"

Well, I was *trying* to be being polite...

>had to replace their Java-based e-commerce solution in order to
>efficiently scale up. (My contact solved their problem with perl/MySQL.)

Interesting. What was the specific issue that killed them? In other
words, was it badly written Java, or just the fact that it was Java?

>I suspect, as time unfolds many more such issues will arise, especially
>regarding maintainability.

Unfortunately, I agree. Maintainability is one of those dark subjects
that no one wants to talk about. Dave and I were part of an OOPSLA
workshop last year on Software Archeology that talked around those
issues; this year Brian Marick has asked us back to talk about
Software For Life. It's an interesting and underappreciated subject,
but as you note, I suspect that will change in time as well.

>OO isn't taught well IMO. "Pickaxe" is a rare exception.

Thanks!

>The irony is
>that Ruby is closely modelled in accordance with contemporary thinking
>in modern theoretical physics where everything -is- an object (in
>concept, not necessarily name). The physical world in which we live is
>de facto Rubyesque. :-)

Ha ha! I love it! God is a Ruby programmer, after all :-)

/\ndy

Andrew Hunt

unread,
Sep 6, 2002, 3:52:45 PM9/6/02
to
Denys Usynin observes:


>I think this whole Least Surprise Principle is a load of bullshit that
>is invoked far too often for no good reason. It has a fancy name, but I
>translate it to myself as "when matz made Ruby he made sure the way it
>worked made sense to him". Excuse me, isn't it how all languages are(or
>should be) made?

Probably, but they are not. I take the PLS to be a measure of
internal consistency. C++, for example, does not have this
level of consistency -- it is riddled with special exceptions to
rules and many "dark corners" where it isn't at all clear what
the expected behavior should be.

The published C++ FAQ book is about 4" thick, after all, so I
submit that that language is actually FULL of surprises :-)

/\ndy

Drew Mills

unread,
Sep 6, 2002, 3:54:43 PM9/6/02
to
I don't think Larry's off his rocker. In fact Andy's words say otherwise:

"I thinks that's why so many people have
trouble mastering OO concepts:
because it's taught as a sort of an add-on

to already entrenched procedural, ..."

OO is still not taught as the norm. Thus beginner's will be surprised by
everythings-an-object, regardless of what Ruby has done. The existence of
Ruby with 'OO everywhere' won't change this.

So Larry was correct. That's not to say that the world needs to accede to
this pedagogy, but it is a positive thing to recognize reality, even if you
decide not to change.

Once the world comes to an agreement that everything should be an object
(and I'm not convinced we will), then it will eventually be taught that way.
Till then, beginners will continue to be surprised by Ruby.

My first professional language in 1985 was an object-oriented LISP. I was
quite suprised by it. I don't think things have changed that much since
then. Except that OO is a more commonly taught additional technique.

Drew

> -----Original Message-----
> From: Andrew Hunt [mailto:an...@toolshed.com]
>

Andrew Hunt

unread,
Sep 6, 2002, 3:55:40 PM9/6/02
to
Denys Usynin goes on to say:


> after all, isn't the whole purpose of OO programming to let the person
>think as a human instead of trying to think as a machine?


"the tools we are trying to use and the language or notation
we are using to express or record our thoughts are the major
factors determining what we can think or express at all!"

-- Edsger W. Dykstra
ACM Turing Award Lecture, 1972

/\ndy

Andrew Hunt

unread,
Sep 6, 2002, 4:05:29 PM9/6/02
to
Drew Mills points out:

>OO is still not taught as the norm.

>So Larry was correct. That's not to say that the world needs to accede to
>this pedagogy, but it is a positive thing to recognize reality, even if you
>decide not to change.

I think that's a fair summary of Perl, actually. It's a very
reasonable thing to conform to the norms -- all possible norms, in
Perl's case. All progress, of course, depends on the *unreasonable*
person :-)

But this is all beside the point -- Larry's off the mark
in suggesting that an inconsistency is preferable for any reason. I
contend that that attitude is just plain wrong. It's better to
be consistent -- even if that consistency is towards something
new and unfamiliar -- than to introduce exceptional cases that
the user (at any level) has to carry around.

/\ndy

Andrew Hunt

unread,
Sep 6, 2002, 4:09:08 PM9/6/02
to
Bill glows:

>I would not hesitate to
>say that Matz is much more genius than Larry!

If we're voting, I'd second that :-)

But all things evolve. I remember being *ecstatic* that Perl
was around, and that I wouldn't have to suffer the problems
inherent in porting shell scripts with awk, nawk, sed,
cut, grep and the rest of the gang from one Unix box to another.
Perl made all of that ugliness go away.

But the world moves on. Ruby is far more appropriate for the
current world, and will hopefully evolve more gracefully for
tomorrow's world as well.

I'll go back to being quiet now...

/\ndy

Michael Campbell

unread,
Sep 6, 2002, 4:10:07 PM9/6/02
to
> But this is all beside the point -- Larry's off the mark
> in suggesting that an inconsistency is preferable for any reason. I
> contend that that attitude is just plain wrong. It's better to
> be consistent -- even if that consistency is towards something
> new and unfamiliar -- than to introduce exceptional cases that
> the user (at any level) has to carry around.

I agree here too in this context, but I have to put in a quote I'm
rather fond of...

Consistency is the last refuge of the unimaginative. -- Oscar Wilde

=)


=====
--
Use your computer to help find a cure for cancer: http://members.ud.com/projects/cancer/

Yahoo IM: michael_s_campbell

__________________________________________________
Do You Yahoo!?
Yahoo! Finance - Get real-time stock quotes
http://finance.yahoo.com

Denys Usynin

unread,
Sep 6, 2002, 4:13:35 PM9/6/02
to

yeah and as I said, depending on your background , Ruby is just as full
of surprises as C++ is.

My guess is Ruby won't surpise you too much only if you have Perl/Python
background. That's it.

I am probably the only person in the world who never used perl/python
:) When I first needed a scripting language I just learned Ruby
instead. I love the language but I can by no means say it was natural
and not surprising to me. Quite the opposite, I had to adapt
to the way Ruby's logic works and get the feel for it.

The great Principle is an empty sound. Larry Wall should have never
mentioned it.

Matt Gushee

unread,
Sep 6, 2002, 4:20:34 PM9/6/02
to
On Sat, Sep 07, 2002 at 02:48:18AM +0900, Andrew Hunt wrote:
>
> I thinks that's why so many people have trouble mastering OO concepts:
> because it's taught as a sort of an add-on to already entrenched
> procedural, linear thinking. It's been my experience that you get
> better mileage is you start with objects right out of the gate -- which
> is exactly what Dave and I did in the pickaxe book.

Hmm, this comment reminds me of when I took Sun's "Java Programming
Language" couse a couple of years ago. It was advertised as being for C
and C++ programmers, and on the first day the instructor spent a good
deal of time making sure everybody knew the prerequisites were no joke.
Unsurprisingly, the lessons made frequent references to C and C++
concepts, which were mostly over my head. But with my Python background,
I had very little trouble with the parts of the course that were
actually about Java (can't say the same for all my classmates, many of
whom did have the prerequisites).

--
Matt Gushee
Englewood, Colorado, USA
mgu...@havenrock.com
http://www.havenrock.com/

pa...@prescod.net

unread,
Sep 6, 2002, 4:36:25 PM9/6/02
to
On Sat, 7 Sep 2002, Denys Usynin wrote:

> yeah and as I said, depending on your background , Ruby is just as full
> of surprises as C++ is.

The question is, once you've mastered Ruby's basic worldview do you still
keep being surprised. Some languages are never surprising once you've
mastered their basics. Others continually surprise with their baroque
twists and turns and special cases. POLS suggests that the designer is
consciously focusing on a design where new concepts fall naturally out of
old ones. C++, by contrast, optimizes for performance. Perl optimizes for
"expressiveness" and in succinctness.

> I am probably the only person in the world who never used perl/python
> :) When I first needed a scripting language I just learned Ruby
> instead. I love the language but I can by no means say it was natural
> and not surprising to me. Quite the opposite, I had to adapt
> to the way Ruby's logic works and get the feel for it.

Of course. But do you have to KEEP adapting to quirky behaviours that
don't fit properly into the logical model you've adapted to? In C++, most
of us do. "X implies Y except when Z which occurs if and only if Q." Like:
"use the delete operator. Except for deleting arrays. There is a special
operator for that. And don't use delete on objects that were malloc'd."

For more examples:

http://new-brunswick.net/workshop/c++/faq/strange-inheritance.html

Paul Prescod


Andrew Hunt

unread,
Sep 6, 2002, 4:46:12 PM9/6/02
to
>yeah and as I said, depending on your background , Ruby is just as full
>of surprises as C++ is.

Disagree; the Ruby FAQ is quite a bit skinnier than 4" :-)

>My guess is Ruby won't surpise you too much only if you have Perl/Python
>background. That's it.

I've never written a line of code in Python, BTW.

But you've piqued my curiosity: tell me, what did you find
surprising about Ruby? Where "surprising" is defined as
"you thought it worked one way" and then discovered that it did not,
or that it worked that way some of the time and some other way
the rest of the time?

Just learning a new concept doesn't count as surprising,
but being led astray by inconsistent behavior does.

/\ndy

Paul Duncan

unread,
Sep 6, 2002, 5:03:09 PM9/6/02
to
* Phil Tomson (pt...@shell1.aracnet.com) wrote:
> http://interviews.slashdot.org/article.pl?sid=02/09/06/1343222&mode=thread&tid=145
>
> We may not agree with what Larry has to say about Ruby, but as usual he
> says it well.
>
> The relevant part:
>
> "As for specifics, I must say that the example of Ruby is the main reason
> I decided against implicit lexical scoping for Perl 6. We'll be sticking
> with explicit my declarations. But I have to like the majority of Ruby
> simply because that's the part that was borrowed straight out of Perl. :-)

I'm suspicious of anything Larry, or any of the top Perl brass, have to
say regarding scoping rules. Variable declaration in Perl is a
nightmare: do you use "my", "our", "local", or a filehandle? Be
careful! The scoping and namespace rules change in subtle and nefarious
ways depending on the declaration. It's even worse if you plan on using
references (or, more accurately, one of the three different types of
references), since the scoping and namespace rules change in subtle and
nefarious ways depending on the declaration.

Examples of this madness? A "local" variable isn't really local; it's
valid in nested subroutines as well. A top-level "my" variable isn't
really global to the package; you have to use "our" in order to allow
that behavior (this distinction is important if you're using symbolic
references instead of scalar references).

Perl doesn't have the best track record regarding sane scoping rules.
Oh, I'm sure they have it all figured out _this_ time around, and they
promise to get it right in Perl 6. As Linus says, show me the code. If
I sound bitter, it's because I've been burned by this mess more than
once. That said, Ruby does have its scoping quirks (eg eval and closure
block ambiguity), most of which have been beaten to death on this list,
so I won't bother regurgitating them here.

> I also liked Ruby's unary splat operator, so I borrowed it for Perl 6.
>
> The main problem I see with Ruby is that the Principle of Least Surprise
> can lead you astray, as it did with implicit lexical scoping. The question
> is, whose surprise are you pessimizing? Experts are surprised by different
> things than beginners. People who are trying to grow small programs into
> large programs are surprised by different things than people who design
> their programs large to begin with.
>
> For instance, I think it's a violation of the Beginner's Principle of
> Least Surprise to make everything an object. To a beginner, a number is
> just a number. A string is a string. They may well be objects as far as

That's simply not true. If a number is an number, then it shares a set
of common characteristics with all other numbers. Is it zero? Is it an
integer? What integer does it round to? Even non-technically inclined
people think this way; sets of things are grouped together because they
share attributes. As Andy says in another response, there's nothing
counterintuitive or complicated about making this natural thought
process a fundamental aspect of a language. On the contrary, the
unnatural distinction between objects (thingies which possess attributes)
and primitives (thingies which are magically exempt or devoid of
attributes) is counterintuitive and confusing for beginners. Larry
should know better.

> the computer is concerned, and it's even fine for experts to treat them as
> objects. But premature OO is a speed bump in the novice's onramp. "
>
> Personally, I like Ruby's scoping rules a lot better than Perl's.

Obviously I agree. :)

> Also, I think that everyting being an object is actually helpful for
> beginners - It's much easier to pick up OO programming ideas if you learn
> them first, I think. It also tends to make things much more consistent.

Again, I agree.

> Thoughts?
>
> Phil

--
Paul Duncan <pa...@pablotron.org> pabs in #gah (OPN IRC)
http://www.pablotron.org/ OpenPGP Key ID: 0x82C29562

Paul Duncan

unread,
Sep 6, 2002, 5:07:51 PM9/6/02
to
* Michael Campbell (michael_s...@yahoo.com) wrote:
> > But this is all beside the point -- Larry's off the mark
> > in suggesting that an inconsistency is preferable for any reason. I
> > contend that that attitude is just plain wrong. It's better to
> > be consistent -- even if that consistency is towards something
> > new and unfamiliar -- than to introduce exceptional cases that
> > the user (at any level) has to carry around.
>
> I agree here too in this context, but I have to put in a quote I'm
> rather fond of...
>
> Consistency is the last refuge of the unimaginative. -- Oscar Wilde

Haha. I was waiting for that quote to rear it's ugly head. :)

> =)
>
>
> =====
> --
> Use your computer to help find a cure for cancer: http://members.ud.com/projects/cancer/
>
> Yahoo IM: michael_s_campbell
>
> __________________________________________________
> Do You Yahoo!?
> Yahoo! Finance - Get real-time stock quotes
> http://finance.yahoo.com

--

Andrew Hunt

unread,
Sep 6, 2002, 5:11:48 PM9/6/02
to
>> Consistency is the last refuge of the unimaginative. -- Oscar Wilde

Well, since ya'll started it, how about:

A foolish consistency is the hobgoblin of little minds.
-- Ralph Waldo Emerson

/\ndy

(really, I'm going to stop now...)

Rich Kilmer

unread,
Sep 6, 2002, 5:13:31 PM9/6/02
to
I would like a quick shot at this.

I was not as much supprised as impressed with Ruby. I was doing HEAVY
Java development (300k line framework) and the dynamic/pure oo nature of
Ruby was VERY different than Java. I read the Pickaxe book in two
days...did not touch the language until a week later. The book just
kept me thinking...internalizing the differences in dynamic vs. static
behaviors objects and classes.

Then I started using it and was functional very quickly...rewriting some
of my java utilities in a fraction of the amount of code/time it took in
Java. I moved on to more complex things...then I hit what I will call
the "refactoring loop". Ruby lets me code, refactor, code,
refactor...its really awesome (unit tests performing refactoring
validation). I NEVER wanted to refactor my Java code (its just not
fun). With Ruby the language does not supprise me (in its
syntax/architecture), but the code I write very much does.

Matz...thanks for a great language
Andy...thanks for a great book

-Rich

> -----Original Message-----
> From: Andrew Hunt [mailto:an...@toolshed.com]

> Sent: Friday, September 06, 2002 4:41 PM
> To: ruby-talk ML
> Subject: Re: Larry Wall's comments on Ruby
>
>

Reimer Behrends

unread,
Sep 6, 2002, 5:11:44 PM9/6/02
to
Andrew Hunt (an...@toolshed.com) wrote:
[...]

> But you've piqued my curiosity: tell me, what did you find
> surprising about Ruby? Where "surprising" is defined as
> "you thought it worked one way" and then discovered that it did not,
> or that it worked that way some of the time and some other way
> the rest of the time?

There are a few things. One is operator precedence; to find out that

a = not(b)

does not work at all, and only gives you an unclear error message is
a real puzzler. Another one is

statement while condition

Noting that the condition is tested before the statement is executed
seems to be contrary to how every other language with test-at-end
loops does it. To discover then that

begin statement end while condition

works the other way round is even more confusing.

Reimer Behrends

Denys Usynin

unread,
Sep 6, 2002, 5:22:47 PM9/6/02
to

> But you've piqued my curiosity: tell me, what did you find
> surprising about Ruby? Where "surprising" is defined as
> "you thought it worked one way" and then discovered that it did not,
> or that it worked that way some of the time and some other way
> the rest of the time?

well things that surprised me will probably be considered obvious by
you. as I said its only a matter of the background and maybe of the
thinking style.
But here are just a few examples I had to struggle with:

1. && and || are not the same as "and" and "or". The difference
is subtle, but why weren't these made completely synonymous in the first
place?

2. File.open( "myfile" ) { |file| puts "block..." } will automatically
close it. I keep trying to close it in the end of block. I may see how
it seems very intuitive to some people but I am used to C.

3. myString.each { } processes lines not bytes. Why?

4. myBlock = Proc.new { puts "myBlock..." }
then I intuitively expect to be able to do this: 5.times myBlock
but it won't work

5. Forgiving syntax leaves lots of room to fool yourself. Classical
example: x = y +z
It is VERY counter-intuitive to me that this tries to call method y with
paramter +z instead of adding the two numbers together

I could go on. Anyway, my point is, as with every language you can't
just blindly rely on your intuition in Ruby, you have to remember how to
do some(many?) things. And just as with every other language, once you
have a feel for it, you can do a lot of things just by pure intuition.
Ruby may have less things to remember than C++ but it doesn't change the
big picture.

Dan Sugalski

unread,
Sep 6, 2002, 5:25:45 PM9/6/02
to
At 6:00 AM +0900 9/7/02, Paul Duncan wrote:
>I'm suspicious of anything Larry, or any of the top Perl brass, have to
>say regarding scoping rules. Variable declaration in Perl is a
>nightmare: do you use "my", "our", "local", or a filehandle?

Erm... only my and our are declarations. local is not. And you'd use
a filehandle presumably when you wanted to read or write from a file
of some sort.

There are a few operations in perl that only act on global variables,
rather than lexical ones (notably local, symbolic refs, and formats).
Perl's appendix--vestigial remnants of an older time. Every language
has them, and they're generally marked as deprecated.

I'd hardly call anything in perl nefarious, though. Well, with
perhaps the exception of the source to the regex engine, but all
regex engine code is evil.
--
Dan

--------------------------------------"it's like this"-------------------
Dan Sugalski even samurai
d...@sidhe.org have teddy bears and even
teddy bears get drunk

Drew Mills

unread,
Sep 6, 2002, 5:49:03 PM9/6/02
to
> -----Original Message-----
> From: Andrew Hunt [mailto:an...@toolshed.com]
> Just learning a new concept doesn't count as surprising,
> but being led astray by inconsistent behavior does.
>
Learning a new concept that does not fit with your worldview does count as
surprising. And it will continue to be surprising until you change your
worldview. And beginner's are still taught a procedural worldview.

You just can't say that Ruby isn't surprising. If someone come's to Ruby,
really likes the OO-everwhere aspect, they will often say, "Wow". And guess
what, they were just surprised. Otherwise, they would say, "Big Deal, I've
been doing that in language X for the last 4 years at school. Of course
everthing's an object." But they don't. Because that's not what they're
taught.

And if that same procedural person has to re-organize his worldview to do
everything in an OO-everwhere fashion, that person will be surprised over
and over until he successfully changes his worldview.

Let's be clear: I'm not looking for procedural everywhere, I'm OO all the
way. But Larry hit the nail on the head that beginners will be surprised.
It might be a nice surprise! But it will be a surprise. His choice to not
surprise them with OO can arguably be judged short-sighted. But his
statement about their surprise is spot-on.

Another way of saying it, "Since our beginners are taught inconsistencies,
they may well be suprised by consistency."

Drew

dbl...@candle.superlink.net

unread,
Sep 6, 2002, 5:59:09 PM9/6/02
to
Hi --

And if you add a rescue:

begin statement; rescue; end while condition

it once again tests the condition first, which I've always found kind
of anomalous.... It means you can't casually throw in a rescue clause
for debugging purposes, or whatever, because the logic will change.


David

--
David Alan Black | Register for RubyConf 2002!
home: dbl...@candle.superlink.net | November 1-3
work: blac...@shu.edu | Seattle, WA, USA
Web: http://pirate.shu.edu/~blackdav | http://www.rubyconf.com

Mark Probert

unread,
Sep 6, 2002, 6:05:49 PM9/6/02
to
At 06:48 AM 9/7/2002 +0900, Drew wrote:
>[snip]

>
>And if that same procedural person has to re-organize his worldview to do
>everything in an OO-everwhere fashion, that person will be surprised over
>and over until he successfully changes his worldview.

I think you are right that it is the change in world view that
is the surprising part. However, after some acquaintance with
more that one programming language, this is sort of expected.

Perhaps school's should be encouraged to think outside the Java/C/C++
idiom? I would think that the surprise factor would be much less
prevalent if people had exposure to a bunch of vastly different
programming languages/environments.

Might I suggest:

+ C/C++/Java
+ Smalltalk/Ruby
+ Haskell/OCaml
+ LISP
+ FORTH
+ FORTRAN
+ COBOL
+ SQL
+ Oberon/Delphi

All of these approach problems in a different way and are optimised
for different solutions (even if many are "general programming
languages").

I like ruby, it is fun, quick and very flexible. It is ideal for
what I am doing at work at the moment (talking to network elements)
but I would be -very- surprised if I could embed it onto an 8-bit
micro-controller with 16k of memory ...

-mark.


Paul Duncan

unread,
Sep 6, 2002, 6:22:51 PM9/6/02
to
* Dan Sugalski (d...@sidhe.org) wrote:
> At 6:00 AM +0900 9/7/02, Paul Duncan wrote:
> >I'm suspicious of anything Larry, or any of the top Perl brass, have to
> >say regarding scoping rules. Variable declaration in Perl is a
> >nightmare: do you use "my", "our", "local", or a filehandle?
>
> Erm... only my and our are declarations. local is not. And you'd use
> a filehandle presumably when you wanted to read or write from a file
> of some sort.

True. I guess what I'm really trying to say is this: despite the
additional complexity and inconsistent behavior, Perl's scoping rules
are not really that much more powerful than Ruby's scoping rules.

The reason I tossed in file handles? I was using them to illustrate how
various types in Perl have scoping nuances which cause unexpected
behavior and give unexpected results. Case in point:

use strict; # no more implicit variables
open HeyImStillImplicit, "filename.txt"; # okay!!???
open $foo, "filename.txt"; # nope sorry

# this is okay:
my $foo;
open $foo, "filename.txt";

# but this isn't:
my FILEHANDLE;
..

> There are a few operations in perl that only act on global variables,
> rather than lexical ones (notably local, symbolic refs, and formats).
> Perl's appendix--vestigial remnants of an older time. Every language
> has them, and they're generally marked as deprecated.

Neither symbolic references nor the local keyword are marked as
depricated. To be fair, local says "you should probably be using my()".

> I'd hardly call anything in perl nefarious, though. Well, with
> perhaps the exception of the source to the regex engine, but all
> regex engine code is evil.
> --
> Dan
>
> --------------------------------------"it's like this"-------------------
> Dan Sugalski even samurai
> d...@sidhe.org have teddy bears and even
> teddy bears get drunk

--

Andrew Hunt

unread,
Sep 6, 2002, 6:37:26 PM9/6/02
to
>Learning a new concept that does not fit with your worldview does count as
>surprising. And it will continue to be surprising until you change your
>worldview. And beginner's are still taught a procedural worldview.
>
>You just can't say that Ruby isn't surprising. If someone come's to Ruby,
>really likes the OO-everwhere aspect, they will often say, "Wow". And guess
>what, they were just surprised.

But that's not the original point -- the Principle of Least surprise
(to me, at least) is not about "original" surprise, it's about
"repeated" surprise (see an earlier post).

/\ndy

W Kent Starr

unread,
Sep 6, 2002, 6:39:12 PM9/6/02
to
On Fri, 2002-09-06 at 15:45, Andrew Hunt wrote:

>
> >had to replace their Java-based e-commerce solution in order to
> >efficiently scale up. (My contact solved their problem with perl/MySQL.)
>
> Interesting. What was the specific issue that killed them? In other
> words, was it badly written Java, or just the fact that it was Java?
>

Well, the "incident" occurred a little over two years ago and involved a
major online/offline retailer with a highly seasonal business. Because
this was told to me in confidence I can't be more specific about those
details.

The application was a custom implemented e-commerce app, Java based.
Well into season the avg daily hit count ranged into the very high five
figures and -- the whole bleedin' thing crashed! That's when my contact
was called in. During the day and half the site was down (which cost
cles to a half million dollars in lost revenue, I'm told) the contact
and two of his programmers worked non-stop to recast the whole thing in
perl and MySQL. The last I heard the redone site was still pumping with
an avg daily hit count in the low six figures.

Was it badly written Java? I never saw the source code, of course. But I
do know from other experiences and contacts that badly written Java is
not too difficult to achieve, especially in a crunch (which nearly
everything e-commerce related tend to be.) The only thing I -do- know is
that the Java app failed (with costly results) and the open source
solution saved the day.

> Unfortunately, I agree. Maintainability is one of those dark subjects
> that no one wants to talk about. Dave and I were part of an OOPSLA
> workshop last year on Software Archeology that talked around those
> issues; this year Brian Marick has asked us back to talk about
> Software For Life. It's an interesting and underappreciated subject,
> but as you note, I suspect that will change in time as well.
>

Bright on the buzzword radar these days is TCO -- total cost of
ownership. Over he lifetime of any given software projects, as with
ships, the bulk of TCO is maintenance, not original construction. All
other things being equal, the easier (read cheaper) a solution is to
maintain, the lower the TCO over time. With IT turnover still is the
double digits (despite the economic downturn) a lot of fresh hands will
be involved in that maintenance so the issues of expressiveness and
clarity, as they relate to software projects, are critical ones.


> >The irony is
> >that Ruby is closely modelled in accordance with contemporary thinking
> >in modern theoretical physics where everything -is- an object (in
> >concept, not necessarily name). The physical world in which we live is
> >de facto Rubyesque. :-)
>
> Ha ha! I love it! God is a Ruby programmer, after all :-)
>

Well, Wolfram would likely cast God as a Mathematica programmer. :-)
Searls, Locke, et al. would say God created the bazaar but man built the
cathedrals, and thus gave rise to "original sin". :-)

Regards,

Kent Starr

Hal E. Fulton

unread,
Sep 6, 2002, 6:39:51 PM9/6/02
to
----- Original Message -----
From: "Andrew Hunt" <an...@toolshed.com>
To: "ruby-talk ML" <ruby...@ruby-lang.org>
Sent: Friday, September 06, 2002 2:54 PM
Subject: Re: Larry Wall's comments on Ruby

"Language shapes the way we think and determines
what we can think about."
Benjamin Whorf

"Those who doubt the importance of a convenient notation
should try writing a LISP interpreter in COBOL or doing
long division with Roman numerals."
Hal Fulton


:)
Hal

Matt Gushee

unread,
Sep 6, 2002, 6:47:37 PM9/6/02
to
On Sat, Sep 07, 2002 at 07:38:54AM +0900, Hal E. Fulton wrote:
>
> "Language shapes the way we think and determines
> what we can think about."
> Benjamin Whorf

Well, linguists these days don't think too much of *his* work. Much of
it relied on generalizations about indigenous cultures that he never
actually observed. Though his mentor, Sapir, whose conclusions were more
moderate, is still respected.

Hal E. Fulton

unread,
Sep 6, 2002, 6:57:21 PM9/6/02
to
----- Original Message -----
From: "Matt Gushee" <mgu...@havenrock.com>
To: "ruby-talk ML" <ruby...@ruby-lang.org>
Sent: Friday, September 06, 2002 5:47 PM
Subject: Re: Larry Wall's comments on Ruby

> On Sat, Sep 07, 2002 at 07:38:54AM +0900, Hal E. Fulton wrote:
> >
> > "Language shapes the way we think and determines
> > what we can think about."
> > Benjamin Whorf
>
> Well, linguists these days don't think too much of *his* work. Much of
> it relied on generalizations about indigenous cultures that he never
> actually observed. Though his mentor, Sapir, whose conclusions were more
> moderate, is still respected.

Still a bright person with fascinating ideas. He just
went overboard.

I admire him because his real field was chemical engineering,
yet he managed to make a stir in the linguistic community.
Granted many of his ideas were wrong. He was just an amateur,
after all.

It's when I reflect on his status as a self-taught amateur
that I respect him again.

Hal


Paul J. Sanchez

unread,
Sep 6, 2002, 8:04:34 PM9/6/02
to
>>>>> "DM" == Drew Mills <tam...@ups.com> writes:

>> -----Original Message----- From: Andrew Hunt
>> [mailto:an...@toolshed.com] Just learning a new concept doesn't
>> count as surprising, but being led astray by inconsistent
>> behavior does.
>>

DM> Learning a new concept that does not fit with your worldview
DM> does count as surprising. And it will continue to be
DM> surprising until you change your worldview. And beginner's
DM> are still taught a procedural worldview.

DM> You just can't say that Ruby isn't surprising. If someone
DM> come's to Ruby, really likes the OO-everwhere aspect, they
DM> will often say, "Wow". And guess what, they were just
DM> surprised. Otherwise, they would say, "Big Deal, I've been
DM> doing that in language X for the last 4 years at school. Of
DM> course everthing's an object." But they don't. Because
DM> that's not what they're taught.

DM> And if that same procedural person has to re-organize his
DM> worldview to do everything in an OO-everwhere fashion, that
DM> person will be surprised over and over until he successfully
DM> changes his worldview.

DM> Let's be clear: I'm not looking for procedural everywhere, I'm
DM> OO all the way. But Larry hit the nail on the head that
DM> beginners will be surprised. It might be a nice surprise!
DM> But it will be a surprise. His choice to not surprise them
DM> with OO can arguably be judged short-sighted. But his
DM> statement about their surprise is spot-on.

DM> Another way of saying it, "Since our beginners are taught
DM> inconsistencies, they may well be suprised by consistency."

Depends on who "our" beginners are. If they're beginners to Ruby, but
old hands at programming, you're probably correct. If they're
complete newbies, it's not at all clear. I've heard (anecdotally)
that programming neophytes find OO much more natural and intuitive
than procedural programming - it agrees with their world view. Once
we've put all of the effort of bending our brains to think
procedurally, it seems natural to us - we forget the pain and angst of
our early days, and wonder why it's so hard for newbies to see the
"obvious" solutions we've trained ourselves to recognize. But if all
the patterns were so obvious, we wouldn't need all of the pattern
books that are so popular these days.

I remember glancing through a C++ patterns book after being an
Objective-C programmer for a while, and snickering. Perhaps one
metric of a good language is how few patterns are needed to use it
productively.

--paul

James F.Hranicky

unread,
Sep 6, 2002, 8:09:03 PM9/6/02
to
On Sat, 7 Sep 2002 05:20:22 +0900
"Denys Usynin" <usy...@hep.upenn.edu> wrote:

> > The published C++ FAQ book is about 4" thick, after all, so I
> > submit that that language is actually FULL of surprises :-)
> >
> > /\ndy
> >
>
> yeah and as I said, depending on your background , Ruby is just as full
> of surprises as C++ is.

Wow, I couldn't disagree more. Thinking about C++ templates alone makes
my skin crawl.

> My guess is Ruby won't surpise you too much only if you have Perl/Python
> background. That's it.

Coming from a perl background, I guess I can't answer this :->

> I am probably the only person in the world who never used perl/python
> :) When I first needed a scripting language I just learned Ruby
> instead. I love the language but I can by no means say it was natural
> and not surprising to me. Quite the opposite, I had to adapt
> to the way Ruby's logic works and get the feel for it.
>
> The great Principle is an empty sound. Larry Wall should have never
> mentioned it.

I think it's a fantastic principle when taken in context: The principle
of least suprise as seen by Matz. I look at what his vision produced and
am stunned by it's simplicity and power. As a sysadmin, the thing I
liked most about perl was the ability to get to the OS while being able
to easily use higher level abstractions bafflingly (still) absent from
C such as hashes, arrays, and powerful regexp capabilities.

Ruby has taken that concept that much further by providing an intuitive
OO framework that can achieve the same results, only in less code. The
fact that I can do this:

ruby -e 'p File.stat("foo").mtime.to_i'

without having to write it in C, or remember where mtime comes in the
array returned by perl's stat is awesome. And guess what? The whole language
works that way!

After years of wishing C had libraries with higher level abstractions,
and fumbling around with perl's not-so-great OO implementation, ruby
is a breath of fresh air, and although I've been suprised a couple of
times, for the most part, when I'm not sure how Ruby works in one
area, and I make a guess, it turns out I'm right.

You may be right about the perl (python, C, C++, etc...) folks finding
ruby to hold true to POLS. It's possibly because it's what we've always
wanted but have never had, and we're just damn happy about it :->

----------------------------------------------------------------------
| Jim Hranicky, Senior SysAdmin UF/CISE Department |
| E314D CSE Building Phone (352) 392-1499 |
| j...@cise.ufl.edu http://www.cise.ufl.edu/~jfh |
----------------------------------------------------------------------

"Given a choice between a complex, difficult-to-understand, disconcerting
explanation and a simplistic, comforting one, many prefer simplistic
comfort if it's remotely plausible, especially if it involves blaming
someone else for their problems."
-- Bob Lewis, _Infoworld_

Bryan Murphy

unread,
Sep 6, 2002, 9:00:19 PM9/6/02
to
Andrew Hunt wrote:

>But this is all beside the point -- Larry's off the mark
>in suggesting that an inconsistency is preferable for any reason. I
>contend that that attitude is just plain wrong. It's better to
>be consistent -- even if that consistency is towards something
>new and unfamiliar -- than to introduce exceptional cases that
>the user (at any level) has to carry around.
>
>

And you know what, in many ways consistency is exactly what Larry is
doing. If you've been
following the Apocalypses, you should have noticed by now how hard he is
working towards
cleaning up the syntax of Perl operators and Regex's. Making them
"more consistent", or
something to that effect.

So, in some ways Larry just said you shouldn't necessarily strive to do
exactly what he is
doing! You have to forgive him though, he's a busy man and has a lot on
his plate.
Everybody slips up once in awhile, some just get more publicity when
they do.

Whatever the case may be, Perl 6 is going to be on interesting beasty,
regardless of what
Larry says about Ruby.

Bryan

Phlip

unread,
Sep 6, 2002, 9:16:14 PM9/6/02
to
Andrew Hunt wrote:

> But that's not the original point -- the Principle of Least surprise
> (to me, at least) is not about "original" surprise, it's about
> "repeated" surprise (see an earlier post).

In the space provided I have listed the number of times Ruby surprised me:

--> [ ]

I hope I wrote legibly.

--
Phlip
http://www.c2.com/cgi/wiki?RatAndTuyen
-- It's a small Web, after all... --

Bob Calco

unread,
Sep 6, 2002, 10:37:41 PM9/6/02
to
My $0.02...

%% On Sat, 7 Sep 2002 05:20:22 +0900
%% "Denys Usynin" <usy...@hep.upenn.edu> wrote:
%%
%% > > The published C++ FAQ book is about 4" thick, after all, so I
%% > > submit that that language is actually FULL of surprises :-)
%% > >
%% > > /\ndy
%% > >
%% >
%% > yeah and as I said, depending on your background , Ruby is
%% just as full
%% > of surprises as C++ is.
%%
%% Wow, I couldn't disagree more. Thinking about C++ templates alone makes
%% my skin crawl.

Well, I happen to like C++ and especially C++ templates for precisely the
same reason I love Ruby: they make metaprogramming easy, in an otherwise
hopelessly static language, no less. Templates in C++ are very powerful, and
C++ syntax really isn't scary once you understand pointers and know how to
use them and manage your own memory without blowing your foot off. Template
syntax looks dangerous and even cabalistic until you know how to use
templates for good design (Alexandrescu's "Modern C++ Design" is a work of
sheer genius, for instance). Once you _do_ get into them, you can't program
in C++ without them.

Ruby's dynamic OO model provides another, simpler way to define high level
abstractions in the aid of better program design, and is infinitely easier
to refactor - which is a huge plus over compiled C++. I also like Ruby
because, when I decide I need to, I can always extend Ruby in C++. So I get
the best of both worlds, and can jaunt back and forth between them as the
muse singeth.

%%
%% > My guess is Ruby won't surpise you too much only if you have
%% Perl/Python
%% > background. That's it.
%%
%% Coming from a perl background, I guess I can't answer this :->

Hmmm. You come from a Perl background and C++ template syntax makes your
skin crawl... my $OK, $I_believe_you = shift( @_ ); ### ;)

WRT Larry Wall's comments, they make perfect sense when you understand that
he's got his own language to promote, and a new version soon, at that. My
only comment is that Perl's retrofit of OO syntax is probably as ugly as
anybody could have possibly imagined, and I think he did it that way because
he really wanted it to look funky so that the people who "got it" and could
use Perl OO productively would look superhuman smart, like him. Ruby's OO
syntax, by contrast, is the definition of elegance and readability in a
development environment (scripting) that really benefits from both (not that
they aren't beneficial in _all_ programming environments, just that
scripting is pointless if at the very least the syntax isn't much cleaner
and noise-free than is possible in statically typed, compiled languages,
like C++ - which continue to have their place in the world). Strictly
speaking, Ruby's OOP is much closer to true OOP (elusive concept that it
is!) than anything Perl has offered so far. And his whole notion that
objects are too hard for beginners to grasp is the exact opposite of
reality - its concepts like "scalars" and "fundamental data types" that
throw off inexperienced programmers. Humans think in terms of objects or
"things", and in particular we visualize reality in terms of objects, not in
terms of the size of bytes of data that a computer can use.

As sort of an aside, I do think that exposure solely to object oriented
programming is a Bad Thing for the serious programmer, however: At some
point he or she must look squarely at the ghost in the machine, turn on the
light, and scare it away. There are other paradigms and they too can be
good. But the casual programmer, like the casual citizen, can live a
perfectly happy, productive life without direct knowledge of how the world
really works, under the surface of things... That's actually the whole point
of civilized society - to make primitive survivalism technically
unnecessary - and the point of high-level languages as well: one should be
able to get a lot of work-for-pay done in a high level language like Ruby
without _necessarily_ having to know all the gory details of the
implementation of dynamic typing, garbage collection, etc. Knowing these
things of course is good: but not necessary for life.

Sincerely,

Bob Calco


Chris Gehlker

unread,
Sep 6, 2002, 10:39:00 PM9/6/02
to

On Friday, September 6, 2002, at 03:05 PM, Mark Probert wrote:

> Might I suggest:
>
> + C/C++/Java
> + Smalltalk/Ruby
> + Haskell/OCaml
> + LISP
> + FORTH
> + FORTRAN
> + COBOL

Absolutely Not! ;-)
> + SQL
> + Oberon/Delphi
>
Seriously, I have often thought it would be best to teach assembler
and Smalltalk/Ruby as first languages.
--
As an adolescent I aspired to lasting fame, I craved factual certainty,
and I thirsted for a meaningful vision of human life - so I became a
scientist. This is like becoming an archbishop so you can meet girls.
-Matt Cartmill, anthropology professor and author (1943- )

Tim Hammerquist

unread,
Sep 6, 2002, 11:01:06 PM9/6/02
to
Larry Wall wrote:
[ snippage ]

> For instance, I think it's a violation of the Beginner's
> Principle of Least Surprise to make everything an object. To a
> beginner, a number is just a number. A string is a string. They
> may well be objects as far as the computer is concerned, and

> it's even fine for experts to treat them as objects. But
> premature OO is a speed bump in the novice's onramp.

Phil Tomson graced us by uttering:


> Personally, I like Ruby's scoping rules a lot better than
> Perl's.
>

> Thoughts?


>
> Also, I think that everyting being an object is actually
> helpful for beginners - It's much easier to pick up OO
> programming ideas if you learn them first, I think. It also
> tends to make things much more consistent.

After nearly two decades of coding (yes, some of you have been
doing it longer), I no longer claim to be able to twist my mind
into the "Beginner" shape. In fact, I can't say I honestly give
a single, solitary thought to how a beginner might perceive a
given language, syntax, or construct. My primary concerns are:

- Does it do what I need?
- Does it do too much more than what I need?
- Will my peers be able to understand what I'm doing?

With these criteria, both Ruby and Perl fit my bill, so long as
the "peers" reviewing the Perl code know Perl, and the "peers"
reviewing Ruby code know Ruby.

Most non-coders have trouble distinguishing between a "word" (in
the linguistic sense) and "character-based representation of a
word", much less Functional vs. Object-Oriented programming.

So the question for Larry (IMHO) becomes: "As a language
designer, how far back in the evolution of a programmer should I
cater to?" But I'm not Larry, regardless of whether I agree with
his decision. :)

Today I heard a user attempt to explain a problem.

"I was typing my project and MS Word told me I'd performed an
illegal operation. I clicked OK and the window went away and
when I opened up Word again, it wouldn't let me edit the file
or delete it. I had to start all over."

So MS Word had (predictably) crashed, leaving a swapfile and
zombie filehandle in its wake. After (predictably) restarting
Windows, the forsaken victim of a file entry could safely be
removed. Of course these aren't the users Larry has in mind, but
where _does_ he draw the line, I wonder?

Do Larry's refinements on behalf of the beginner help me?
Seldom. Do they hinder me? Seldom. Am I glad there's a
clear-headed developer behind such a prolific and valuable
language? Always.

Perl's an exceptionally powerful functional language wearing an
ugly, dirty, but passably OO mask.

Ruby is a clean, powerful, and elegant OO language. It's not
perfect, and it's not the answer to everything... but we haven't
hit v2.x yet!

Tim Hammerquist
--
Although the Perl Slogan is There's More Than One Way to Do It, I hesitate
to make 10 ways to do something. :-)
-- Larry Wall in <96...@jpl-devvax.JPL.NASA.GOV>

Christoph

unread,
Sep 6, 2002, 11:28:35 PM9/6/02
to
"W Kent Starr" wrote

...
> OO isn't taught well IMO. "Pickaxe" is a rare exception. The irony is


> that Ruby is closely modelled in accordance with contemporary thinking
> in modern theoretical physics where everything -is- an object (in

Actually, I am curious. Do you have something more specific in mind,
or was this just mend to be an admittedly very clever line?

> concept, not necessarily name). The physical world in which we live is
> de facto Rubyesque. :-)

I like Ruby a lot but I find Ruby's single method receiver
OO-incarnation (unfortunately the OO-norm) models badly
at least the mathematical world - the ingenious but in the end
hackish and ``non-OO'' coerce framework of the Numeric
class hierarchy is an unfortunate proof to my point...

/Christoph


Phlip

unread,
Sep 7, 2002, 12:09:19 AM9/7/02
to
Larry Wall wrote:

> For instance, I think it's a violation of the Beginner's Principle of
> Least Surprise to make everything an object.

Guys, Google Groups sez this thread does not yet contain the phrase
"sour grapes". Can anyone see a way to fit it in?

--
Phlip
http://www.greencheese.org/DontPlanDesigns
-- Proud victim of the dreaded boomerang effect --

JamesBritt

unread,
Sep 7, 2002, 12:57:04 AM9/7/02
to
>
> Larry Wall wrote:
>
> > For instance, I think it's a violation of the Beginner's Principle of
> > Least Surprise to make everything an object.
>
> Guys, Google Groups sez this thread does not yet contain the phrase
> "sour grapes". Can anyone see a way to fit it in?


Done.


James

Patrick May

unread,
Sep 7, 2002, 1:04:57 AM9/7/02
to
Andrew Hunt <an...@toolshed.com> wrote in message news:<200209061748...@toolshed.com>...

> Larry Wall is reputed to have said:
>
> >For instance, I think it's a violation of the Beginner's Principle of
> >Least Surprise to make everything an object. To a beginner, a number is
> >just a number. A string is a string. They may well be objects as far as
> >the computer is concerned, and it's even fine for experts to treat them as
> >objects. But premature OO is a speed bump in the novice's onramp. "
>
> I think Larry's off his rocker on this one. Consistency is far
> more important than familiarity. IMHO, Larry is demonstrating a
> widely-held bias that objects are somehow "different" and should
> be segregated and not taught to beginners.

This part jumped out to me too. To beginner (say my gf), 1 is a
number, and "This is a couple of things" is a quoted sentence.
Strings are things that our cats play with.

When he says Beginner's Principle of Least Surprise, he seems to mean
programmers who learned C first. That, to me, is a very strange
definition of a 'novice'.

~ Patrick

Reimer Behrends

unread,
Sep 7, 2002, 1:14:44 AM9/7/02
to
Patrick May (patri...@monmouth.com) wrote:
[...]

> When he says Beginner's Principle of Least Surprise, he seems to mean
> programmers who learned C first. That, to me, is a very strange
> definition of a 'novice'.

One might consider "teachability" as an alternate metric, since the task
of communicating programming language concepts to a given target
audience can at least be estimated. (And personally, I'd rank Python
above Ruby according to that metric; but both way above Perl. The
idea of designing a curriculum to teach Perl to novice programmers
scares me, to be honest.)

Reimer Behrends

David Leal

unread,
Sep 7, 2002, 5:29:51 AM9/7/02
to
Andrew Hunt wrote:

> But you've piqued my curiosity: tell me, what did you find
> surprising about Ruby? Where "surprising" is defined as
> "you thought it worked one way" and then discovered that it did not,
> or that it worked that way some of the time and some other way
> the rest of the time?

Here are two more:

"abcd"[0] # => 97 (!!! Shouldn't this be "a"?)

and

"abcd".sub( /ab/ ) { |match|
match[0] # => 97 (Huh? Shouldn't match be a MatchData?)
} # => "97cd" (Should be "abcd")

Ruby is my favorite language, but as long as we are all humans
there will always be surprises...

Cheers,

David

Bil Kleb

unread,
Sep 7, 2002, 6:01:43 AM9/7/02
to
Andrew Hunt wrote:
>
> Consistency is far more important than familiarity.

But Ruby's consistency gave into familiarity in certain
(what I consider) fairly significant areas.

For example, we only have "everything's an object" behavior
for some math intrinsics and the interface for files is a bit
strange, e.g., I expected 5.cos or aFile.rename('toThisName'),
not Math.cos(5) and File.rename(aFile,'toThisName').

--
Bil

ts

unread,
Sep 7, 2002, 6:37:22 AM9/7/02
to
>>>>> "P" == Paul Duncan <pa...@pablotron.org> writes:

P> Examples of this madness?

I'm agree with you, there is madness in "my" :-)

pigeon% perl
use strict;
{
my $m;
sub aa { eval '$m += 2' }
sub bb { eval 'print "bb : $m\n"' }
sub cc { $m += 3 }
sub dd { print "dd : $m\n" }
}
for (0 .. 3) {
aa(); bb();
cc(); dd();
}
^D
bb : 2
dd : 3
bb : 4
dd : 6
bb : 6
dd : 9
bb : 8
dd : 12
pigeon%


Guy Decoux

ts

unread,
Sep 7, 2002, 6:57:50 AM9/7/02
to
>>>>> "d" == dblack <dbl...@candle.superlink.net> writes:

d> begin statement; rescue; end while condition

pigeon% ruby -ve 'begin puts 12; rescue; end while false'
ruby 1.7.3 (2002-08-30) [i686-linux]
12
pigeon%

pigeon% ruby -ve 'begin puts 12; end while false'
ruby 1.7.3 (2002-08-30) [i686-linux]
12
pigeon%

Tue Mar 26 01:56:33 2002 Yukihiro Matsumoto <ma...@ruby-lang.org>

* parse.y (primary): while/until statement modifiers to "begin"
statement now work as "do .. while" even when begin statement
has "rescue" or "ensure" [new].


Guy Decoux

Christian Szegedy

unread,
Sep 7, 2002, 7:09:50 AM9/7/02
to
W Kent Starr wrote:

>>I think Larry's off his rocker on this one. Consistency is far
>>more important than familiarity. IMHO, Larry is demonstrating a
>>widely-held bias that objects are somehow "different" and should
>>be segregated and not taught to beginners.
>
>

> Which, geven the time of his rise to ascendency would be expected. We
> are all a product of "our times". While complexity theory has been
> around since the 70's (some would say off and on since the time of the
> Egyptians) but was not widely known outside of very tight academic
> circles before the mid-90's. Thus, Larry can be forgiven for thinking
> the concept of "objects" too difficult for the beginner.

I wonder whether you know what complexity theory is about...

Regards, Christian

Christian Szegedy

unread,
Sep 7, 2002, 7:38:48 AM9/7/02
to
Denys Usynin wrote:

> I think this whole Least Surprise Principle is a load of bullshit that
> is invoked far too often for no good reason. It has a fancy name, but I
> translate it to myself as "when matz made Ruby he made sure the way it
> worked made sense to him". Excuse me, isn't it how all languages are(or
> should be) made?

I was able to use Ruby quite well 1 hour after downloading it.
And I used most features without reading any docs at all,
just by guessing it based on common sense considerations. (In all
other languages I know, there were always surprising details I had
to learn before using a particular feature.) This attribute of
Ruby became quite conscious before I had learn the phrase "principle
of least surprise". For me, it was not a buzzword but the label
of a concrete psychological experience I made. But, of course, this
is fairly subjective and other people may not had the same
experience, but this is no reason to tell us that it does not exist.
I could demonstrate it to my brother also: I taught him Ruby half an

hour long, then I asked him "how would you do this or this more
conveniently? He always told the best way according to him (sometimes
it was a feature he knew from Java, sometimes it was something he
knew from Maple, sometimes just a common sense solution he thought
it was most appropriate. Most worked quite well in Ruby :))

> Excuse me, isn't it how all languages are (or should be) made?
Yes, they ***should*** be, but ***are*** they?

> It is a good language simply because it was written by a good programmer;

No, it is a good language because it was written by a
good psichologist, and aesthetist.

Programming skills are "only" a guarantee that it works, but do
not say anything about the quality of the language.

Regards, Christian

Christian Szegedy

unread,
Sep 7, 2002, 7:55:19 AM9/7/02
to
Alan Chen wrote:
>
>>Larry Wall said in
>>http://interviews.slashdot.org/article.pl?sid=02/09/06/1343222&mode=thread&tid=145

>>
>>"For instance, I think it's a violation of the Beginner's Principle of
>>Least Surprise to make everything an object. To a beginner, a number is
>>just a number. A string is a string. They may well be objects as far as
>>the computer is concerned, and it's even fine for experts to treat them as
>>objects. But premature OO is a speed bump in the novice's onramp. "
>>
>
> A total beginner, I think, would have the same amount of difficulty
> learning OO vs. non-OO languages. For a beginner to ruby, but not to
> programming, I would think that the learning curve would be strongly
> background dependent.

Actually I have made a quite interesting experiance last year.
My wife started to study computer science. She was a total
novice: she has never written a line of code in any programming
languages. Interestingly the base course in programming featured Haskell.
Sometimes she came to me to ask questions and I had real problems,
but she she seemed to pick it up quite naturally.

The next course was about imperative programming in Pascal, and
I thought it would be really trivial for her after Haskell. It was
not the case: she has always complained that Haskell was so logical,
but Pascal is a real mess. I had to admit that she was right.
So I think the preconception that immperative programming is simple
to novices, and functional or OO was harder is simply not true. I
think it is based on the fact that most people enter the universities
or courses, have already experience in some imperative language.

Regards, Christian


dbl...@candle.superlink.net

unread,
Sep 7, 2002, 9:09:10 AM9/7/02
to
Hi --

Thanks for pointing this out. As the original (I think) complainer
about this, I should have tracked it more closely. (Sorry, Matz :-)


David

--
David Alan Black | Register for RubyConf 2002!
home: dbl...@candle.superlink.net | November 1-3
work: blac...@shu.edu | Seattle, WA, USA
Web: http://pirate.shu.edu/~blackdav | http://www.rubyconf.com

Massimiliano Mirra

unread,
Sep 7, 2002, 9:16:58 AM9/7/02
to
On Sat, Sep 07, 2002 at 02:31:49AM +0900, Phil Tomson wrote:
> For instance, I think it's a violation of the Beginner's Principle of
> Least Surprise to make everything an object. To a beginner, a number is
> just a number. A string is a string. They may well be objects as far as
> the computer is concerned, and it's even fine for experts to treat them as
> objects. But premature OO is a speed bump in the novice's onramp. "

So learning to deal with numbers AND strings AND arrays AND hashes AND
objects is easier than learning to deal with objects? Cool.


Massimiliano

Bob Calco

unread,
Sep 7, 2002, 10:21:14 AM9/7/02
to
See below:

%% -----Original Message-----
%% From: Christian Szegedy [mailto:sze...@t-online.de]
%% Sent: Saturday, September 07, 2002 8:22 AM
%% To: ruby-talk ML
%% Subject: Re: Larry Wall's comments on Ruby

%% Actually I have made a quite interesting experiance last year.
%% My wife started to study computer science. She was a total
%% novice: she has never written a line of code in any programming
%% languages. Interestingly the base course in programming featured Haskell.
%% Sometimes she came to me to ask questions and I had real problems,
%% but she she seemed to pick it up quite naturally.
%%
%% The next course was about imperative programming in Pascal, and
%% I thought it would be really trivial for her after Haskell. It was
%% not the case: she has always complained that Haskell was so logical,
%% but Pascal is a real mess. I had to admit that she was right.
%% So I think the preconception that immperative programming is simple
%% to novices, and functional or OO was harder is simply not true. I
%% think it is based on the fact that most people enter the universities
%% or courses, have already experience in some imperative language.
%%
%% Regards, Christian

This makes total sense to me: Languages that are modeled after the way
people actually think are easier for people to grasp than languages that are
modeled after the way computers are designed to manipulate bits of data.
Pascal, C, C++, and especially assembler are, for this reason, quite
counter-intuitive to someone not familiar with the way computers process
information (and the nuances in the ways different chip architectures are
designed).

At bottom, high-level languages must be implemented in the lower-level
languages, or the computer would have no idea how to interpret what the
human is asking it to do, but in terms of one's ability to "get" the syntax
of a computer langauge, the syntax must be highly abstracted from the
language of the machine.

The pluses:
+ easy to learn
+ easy to use
+ easier to focus on the problem domain rather than on the mechanics of
information processing, thus
+ easier to get work done quickly

The minuses:
- can severely limit one's view of how a computer actually works, thus
trapping one in a single paradigm or computational model, in much the same
way (as the saying goes) everything looks like a nail to someone who has
only a hammer in his/her toolbox. OO is in this respect sort of like the
ultimate Swiss Army Knife, but it is still just one approach of many that
might work and work well in a given problem domain.
- if one's favorite high level language were not available, and only a
lower level language were available, one would be SOL unless one knew how to
program in that lower level language. Everyone _should_ know at least one
low level language well enough to get by in a crunch.

Generally speaking, the pluses of high level languages like Ruby outweigh
the minuses for most applications, since availability of high level
languages is not a serious issue in light of the open source software
movement, the Internet, and the decreasing cost of hardware.

Sincerely,

Bob Calco

Yukihiro Matsumoto

unread,
Sep 7, 2002, 10:26:49 AM9/7/02
to
Hi,

In message "Re: Larry Wall's comments on Ruby"


on 02/09/07, Denys Usynin <usy...@hep.upenn.edu> writes:

|I think this whole Least Surprise Principle is a load of bullshit that
|is invoked far too often for no good reason. It has a fancy name, but I
|translate it to myself as "when matz made Ruby he made sure the way it
|worked made sense to him". Excuse me, isn't it how all languages are(or
|should be) made?

Although I admit the "Principle of Least Surprise" has bigger impact
than I expected, I still disagree all languages are following the
principle.

Language designers are often too near-sighted in designing his/her
language. They often focus too much on "what it can do". But in
reality we have to focus on "how it can solve" or "how it makes you
feel in programming". I think this is a difference.

I also admit that Ruby offers bunch of surprises when you meet it for
the first time. But once you become familiar with it, you will feel
comfortable programming in it.

Believe me, not all languages are made equal. Some (or many)
languages just don't make you feel comfortable even after you've
mastered it. Their design just don't care how you feel while you
program in.

If "POLS" is not a proper word for the concept, and you have a better
slogan for the concept, I'd glad to switch.

matz.

Florian Frank

unread,
Sep 7, 2002, 11:02:42 AM9/7/02