Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Joel Spolsky on languages for web programming

22 views
Skip to first unread message

Dido Sevilla

unread,
Sep 1, 2006, 7:30:02 AM9/1/06
to
http://www.joelonsoftware.com/items/2006/09/01.html

Actually, despite the fact that I love Ruby a lot, I'm inclined to
partially agree with him on this. Presently, our company does have
some Rails-based web applications deployed but they're predominantly
applications geared for use by only a few people (internal client use
only); we've not yet tried to deploy a real public-facing web
application based on Rails. For that, it works really well. We're
taking a wait and see attitude before we attempt to use Rails for any
high load applications; my own experiences attempting to optimize
plain Ruby code for performance have been simultaneously frustrating
and rewarding. I doubt I could do the same with a Rails app. So for
now we're gonna stick with PHP for our public facing web applications,
even if it is even worse for i18n/l10n/m17n applications than Ruby
is...

Yukihiro Matsumoto

unread,
Sep 1, 2006, 9:00:25 AM9/1/06
to
Hi,

In message "Re: Joel Spolsky on languages for web programming"
on Fri, 1 Sep 2006 20:30:02 +0900, "Dido Sevilla" <dido.s...@gmail.com> writes:

|http://www.joelonsoftware.com/items/2006/09/01.html

I am very proud he mentioned Ruby in one of his essays. Actually, I
agree with his conclusion:

>that's not a safe choice for at least another year or six. that's not
>a safe choice for at least another year or six.

He is a businessman, not a geek, so he does not have to risk himself
using Ruby (and Rails). It doesn't matter. He will not pay me
anything even if he choose Ruby.

But we disagree in the middle.

> (1) it displays a stunning antipathy towards Unicode and
> (2) it's known to be slow, so if you become The Next MySpace, you'll
> be buying 5 times as many boxes as the .NET guy down the hall.

(1) Although we took different path to handle m17n issue from other
Unicode centric languages, we don't have any "stunning antipathy".
(2) Although Ruby runs slower than other languages in those
micro-benchmarks, real bottleneck in the most application lies in
either network connection or databases. So we don't have to buy 5
times as many boxes.

matz.

Richard Conroy

unread,
Sep 1, 2006, 9:14:33 AM9/1/06
to
On 9/1/06, Dido Sevilla <dido.s...@gmail.com> wrote:
> http://www.joelonsoftware.com/items/2006/09/01.html
>
> Actually, despite the fact that I love Ruby a lot, I'm inclined to
> partially agree with him on this.

He is a good writer, and I really like how he cuts through the
rhetoric and is firmly grounded on practical programming matters.

His core point on technology choice is valid:
"How do you decide between C#, Java, PHP, and Python? The only real
difference is which one you know better. If you have a serious Java
guru on your team who has build several large systems successfully
with Java, you're going to be a hell of a lot more successful with
Java than with C#, not because Java is a better language (it's not,
but the differences are too minor to matter) but because he knows it
better. Etc."

His point is valid too if you have a big Rails guru on your team, someone who
has built large successful Rails systems, then you pick a Rails solution.

The performance/scalability of Rails as an enterprise web-fronted system is
continually questioned. People point to Basecamp etc. as examples that Rails can
do it.

But thats missing the point - those systems are successful becuase those
companies have very good, experienced Rails engineers. Rails n00bs would
probably make a lot of mistakes that clash with the framework and
compromise its scalability.

He also has a point about his wait-and-see attitude wrt Rails. Rails development
is moving quickly, and in a year's time the major issues like unicode support,
deployment, support ecosystem (tools, etc.) and performance could be
solved problems.

Derek Chesterfield

unread,
Sep 1, 2006, 9:41:29 AM9/1/06
to
On 1 Sep 2006, at 14:00, Yukihiro Matsumoto wrote:

> Actually, I agree with his conclusion:
>
>> that's not a safe choice for at least another year or six. that's not
>> a safe choice for at least another year or six.
>
> He is a businessman, not a geek, so he does not have to risk himself
> using Ruby (and Rails). It doesn't matter. He will not pay me
> anything even if he choose Ruby.

In the context of the enterprise, I agree too. But only in the sense
that 'safe' implies availability of skills, and perhaps tools to some
extent. IMO, Ruby and Rails are 'safe' in the sense that they have
longevity, which I think would be the next biggest concern of
enterprise users.

Rimantas Liubertas

unread,
Sep 1, 2006, 10:07:28 AM9/1/06
to
> In the context of the enterprise, I agree too. But only in the sense
> that 'safe' implies availability of skills, and perhaps tools to some
> extent. IMO, Ruby and Rails are 'safe' in the sense that they have
> longevity, which I think would be the next biggest concern of
> enterprise users.

There is another take on what's risky and what's safe:
http://www.infoq.com/articles/From-Java-to-Ruby--Risk


Regards,
Rimantas
--
http://rimantas.com/

Rob Sanheim

unread,
Sep 1, 2006, 10:20:32 AM9/1/06
to
On 9/1/06, Dido Sevilla <dido.s...@gmail.com> wrote:


I find it amusing that he says Rails is too risky and new, yadda
yadda, but then he goes on to talk about their in-house language,
"Wasabi":

"Wasabi, a very advanced, functional-programming dialect of Basic with
closures and lambdas and Rails-like active records that can be
compiled down to VBScript, JavaScript, PHP4 or PHP5."

So Rails is too risky, but inventing your own language isn't? Did
someone say "not invented here" ??

Also, I could see how looking at unicode in Rails could scare large
enterprise apps, but the scaling and slowness thing is just FUD.

- Rob
--
http://www.robsanheim.com
http://www.seekingalpha.com
http://www.ajaxian.com

James Edward Gray II

unread,
Sep 1, 2006, 10:28:09 AM9/1/06
to
On Sep 1, 2006, at 9:20 AM, Rob Sanheim wrote:

> I find it amusing that he says Rails is too risky and new, yadda
> yadda, but then he goes on to talk about their in-house language,
> "Wasabi":

I too found that beyond ironic.

James Edward Gray II

Yukihiro Matsumoto

unread,
Sep 1, 2006, 10:36:19 AM9/1/06
to
Hi,

In message "Re: Joel Spolsky on languages for web programming"

That indicates that he trusts himself, and not me (Ruby). And I think
he's right.

matz.

kha...@enigo.com

unread,
Sep 1, 2006, 10:39:36 AM9/1/06
to
> and rewarding. I doubt I could do the same with a Rails app. So for
> now we're gonna stick with PHP for our public facing web applications,
> even if it is even worse for i18n/l10n/m17n applications than Ruby
> is...

An interesting performance test is to take some task and implement it in
Rails or Nitro or IOWA or Camping or whatever, and then implement it in a
PHP framework with equivalent functionality.

I have done some of this using CakePHP, which is a reasonably good PHP web
development framework, and the results are interesting. While PHP will
benchmark faster than Ruby for isolated benchmark tasks, when one starts
looking at frameworks with equivalent capabilities, PHP loses that
performance advantage, at least in the limited testing that I have done so
far.


Kirk Haines


William Grosso

unread,
Sep 1, 2006, 10:46:48 AM9/1/06
to

Well, to be fair, it's not quite that.

Start with http://www.joelonsoftware.com/articles/FogBugzIII.html --
they had a VBScript app and it needed to be on PHP. So they wrote
a VBScript to PHP compiler.

Good choice, given the market pressures.

Three years later, they've extended that compiler a little in the
direction of a FogBUGZ specific DSL and given it a silly name.

Not particularly surprising, nor particularly relevant to language
debates.


Bill

James Edward Gray II

unread,
Sep 1, 2006, 10:49:44 AM9/1/06
to

Really? I trust you a lot more than me, at least as far as designing
languages goes.

James Edward Gray II

P.S. I've used Spolsky's software and read his books and it just so
happens that I trust you more than him too. ;)


Rob Sanheim

unread,
Sep 1, 2006, 10:52:01 AM9/1/06
to

On second though, maybe Joel was just trolling a bit here? Maybe just
a bit of an in-joke, knowing that the bloggers would go nuts about the
obvious contradiction?

It seems to nutty to be true...
- rob

James Britt

unread,
Sep 1, 2006, 10:53:24 AM9/1/06
to
Rob Sanheim wrote:

>
> So Rails is too risky, but inventing your own language isn't? Did
> someone say "not invented here" ??


Where would be but for Not Invented Here?


--
James Britt

http://www.ruby-doc.org - Ruby Help & Documentation
http://www.artima.com/rubycs/ - The Journal By & For Rubyists
http://www.rubystuff.com - The Ruby Store for Ruby Stuff
http://www.jamesbritt.com - Playing with Better Toys

Mike Berrow

unread,
Sep 1, 2006, 10:54:27 AM9/1/06
to
It seems to me that Joel has accused us of having too much *fun* with
Ruby and with Rails. Way too much. And we all know that fun can't
possibly go hand in hand with productivity and *real* business.

-- Mike Berrow

--
Posted via http://www.ruby-forum.com/.

Yukihiro Matsumoto

unread,
Sep 1, 2006, 11:00:51 AM9/1/06
to
Hi,

In message "Re: Joel Spolsky on languages for web programming"

on Fri, 1 Sep 2006 23:49:44 +0900, James Edward Gray II <ja...@grayproductions.net> writes:

|> That indicates that he trusts himself, and not me (Ruby). And I think
|> he's right.
|
|Really? I trust you a lot more than me, at least as far as designing
|languages goes.

To rephrase, he has right to trust himself than me. I think I am a
better language designer than him. But at the same time, I think he
is a better programmer to create enterprisy software.

|P.S. I've used Spolsky's software and read his books and it just so
|happens that I trust you more than him too. ;)

I've read his book, but not used his software, so that I may not be
the right person to judge him.

matz.

Richard Conroy

unread,
Sep 1, 2006, 11:44:36 AM9/1/06
to
On 9/1/06, Rob Sanheim <rsan...@gmail.com> wrote:
> I find it amusing that he says Rails is too risky and new, yadda
> yadda, but then he goes on to talk about their in-house language,
> "Wasabi":

> So Rails is too risky, but inventing your own language isn't? Did


> someone say "not invented here" ??

Yes, *Joel* did. He makes quite a strong case for it:
"in defense of not-invented-here syndrome"
http://www.joelonsoftware.com/articles/fog0000000007.html

> Also, I could see how looking at unicode in Rails could scare large
> enterprise apps, but the scaling and slowness thing is just FUD.

The unicode issue is scary to an enterprise mindset, where having
a valid unicode mechanism that is 'non-unicode centric' still qualifies
as 'unicode antipathy'. The enterprise mindset has become happy
with the fact that unicode support is one of those things that they
don't have to think about in the languages/frameworks they favour.

The scaling/slowness thing though I can kind of understand (the
concern that is). I get
the feeling that Rails performs properly when you stick to the promoted
model, but if you deviate from it (processing too much, rather than
using the stack properly) you can choke your performance. Other
languages are probably more forgiving in this respect.

Don't forget the original context of Joels post: "Which web technology
would you bet your company on" (and by implication you house &
savings and whatever other assets you have used to get capital)

Stephen Kellett

unread,
Sep 1, 2006, 12:08:55 PM9/1/06
to
In message <44F8494F...@gmail.com>, James Britt
<james...@gmail.com> writes

>Where would be but for Not Invented Here?

Paying homage to Doug Englebart rather than Steve Jobs.

Stephen
--
Stephen Kellett
Object Media Limited http://www.objmedia.demon.co.uk/software.html
Computer Consultancy, Software Development
Windows C++, Java, Assembler, Performance Analysis, Troubleshooting

Ben Harper

unread,
Sep 1, 2006, 1:05:54 PM9/1/06
to
I suspect the 'fun' thing is a bit of a stab at the frequency of the
word 'fun', as well as the 'programmer happiness' thing in Ruby and/or
Rails evangelism. I myself don't like such terms too much when used to
describe what I do.
I love my work, and some of it now involves ruby, but 'fun', in written
english, has connotations that perhaps do not suit the way myself and
like-minded people describe how we feel about our work. I would use it
in conversation, but I would not use it in a banner advertising my
working environment.

M. Edward (Ed) Borasky

unread,
Sep 1, 2006, 2:00:58 PM9/1/06
to
Yukihiro Matsumoto wrote:
> Hi,

> (2) Although Ruby runs slower than other languages in those
> micro-benchmarks, real bottleneck in the most application lies in
> either network connection or databases. So we don't have to buy 5
> times as many boxes.

Allow me to rant on what we performance engineers do for a living ... :)

Most web applications I've encountered are not engineered for
performance -- *ever*. There's this little slogan they teach
programmers, which I've heard repeated numerous times on this list:

Beware premature optimization.

As a performance engineer, I don't consider premature optimization a
sin, of course. But leaving that aside, what it translates to is "make
it work, then make it pretty, then deploy it." Once it's deployed, it's
faster to "throw hardware at it" than it is to do the performance
optimization aka performance *re-engineering*.

Incidentally, the person who coined that slogan, Edsger W. Dijkstra,
said also that programming should separate the concerns of correctness
and efficiency. He *never* said to *only* worry about correctness. Part
of the discipline of programming is to pay attention to the efficiency
in equal measure to the correctness. That somehow seems to have been lost.

OK ... on with the rant: :)

1. Those of you who claim to be doing "test-driven development": If
performance testing isn't part of your test-driven development, you
aren't *really* doing test-driven development.

2. Micro-benchmarks are important. A few weeks ago, I ran and profiled a
micro-benchmark on "Matrix", which I posted to the YARV mailing list and
on my RubyForge project page. It's four times as fast in YARV as it is
in Ruby 1.8.5.

Whether it's YARV or some other low-level implementation techniques
really isn't important. What *is* important is that the Ruby "inner
interpreter", to borrow a term from Forth, *must* be as fast as it can
be given the other design constraints like compilers and portability.
The way to achieve this is with profiling on micro-benchmarks.

3. The future of "throw hardware at it" is multi-core processors and
storage array networks (SANs). If you aren't designing your applications
to take advantage of multi-processors, or if you aren't paying attention
to the "lower-level" I/O characteristics of your applications, they're
going to throw hardware at someone else's code.

Speaking of throwing hardware at it, there are ways to know what *kind*
of hardware to throw at it. Learn them. :)

4. An application with a network bottleneck is very often poorly
designed. One of the goals of performance engineering is *minimizing*
the amount of traffic between the client and the server.

Of course, if you're streaming audio, displaying high-definition video,
etc., the network traffic *is* the application, but an on-line banking
application should *not* have to send back more than a couple of
kilobytes to confirm that I've made a payment.

5. Database "bottlenecks": Most of the "industrial strength" databases
-- Oracle, PostgreSQL, MySQL, SQL Server 2005, DB2, etc. -- are
co-optimized with the operating systems and the platform hardware.
They've done their homework; they've done their performance engineering,
profiling and micro-benchmarking.

As good as they are, a poorly engineered database schema can make the
RDBMS work much harder than it needs to, requiring more hardware than is
necessary.

M. Edward (Ed) Borasky

unread,
Sep 1, 2006, 2:07:17 PM9/1/06
to
Yes ... I do what I do for a living (mostly in Perl and R, not yet Ruby
or Rails) for two reasons: I enjoy doing it and I get paid well for it.
Take away either of those and I'd go job hunting.


kha...@enigo.com

unread,
Sep 1, 2006, 5:13:09 PM9/1/06
to
On Sat, 2 Sep 2006, M. Edward (Ed) Borasky wrote:

> 5. Database "bottlenecks": Most of the "industrial strength" databases
> -- Oracle, PostgreSQL, MySQL, SQL Server 2005, DB2, etc. -- are
> co-optimized with the operating systems and the platform hardware.
> They've done their homework; they've done their performance engineering,
> profiling and micro-benchmarking.
>
> As good as they are, a poorly engineered database schema can make the
> RDBMS work much harder than it needs to, requiring more hardware than is
> necessary.

It is very easy, though, for even simple queries on a simple, optimized
DB schema on a well tuned engine to be the single largest bottleneck for a
dynamic web site or web based application.

Take, for example, a site that I am working on today.

It is dynamically generated using data from a simple db, but it is ok for
there to be a short latency between changes to db data and changes
appearing on the site pages.

Querying everything from the db for every request was netting around 35
pages per second -- a little under 3 hundredths of a second per page
generation. Changing the code so that it queries no more than twice a
minute, operating off of cached data between queries, dropped the page
generation to around 5 or 6 thousandths of a second -- about 170ish a
second. For comparison, on my development server I get about 620 page
loads a second from a static version of the content.

The database presents a significant bottleneck that I can fortunately work
around through a little bit of caching. If, for whatever reason, I could
not employ this caching, that db bottleneck could be a much more
significant issue than any Ruby speed issues.


Kirk Haines


BTW, just for comparison, a version of this content rendered and delivered
through CakePHP on PHP4, with no db data caching, renders about 17 pages
per second.

Squeamizh

unread,
Sep 1, 2006, 7:06:26 PM9/1/06
to

James Edward Gray II wrote:

Do you know what ironic means?

James Edward Gray II

unread,
Sep 1, 2006, 7:25:54 PM9/1/06
to

ironic |īˈränik| |aɪˌrɑnɪk| |ʌɪˌrɒnɪk|
adjective
using or characterized by irony : his mouth curved into an ironic smile.
• happening in the opposite way to what is expected, and typically
causing wry amusement because of this : [with clause ] it was ironic
that now that everybody had plenty of money for food, they couldn't
obtain it because everything was rationed.

-=-=-=-=-=-

My usages of the term seems OK to me.

Joel bashed Ruby for being unproven, then went on to say that they
were using a custom language they invented. By his own definition,
that is unproven. That's "happening in the opposite way to what is
expected" because of his earlier statements and it also causes me
"wry amusement."

Did this attack on my grammar serve some purpose?

James Edward Gray II


Joel VanderWerf

unread,
Sep 1, 2006, 7:32:24 PM9/1/06
to

After gem install MerriamWebster, ri irony outputs:

3 a (1) : incongruity between the actual result of a sequence of events
and the normal or expected result (2) : an event or result marked by
such incongruity b : incongruity between a situation developed in a
drama and the accompanying words or actions that is understood by the
audience but not by the characters in the play -- called also dramatic
irony, tragic irony

In this case, "tragic irony" seems apt.

..Just another Joel

--
vjoel : Joel VanderWerf : path berkeley edu : 510 665 3407

Bil Kleb

unread,
Sep 1, 2006, 8:42:45 PM9/1/06
to
Joel VanderWerf wrote:
>
> After gem install MerriamWebster, ri irony outputs:

Dang cool. I missed that one. However, I only get:

Attempting remote installation of 'MerriamWebster'
ERROR: While executing gem ... (NoMethodError)
undefined method `name' for -517611318:Fixnum

on ruby 1.8.2 (2004-12-25) [powerpc-darwin8.2.0]

Later,
--
Bil Kleb
http://fun3d.larc.nasa.gov

Chad Perrin

unread,
Sep 1, 2006, 9:02:13 PM9/1/06
to
On Sat, Sep 02, 2006 at 08:32:24AM +0900, Joel VanderWerf wrote:
> Squeamizh wrote:
> >James Edward Gray II wrote:
> >>On Sep 1, 2006, at 9:20 AM, Rob Sanheim wrote:
> >>
> >>>I find it amusing that he says Rails is too risky and new, yadda
> >>>yadda, but then he goes on to talk about their in-house language,
> >>>"Wasabi":
> >>I too found that beyond ironic.
> >>
> >>James Edward Gray II
> >
> >Do you know what ironic means?
> >
>
> After gem install MerriamWebster, ri irony outputs:

[ Considering the responses by both you and James, I guess the real
question here is "Does Squeamizh know what 'ironic' means?"

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]
"The measure on a man's real character is what he would do
if he knew he would never be found out." - Thomas McCauley

Eero Saynatkari

unread,
Sep 1, 2006, 9:14:45 PM9/1/06
to
Chad Perrin wrote:
> On Sat, Sep 02, 2006 at 08:32:24AM +0900, Joel VanderWerf wrote:
>> >
>> >Do you know what ironic means?
>> >
>>
>> After gem install MerriamWebster, ri irony outputs:
>
> [ Considering the responses by both you and James, I guess the real
> question here is "Does Squeamizh know what 'ironic' means?"

Someone needs more George Carlin. In any case, I
would not necessarily call this 'ironic', rather
'hypocritical' or 'oxymoronic'.

It would approach irony if Joel's big software
project would fail because Wasabi was unstable
and broke the whole thing :)

Chad Perrin

unread,
Sep 1, 2006, 9:22:18 PM9/1/06
to
On Sat, Sep 02, 2006 at 10:14:45AM +0900, Eero Saynatkari wrote:
> Chad Perrin wrote:
> > On Sat, Sep 02, 2006 at 08:32:24AM +0900, Joel VanderWerf wrote:
> >> >
> >> >Do you know what ironic means?
> >> >
> >>
> >> After gem install MerriamWebster, ri irony outputs:
> >
> > [ Considering the responses by both you and James, I guess the real
> > question here is "Does Squeamizh know what 'ironic' means?"
>
> Someone needs more George Carlin. In any case, I
> would not necessarily call this 'ironic', rather
> 'hypocritical' or 'oxymoronic'.

Joel was not being ironic -- rather, the circumstance of Joel's
commentary was itself ironic.


>
> It would approach irony if Joel's big software
> project would fail because Wasabi was unstable
> and broke the whole thing :)

That, too, would be ironic -- even more so.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"A script is what you give the actors. A program
is what you give the audience." - Larry Wall

James Edward Gray II

unread,
Sep 2, 2006, 12:26:08 AM9/2/06
to
On Sep 1, 2006, at 7:50 PM, Bil Kleb wrote:

> Joel VanderWerf wrote:
>> After gem install MerriamWebster, ri irony outputs:
>
> Dang cool. I missed that one. However, I only get:
>
> Attempting remote installation of 'MerriamWebster'
> ERROR: While executing gem ... (NoMethodError)
> undefined method `name' for -517611318:Fixnum
>
> on ruby 1.8.2 (2004-12-25) [powerpc-darwin8.2.0]

This is going to sound funny, but try it again. A couple of times if
you have to.

I've seen that error before and it seems to come and go. Certain
things seem to stress it more. I often saw it when building the
RDocs for gems, for example.

Interestingly, I have not seem it in some time now. I'm guessing my
switch to an Intel Mac or upgrade to 1.8.4 resolved the issue.

Hope that helps.

James Edward Gray II


vi...@edgeio.com

unread,
Sep 2, 2006, 7:06:25 AM9/2/06
to
Yukihiro Matsumoto wrote:
> In message "Re: Joel Spolsky on languages for web programming"
> on Fri, 1 Sep 2006 20:30:02 +0900, "Dido Sevilla" <dido.s...@gmail.com> writes:
> > (2) it's known to be slow, so if you become The Next MySpace, you'll
> > be buying 5 times as many boxes as the .NET guy down the hall.

> (2) Although Ruby runs slower than other languages in those
> micro-benchmarks, real bottleneck in the most application lies in
> either network connection or databases. So we don't have to buy 5
> times as many boxes.

Agreed. I manage engineering at Edgeio (http://www.edgeio.com/). We
don't use Rails, but we do use Ruby extensively on our _backend_. All
of our backend was initially written in C++, but I decided to start
migrating it to Ruby because:

1) It massively cut complexity. As an example I replaced a very
primitive, simple messaging middleware system with 700 lines of Ruby
that was significantly more advanced. Adding the same features to the
C++ version would easily have brought it to 3000-4000 lines of code.
More importantly, it took less time to write the Ruby version than to
write the much more primitive C++ code. And that system was my first
ever Ruby app, while I've been doing C/C++ for more than 10 years.

2) We hardly ever max out CPU's. We _can't_ buy boxes with slow enough
CPU's to make it become an issue for most of our servers. The messaging
middleware app handles millions of messages per day and rarely takes up
10% of a single CPU on the servers it runs on.

3) Most of the CPU time used by our Ruby apps is spend waiting for IO.
For the messaging middleware server, a conservative estimate is that
about 80% of the time is spent _in the kernel_ in read() or write()
syscalls. So even assuming that we could get a 10 times speedup of the
code we have written from moving the code back to C/C++, that would
only translate to a reduction in CPU use from 10% to 8.2% (reducing
time spent in userspace from 2% to 0.2%). Heck, even a 100 times
speedup of the userspace code would still only reduce the CPU use from
10% to 8.02%...

4) Servers are cheap. Engineers are not. A relatively senior engineer
easily costs USD 8k-12k a month (salary+social costs). A single
man-month of wasted effort could buy me a lot of extra processing
power.

5) When we scale, the limitations we run into are from most to least
important: disk io capacity, memory, network latency, cpu. Optimizing
algorithms to minimize disk writes, minimize memory usage (granted,
this is an area were we have to be careful with Ruby) and minimize
latency are all more important than reducing cpu usage. As a result,
because we'll generally have to buy/lease more servers based on io
capacity and memory, we have a lot of wasted CPU capacity. For _most_
(not all) of our backend code, spending time speeding up the code would
not save us anything.

6) IF/when we have to rewrite things in C/C++ to speed it up at any
time, it'll still be far more effective to translate small, isolated
parts of working, well tested Ruby code directly and create extensions
of them, than to write the whole apps as C/C++.

Vidar

Alder Green

unread,
Sep 2, 2006, 9:53:59 AM9/2/06
to
On 9/1/06, Dido Sevilla <dido.s...@gmail.com> wrote:
> http://www.joelonsoftware.com/items/2006/09/01.html
>
> Actually, despite the fact that I love Ruby a lot, I'm inclined to
> partially agree with him on this. Presently, our company does have
> some Rails-based web applications deployed but they're predominantly
> applications geared for use by only a few people (internal client use
> only); we've not yet tried to deploy a real public-facing web
> application based on Rails. For that, it works really well.

If anything, your hitherto positive expeirence should cast an
optimistic outlook for Rails' performance in a wider-audience
application.

> high load applications; my own experiences attempting to optimize
> plain Ruby code for performance have been simultaneously frustrating

> and rewarding. I doubt I could do the same with a Rails app.

Rails performacne optimization is very different from Ruby
optimization. There's only so much you can to optimize Ruby for simple
operations. Web applications, however, are usually complex. Solving or
even merely identifying bottlenecks involves making and testing broad
changes rapidly. And that's where Ruby/Rails really shines.

There are many significant optimizations I was able to discover and
implement in Rails, that I simply wouldn't be able to do in PHP
because of its less modular and malleable nature. In fact, I think
VHLLs like Ruby (in conjuction with appropriate frameworks like Rails)
have a solid advantage there over lower level languages, and this edge
will only increase in the future.

> So for
> now we're gonna stick with PHP for our public facing web applications,
> even if it is even worse for i18n/l10n/m17n applications than Ruby
> is...

The first steps I took in Rails were rewriting existing PHP
applications with it. So I have a certain estimate about how the Rails
compares to PHP for very similar applications. Keep in mind those
initial translations were completely naive; I didn't know Rails at the
time, and I didn't do anything like the optimizations mentioned above.

Generally, those naive applications weren't significantly slower than
their PHP equivalents (which themselves were old, tested, optimized
code). At most, the PHP was twice as fast (would handle twice the
requests over the same time period). And that's a maximum estimate -
generally, any difference was hardly noticeable.

In my experience a lot of production PHP applications can be optimized
to at least 2-3 times of their existing performance (and that's
assuming there wasn't really sloppy programming going on, in which
case the gain might easily be many times more). So I don't see why
Rails is any less viable than PHP in that regard. In fact, quite the
contrary: Rails applications I worked on frequently ended up getting
some refactoring after going production, and becoming 2-3 times more
efficient. Refactoring a working PHP application, however, is commonly
such a PITA, I was happy to close the hood after the engine was
running and not bother opening it again unless in the unfortunate
event of a bug.

Regards,
-Alder

Jeffrey Schwab

unread,
Sep 2, 2006, 10:27:01 AM9/2/06
to
James Edward Gray II wrote:
> On Sep 1, 2006, at 7:50 PM, Bil Kleb wrote:
>
>> Joel VanderWerf wrote:
>>> After gem install MerriamWebster, ri irony outputs:
>>
>> Dang cool. I missed that one. However, I only get:
>>
>> Attempting remote installation of 'MerriamWebster'
>> ERROR: While executing gem ... (NoMethodError)
>> undefined method `name' for -517611318:Fixnum
>>
>> on ruby 1.8.2 (2004-12-25) [powerpc-darwin8.2.0]

What am I doing wrong?

cmd>gem install MerriamWebster
Attempting local installation of 'MerriamWebster'
Local gem file not found: MerriamWebster*.gem


Attempting remote installation of 'MerriamWebster'

Updating Gem source index for: http://gems.rubyforge.org
ERROR: While executing gem ... (Gem::GemNotFoundException)
Could not find MerriamWebster (> 0) in the repository


Platform is Windows XP; don't worry, I already hate myself.

Alvin Ryder

unread,
Sep 2, 2006, 6:11:48 PM9/2/06
to
Dido Sevilla wrote:
> http://www.joelonsoftware.com/items/2006/09/01.html
>
> Actually, despite the fact that I love Ruby a lot, I'm inclined to
> partially agree with him on this. Presently, our company does have
> some Rails-based web applications deployed but they're predominantly
> applications geared for use by only a few people (internal client use
> only); we've not yet tried to deploy a real public-facing web
> application based on Rails. For that, it works really well. We're
> taking a wait and see attitude before we attempt to use Rails for any

> high load applications; my own experiences attempting to optimize
> plain Ruby code for performance have been simultaneously frustrating
> and rewarding. I doubt I could do the same with a Rails app. So for

> now we're gonna stick with PHP for our public facing web applications,
> even if it is even worse for i18n/l10n/m17n applications than Ruby
> is...

Interesting post Dido ;-)

I have never seen one web project fail due to slow language execution,
yet I have seen many fail because developers boggled down with
complexity, bloat and sluggish time to market!

After a couple of week's worth of work in Perl, people would say "wow
that's totally awesome" but with Java they say "hmmm, is that all"?

Java and C# are no guarantee for success. Sure the production
environments are solid but you cannot ignore the development economics,
you seem to need 10 instead of 3 people and 5 times as long. These are
serious issues!

Ok, sure Java's OO may be nicer than Perl 5's but once you brew
HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
isn't exactly what I'd call pretty. Java is in no way a safe bet.

How about C#, well it runs in Windows and without serious and expensive
firewalls you just can't go anywhere near the Internet. Yes .net is
prettier than the java's web stack but there is still way too much to
learn and when the framework doesn't do what you need you're left in
the cold big time. Again .NET is no guarantee to success either.

Ruby and Rails just get straight to the point. They make common things
easy and elegant. If execution speed really is a problem then I reckon
it'll get fixed.

As for developing major sites with Rails, most managers don't have the
balls. They'd rather pay millions to get a java solution, it isn't
their money on the budget so they gutlessly pour it down the java hole
and hope for the best. If the project fails they blame the team or
throw more money and bodies at the problem, of course it's not java's
fault or theirs.

Anyway I don't hold prejedice again java or c# but they are in no way a
safe bet.

Cheers.

Chad Perrin

unread,
Sep 2, 2006, 7:56:35 PM9/2/06
to
On Sun, Sep 03, 2006 at 07:15:32AM +0900, Alvin Ryder wrote:
>
> As for developing major sites with Rails, most managers don't have the
> balls. They'd rather pay millions to get a java solution, it isn't
> their money on the budget so they gutlessly pour it down the java hole
> and hope for the best. If the project fails they blame the team or
> throw more money and bodies at the problem, of course it's not java's
> fault or theirs.
>
> Anyway I don't hold prejedice again java or c# but they are in no way a
> safe bet.

Sure it is. You'll (almost) never have to fear for your job based on a
decision to go with Java or a Microsoft "solution", even if it is
entirely the WRONG decision. You could cost the company millions, end
up getting dozens of people laid off, and tank the entire project, but
if the language by which you did so is Java or C# you may still have job
security (as long as you haven't made other high-profile bad decisions).
The problem with job security in that circumstance only really arises if
there was a bitter power struggle over whether to go with Java or .NET,
and your side "won", then the project tanked at a cost of millions. The
opposing "side" might just blame the language/framework decision.

On the flipside, even where from a technical standpoint it's almost
impossible to avoid thinking something like Ruby on Rails, or Perl's
Catalyst, or Python's Django, is the best option, you may well find
yourself losing a job even if you made the right decision and the
project was well on its way to being a howling success. All it takes is
a poorly-timed change in management structure, and they may junk all the
work that has already been done at a cost of millions to rewrite
everything in Java or C# (or, God forbid, VB.NET), and fire you and all
your buddies for doing great work very quickly in the "wrong" language.

Corporate politics. Whee.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

Ben Franklin: "As we enjoy great Advantages from the Inventions of
others we should be glad of an Opportunity to serve others by any
Invention of ours, and this we should do freely and generously."

William Grosso

unread,
Sep 2, 2006, 8:45:54 PM9/2/06
to

At the risk of talking to a wall ... give me a break.

Ruby's a very nice language. Rail's an amazing framework.

But. They are neither dominant nor even close to universally
appropriate. They have things they do well, and they have things
they don't do well, and they have things they don't do at all.
And they have design foci which make them appropriate for some
tasks, and not appropriate for others.

All of which should be completely obvious.

Here's the sermon: Pretending that decisions you don't understand
were made entirely for political reasons, or because the people making
the decision are stupid, is a sure-fire way to prevent yourself from
ever learning anything. Instead of indulging in free-form bile, why
not ask "What would have to be true for that to be the right decision?"

You'd be surprised how much insight such a simple question can
generate.


William Grosso

Chad Perrin

unread,
Sep 2, 2006, 8:50:12 PM9/2/06
to
On Sun, Sep 03, 2006 at 09:45:54AM +0900, William Grosso wrote:
>
> At the risk of talking to a wall ... give me a break.
>
> Ruby's a very nice language. Rail's an amazing framework.
>
> But. They are neither dominant nor even close to universally
> appropriate. They have things they do well, and they have things
> they don't do well, and they have things they don't do at all.
> And they have design foci which make them appropriate for some
> tasks, and not appropriate for others.

Who said otherwise? And why are you top-posting?


>
> All of which should be completely obvious.
>
> Here's the sermon: Pretending that decisions you don't understand
> were made entirely for political reasons, or because the people making
> the decision are stupid, is a sure-fire way to prevent yourself from
> ever learning anything. Instead of indulging in free-form bile, why
> not ask "What would have to be true for that to be the right decision?"

Perhaps you should read what I said a second, and maybe even third,
time. In paraphrase, it was (summarized):

Regardless of how good or bad a decision a given language is for a
given task, Ruby is more likely to get you fired that Java.


>
> You'd be surprised how much insight such a simple question can
> generate.

You might be surprised by how much actually reading and trying to
understand makes, as opposed to jumping to conclusions about someone's
malicious intent regarding a discussion of the corporate politics of
language and tool choice.

Despite the fact it got this far via top-posting, a no-no here at
ruby-talk/comp.lang.ruby, I'll leave the text you quoted at the bottom
so you can more easily peruse it again at your leisure.

--

CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"The first rule of magic is simple. Don't waste your time waving your
hands and hopping when a rock or a club will do." - McCloctnick the Lucid

Christian Neukirchen

unread,
Sep 3, 2006, 10:03:30 AM9/3/06
to
Bil Kleb <Bil....@NASA.gov> writes:

> Joel VanderWerf wrote:
>> After gem install MerriamWebster, ri irony outputs:
>
> Dang cool. I missed that one. However, I only get:
>
> Attempting remote installation of 'MerriamWebster'
> ERROR: While executing gem ... (NoMethodError)
> undefined method `name' for -517611318:Fixnum
>
> on ruby 1.8.2 (2004-12-25) [powerpc-darwin8.2.0]

Sounds like the OS X symbol table overflow bug. Update your Ruby.

> Bil Kleb
--
Christian Neukirchen <chneuk...@gmail.com> http://chneukirchen.org

David Vallner

unread,
Sep 3, 2006, 11:18:30 AM9/3/06
to
Gautam Dey wrote:

> On 9/1/06, James Edward Gray II <ja...@grayproductions.net> wrote:
>>
>> On Sep 1, 2006, at 9:20 AM, Rob Sanheim wrote:
>>
>> > I find it amusing that he says Rails is too risky and new, yadda
>> > yadda, but then he goes on to talk about their in-house language,
>> > "Wasabi":
>>
>> I too found that beyond ironic.
>>
>> James Edward Gray II
>>
>>
> I didn't. Depending on what the language needs to do it, would be easy to
> write a lanuage. From what he wrote, it sounds like wasabi was written to
> his specific domain, and so of course he would trust it more then some
> other
> general purpose language write by some one else. Now, I'm not saying that
> other should trust wasabi. I know I would not, but that's just me trusting
> what I know, more then just blanking trusting Jole.
>


And another point is that quite a few Ruby frameworks do come to
defining a domain-specific language in Ruby - cf. Og data definition,
Puppet, rake. There's a (maybe not quite fine) line between a very
specific framework and a DSL that just gets crossed, and I don't believe
rubyists are the innocents to throw the first stone.

Then again, if Ruby is to be accused of being too much of an unsafe toy,
might as well go the full way :)

David Vallner

David Vallner

unread,
Sep 3, 2006, 11:29:26 AM9/3/06
to
Chad Perrin wrote:
> On Sun, Sep 03, 2006 at 09:45:54AM +0900, William Grosso wrote:
>> Here's the sermon: Pretending that decisions you don't understand
>> were made entirely for political reasons, or because the people making
>> the decision are stupid, is a sure-fire way to prevent yourself from
>> ever learning anything. Instead of indulging in free-form bile, why
>> not ask "What would have to be true for that to be the right decision?"
>
> Perhaps you should read what I said a second, and maybe even third,
> time. In paraphrase, it was (summarized):
>
> Regardless of how good or bad a decision a given language is for a
> given task, Ruby is more likely to get you fired that Java.
>


To be fair, it's not just corporate politics. Statistically, it's more
likely a development house will have a strong base of Java developers or
C# developers (C#, while being very young and so far an abomination unto
Nuggan, is reasonably Java compatible), and that starting a Rails
project means you'll probably have to get people with no Ruby experience
on the team, or create a burden on the company in case the original team
falls apart and quits to other companies regarding maintenance, or whatever.

While the programming language decision might or might not have anything
to do with whether the project succeeds, choosing a *locally* unproven
language DOES make the project inherently higher-risk, and makes the
managers overall nervous - whence the likelihood of getting fired being
higher. It's not punishment for your failure, it's more for all the
other mess you could've caused even if the project succeeded, even if
the management might not be consciously aware of that.

How good a language or the frameworks for it are to initially develop
something in is not (maybe not by far) the most important factor when
making a decision.

David Vallner

David Vallner

unread,
Sep 3, 2006, 12:01:14 PM9/3/06
to
Utter pants. I mean, you used the word "bloat", which should make people
lose any debate by default.

Alvin Ryder wrote:
> Java and C# are no guarantee for success.

Neither is Ruby / Rails. *No technology* is a guarantee for success, no
technology ever was, and I'll bet a gold bar against a plastic spoon no
technology ever will. Technology used is a very important decision to
make, but it never single-handedly moves you from doable to undoable or
vice versa.

> you seem to need 10 instead of 3 people and 5 times as long.

Pure, unadulterated shite. Give me numbers. Credible statistics and real
research, not random anectodal success stories that are too pathethic to
sell Herbalife diet pills.

Also, initial development cost isn't a very important factor. Recalls
your uni software lifecycle charts about how much of a project's life is
maintenance. For a successful project, the numbers are very much true.
With a successful product comes the responsibility of supporting it and
keeping it successful, and in some cases this responsibility creates
ongoing costs that dwarf the initial development horribly.

> Ok, sure Java's OO may be nicer than Perl 5's but once you brew
> HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
> isn't exactly what I'd call pretty. Java is in no way a safe bet.

Noone cares about pretty. It's also a completely irrelevant issue when
deciding on implementation language if you're at least remotely responsible.

> How about C#, well it runs in Windows and without serious and expensive
> firewalls you just can't go anywhere near the Internet.

You need to tighten off Unix-based servers too. Heck, there are even
serious and expensive firewalls for Linux around too, because not
everyone has an in-house iptables guru.

> Ruby and Rails just get straight to the point. They make common things
> easy and elegant.

Sometimes things aren't so common. Ruby and Rails DO have faults. Just
google around, I'm not going to go namecall out of respect and out of a
sense of realism - every technology has flaws and any mudslinging would
only lead to a pointless flamewar. Sometimes they are uneducated rants
and / or whining, but some of them are valid.

And if you do NOT go out and learn about these flaws, and what impact
they could have, and be fully aware of them when making the
implementation technology decision on a project to consider the severity
of their impact under the circumstances of your project, then your
decision may cause a lot of trouble.

> If execution speed really is a problem then I reckon
> it'll get fixed.

Speaking purely theorethically, Ruby can not be made as performant as
Java or C# could be made if they had ideally performing implementations.
Latent typing makes it almost impossible to do certain optimizations as
static typing does. That's pure fact. Of course, it's not saying Ruby
can't be fast enough - but there have been people with more experience
at the performance side of software development that talked much better
about that

> As for developing major sites with Rails, most managers don't have the
> balls.

I advise you go on throught freshman year on a management school. It's
the managers' job to "not have balls" and risk when there's apparently
nothing to be had from taking it. If you want to be a Ruby advocate, you
need to be able to persuade them, not yourself, of the advantages or
using it.

> They'd rather pay millions to get a java solution, it isn't
> their money on the budget so they gutlessly pour it down the java hole
> and hope for the best. If the project fails they blame the team or
> throw more money and bodies at the problem, of course it's not java's
> fault or theirs.

That's because it's not. Since projects never fail purely on a
technology solution - they fail on results of bad project planning more
often than not (assigning novice programmers to large projects that will
probably go over their heads), mistaken business objectives as whoever
contracted the software finds he isn't really interested in what the
tech demos had to show at all.

And the stereotype of lazy management that never gets punished is good
to make Dilbert strips from - in real life, it probably doesn't hold
true outside of a select few huge moloch companies, or on the opposite
side of the spectrum small short-lived hick-led shops where the bosses
kids and nephews gets all sort of crap assigned to get better allowance.
In a well-led company with working internal management processes, when
the shit hits the fan, everyone gets the stink.

David Vallner

Alex Young

unread,
Sep 3, 2006, 12:40:35 PM9/3/06
to
That's not quite the same - those DSL's build upon a known and well
understood foundation, because they use Ruby's syntax to their own ends.
I'm inferring from the very little information that's out there that
Wasabi has its own parser, and that makes it a very, very different
beast to a DSL in the sense that I've come across the term in Ruby.

--
Alex

James Britt

unread,
Sep 3, 2006, 12:55:12 PM9/3/06
to
Alex Young wrote:


> I'm inferring from the very little information that's out there that
> Wasabi has its own parser, and that makes it a very, very different
> beast to a DSL in the sense that I've come across the term in Ruby.
>

Some Wasabi info:

http://www.joelonsoftware.com/items/2006/09/01b.html
http://programming.reddit.com/info/g0fa/comments

--
James Britt

"Simplicity of the language is not what matters, but
simplicity of use."
- Richard A. O'Keefe in squeak-dev mailing list

Alex Young

unread,
Sep 3, 2006, 1:13:24 PM9/3/06
to
David Vallner wrote:
> Utter pants. I mean, you used the word "bloat", which should make people
> lose any debate by default.
>
> Alvin Ryder wrote:
>> Java and C# are no guarantee for success.
>
> Neither is Ruby / Rails. *No technology* is a guarantee for success, no
> technology ever was, and I'll bet a gold bar against a plastic spoon no
> technology ever will. Technology used is a very important decision to
> make, but it never single-handedly moves you from doable to undoable or
> vice versa.
That's the wrong argument to pick. Try calculating the full dynamics of
a modern metropolitan water supply network with just pen and paper.
Technological advances *do* move us from undoable to doable, and it's
specific technologies that do it.


>> you seem to need 10 instead of 3 people and 5 times as long.
>
> Pure, unadulterated shite. Give me numbers. Credible statistics and real
> research, not random anectodal success stories that are too pathethic to
> sell Herbalife diet pills.

I'm not going to address this - research on this level is heavily
funded, and heavily trend-driven. The answers you get depend too
heavily on what questions you ask.

> Also, initial development cost isn't a very important factor. Recalls
> your uni software lifecycle charts about how much of a project's life is
> maintenance. For a successful project, the numbers are very much true.
> With a successful product comes the responsibility of supporting it and
> keeping it successful, and in some cases this responsibility creates
> ongoing costs that dwarf the initial development horribly.

No argument there whatsoever.

>> Ok, sure Java's OO may be nicer than Perl 5's but once you brew
>> HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
>> isn't exactly what I'd call pretty. Java is in no way a safe bet.
>
> Noone cares about pretty. It's also a completely irrelevant issue when
> deciding on implementation language if you're at least remotely
> responsible.

Actually, pretty does matter. The comfort of a problem solver directly
impacts his/her approach to a problem. That's just human nature.

> Speaking purely theorethically, Ruby can not be made as performant as
> Java or C# could be made if they had ideally performing implementations.
> Latent typing makes it almost impossible to do certain optimizations as
> static typing does. That's pure fact.

I remain unconvinced by this - and it's mainly JIT optimisation that
keeps me on the fence. Dynamic optimisations can beat static - but not
in all cases. I believe this is what one calls an "open research" question.

--
Alex

Joseph

unread,
Sep 3, 2006, 2:39:09 PM9/3/06
to
Although I respect Joel very much, I believe he makes a fundamental
mistake in his reasoning.

Basically what he is saying can be deconstructed this way:

* Do not risk developing in new cutting edge technology. Even if
successful proof of concepts are already out there (37 signals et. al)
* Use what most people use: PHP / J2EE / .Net not what most experts
tell you to use. Communities and support are paramount.
* Corporations and the people in those organizations favor safety, if
your job is on the line go with the tried and true. Take no risks.

All three assumptions rely on a single assumption: FEAR.

* Fear the technology would eventually not deliver.
* Fear the support will not be sufficient.
* Fear regarding your job safety as a corporate developer or manager
who chooses Ruby or Ruby on Rails for some mission critical project.

All assumptions are wrong.

The only way significant progress is accomplished is precisely a
combination of: FAITH and COURAGE. That will make you stand out
anywhere.

The ideal place for those characteristics is inside a Startup or inside
of a bold, courageous corporation! It is not about the size of the
organization though, it is about the courage and boldness of the people
inside those companies.

People forget how the Internet, yes the OLD Internet was built. It was
done on new technology (www, http, mosaic, Perl), new development
models (open source, collaboration), new business objectives (community
first, users, and yes finally profits too.)

So, this is my take on this issue regarding Ruby and Ruby on Rails:

Do it, risk it, it's worth it.

And the biggest advantage to Joel's thinking for you would be that
neither he, nor corporations who thing like he does (most of them) will
be your competition. So when they do have some serious issues to
tackle, like that huge Framework called [insert-your-safe-choice-here]
trying to bent backwards to do what needs to be done fast... you will
have the last laugh.

Best Regards,

Jose L. Hurtado
Web Developer
Toronto, Canada

Yukihiro Matsumoto wrote:
> Hi,


>
> In message "Re: Joel Spolsky on languages for web programming"

> on Fri, 1 Sep 2006 23:49:44 +0900, James Edward Gray II <ja...@grayproductions.net> writes:
>
> |> That indicates that he trusts himself, and not me (Ruby). And I think
> |> he's right.
> |
> |Really? I trust you a lot more than me, at least as far as designing
> |languages goes.
>
> To rephrase, he has right to trust himself than me. I think I am a
> better language designer than him. But at the same time, I think he
> is a better programmer to create enterprisy software.
>
> |P.S. I've used Spolsky's software and read his books and it just so
> |happens that I trust you more than him too. ;)
>
> I've read his book, but not used his software, so that I may not be
> the right person to judge him.
>
> matz.

Phlip

unread,
Sep 3, 2006, 3:43:32 PM9/3/06
to
Joseph wrote:

> Although I respect Joel very much, I believe he makes a fundamental
> mistake in his reasoning.

Joel is such a good writer that sometimes his jaw-drooping errors are
impossible to refute. (And don't encourage him; he loves it when you fight
back!)

> Basically what he is saying can be deconstructed this way:
>
> * Do not risk developing in new cutting edge technology. Even if
> successful proof of concepts are already out there (37 signals et. al)
> * Use what most people use: PHP / J2EE / .Net not what most experts
> tell you to use. Communities and support are paramount.

The open source tools that succeed must have higher technical quality than
the Daddy Warbucks tools. The latter can afford to buy their communities and
"support" networks. Because an open source initiative cannot buy its
community and marketing, only the strong survive, and their early adopters
will form this community spontaneously. They will provide the true
word-of-mouth advertising that marketing tends to simulate.

And I am sick and tired of seeing at shops dragged down by some idiotic
language choice made between the marketeers and a computer-illiterate
executive.

> * Corporations and the people in those organizations favor safety, if
> your job is on the line go with the tried and true. Take no risks.

Ah, so looking like you are following best practices is more important than
doing everything you can to ensure success. Gotcha!

Yes, I have seen that upclose, too!

> All three assumptions rely on a single assumption: FEAR.
>
> * Fear the technology would eventually not deliver.
> * Fear the support will not be sufficient.
> * Fear regarding your job safety as a corporate developer or manager
> who chooses Ruby or Ruby on Rails for some mission critical project.

Yup - that's the Fear Uncertainty and Doubt formula that Microsoft (among
others) use all the time. They have tried, over and over again, to FUD
Linux. Their CEO will get up on stage and say incredibly stupid things, like
"if an open source platform fails you, there is nobody you can go to for
help!" He means there's nobody you can sue. As if you could go to MS for
help, without paying thru the nose...

Oh, Joel is pro-Linux, right? What's the difference??

> All assumptions are wrong.

Better, fear that your boss will experience misguided fear.

--
Phlip
http://c2.com/cgi/wiki?ZeekLand <-- NOT a blog!!!


Vidar Hokstad

unread,
Sep 3, 2006, 4:25:06 PM9/3/06
to

Joseph wrote:
> Although I respect Joel very much, I believe he makes a fundamental
> mistake in his reasoning.
>
> Basically what he is saying can be deconstructed this way:
>
> * Do not risk developing in new cutting edge technology. Even if
> successful proof of concepts are already out there (37 signals et. al)
> * Use what most people use: PHP / J2EE / .Net not what most experts
> tell you to use. Communities and support are paramount.
> * Corporations and the people in those organizations favor safety, if
> your job is on the line go with the tried and true. Take no risks.
>
> All three assumptions rely on a single assumption: FEAR.

No. They rely on sound risk management principles.

> * Fear the technology would eventually not deliver.

Replace "Fear" with "Risk" and the above is reasonable if your company
does not have people experienced in a particular technology. And fact
is today it is still far harder to find people skilled at Ruby than
many other languages. More importantly, there is too little experience
with many Ruby technologies for a company with no Ruby experience to
_know_ whether Ruby will be appropriate for a specific project.

> * Fear the support will not be sufficient.

Replace "Fear" with "Risk" again. The company I work for, Edgeio, uses
PHP for our frontend code (but Ruby for our backend) because when we
started building it I had concerns about the availability of people
skilled with Ruby in general or Rails in particular.

When we started hiring those concerns were validated: It's proved
extremely hard to find people with Ruby experience. While it's
certainly getting easier rapidly, not everyone can afford to take the
risk. In our case I decided to start phasing Ruby in for small self
contained components in our backend, and gradually take it from there
as we get enough Ruby skills through training or hiring, which has
proven to work well and meant that in the event that we'd run into
unforeseen problems, the effort involved in switching back to another
language would have been limited.

> * Fear regarding your job safety as a corporate developer or manager
> who chooses Ruby or Ruby on Rails for some mission critical project.

Which is very valid if you make a choice detrimental to the company,
regardless which language it involves. As much as I love working with
Ruby, if someone working for me picked it for something inappropriate,
and the project tanked badly, they certainly would have to take the
responsibility for the language choice. If you don't have Ruby skills,
or your team doesn't have sufficient Ruby skills, or there aren't
enough skilled Ruby developers available in your location, picking it
for a high risk project will certainly not speak to your favor with any
risk

"Fear" as you say, or "risk" is an important decision factor for any
conscientious manager. Deciding what level of risk is appropriate for a
project vs. the potential payoffs is one of the most important skill a
manager must have to make good decisions.

They key is whether you/your team has or can easily aquire the skills
required to minimise the risks and maximise the payoff. For many teams
that will not be the case when dealing with any specific new tech.

As for "successfull proof of concepts", they mean nothing unless a) you
have the same skills and resources as the company in question, and b)
your project is sufficiently similar. Which means most decisions about
technology tends to boil down to 1) what your team knows to a certain
degree, 2) which technologies are the most widely deployed. Ideally
you're looking for an intersection.

_Sometimes_ the payoff in trying a technology that your team is
inexperienced with or that isn't widely deployed is large enough to
outweigh the risks, or the risks are mitigated by your teams experience
(in the case of tech that isn't widely deployed) or by the available
pool of external experience (in the case where your team doesn't have
the skills), but that is not a decision to take lightly.

I am all for using Ruby, and I think a lot of companies that aren't
using Ruby could get great benefit from testing it. But on low impact,
low risk, simple projects first. Not because Ruby in itself is
inherently high risk, but because few companies have enough experience
with Ruby to jump right into using it on a large or high profile
project.

Vidar

Alvin Ryder

unread,
Sep 3, 2006, 6:29:23 PM9/3/06
to
David Vallner wrote:
> Utter pants. I mean, you used the word "bloat", which should make people
> lose any debate by default.
>

I don't like bloated software, it is unnecessary.

> Alvin Ryder wrote:
> > Java and C# are no guarantee for success.
>
> Neither is Ruby / Rails. *No technology* is a guarantee for success, no
> technology ever was, and I'll bet a gold bar against a plastic spoon no
> technology ever will. Technology used is a very important decision to
> make, but it never single-handedly moves you from doable to undoable or
> vice versa.
>

There are many factors required for success and I don't believe any one
factor guarantees it but interestingly it can take so much as one
element gone wrong to ruin everything.

> > you seem to need 10 instead of 3 people and 5 times as long.
>
> Pure, unadulterated shite. Give me numbers. Credible statistics and real
> research, not random anectodal success stories that are too pathethic to
> sell Herbalife diet pills.
>

The "10 to 3" ratio wasn't meant to be taken literally surely you don't
think otherwise? And can you tell me where can I get such "credible
statistics and real research" from?

Are you saying all languages yield the same level of productivity? If
they aren't equally productive then how much more productive is Java
over C++ or VB over assembler? Do you need "credible statistics and
research" to answer the question?

> Also, initial development cost isn't a very important factor. Recalls
> your uni software lifecycle charts about how much of a project's life is
> maintenance. For a successful project, the numbers are very much true.
> With a successful product comes the responsibility of supporting it and
> keeping it successful, and in some cases this responsibility creates
> ongoing costs that dwarf the initial development horribly.
>

I disagree, the initial cost is vital. Most projects get approved or
not approved based on that initial cost and if that money is drained on
developers trying to tame an unwieldly platform instead of building the
actual system then we have a problem don't we?

> > Ok, sure Java's OO may be nicer than Perl 5's but once you brew
> > HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
> > isn't exactly what I'd call pretty. Java is in no way a safe bet.
>
> Noone cares about pretty. It's also a completely irrelevant issue when
> deciding on implementation language if you're at least remotely responsible.
>

I care about pretty.

> > How about C#, well it runs in Windows and without serious and expensive
> > firewalls you just can't go anywhere near the Internet.
>
> You need to tighten off Unix-based servers too. Heck, there are even
> serious and expensive firewalls for Linux around too, because not
> everyone has an in-house iptables guru.
>

True, no platform is 100% impervious to attack but some are less secure
than others.

> > Ruby and Rails just get straight to the point. They make common things
> > easy and elegant.
>
> Sometimes things aren't so common. Ruby and Rails DO have faults. Just
> google around, I'm not going to go namecall out of respect and out of a
> sense of realism - every technology has flaws and any mudslinging would
> only lead to a pointless flamewar. Sometimes they are uneducated rants
> and / or whining, but some of them are valid.
>

Yes I know all platforms have faults and wish lists, I didn't think
otherwise.

> And if you do NOT go out and learn about these flaws, and what impact
> they could have, and be fully aware of them when making the
> implementation technology decision on a project to consider the severity
> of their impact under the circumstances of your project, then your
> decision may cause a lot of trouble.
>

Fair enough, I agree. I think software should be published with
specifications and limits as they do in other industries. This is a 100
ohm resister +/- 2%, capable of running in these temperators, it
handles this much power ... but in software its just "blah" you have to
discover the limits yourself (ouch).

> > If execution speed really is a problem then I reckon
> > it'll get fixed.
>
> Speaking purely theorethically, Ruby can not be made as performant as
> Java or C# could be made if they had ideally performing implementations.
> Latent typing makes it almost impossible to do certain optimizations as
> static typing does. That's pure fact. Of course, it's not saying Ruby
> can't be fast enough - but there have been people with more experience
> at the performance side of software development that talked much better
> about that
>

I'm not sure how fast or slow Ruby is but if it's as fast as Perl I'll
be happy enough. Yes I know C is faster but I need fast development
times too.

> > As for developing major sites with Rails, most managers don't have the
> > balls.
>
> I advise you go on throught freshman year on a management school. It's
> the managers' job to "not have balls" and risk when there's apparently
> nothing to be had from taking it. If you want to be a Ruby advocate, you
> need to be able to persuade them, not yourself, of the advantages or
> using it.
>

I've worked with Harvard level managers, they seemed to think it *was
their job to have balls*, which is the opposite of what your saying? I
prefer to work with managers that have knowledge, intelligence, energy
and conviction to back up their decisions.

Besides that the choice of language is usually mine, that's why I
gravitate to the more productive ones. In my experience run-time
performance is rarely an issue but development time is.

> > They'd rather pay millions to get a java solution, it isn't
> > their money on the budget so they gutlessly pour it down the java hole
> > and hope for the best. If the project fails they blame the team or
> > throw more money and bodies at the problem, of course it's not java's
> > fault or theirs.
>
> That's because it's not. Since projects never fail purely on a
> technology solution - they fail on results of bad project planning more
> often than not (assigning novice programmers to large projects that will
> probably go over their heads), mistaken business objectives as whoever
> contracted the software finds he isn't really interested in what the
> tech demos had to show at all.
>
> And the stereotype of lazy management that never gets punished is good
> to make Dilbert strips from - in real life, it probably doesn't hold

No, I've seen it hold in real life too many times, the "pointy haired
boss with the corner office" still brings a chuckle out of me.

> true outside of a select few huge moloch companies, or on the opposite
> side of the spectrum small short-lived hick-led shops where the bosses
> kids and nephews gets all sort of crap assigned to get better allowance.
> In a well-led company with working internal management processes, when
> the shit hits the fan, everyone gets the stink.
>
> David Vallner

Cheers ;-)

M. Edward (Ed) Borasky

unread,
Sep 3, 2006, 10:26:18 PM9/3/06
to
David Vallner wrote:
>> How about C#, well it runs in Windows and without serious and expensive
>> firewalls you just can't go anywhere near the Internet.
>
> You need to tighten off Unix-based servers too. Heck, there are even
> serious and expensive firewalls for Linux around too, because not
> everyone has an in-house iptables guru.
But everybody *should* have a *certified* Cisco engineer if they use
Cisco routers, for example. It's one of the costs of doing business.

> Speaking purely theorethically, Ruby can not be made as performant as
> Java or C# could be made if they had ideally performing implementations.
> Latent typing makes it almost impossible to do certain optimizations as
> static typing does. That's pure fact.

I'm not sure I agree with you here. First of all, while latent typing
may prevent you from optimizing (and I'm writing in Perl, not Ruby)

$j=0;
for ($k=0; $k<100000; $k++) {
$j++;
}

to

$j=$k=100000;

that kind of optimization is a trick used by compilers to get good
performance on trivial benchmarks, rather than something with a more
wide-ranging real-world payoff.

Second "compiled languages", like Java, C#, C++ and even C have
extensive optimized run-time libraries to do all the lower-level things
that a "true optimizing compiler", if such a thing existed, would do
automatically. Over the years, compilers have improved to the point
where they generate optimal code for things like LINPACK and the
Livermore Kernels.

In short, I don't see why a Ruby interpreter *and* run time can't
compete with a Java, C# or C++ compiler *and* run time! As long as you
have to have the same number of bits around to keep track of the
program's data structures, objects, etc., "optimization" becomes a
matter of implementing the operations on the data structures efficiently.


Devin Mullins

unread,
Sep 3, 2006, 10:33:04 PM9/3/06
to
David Vallner wrote:
> Noone cares about pretty. It's also a completely irrelevant issue when
> deciding on implementation language if you're at least remotely
> responsible.
*Everyone* cares about pretty. http://www.paulgraham.com/taste.html

Pretty means understandable, maintainable, clean (and what the heck does
clean mean? reduced duplication?). Pretty means fewer LOC, which is
about the only objective measure of maintainability we know. (Cyclomatic
complexity being another, I suppose..) Pretty means fun, which we all
know means productive.

> Speaking purely theorethically, Ruby can not be made as performant as
> Java or C# could be made if they had ideally performing implementations.
> Latent typing makes it almost impossible to do certain optimizations as
> static typing does. That's pure fact.

Irrelevant. In many cases, the fact that Ruby has latent typing is an
*implementation detail*. Ruby has *no type declarations*, but in many
cases static type inference can be applied to get the same optimizations
of which Java and C# implementations avail themselves. (Disclaimer:
that's about as much as I know about this subject.)

That's not to say that I expect the current CRuby maintainers to add
such optimizations. They seem not to care, and that's just fine by me.

Devin

Chad Perrin

unread,
Sep 3, 2006, 10:34:46 PM9/3/06
to
On Mon, Sep 04, 2006 at 12:18:30AM +0900, David Vallner wrote:
>
> And another point is that quite a few Ruby frameworks do come to
> defining a domain-specific language in Ruby - cf. Og data definition,
> Puppet, rake. There's a (maybe not quite fine) line between a very
> specific framework and a DSL that just gets crossed, and I don't believe
> rubyists are the innocents to throw the first stone.

There's a distinct difference between a subset of an already extant
language and an entirely separate language with its own idiomatic
syntax.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"Real ugliness is not harsh-looking syntax, but having to
build programs out of the wrong concepts." - Paul Graham

Chad Perrin

unread,
Sep 3, 2006, 10:34:48 PM9/3/06
to
On Mon, Sep 04, 2006 at 02:13:24AM +0900, Alex Young wrote:
> David Vallner wrote:
> >Utter pants. I mean, you used the word "bloat", which should make people
> >lose any debate by default.
> >
> >Alvin Ryder wrote:
> >>Java and C# are no guarantee for success.
> >
> >Neither is Ruby / Rails. *No technology* is a guarantee for success, no
> >technology ever was, and I'll bet a gold bar against a plastic spoon no
> >technology ever will. Technology used is a very important decision to
> >make, but it never single-handedly moves you from doable to undoable or
> >vice versa.
> That's the wrong argument to pick. Try calculating the full dynamics of
> a modern metropolitan water supply network with just pen and paper.
> Technological advances *do* move us from undoable to doable, and it's
> specific technologies that do it.

. . and in any case, I don't think anyone was saying Ruby was any kind
of guarantee of anything. The point is that Joel Spolsky's
characterization of ultraconservative technology choices as necessarily
"right" is chaff and nonsense. Despite Joel's usually intelligent and
well-reasoned commentary, he dropped the ball on this one, effectively
saying that Ruby is a guarantee of failure.

Bollocks, I say.


>
> >Also, initial development cost isn't a very important factor. Recalls
> >your uni software lifecycle charts about how much of a project's life is
> >maintenance. For a successful project, the numbers are very much true.
> >With a successful product comes the responsibility of supporting it and
> >keeping it successful, and in some cases this responsibility creates
> >ongoing costs that dwarf the initial development horribly.
> No argument there whatsoever.

I have a caveat to add:

It's true that initial development is often one of the cheaper parts of
a "successful" project, cost of initial development is still critically
important. If your initial development is too costly, you never get to
maintenance. Additionally, if you think middle managers think ahead
enough to just ignore initial development costs (even when they can
afford to do so) in favor of long-term cost savings, you probably
haven't dealt with middle managers as much as I have. CxO-types are
even worse, because their job success metrics are more tied to quarterly
stock prices and market shares than anything more long-term (generally
speaking).


>
> >>Ok, sure Java's OO may be nicer than Perl 5's but once you brew
> >>HTML/Javascript/JSP/JSTL/EL/tags/JSF or Struts together the result
> >>isn't exactly what I'd call pretty. Java is in no way a safe bet.
> >
> >Noone cares about pretty. It's also a completely irrelevant issue when
> >deciding on implementation language if you're at least remotely
> >responsible.
> Actually, pretty does matter. The comfort of a problem solver directly
> impacts his/her approach to a problem. That's just human nature.

. . and how much more do you think it costs in the long run to
maintain code that is a nasty, overly complex, ugly mess? Pretty
matters.


>
> >Speaking purely theorethically, Ruby can not be made as performant as
> >Java or C# could be made if they had ideally performing implementations.
> >Latent typing makes it almost impossible to do certain optimizations as
> >static typing does. That's pure fact.
> I remain unconvinced by this - and it's mainly JIT optimisation that
> keeps me on the fence. Dynamic optimisations can beat static - but not
> in all cases. I believe this is what one calls an "open research" question.

Unfortunately, JIT implementations haven't been subjected to the same
long-term scrutiny and advancement as more traditional persistent binary
executable compiling implementations. As a result, I don't think the
state of the art is there yet -- leaving JIT implementations effectively
slower by nature until they get some more advancement over the years to
come. I really believe that gap will be closed rapidly in the near
future. Only time and experience will tell whether it can be made as
fast or faster, though I have no doubt that it can at least be made
close enough that most of us won't care.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

Chad Perrin

unread,
Sep 3, 2006, 10:35:11 PM9/3/06
to
On Mon, Sep 04, 2006 at 12:29:26AM +0900, David Vallner wrote:
> >time. In paraphrase, it was (summarized):
> >
> > Regardless of how good or bad a decision a given language is for a
> > given task, Ruby is more likely to get you fired that Java.
> >
>
>
> To be fair, it's not just corporate politics. Statistically, it's more
> likely a development house will have a strong base of Java developers or
> C# developers (C#, while being very young and so far an abomination unto
> Nuggan, is reasonably Java compatible), and that starting a Rails
> project means you'll probably have to get people with no Ruby experience
> on the team, or create a burden on the company in case the original team
> falls apart and quits to other companies regarding maintenance, or whatever.

Choosing a language despite the resources at your disposal, rather than
because of them, would probably make that a "bad decision". That in no
way invalidates the summarized point I already made:

"Regardless of how good or bad a decision a given language is for a
given task, Ruby is more likely to get you fired that Java."

--

CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"The ability to quote is a serviceable
substitute for wit." - W. Somerset Maugham

M. Edward (Ed) Borasky

unread,
Sep 3, 2006, 10:49:07 PM9/3/06
to
Alex Young wrote:
> David Vallner wrote:
>> And another point is that quite a few Ruby frameworks do come to
>> defining a domain-specific language in Ruby - cf. Og data definition,
>> Puppet, rake. There's a (maybe not quite fine) line between a very
>> specific framework and a DSL that just gets crossed, and I don't
>> believe rubyists are the innocents to throw the first stone.
> That's not quite the same - those DSL's build upon a known and well
> understood foundation, because they use Ruby's syntax to their own ends.
> I'm inferring from the very little information that's out there that
> Wasabi has its own parser, and that makes it a very, very different
> beast to a DSL in the sense that I've come across the term in Ruby.

To use Martin Fowler's terminology, there are external DSLs -- a
language created for the domain and implemented with a parser, etc., in
some general-purpose language. And there are *internal* DSLs, written as
extensions/subsets inside a language like Ruby.

Rails and rake are internal DSLs, and Ruby makes internal DSL creation
much easier than many other languages. I can't tell from this thread
whether Wasabi is external or internal.

I hardly think of an external DSL as anything special any more. They've
been around as long as I've been programming, which is -- well, let's
just say your toaster has more compute power than the machine I learned
on. :) Almost every major decades-old Fortran code, for example, is
really implementing an external DSL.


Chad Perrin

unread,
Sep 3, 2006, 10:54:54 PM9/3/06
to
On Mon, Sep 04, 2006 at 11:26:18AM +0900, M. Edward (Ed) Borasky wrote:
> David Vallner wrote:
> >> How about C#, well it runs in Windows and without serious and expensive
> >> firewalls you just can't go anywhere near the Internet.
> >
> > You need to tighten off Unix-based servers too. Heck, there are even
> > serious and expensive firewalls for Linux around too, because not
> > everyone has an in-house iptables guru.
> But everybody *should* have a *certified* Cisco engineer if they use
> Cisco routers, for example. It's one of the costs of doing business.

Frankly, iptables is easier to learn effectively than most proprietary
firewalls -- and then there's stuff like IPCop, which makes things even
easier.

Vidar Hokstad

unread,
Sep 4, 2006, 5:39:45 AM9/4/06
to

Devin Mullins wrote:

> David Vallner wrote:
> > Speaking purely theorethically, Ruby can not be made as performant as
> > Java or C# could be made if they had ideally performing implementations.
> > Latent typing makes it almost impossible to do certain optimizations as
> > static typing does. That's pure fact.
> Irrelevant. In many cases, the fact that Ruby has latent typing is an
> *implementation detail*. Ruby has *no type declarations*, but in many
> cases static type inference can be applied to get the same optimizations
> of which Java and C# implementations avail themselves. (Disclaimer:
> that's about as much as I know about this subject.)

You're absolutely right.

Look to Haskell for a good example a _statically typed_ language
almost free of type annotation of any kind - type information is almost
exclusively added by the compiler (though you can add type
annotations).

While Ruby has features that make it impossible for an implementation
to use strict static typing everywhere, a lot of a typical Ruby
application could be statically typed by an implementation using type
inference fairly easily by doing some relatively simple flow analysis
combined with marking up the parse tree.

Doing it for a pure interpreter would be easy, but the advantages would
be relatively limited. Doing it for a JIT compiler would also be quite
straightforwards and does have the potential of very significant
speedups.

For a full fledged compiler it would be tricky without some
restrictions - the main problem is Ruby's introspective features and
various eval mechanisms, which means the type inference valid at
compile time might not hold at runtime. Add a few restrictions on the
use of load/require etc. and the use of eval's and/or some way of
adding some basic type annotation to guide the compiler for "extension
points" (classes/methods that will be affected by runtime changes) and
it would be doable without significant changes.

Vidar

Stephen Kellett

unread,
Sep 4, 2006, 6:55:06 AM9/4/06
to
In message <44FAFC3...@vallner.net>, David Vallner
<da...@vallner.net> writes

>research, not random anectodal success stories that are too pathethic
>to sell Herbalife diet pills.

Thats a great line.

Stephen
--
Stephen Kellett
Object Media Limited http://www.objmedia.demon.co.uk/software.html
Computer Consultancy, Software Development
Windows C++, Java, Assembler, Performance Analysis, Troubleshooting

Stephen Kellett

unread,
Sep 4, 2006, 7:08:41 AM9/4/06
to
In message <44FB5AD5...@comcast.net>, Devin Mullins
<twi...@comcast.net> writes

>does clean mean? reduced duplication?). Pretty means fewer LOC, which
>is about the only objective measure of maintainability we know.

I take it you've never had the pleasure of reading someone else's APL
code? Its about as dense as you can get in LOC.

Sure is not easy to maintain. Often described as a "write only
language".

I think the word "pretty" is not the correct word, "elegant" would be
better.

Joseph

unread,
Sep 4, 2006, 1:06:07 PM9/4/06
to
Vidar,

Risk Management IS NOT equivalent to FEAR, in that you are right.

However, as I said earlier, no SIGNIFICANT progress can be expected
without some risk. Risk Management is about dealing with risk, not
eliminating it.

Ruby and Ruby on Rails are not the safest choice, but I believe they
are one of the very best choices for web development. There is a
slight risk in it, but not enough to stop any bold, courageous
corporation, startup or even lone developer to create great software
with it.

Joel and people who share his views, equate Risk with FEAR. That is
their main mistake. Ruby is ready now... not for everything, but is
uniquely ready for web development, that is I believe a fact.

As another poster mentioned, there is an evolution in the adoption of
technology. Ruby is still with the early adopters, but that does not
mean is not mature enough for critical applications.

What will prove me right however is not my rationale here or in my
previous post, but TIME, time will prove those who sticked to Ruby and
Ruby on Rails did so wisely, because TRUTH is tested in time.... I
believe Ruby is ready now, many people disagree, but ultimately time
and people using Ruby for critical applications will be the deciding
factor.

I love this quote from the Linux Advocacy video Red Hat produced
recently, it is incredibly accurate to this issue we are discussing,
and I recommend everyone to watch it, I will quote it here:

"Despite Ignorance
Despite Ridicule
Despite Opposition
Despite it ALL
TRUTH HAPPENS"
Source: http://www.redhat.com/truthhappens/

Time will tell us indeed, but I am not waiting for the jury, I am
learning Ruby and RoR now, eager to apply it to create cool, amazing
web applications... isn't that the whole point? To push technology?
To make it fun again? To innovate?

Jose Hurtado
Web Developer
Toronto, Canada

Vidar Hokstad

unread,
Sep 4, 2006, 3:37:59 PM9/4/06
to

Joseph wrote:
> Risk Management IS NOT equivalent to FEAR, in that you are right.
>
> However, as I said earlier, no SIGNIFICANT progress can be expected
> without some risk. Risk Management is about dealing with risk, not
> eliminating it.

This we agree on.

> Ruby and Ruby on Rails are not the safest choice, but I believe they
> are one of the very best choices for web development. There is a
> slight risk in it, but not enough to stop any bold, courageous
> corporation, startup or even lone developer to create great software
> with it.

And we agree on this too, to some extent. My argument is mainly that
without a certain level of knowledge about Ruby, the level of risk is
unknown, in which case it is prudent to assume the likely risk is high
until you have investigated it closer.

For those of us that know Ruby, the best we can do to spread it is to
help people get to the stage where they know enough that they can
accurately assess the risk of using Ruby for their projects, but until
people have that knowledge the risk as seen by someone who doesn't know
Ruby will be higher than the real risks with using Ruby.

> Joel and people who share his views, equate Risk with FEAR. That is
> their main mistake. Ruby is ready now... not for everything, but is
> uniquely ready for web development, that is I believe a fact.

And it may very well be a fact, but again, there are still risks, and
those risks are greater for someone who doesn't know Ruby, or who
doesn't have the skills inhouse, whereas the corresponding risks for a
Java shop of doing something in Java may be very low if their staff is
skilled enough in Java.

Personally I hate Java and love using Ruby, but if I had to manage a
team of Java guru's, I'd still consider java as a safer choice than
Ruby unless the project was long enough to take significant time
retraining staff and possibly hiring replacements for anyone who decide
to leave.

They then have to make a tradeoff: low risk in Java (or C# or PHP or
LISP or whatever language they have the sufficient experience with to
make the risk a known, low factor) or a possibly higher risk in another
language vs. _possibly_ lower cost and shorter development time.
Developer hapiness doesn't count unless it affects one of the previous
two, or increases employee retention.

However, that possible payoff depends on whether they make a
successfull transition, which they won't know the chances off if they
have little to no exposure to the language. It also depends on whether
Ruby is right for _their specific project_, which they won't know if
they have little experience with the language.

These factors are all reasons why - regardless of how good Ruby is -
for someone to be picking Ruby just because you and I and other
Rubyists say it's good without having a reasonable degree of knowledge
about how appropriate it would be for their project themselves would be
quite irresponsible.

> As another poster mentioned, there is an evolution in the adoption of
> technology. Ruby is still with the early adopters, but that does not
> mean is not mature enough for critical applications.

For some it certainly is. But despite having a reasonable experience
with Ruby, I'd hesitate about making a blanket statement about it.
Performance _will_ be an issue for some apps (as I've noted elsewhere,
it won't be for _most_ web apps, but there certainly are web apps that
are CPU intensive too, and where C/C++ extensions would be vital if you
were to go with Ruby at the current stage), and lack of certain
libraries might be an issue for some.

Feature poor XML integration IS an issue for my company (Edgeio.com) at
the moment. It's one we expect to solve, but at the cost of additional
work which we wouldn't incur in some other languages. Ruby is still
good for us, but it's not a panacea for all types of development. It
likely never will be, but as time goes the space of apps for which Ruby
is a good choice will of course increase significantly and I do believe
it can supplant many currently more widely used languages.

We still use PHP for our web frontend, though. All our Ruby code is in
the backend for now. I did consider Rails, and maybe we'll migrate to
it at some point, but currently the potential savings are too small to
outweigh the cost/time to migrate, and our frontend is growing thinner
and thinner as we refactor our middleware and backend, so it pays to
just wait for now.

> What will prove me right however is not my rationale here or in my
> previous post, but TIME, time will prove those who sticked to Ruby and
> Ruby on Rails did so wisely, because TRUTH is tested in time.... I
> believe Ruby is ready now, many people disagree, but ultimately time
> and people using Ruby for critical applications will be the deciding
> factor.

Ruby is ready now for some apps _if you have the experience_ or your
potential cost savings are large enough to justify taking the time to
retrain your staff or hire new people.

I use Ruby because it's the best of an increasing pool of bad
alternatives. I still haven't found a language I don't see tons of
flaws in, Ruby included. Ruby's flaws are just less annoying than the
rest :) I don't believe in "truths" in language choices - people need
to pick what works for them, and while looking at what's popular is
often good, there are always exceptions.

> Time will tell us indeed, but I am not waiting for the jury, I am
> learning Ruby and RoR now, eager to apply it to create cool, amazing
> web applications... isn't that the whole point? To push technology?
> To make it fun again? To innovate?

That's one viewpoint. But the point for the companies considering
language choices is what technologies will bring them the greatest
profit at the lowest risk.

As much as it's tempting for me as a geek to pick technology based on
personal preference, ultimately I have a responsibility to the
shareholders that needs to take preference.
(and since I'm one of them, and I work at a startup, there's also the
hope of an opportunity for early "retirement" :) )

Vidar

Bil Kleb

unread,
Sep 5, 2006, 10:29:32 PM9/5/06
to
Christian Neukirchen wrote:

> Bil Kleb <Bil....@NASA.gov> writes:
>> Attempting remote installation of 'MerriamWebster'
>> ERROR: While executing gem ... (NoMethodError)
>> undefined method `name' for -517611318:Fixnum
>>
>> on ruby 1.8.2 (2004-12-25) [powerpc-darwin8.2.0]
>
> Sounds like the OS X symbol table overflow bug. Update your Ruby.

Bingo.

Thanks,
--
Bil Kleb, http://fun3d.larc.nasa.gov

Chad Perrin

unread,
Sep 5, 2006, 5:06:17 PM9/5/06
to
On Wed, Sep 06, 2006 at 02:36:02AM +0900, Richard Conroy wrote:
>
> Libraries is life. And the greatest source of risk generally, developers
> like
> libraries that they can trust. If you can't trust them you have to
> limit yourself
> to problems that do without, or write your own equivalents (which will wipe
> out your rails productivity boost).

. . once.

Imagine you have choice A and choice B. With choice B, you have all the
libraries already. With choice A, you're missing some. On the other
hand, with choice A you have a productivity boost that provides extra
time roughly equivalent to the time it takes to write the libraries that
are missing.

Now imagine you're doing the same thing again, a few years later. Would
you rather have chosen option B the first time, and be faced again by
the same trade-off between core task productivity and available
libraries, or have chosen option A the first time, have written the
libraries you needed, and now have only to choose between two options
with the same library availability for the task at hand with wildly
different productivity characteristics?

Productivity doesn't go away just because you're spending the same
amount of time completing the overall goal. It just gets used more.
Rather than producing only a web app, you are in the same time producing
a web app and the libraries necessary to support it.

Now, if producing those libraries the first time ends up taking three
times as long as the app itself with choice B would have, that's another
story -- but if I take your "wipe out your rails productivity boost"
comment at face value, I'm still choosing Rails.


>
> Well lots of people do, and the question still stands as to what happens
> to the boost if you have to do any significant Ruby code processing in
> Rails (non-SNMP). I have got some sloppy partial code around that
> builds up a DIV graph and there is a perceptible delay in the drawing
> of the page that uses it. Admittedly its under worst case scenarios, but
> it does strike me that you really don't want to be doing anything funky
> in your controllers at all.

It strikes me as a bad idea in general to pursue edge cases in
frameworks. Frameworks are for general-case development. Their benefit
is that they do the common things for you. If your application is 98%
uncommon things, you aren't going to get much use out of frameworks.

SNMP, I realize, is not an uncommon case, but it's uncommon enough for
web app development that expecting a web development framework to do the
heavy lifting for you is a somewhat odd demand, I think.


>
> I am not talking of the classic situation of where you own the server or
> can directly install on in customer equipment. I am talking of scenarios
> where you have to produce a windows/Mac OS/Linux installer/packager,
> that will extract out a working Rails Apps with dependencies and
> minimal interaction from the user.
>
> I would kill to read a step by step example of this - from source control
> to OS-specific-user-friendly installers. I have been looking a bit at
> Capistrano, I haven't delved deeply enough, but it doesn't seem to
> go the full distance that I am talking about. And if I am not mistaken,
> the author has stated that its not a path he will pursue

I'm entirely with you on this: web application framework advocates, for
any framework in any language, seem to believe that the developer will
always be the one deploying and that deployment will be accomplished at
a server where the developer has complete access and control. There
isn't nearly enough attention on the problem of removing control
characteristics and direct access capabilities, or even passing on
deployment to someone else entirely who wasn't part of the original
picture at all.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"There comes a time in the history of any project when it becomes necessary
to shoot the engineers and begin production." - MacUser, November 1990

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 2:00:13 PM9/4/06
to
Chad Perrin wrote:
> As someone with a combination of college, trade school, on-the-job, and
> purely autodidactic education, with several certifications of my own, my
> experience is that all certificiations really prove is A) a certain
> minimum standard of test-taking competence that can be sized up within
> five minutes of meeting someone anyway and B) a certain amount of money
> invested in professional advancement.
They also prove that you can learn and carry out a learning task to
completion. They also provide HR and the hiring manager with an
objective way of ruling out unqualified candidates. If I post a network
engineer position and get 100 applications, ten of whom have completed
their certification, that's 90 resumes I can throw in the trash.


Richard Conroy

unread,
Sep 5, 2006, 8:12:23 AM9/5/06
to
On 9/4/06, Chad Perrin <per...@apotheon.com> wrote:
> If you eliminate risk entirely, you end up guaranteeing failure -- for
> some definition of risk. Any definition of risk that does not result in
> that end is either meaningless or effectively impossible to eliminate.

Well thats pedantic, you will notice that I said nothing about eliminating
it entirely. I always thought your technology choice should
not add to your existing business risk wherever you can help it.

Currently the way I see it, is that if you choose a java solution over a
RoR solution you are increasing business risk due to the fact that it
will take longer to get your v1.0 up. In the meantime the RoR solution
is getting very polished, and mature and is in a position to ship or secure
revenue or mind share early. Also lengthy development gives the ADD
marketing crowd too much time to get antsy.

The main unanswered question in Rails is whether 'Rails is quicker'
holds true for a sufficiently large application space. Or is this a truism
only in a conventional web app space and green field application to boot
(the Rails Sweet Spot (tm) ).

If you have to do wierd stuff like:
- interact with systems in some non web way (CORBA, SNMP, XML)
- use legacy databases
- run on single-server machines
- do CPU intensive work in requests (e.g. image manipulation)

do you still get to keep the productivity boost? If you don't, the RoR
advantage starts to disappear, and all those other conventional technologies
start to look more attractive. All those 'works as advertised' libraries, and
alternatives of technology choices if your main library/binary is inadequate.
Also a huge volume of proven, documented best practices.


The burden of proof is on Rails to
establish it is ready for prime time.

Austin Ziegler

unread,
Sep 4, 2006, 4:57:30 PM9/4/06
to
On 9/3/06, Joseph <jlhu...@gmail.com> wrote:
> Best Regards,
>
> Jose L. Hurtado
> Web Developer
> Toronto, Canada

So ...

we've never seen you at a TRUG meeting (we just had one yesterday).

Come on out and join us!

-austin
--
Austin Ziegler * halos...@gmail.com * http://www.halostatue.ca/
* aus...@halostatue.ca * http://www.halostatue.ca/feed/
* aus...@zieglers.ca

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 2:12:07 PM9/4/06
to
Chad Perrin wrote:
> This brings me to a thought I've been having a lot, lately: that the
> future of compiled code will probably start looking in some ways more
> and more like interpreted code. I don't see why we can't, relatively
> soon, have a compiler that produces a compiled executable of a dynamic
> language such as Ruby that does not require a VM or interpreter to be
> run (outside of the very loose definition of "interpreter" or "VM" that
> might include your whole OS in that definition). The dynamic aspects of
> the language would be handled by the executable binary itself, rather
> than by an interpreter external to the program.
>
> I'm not entirely sure how to explain what I'm thinking at this time so
> I'm sure I get my point across. Hopefully someone who reads this will
> get where I'm aiming, and may even be able to help me clarify it.
Perhaps you're thinking along the lines of Lisp or Forth, where an
application is layered on top of the entire compiler/interpreter/runtime
package and then saved as an executable. As far as I can tell, there's
absolutely no reason this couldn't be done for Ruby. IIRC that's also
the way the Squeak Smalltalk environment works and the way Self worked.

Incidentally, Forth contains two interpreters and a compiler. A typical
Common Lisp contains one compiler and one interpreter. Right now, Ruby
is simple enough that what you're describing seems feasible -- a couple
more years of co-evolution with its users and it might not be. :)


Chad Perrin

unread,
Sep 4, 2006, 2:18:40 PM9/4/06
to

I don't think I can really put much value in that "carry out a learning
task to completion" idea, in this case. The sort of "learning" it
measures is, generally speaking, more suited to learning to give the
answers people are expecting than coming up with correct answers.
Microsoft certs, in particular, are bad about this -- filled with
marketing euphemisms and salesworthy "this solution for that problem"
questions.

That's not to say certifications are useless, but they carry little
enough worth in (accurately) judging a candidate's value that ignoring
them entirely probably wouldn't hurt your hiring strategies.

You're right about certifications providing HR and hiring managers with
an "objective" metric for candidate qualifications, but that's pretty
self-referential (they're "qualified" if they meet the qualification
requirements, including a certification, which is required so that
you'll have some way to tell if they're qualified, et cetera), and
there's not really any indication that what it objectively measures is
useful for most purposes. About the only way it measures something
useful with regard to job performance is if someone can literally just
walk into the exam cold, with no studying, and answer all the questions
correctly . . . except for the questions that are misgraded on the exam
(I've yet to see a certification test that doesn't require technically
inaccurate answers to get everything "right").

Throwing out 90% of candidates for not having a certification in the IT
industry is about like throwing out 90% of candidates because their
ties aren't the right width. I mean, sure, having ties of the "right"
width indicates an attention to detail and ability to keep up with
changing trends, which is useful for technical matters, but there's no
guarantee the people you've excluded aren't just fashion-impaired
despite attention to detail and throughly current knowledge of
information technologies, nor that the people with the "right" ties
aren't more focused on fashion than professional skills, or even just
really lucky in their choice of ties today.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"It's just incredible that a trillion-synapse computer could actually
spend Saturday afternoon watching a football game." - Marvin Minsky

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 1:48:06 PM9/4/06
to
Chad Perrin wrote:

> On Mon, Sep 04, 2006 at 01:01:53PM +0900, Alvin Ryder wrote:
>> I'm not sure how fast or slow Ruby is but if it's as fast as Perl I'll
>> be happy enough. Yes I know C is faster but I need fast development
>> times too.
>
> Based on what I've heard/seen/experienced, Ruby is generally somewhat
> slower than Python which is, in turn, somewhat slower than Perl.
> Generally. On average. For some definition of "average".

Would you be interested in the correct definition of average in
benchmarking? Of course you would! :)

http://portal.acm.org/citation.cfm?id=5673&dl=ACM&coll=&CFID=15151515&CFTOKEN=6184618

> One of the
> nice things about Perl's relative maturity is the significant work that
> has gone into performance concerns for the parser/runtime. I have no
> doubt that Ruby will eventually match Perl for general-case performance,
> but I also have no doubt that on the whole it has not matched it yet.

And this was one of the motivations of the Parrot team -- a common
virtual machine for Perl, Python and Ruby. The Ruby community seems to
have put a lot more effort into YARV than the Cardinal/Parrot approach.
Has the Python community similarly gone their own way, or do they plan
to use Parrot?


Devin Mullins

unread,
Sep 5, 2006, 10:40:38 AM9/5/06
to
Devin Mullins wrote:

> Richard Conroy wrote:
>> The burden of proof is on Rails to
>> establish it is ready for prime time.
>
> I beg to differ, but the burden of proof is on people who give a shit
> about "the burden of proof."

I'm sorry, that came out wrong. Please let me rephrase:

I beg to differ, but the burden of "proof" is on people who give a shit
about the burden of "proof."

That's better. Sorry about the mix-up.

> Devin
> (Excusez-vous mon anglais.)

Chad Perrin

unread,
Sep 4, 2006, 2:29:08 PM9/4/06
to
On Tue, Sep 05, 2006 at 03:12:07AM +0900, M. Edward (Ed) Borasky wrote:
> Chad Perrin wrote:
> > This brings me to a thought I've been having a lot, lately: that the
> > future of compiled code will probably start looking in some ways more
> > and more like interpreted code. I don't see why we can't, relatively
> > soon, have a compiler that produces a compiled executable of a dynamic
> > language such as Ruby that does not require a VM or interpreter to be
> > run (outside of the very loose definition of "interpreter" or "VM" that
> > might include your whole OS in that definition). The dynamic aspects of
> > the language would be handled by the executable binary itself, rather
> > than by an interpreter external to the program.
> >
> > I'm not entirely sure how to explain what I'm thinking at this time so
> > I'm sure I get my point across. Hopefully someone who reads this will
> > get where I'm aiming, and may even be able to help me clarify it.
> Perhaps you're thinking along the lines of Lisp or Forth, where an
> application is layered on top of the entire compiler/interpreter/runtime
> package and then saved as an executable. As far as I can tell, there's
> absolutely no reason this couldn't be done for Ruby. IIRC that's also
> the way the Squeak Smalltalk environment works and the way Self worked.

No . . . that's not quite it. Maybe a really bad diagram will help.

interpreter for a dynamic language:
|--------------------------------------------------|

interpreter capabilities exercised by a program in a dynamic language:
|++++++++++++|

compiled static binary for an equivalent program from a static language:
|++++++++++++|

combination static/dynamic compiled binary from a dynamic language:
|+++++++++++----|

. . roughly.

There would likely be more binary size necessary, but considering that
even an interpreter is (generally) a compiled binary that just operates
on input, I don't see any reason to assume we cannot cannot compile
dynamic language code into a persistent binary with accomodations made
for the parts of the program that require runtime dynamic behavior.
This strikes me as a superior approach to a JIT compiler/interpreter
approach like Perl's, a pure interpreter approach like Ruby's, or a
bytecode compilation plus runtime interpreter VM like Java's, for
performance. Add to that the potential increased performance for some
parts of a program written in a more dynamic language something like the
following might actually run faster than the equivalent compiled program
I diagrammed above:

|+++++++--------|

. . depending on how well those dynamic bits (represented by the
hyphens) optimize at runtime for a particular run of the program.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

James Edward Gray II

unread,
Sep 3, 2006, 11:37:59 PM9/3/06
to
On Sep 3, 2006, at 11:55 AM, James Britt wrote:

> Alex Young wrote:
>
>
>> I'm inferring from the very little information that's out there
>> that Wasabi has its own parser, and that makes it a very, very
>> different beast to a DSL in the sense that I've come across the
>> term in Ruby.
>

> Some Wasabi info:
>
> http://www.joelonsoftware.com/items/2006/09/01b.html
> http://programming.reddit.com/info/g0fa/comments

This was interesting reading.

I'm paraphrasing here but Spolsky's replies in the second link
basically indicate that he trusts his team and likes to take a few
risks. I would say that's the reason to choose Ruby on Rails as an
answer to the original article-provoking question.

James Edward Gray II

Chad Perrin

unread,
Sep 4, 2006, 6:13:13 AM9/4/06
to
On Mon, Sep 04, 2006 at 11:49:07AM +0900, M. Edward (Ed) Borasky wrote:
>
> Rails and rake are internal DSLs, and Ruby makes internal DSL creation
> much easier than many other languages. I can't tell from this thread
> whether Wasabi is external or internal.

I'm having a hard time imagining it being internal, conidering the way
Joel describes it in an essay where he addresses Wasabi directly:
http://www.joelonsoftware.com/items/2006/09/01b.html

I am, frankly, having a tough time imagining how he could have designed
a language that does what he claims as an internal DSL of VBScript. In
fact, he describes it as "100% backwards-compatible with VBScript".

On the other hand, it seems utterly incomprehensible that someone would
recreate VBScript, but with more power, from scratch -- which is what
he'd have to do, considering I doubt Microsoft gave him the source for
it.


>
> I hardly think of an external DSL as anything special any more. They've
> been around as long as I've been programming, which is -- well, let's
> just say your toaster has more compute power than the machine I learned
> on. :) Almost every major decades-old Fortran code, for example, is
> really implementing an external DSL.

In a manner of speaking, one might say that all programming languages
are, in one way or another, DSLs of a sort.

Devin Mullins

unread,
Sep 5, 2006, 10:20:51 AM9/5/06
to
Richard Conroy wrote:
> The burden of proof is on Rails to
> establish it is ready for prime time.
I beg to differ, but the burden of proof is on people who give a shit
about "the burden of proof."

Devin
(Excusez-vous mon anglais.)

Devin Mullins

unread,
Sep 4, 2006, 2:31:38 AM9/4/06
to
James Britt wrote:
>> This leads to an interesting question: how many ruby programmers are
>> there, anyway?
>>
>> I ran across
>> http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
>> today and boggled at the "2.5 million" number for PHP.
>>
>> Any ideas for Ruby?
>
> At least a dozen, maybe more.

+1 (in both senses)

Richard Conroy

unread,
Sep 6, 2006, 12:54:44 PM9/6/06
to
On 9/5/06, Chad Perrin <per...@apotheon.com> wrote:
> > or write your own equivalents (which will wipe
> > out your rails productivity boost).
>
> . . . once.

>
> Imagine you have choice A and choice B. With choice B, you have all the
> libraries already. With choice A, you're missing some. On the other
> hand, with choice A you have a productivity boost that provides extra
> time roughly equivalent to the time it takes to write the libraries that
> are missing.

Time saved is not the only thing that libraries provide. For instance, you
may simply not have the expertise or business mentality for library
development. e.g. Marketing, management and other project stakeholders
cannot determine if a library is 'good' or 'done' or whatever - a library is
a software product whose customers are developers. Libraries generally
require higher quality than the products that use them too.

I know for instance that I would smack anyone at work who suggested
we write a(nother) security library. I have been in companies where
our library choice was something that was discussed in sales pitches.
If you use something sensitive like a credit card payment processing
engine, customers might be sensitive to what libraries you use to access
it (the vendors, an established third party etc.).

There are also many companies where library authoring may be a complete
anathema. They don't know how to fit it in with their business processes
(like testing) or how the life expectancy and support of a library outlasts its
first application, and how it gets support. There may also be significant
developer fear about having to author and effectively support something
like that.

Also consider this: the Ruby/Rails environment has probably more active
library development per capita (of developers) than established languages.

I know in Java,
the quality of the basic libraries is excellent, and it is worth checking out
virtually anything that the jakarta crew work on, as they are especially
good. But ironically, this means that library development skills for your
average java programmer have severely atrophied through lack of use, and
if they then have to write equivalents in Ruby.... that ramps up your risk
completely. I know for a fact that when I am presented with an in-house
java library my first reaction is 'aw crap'. I have been pleasantly surprised
on occasion, and I work with good people now, but there is something
about Java that attracts the worst kind of well-intentioned designer - the
kind that starts developing meta-solutions instead of addressing the
given problem.

> It strikes me as a bad idea in general to pursue edge cases in
> frameworks.

Well I wasn't distinguishing between Ruby or Rails here. I wasn't
requesting that Rails accomodate edge
conditions - the correct approach to edge conditions in Rails, is
to down shift to Plain Old Ruby. This is pretty obvious with its
'helper' hook, and is well stated in the good literature, and 'Recipes'
in general are implemented this way.

However, my point was that using Ruby *IN* Rails in this manner
could kill your scaling. Its another one of those rock-and-hard-place
situations - the implication is that only 'sweet spot' Rails apps
can truly scale. Whether Rails scales up to MySpace levels is
another thing, and whether you need twice or ten times as many
machines is another. But *I* don't particularly care about those cases.

I do care about single machine performance though.

> SNMP, I realize, is not an uncommon case, but it's uncommon enough for
> web app development that expecting a web development framework to do the
> heavy lifting for you is a somewhat odd demand, I think.

I try to avoid the thing as much as I can, doctor's instructions. Its a fine
protocol for what it was intended for, but the only uses of it I ever see
are when people build insane application protocols over it.

I doubt if I am the first person to think about using Rails easy machine
parallelism to split up the workload involved when a couple of thousand
SNMP agents start screaming at you all at once.

This problem alone is what will dictate the shape of our new app. It is
not a solution we have ever solved properly before.

I would get a perverse satisfaction from using 'risky/slow rails' to solve
our most persistent scaling problem. An ActionSNMP plugin would be
quite cool indeed.

> I'm entirely with you on this: web application framework advocates, for
> any framework in any language, seem to believe that the developer will
> always be the one deploying and that deployment will be accomplished at
> a server where the developer has complete access and control. There
> isn't nearly enough attention on the problem of removing control
> characteristics and direct access capabilities, or even passing on
> deployment to someone else entirely who wasn't part of the original
> picture at all.

Well Rails is new, and a moving target. The audience for Rails solutions
is web developers, and its original core audience was probably the
Perl, PHP, ASP community (I am guessing) who had direct access to
production servers.

The success attracts other kinds of web developers with indirect server
access. I don't expect the Rails binary distribution problem to stay
unsolved for long. Instant Rails is an early step in that direction, and
I don't think the problem is particularly hard anyway - you just bundle
your gems and binary dependencies together. Once you are dealing
with say, using existing MySQL databases you are automatically
dealing with a knowledgeable customer, and you can invisibly install
something like sqlite for the technofearful.

Rails is also attracting attention from weirdos like me who are seeing
what else we can do with it, and who see the lack of convenient
source deployment methods a problem. Also the lack of an established
build process is a bit step backward. As we have gotten very used to
100% non-manual build processes that do lots of non-build activities
too (like source analysis, unit tests, product watermarking). Not all
steps are applicable to Rails, but Rails does add some (deploy to
test server & test) but the principle remains the same.

William Grosso

unread,
Sep 3, 2006, 11:50:34 PM9/3/06
to

William Grosso wrote:

> Vidar Hokstad wrote:
>>
>> When we started hiring those concerns were validated: It's proved
>> extremely hard to find people with Ruby experience. While it's
>> certainly getting easier rapidly, not everyone can afford to take the
>> risk. In our case I decided to start phasing Ruby in for small self
>> contained components in our backend, and gradually take it from there
>> as we get enough Ruby skills through training or hiring, which has
>> proven to work well and meant that in the event that we'd run into
>> unforeseen problems, the effort involved in switching back to another
>> language would have been limited.
>>
>
> This leads to an interesting question: how many ruby programmers are
> there, anyway?
>
> I ran across
> http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
> today and boggled at the "2.5 million" number for PHP.
>

Minor correct: I ran across that, and then read the following in Mark
De Visser's profile on LinkedIn.

Zend Technologies creates PHP products, software for rapid development
and deployment of Web applications. PHP is being increasingly adopted
with an estimated 2.5 million developers currently using it and 22 million
deployed websites.


> Any ideas for Ruby?
>
>
> Bill
>
>
>


Peter Booth

unread,
Sep 5, 2006, 5:41:45 PM9/5/06
to
I think he makes an important point that choice of technology is rarely a
proximate cause of project failure. ( EJB & Corba being the two largest risk
I can recall.)

One point he doesn't make is the difference OPM and MHC (Other Peoples Money
versus My Hard-earned Cash.) When you are spending OPM costs can become
unreal. So as an employee a "buzz-word compliant/standard approach" Java or
NET solution might make more sense than a leaner solution that uses
unfamiliar technology. I can still recall being excoriated by peers and a
manager for choosing to use Ruby to write a proof-of-concept test. The Ruby
spike took eleven minutes to write. A Java version would have taken me 90
minutes, and I am a Ruby novice.
I often ask myself "If this were my cash would I do this?"

I love Java but I wouldn't use it to write application (cf plumbing) code,
if I were paying the bills.

-----Original Message-----
From: Phlip [mailto:phli...@yahoo.com]
Sent: Sunday, September 03, 2006 3:46 PM
To: ruby...@ruby-lang.org
Subject: Re: Joel Spolsky on languages for web programming

Joseph wrote:

> Although I respect Joel very much, I believe he makes a fundamental
> mistake in his reasoning.

Joel is such a good writer that sometimes his jaw-drooping errors are
impossible to refute. (And don't encourage him; he loves it when you fight
back!)

> Basically what he is saying can be deconstructed this way:
>
> * Do not risk developing in new cutting edge technology. Even if
> successful proof of concepts are already out there (37 signals et. al)
> * Use what most people use: PHP / J2EE / .Net not what most experts
> tell you to use. Communities and support are paramount.

The open source tools that succeed must have higher technical quality than
the Daddy Warbucks tools. The latter can afford to buy their communities and
"support" networks. Because an open source initiative cannot buy its
community and marketing, only the strong survive, and their early adopters
will form this community spontaneously. They will provide the true
word-of-mouth advertising that marketing tends to simulate.

And I am sick and tired of seeing at shops dragged down by some idiotic
language choice made between the marketeers and a computer-illiterate
executive.

> * Corporations and the people in those organizations favor safety, if
> your job is on the line go with the tried and true. Take no risks.

Ah, so looking like you are following best practices is more important than
doing everything you can to ensure success. Gotcha!

Yes, I have seen that upclose, too!

> All three assumptions rely on a single assumption: FEAR.
>

> * Fear the technology would eventually not deliver.

> * Fear the support will not be sufficient.

> * Fear regarding your job safety as a corporate developer or manager
> who chooses Ruby or Ruby on Rails for some mission critical project.

Yup - that's the Fear Uncertainty and Doubt formula that Microsoft (among
others) use all the time. They have tried, over and over again, to FUD
Linux. Their CEO will get up on stage and say incredibly stupid things, like
"if an open source platform fails you, there is nobody you can go to for
help!" He means there's nobody you can sue. As if you could go to MS for
help, without paying thru the nose...

Oh, Joel is pro-Linux, right? What's the difference??

> All assumptions are wrong.

Better, fear that your boss will experience misguided fear.

--
Phlip
http://c2.com/cgi/wiki?ZeekLand <-- NOT a blog!!!

----------------------------------------------------------------------------
----------------------------------------

The information contained in and accompanying this communication is strictly
confidential and intended solely for the use of the intended recipient(s).
If you have received it by mistake please let us know by reply and then
delete it from your system; you should not copy the message or disclose its
content to anyone.
MarketAxess reserves the right to monitor the content of emails sent to or
from its systems.
Any comments or statements made are not necessarily those of MarketAxess.
For more information, please visit www.marketaxess.com. MarketAxess Europe
Limited is regulated in the UK by the FSA, registered in England no.
4017610, registered office at 71 Fenchurch Street, London, EC3M 4BS.
Telephone (020) 7709 3100.
MarketAxess Corporation is regulated in the USA by the SEC and the NASD,
incorporated in Delaware, executive offices at 140 Broadway, New York, NY
10005. Telephone (1) 212 813 6000.


Chad Perrin

unread,
Sep 4, 2006, 6:25:44 AM9/4/06
to
On Mon, Sep 04, 2006 at 01:23:16PM +0900, M. Edward (Ed) Borasky wrote:
>
> The gap has narrowed. It's rare that an assembly language coder can beat
> a compiler by more than a factor of 2 these days, and on some
> architectures it's a dead tie -- there's only one way to do something
> and the compiler always finds it. Interpreters are better now too,
> mostly because today's languages have such a large component that has to
> be dealt with at run time anyway that the "heavy lifting" is done by
> compiled code.

This brings me to a thought I've been having a lot, lately: that the
future of compiled code will probably start looking in some ways more
and more like interpreted code. I don't see why we can't, relatively
soon, have a compiler that produces a compiled executable of a dynamic
language such as Ruby that does not require a VM or interpreter to be
run (outside of the very loose definition of "interpreter" or "VM" that
might include your whole OS in that definition). The dynamic aspects of
the language would be handled by the executable binary itself, rather
than by an interpreter external to the program.

I'm not entirely sure how to explain what I'm thinking at this time so
I'm sure I get my point across. Hopefully someone who reads this will
get where I'm aiming, and may even be able to help me clarify it.


>
> I'm not sure JIT is "necessary" for efficient interpretation of Ruby
> anyway. But you're right ... if the economics is there, the gap will get
> closed, just like the compiler/assembler gap got closed.

There are things that Ruby allows that simply cannot be done without a
certain amount of runtime interpretation, with the possible exception of
the evolution of persistent compiled executable binaries described
above.

Chad Perrin

unread,
Sep 4, 2006, 6:32:07 AM9/4/06
to
On Mon, Sep 04, 2006 at 01:00:07PM +0900, M. Edward (Ed) Borasky wrote:
> Chad Perrin wrote:
> >
> > Frankly, iptables is easier to learn effectively than most proprietary
> > firewalls -- and then there's stuff like IPCop, which makes things even
> > easier.
> >
> When there are certified iptables engineers, I'll trust my business to
> them. Until then, I'm sticking with Cisco and certified Cisco engineers.
> When you post a job application for a sysadmin position, you're going to
> get at least ten times as many applicants as you need, so you can afford
> to *insist* that they be certified by Cisco, Microsoft or Red Hat as
> appropriate.

As someone with a combination of college, trade school, on-the-job, and
purely autodidactic education, with several certifications of my own, my
experience is that all certificiations really prove is A) a certain
minimum standard of test-taking competence that can be sized up within
five minutes of meeting someone anyway and B) a certain amount of money
invested in professional advancement.

. . with the exception that some certifications require certain resume
bullet-points before one is allowed to take the certification exam in
question (CISSP comes to mind). Considering one doesn't require a
certification to determine whether someone has such resume
bullet-points, however, that seems irrelevant.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

unix virus: If you're using a unixlike OS, please forward
this to 20 others and erase your system partition.

Devin Mullins

unread,
Sep 4, 2006, 5:55:15 PM9/4/06
to
Richard Conroy wrote:
> I would have thought that eliminating risk would be a job well done
> by someone responsible for Risk Management? No?
Not that I'm at all knowledgeable in risk management, but No. Breaking
up the sum total of possible outcomes into "profit" and "risk" is Bad.
There are all sorts of axes on which to measure success -- profit,
employee retention, positive reviews, server uptime, etc. -- all of
which can go in the plus or the minus, and none of which can be
predicted with certainty.

That is, I'd imagine a good Risk Manager has to balance Possibility of
Server Crash against Possibility of Profit, and *if* ____ can greatly
increase the Possibility of Profit at the cost of only slightly
increasing the Possibility of Server Crash, it might be worth pursuing.

> I am seeing an awful lot of chatter here along the lines that technology
> decision makers are insipid jobsworths who fall in line behind the big
> tech brands because they are afraid to stick their neck out.
Well, there's a mix, obviously. I happen to work for an organization
that doesn't do a great job filtering out the insipid ones, but there
are still some non-insipid ones around.

> The concerns are not that questions exist, but that the questions are
> not being really well answered. Some concerns that I have about RoR:
> - lack of good success & failure case studies with lessons learned
1. Have you read these two articles from Relevance, LLC?
http://tinyurl.com/j63la
http://tinyurl.com/h6f5g
2. What such case studies have you read about the other options you're
considering?

> - library (Ruby) and plugin (Rails) immaturity
> - library portability
> - what happens to productivity when you go outside the rails problem domain
> - how narrow is that problem domain (how easy is it to overstep)
> - how forgiving is the technology, if you make mistakes/bad assumptions,
> how easy is it to recover
> - immaturity of tools
Well, it's no wonder those questions aren't being answered. They're
ridiculously vague. Ruby/Rails libraries/plugins/tools are all over the
map -- some are mature; some are immature; some are mature but
incomplete; some are complete but immature; some have been tested on all
types of systems; some only work on a POSIX environment; some work on
all environments, but aren't really tested on Win32; some are fast; some
are slow; no two libraries are not on fire.

> - what happens (to productivity/performance) when your rails apps
> need to do wierd stuff like bolt-on SNMP processing ruby-code
Never had to do SNMP processing, sorry.

> - deployment of Rails apps/bundling rails apps
There are LOTS of case studies of this. Assuming you've read them, the
next step is to deploy something and find out.

> - international support
I hear tell of a number of multilingual Rails apps. I don't have to deal
with i18n, myself.

> I am happy enough with a lot of these issues to go with a Rails solution
> for something non-critical or prototyping. But I can't in good faith bet
> the project on it.
That's probably a good bet, especially if your employer's got no Ruby
talent on hand. I would't wait a year; just measure the success of your
own non-critical app.

Devin

Richard Conroy

unread,
Sep 6, 2006, 5:26:48 PM9/6/06
to
On 9/6/06, Devin Mullins <twi...@comcast.net> wrote:
> Long post! Ack!

Yeah, research and reqs gathering has pimped out my typing
skills. I have written more english than code this year. :sob

> So... the 50-100% extra time is okay, as long as it's known up front?

If its known, and can be planned for, and identified risk is either low,
or has had backup plans for, then its generally okay. Unless you
are in a situation where you are making software commodities
(like yet another social networking site) and productivity converted
into exclusive features is what drives your business, the productivity
hit is okay.

Or at least thats project manager think, that a low-scalar productivity
hit is fine as long as there is no additional hit due to risk.

I don't tend to agree. Once a project schedule extends beyond a low
single digit number of months, it turns into pure fantasy. Also software
schedules are like fly paper - they longer they are, the more shit
sticks to them. 'Completed' software attracts better change requests
than incomplete software.

When the conception date and the ship date become very far apart,
project sponsors forget their original reasoning behind a feature request.
You can bitch about marketing all you want, but they are just human
too.

I have done enough agile work in Java to appreciate on a surface level
what Ruby/Rails can do for you. *I* believe the productivity boost is
extremely important, and the time saved can be applied to polishing
the final product or introducing more features, or as a hedge against
possible risky areas of Ruby/Rails (say no secure SNMPv3 support)
where we might have to wrap a java library or something. Under those
circumstances, risk assessment is about showstoppers like:
- you cannot accomplish something in Rails *at all* - you have overstepped
the current capabilities of the framework and Ruby has no good fallback, or
existing functionality is not 100% ready for primetime or your needs
(e.g. crypto stuff, i18n, enterprise libraries, install on a specific
platform etc.)
- its single CPU performance of a properly optimised solution is 'not
good enough' over a java equivalent
- Rails app distribution to customers is hard to achieve well

These are *my* risk concerns. I am listing them, not stating them as
facts. But the other concerns I am familiar with too, as I know how my
peers and superiors think, and they are the kind of questions they will
ask. So I look out for them - even though I have personally confirmed
them as non-issues, I need to have a prepared defense against them.

> Why not just pad the extra 50-100% for the Ruby estimate, and just spend
> the last few weeks partying when you finish early? :P

lol! They would probably get suspicious as my tan started to improve all
the way up to the march delivery date ....

> >> 2. What such case studies have you read about the other options you're
> >> considering?

> My question had an agenda. I meant: leading up to the moment you picked
> [Java, I presume], what case studies had you read about its use? Just
> trying to scope out for any double-edged swords. Sorry; I was cranky
> yesterday.

Heh. I am thick skinned - I use up all my irritability on marketing, heated
responses don't phase me.

And to subvert your agenda, its probably because of the Java situation
that people are sensitive about technology choice. During Java's early
adoption phase there was really nothing like it at the time, and there
was a total upheaval in software development as this web thing started
to become a platform.

Java eventually matured into a successful language, after burning through
the hype bubble. Once the smoke cleared people realised that WORA
didn't mean Java Office, or Java OS, but in fact meant that java code
would run on whatever you ported your runtime to. Once it failed as a
consumer GUI, people moved on to what it was really good at, like
security, i18n, network software, development solutions and web apps.

But people really got hurt in the 1.1 era. It had enough language/api
features to be useful, but it hurt to develop in. As a result people are
anal about technology choice. Of course now we actually *have*
choices to be anal about.

And if they ever got burned on a development choice before they can
go ultra-conservative. There is a perception that Rails may have certain
weaknesses. Many are not true, some are. Not all the false ones
are being properly dismissed as rubbish, and some of the true ones
are not being debated enough for them to be quantified.

And joel is right: technology steering committees are a useless
waste of time.

> Well, at RailsConf, I talked to a guy who'd never programmed in his life
> before Rails, and he said that within 2 months of picking it up, he'd
> deployed an app to a customer. *shudder*

Nods. I was very fascinated by how it promotes best practices. It removes
unnecessary choices from you, like where your view code goes, where to
put your utility code (helpers) and wiring code (controllers), and how to
break up html generation usefully (partials & components).

Hell, once I am up to speed I am thinking of teaching it to my Dad, who
close to retirement age is considering learning how to do some development.
In the interests of prolonging his existence on this planet, I think Ruby/Rails
is an excellent choice. Its got practical applications, it has a short feedback
cycle for learning, and principle-of-least-surprise is a real phenomenon and
not language-fanboy-speak.

Also - imagine if your customer isn't too fussy and is happy enough with
scaffold code ... it boggles the mind at how far you could take it and how
many people you could get to do it. The hardest thing about that kind
of software development is finding more customers like that.

> Hrm. The only thing I really recall breaking a whole bunch is Engines.
> That said, through experience developing some of these apps, I've become
> much more conservative of what plugins I use. I wrote my own tagging
> code; were I to do it again, I'd write my own user/password code; etc.
> Not so much because of Rails upgrades causing breakage, but because the
> plugin implementations turned out to be flawed/buggy (read: poorly tested).

Well this is what I call a legitimate concern, but it is also something that
I think is hugely overblown by jittery decision makers. Considering the
speed of Rails development, by the time you would start to encounter
plugin issues, say 4-6 months into the project those issues may have
been resolved, or replacement plugins may have arrived. Hell, even swapping
out the standard interpreter with JRuby might address it as you just use an
equivalent mature java library instead.

The discussion I read here about new Rails versions breaking plugins was
more about the philosophy of Rails development - that maintaining
backwards code compatibilty wasn't as primary a concern as it is in
other frameworks. Thats the digest, I don't know how true it was, but
it has now added to my Perception(tm) of rails. I have looked, and
identified that I need Globalise or an equivalent, and BackgroundRb.
And write all my own code.

> > to. People with an eye for risk take that pretty seriously - they
> > expect security issues, but not rock-and-hard-place conflicts like
> > that.
> That's true, and that's one of the ways in which Rails needs somewhat
> guru coders -- ones who test their app thoroughly, and are able to patch
> the broken spots when they come up.

Theres also a concern that if you have written something for a customer,
you need a way to update them or patch their install. Thats the flip side
of my deployment concern: automatic updates to software.

>
> >> Ruby/Rails libraries/plugins/tools are all over the map

> > i.e. all over the place - that makes it risky
> Same could be said about any language, no?
>
> > I don't see enough libraries
> > that I am likely to depend on, at a mature enough version to warrant doing
> > anything mission critical with them. Theres a lot of libraries that require
> > binary installations that are unavailable on all platforms.
> Ah, yeah, can't help you there -- I haven't needed much in the way of
> libraries -- XML parser, HTML parser, etc.. I might be able to help you
> with the second part in a few days -- I'm finally getting around to
> profiling, and hence need to compile Shugo's or ZenProfiler for win32.

I guess wait one year. A lot of this is due to the 'enterprise-style' APIs
coming into Ruby fairly recently (so new the docs aren't up to date
e.g. HTTP class).

> > write your own equivalents (which will wipe
> > out your rails productivity boost).

> Well, not according to the Relevance folks, but I admit, they seem
> pretty adept.

Being at their level is the 'goal' you might say - and my guess is that
a serious showstopper in a library or plugin could wipe out the productivity
boost for an organisation getting into Rails. Which is what I am primarily
concerned with.

> I'm confused... are you asking if the Ruby language is as quick to
> program in as the Rails framework, or if the framework is fast despite
> the language? Or are you talking about writing a Ruby extension?

I am not talking about productivity boost here. I am looking at a
legitimate scaling concern. If you need to do something strange
in Rails you simply write Ruby code in your helpers that does it,
or write a funky plugin.

However I was raising a concern that perhaps Rails is only highly
scaling if you don't do this or are otherwise very careful about how
you introduce pure-processing code like this.

> In any
> case, I don't know if you are going to get metrics much more specific
> than the Relevance posts. People seem pretty guarded about their own
> professional productivity. Probably a little productivity abritage going on.

Yeah they were good. And refreshingly honest. I have been through
several hype bubbles before and sycophants do more harm than
good.

> Interesting. Well, Rubyscript2Exe is supposed to do just that. Never
> used it, but seen an example package being run. A little slow to start up.

Deployment is way more involved than that, but its a start. An interesting
way to look at what I mean by deployment:
"What would be involved to make your Rails app installable by your Mom"
Basically going from nothing (and I mean nothing, you can't expect
Rails knowledge or domain knowledge on the part of the person installing it)
to 100% working Rails app.

You also need to consider the update case as well, like Windows updates.

Once this is a solved problem, you can address all the intervening problems,
like advanced/customisation by someone who knows their stuff and wants
to use their existing web server/database.

And then you address the remote install problem, and look into whether you
need to do anythign wierd for virtualised systems.

> > I don't have a problem with working on Rails myself, I always figured
> > that if
> > I could get a functional prototype of what our main app is doing,
> > working in
> > Rails, it would be a good start. But its all these edge conditions that
> > would
> > kill Rails adoption - and I don't have the time to both address/investigate
> > the concerns
> Ah. Well, I might, were I working for you.
>
> But I'm not. :D

What bugs me is that in order to make a convincing pro-Rails argument
I would effectively have to write the app (software that manages other
software agents on the network) in its entirety. While Rails
has excellent productivity improvements its not *that* good, not for someone
like me who is learning still. While I am tantalisingly close to getting
certain aspects working in some rough code (even cheating: communicating
with the web-enabled software agents by scraping their UI with
WWW::Mechanize or even WATIR) the only way to determine if there are
no i18n issues is to Globalise it, and the only way to address how
installable Rails is, is to make a 1-click installer. And thats not fun, at all,
though WIX might help.

More seriously though is the investment in process-type stuff (tieing in
with our automated build system/auto testing etc.)

Yeah, just one more body would make the difference. The scary thing is
that the steps required to rapidly prototype a java app usually mean
that the code is only fit for burning afterwards. But the difference between
a Rails prototype and a proper app is some refactoring and going back
to write the unit tests.

William Grosso

unread,
Sep 3, 2006, 11:47:30 PM9/3/06
to
Vidar Hokstad wrote:
>
> When we started hiring those concerns were validated: It's proved
> extremely hard to find people with Ruby experience. While it's
> certainly getting easier rapidly, not everyone can afford to take the
> risk. In our case I decided to start phasing Ruby in for small self
> contained components in our backend, and gradually take it from there
> as we get enough Ruby skills through training or hiring, which has
> proven to work well and meant that in the event that we'd run into
> unforeseen problems, the effort involved in switching back to another
> language would have been limited.
>

This leads to an interesting question: how many ruby programmers are
there, anyway?

I ran across http://sanjose.bizjournals.com/sanjose/stories/2006/08/28/daily1.html
today and boggled at the "2.5 million" number for PHP.

Any ideas for Ruby?


Bill


M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 12:23:16 AM9/4/06
to
Chad Perrin wrote:
> On Mon, Sep 04, 2006 at 02:13:24AM +0900, Alex Young wrote:
>>> Speaking purely theorethically, Ruby can not be made as performant as
>>> Java or C# could be made if they had ideally performing implementations.
>>> Latent typing makes it almost impossible to do certain optimizations as
>>> static typing does. That's pure fact.
>> I remain unconvinced by this - and it's mainly JIT optimisation that
>> keeps me on the fence. Dynamic optimisations can beat static - but not
>> in all cases. I believe this is what one calls an "open research" question.
>
> Unfortunately, JIT implementations haven't been subjected to the same
> long-term scrutiny and advancement as more traditional persistent binary
> executable compiling implementations. As a result, I don't think the
> state of the art is there yet -- leaving JIT implementations effectively
> slower by nature until they get some more advancement over the years to
> come. I really believe that gap will be closed rapidly in the near
> future. Only time and experience will tell whether it can be made as
> fast or faster, though I have no doubt that it can at least be made
> close enough that most of us won't care.

In the "good old days", an assembly language programmer could turn out
code that was from 2 to 10 times as fast as that turned out by a
compiler, and a compiler could turn out code that was from 2 to 10 times
as fast as an interpreter.

The gap has narrowed. It's rare that an assembly language coder can beat
a compiler by more than a factor of 2 these days, and on some
architectures it's a dead tie -- there's only one way to do something
and the compiler always finds it. Interpreters are better now too,
mostly because today's languages have such a large component that has to
be dealt with at run time anyway that the "heavy lifting" is done by
compiled code.

Chad Perrin

unread,
Sep 6, 2006, 2:59:35 PM9/6/06
to
On Thu, Sep 07, 2006 at 01:54:44AM +0900, Richard Conroy wrote:
>
> Time saved is not the only thing that libraries provide. For instance, you
> may simply not have the expertise or business mentality for library
> development. e.g. Marketing, management and other project stakeholders
> cannot determine if a library is 'good' or 'done' or whatever - a library is
> a software product whose customers are developers. Libraries generally
> require higher quality than the products that use them too.

Mostly, they require higher quality APIs. Everything else can be
changed.


>
> I know for instance that I would smack anyone at work who suggested
> we write a(nother) security library. I have been in companies where
> our library choice was something that was discussed in sales pitches.
> If you use something sensitive like a credit card payment processing
> engine, customers might be sensitive to what libraries you use to access
> it (the vendors, an established third party etc.).

Good point(s).


>
> There are also many companies where library authoring may be a complete
> anathema. They don't know how to fit it in with their business processes
> (like testing) or how the life expectancy and support of a library outlasts
> its
> first application, and how it gets support. There may also be significant
> developer fear about having to author and effectively support something
> like that.

I'm not terribly sympathetic to unreasonable fears, I'm afraid. I am,
however, sympathetic to people who have to labor under conditions of
unreasonable fear engendered by the ignorance of supervisors, which is
almost the same thing.


>
> Also consider this: the Ruby/Rails environment has probably more active
> library development per capita (of developers) than established languages.
>
> I know in Java,
> the quality of the basic libraries is excellent, and it is worth checking
> out
> virtually anything that the jakarta crew work on, as they are especially
> good. But ironically, this means that library development skills for your
> average java programmer have severely atrophied through lack of use, and
> if they then have to write equivalents in Ruby.... that ramps up your risk
> completely. I know for a fact that when I am presented with an in-house
> java library my first reaction is 'aw crap'. I have been pleasantly
> surprised
> on occasion, and I work with good people now, but there is something
> about Java that attracts the worst kind of well-intentioned designer - the
> kind that starts developing meta-solutions instead of addressing the
> given problem.

I'll take your word for it. I'm not active in the Java community, and
would probably be stoned as a heretic if I was (nouns bore me, and an
entire language devoted to writing in passive voice never struck me as a
very good idea).


>
> >It strikes me as a bad idea in general to pursue edge cases in
> >frameworks.
>
> Well I wasn't distinguishing between Ruby or Rails here. I wasn't
> requesting that Rails accomodate edge
> conditions - the correct approach to edge conditions in Rails, is
> to down shift to Plain Old Ruby. This is pretty obvious with its
> 'helper' hook, and is well stated in the good literature, and 'Recipes'
> in general are implemented this way.
>
> However, my point was that using Ruby *IN* Rails in this manner
> could kill your scaling. Its another one of those rock-and-hard-place
> situations - the implication is that only 'sweet spot' Rails apps
> can truly scale. Whether Rails scales up to MySpace levels is
> another thing, and whether you need twice or ten times as many
> machines is another. But *I* don't particularly care about those cases.

MySpace doesn't even scale to MySpace levels. It's written in CFML, for
crying out loud.


>
> I do care about single machine performance though.
>
> >SNMP, I realize, is not an uncommon case, but it's uncommon enough for
> >web app development that expecting a web development framework to do the
> >heavy lifting for you is a somewhat odd demand, I think.
>
> I try to avoid the thing as much as I can, doctor's instructions. Its a fine
> protocol for what it was intended for, but the only uses of it I ever see
> are when people build insane application protocols over it.

Maybe that's the problem. All I know for sure is that I've never run
across a situation where my first thought was "Y'know, I could solve
this by writing an app that leverages SNMP."


>
> I doubt if I am the first person to think about using Rails easy machine
> parallelism to split up the workload involved when a couple of thousand
> SNMP agents start screaming at you all at once.
>
> This problem alone is what will dictate the shape of our new app. It is
> not a solution we have ever solved properly before.
>
> I would get a perverse satisfaction from using 'risky/slow rails' to solve
> our most persistent scaling problem. An ActionSNMP plugin would be
> quite cool indeed.

I'm curious about how that goes, whether you end up using Rails or not.


>
> >I'm entirely with you on this: web application framework advocates, for
> >any framework in any language, seem to believe that the developer will
> >always be the one deploying and that deployment will be accomplished at
> >a server where the developer has complete access and control. There
> >isn't nearly enough attention on the problem of removing control
> >characteristics and direct access capabilities, or even passing on
> >deployment to someone else entirely who wasn't part of the original
> >picture at all.
>
> Well Rails is new, and a moving target. The audience for Rails solutions
> is web developers, and its original core audience was probably the
> Perl, PHP, ASP community (I am guessing) who had direct access to
> production servers.

There is a startlingly high number of web developers out there who do
the sort of consulting work that does not guarantee direct access to
production servers, unfortunately.


>
> The success attracts other kinds of web developers with indirect server
> access. I don't expect the Rails binary distribution problem to stay
> unsolved for long. Instant Rails is an early step in that direction, and
> I don't think the problem is particularly hard anyway - you just bundle
> your gems and binary dependencies together. Once you are dealing
> with say, using existing MySQL databases you are automatically
> dealing with a knowledgeable customer, and you can invisibly install
> something like sqlite for the technofearful.
>
> Rails is also attracting attention from weirdos like me who are seeing
> what else we can do with it, and who see the lack of convenient
> source deployment methods a problem. Also the lack of an established
> build process is a bit step backward. As we have gotten very used to
> 100% non-manual build processes that do lots of non-build activities
> too (like source analysis, unit tests, product watermarking). Not all
> steps are applicable to Rails, but Rails does add some (deploy to
> test server & test) but the principle remains the same.

When a Rails app deployment process consists of unceremoniously dumping
a bunch of files into a directory across an SFTP connection, then it
will be ready for prime time in the indirect-access market. Until then,
it's still pretty much all Perl and PHP. That's how it looks to me, at
any rate.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

Brian K. Reid: "In computer science, we stand on each other's feet."

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 3:25:36 PM9/4/06
to
Chad Perrin wrote:
>> Perhaps you're thinking along the lines of Lisp or Forth, where an
>> application is layered on top of the entire compiler/interpreter/runtime
>> package and then saved as an executable. As far as I can tell, there's
>> absolutely no reason this couldn't be done for Ruby. IIRC that's also
>> the way the Squeak Smalltalk environment works and the way Self worked.
>
> No . . . that's not quite it. Maybe a really bad diagram will help.
>
> interpreter for a dynamic language:
> |--------------------------------------------------|
>
> interpreter capabilities exercised by a program in a dynamic language:
> |++++++++++++|
>
> compiled static binary for an equivalent program from a static language:
> |++++++++++++|
>
> combination static/dynamic compiled binary from a dynamic language:
> |+++++++++++----|
>
> . . . roughly.

You can usually do something like this in Forth. As you're developing,
you save off the whole enchilada (the Forth interpreters and compiler
and assembler, along with your application code, all of which reside in
the dictionary) as an executable. When you're ready to release the
application, you take a special pass and strip out everything your
application doesn't use, getting a smaller executable that only contains
the pieces of the Forth environment needed to run the application.

I haven't spent any appreciable time inside either Common Lisp or
Scheme, or for that matter Ruby, so I don't know how this would work in
any language except Forth. Maybe what you want is as "simple" as
implementing Ruby on top of Forth. :)

> There would likely be more binary size necessary, but considering that
> even an interpreter is (generally) a compiled binary that just operates
> on input, I don't see any reason to assume we cannot cannot compile
> dynamic language code into a persistent binary with accomodations made
> for the parts of the program that require runtime dynamic behavior.

No reason it can't be done. The question is only "should it be done?" :)

> This strikes me as a superior approach to a JIT compiler/interpreter
> approach like Perl's, a pure interpreter approach like Ruby's, or a
> bytecode compilation plus runtime interpreter VM like Java's, for
> performance.

Java also has JIT, of course. Curiously enough, someone once told me
that if I looked at the JVM carefully, I'd see Forth. :)


> Add to that the potential increased performance for some
> parts of a program written in a more dynamic language something like the
> following might actually run faster than the equivalent compiled program
> I diagrammed above:
>
> |+++++++--------|
>

> . . . depending on how well those dynamic bits (represented by the


> hyphens) optimize at runtime for a particular run of the program.

Well ... maybe we should leave that to the chip? :)

Richard Conroy

unread,
Sep 5, 2006, 1:36:02 PM9/5/06
to
On 9/4/06, Devin Mullins <twi...@comcast.net> wrote:
> Not that I'm at all knowledgeable in risk management, but No.

Risk management isn't really about that. Every time I propose a
design I am exercising risk management. One way of measuring
that design's quality is by how much it addresses, minimises or
identifies risk.

I have worked in bleeding edge technology for most of my career
so you get a 6th sense hammered into you for likely risk sources.

The fact that one solution may take 50-100% longer to implement
isn't necessarily a big risk. But when a library or technology that
is critical to your infrastructure exposes its project-breaking flaw
you are in serious sh1t, as unless there is a feature equivalent
non-broken alternative, or you can write your own conveniently, your
project schedule is out the window. Your initial schedule is nothing
compared to the time overruns these kinds of showstoppers
introduce.

> Well, there's a mix, obviously. I happen to work for an organization
> that doesn't do a great job filtering out the insipid ones, but there
> are still some non-insipid ones around.

I work for an organisation where there are a lot of smart decision makers,
and there are a lot of otherwise smart decision makers who have lapses
of judgement (often during committee style design). The best stuff here always
happens when the number of designers is low and the scope is controlled.

We don't have a problem with technology experimentation,
but we are officially in an area that I would term 'hard software development'
- regardless of how much time and skill we have available, it is not always
possible to deliver something better than a 'merely acceptable' solution.
There are often too many elements in the picture, and too many cruddy
interfaces. So there is a strong tendency to default to conservative options.

> 1. Have you read these two articles from Relevance, LLC?
> http://tinyurl.com/j63la
> http://tinyurl.com/h6f5g

Just now. He confirms what I suspected about Rails productivity gains.
I knew the 10x thing was pure bull, but as far as I am concerned, shaving
30%-50% off an equivalent Java solution is actually a *great* result.

I look forward to his discussion on Rails outside the sweet spot.

Though there is a bit too much zealotry.

> 2. What such case studies have you read about the other options you're
> considering?

I have read this negative one:
http://rationalist-manifesto.blogspot.com/2006/07/web-applications-vs-web-sites-ruby-on.html
I think the article sucked though. He raised a legitimate concern about
Rails outside the sweetspot, based on a very very flawed idea of his. Its clear
he tried somethign stupidly outside the sweetspot and then concluded that Rails
outside the pale was not wise. He didn;t back up his findings at all
or anything.
Frustrating. I would have liked to see the point made by someone objective,
and I would like to have seen actual findings.

Mostly what I am concerned about are War stories
- case studies of Rails adoption, even from companies that were not strong
in web development
- library gotchas
- Java conversion stories
- application of Rails outside of classic web-site problem domains

<snip>


> Well, it's no wonder those questions aren't being answered. They're
> ridiculously vague.

I don't think so - these are questions that get commonly debated
in other platforms. Many are deal breakers or otherwise a massive
source of risk.

I have seen reports on this list that Rails churns its plugins, that
correct plugin operation is not guaranteed as it matures. This could
lead to issues where you are dependent on plugins for essential
behaviour like Background Ruby, or Globalize, and a 1.8.5-style
security fix comes along that you cannot safely or quickly upgrade


to. People with an eye for risk take that pretty seriously - they
expect security issues, but not rock-and-hard-place conflicts like
that.

Ruby/Rails libraries/plugins/tools are all over the


> map -- some are mature; some are immature; some are mature but
> incomplete; some are complete but immature; some have been tested on all
> types of systems; some only work on a POSIX environment; some work on
> all environments, but aren't really tested on Win32; some are fast; some
> are slow; no two libraries are not on fire.

i.e. all over the place - that makes it risky - I don't see enough libraries


that I am likely to depend on, at a mature enough version to warrant doing
anything mission critical with them. Theres a lot of libraries that require
binary installations that are unavailable on all platforms.

Libraries is life. And the greatest source of risk generally, developers like


libraries that they can trust. If you can't trust them you have to
limit yourself

to problems that do without, or write your own equivalents (which will wipe


out your rails productivity boost).

> > - what happens (to productivity/performance) when your rails apps


> > need to do wierd stuff like bolt-on SNMP processing ruby-code
> Never had to do SNMP processing, sorry.

Well lots of people do, and the question still stands as to what happens


to the boost if you have to do any significant Ruby code processing in
Rails (non-SNMP). I have got some sloppy partial code around that
builds up a DIV graph and there is a perceptible delay in the drawing
of the page that uses it. Admittedly its under worst case scenarios, but
it does strike me that you really don't want to be doing anything funky
in your controllers at all.

> > - deployment of Rails apps/bundling rails apps


> There are LOTS of case studies of this. Assuming you've read them, the
> next step is to deploy something and find out.

I am not talking of the classic situation of where you own the server or


can directly install on in customer equipment. I am talking of scenarios
where you have to produce a windows/Mac OS/Linux installer/packager,
that will extract out a working Rails Apps with dependencies and
minimal interaction from the user.

I would kill to read a step by step example of this - from source control
to OS-specific-user-friendly installers. I have been looking a bit at
Capistrano, I haven't delved deeply enough, but it doesn't seem to
go the full distance that I am talking about. And if I am not mistaken,
the author has stated that its not a path he will pursue

I saw another article that was pretty cute too - a guy had a tutorial
that went the long way round of achieving a Rails app bundle. Didn't
go the full distance but it was pretty impressive. Sorry I can't recall the link
or the authors name, but I would say many people here know the
article I am talking about.

> > - international support
> I hear tell of a number of multilingual Rails apps. I don't have to deal
> with i18n, myself.

Not something we can ignore - and we get all the hard languages to
deal with - Chinese simp./trad and Japanese, and while most european
countries or european language customers are OK with our enterprise
solutions in english, that disappears when you start dealing with governments.

Globalize seems sufficient to the task, and honest about what it delivers.

I have done some research and it seems to clearly delineate between
'for free' behaviour and stuff you have to mess with yourself (date l18n
controls, bi-directional text), and it has hooks for when you need
to redirect to locale specific views & templates. And maybe locale specific
partials too. Ideally you want to minimise the amount of view localisation,
but sometimes thats cleaner.

I am just wondering about what its resource input formats are like, whether
its strict about them, whether it supports insertion points and whether you
can build resource formats by hand or dynamically (converting an existing
XML format into a Globalise format). Also I need to look at whether
there is a lookup
cost (I think its one-time-only).I guess I just have to poke around
with the system
more and have a look at the formats.

I got an impression that it was something of an opinionated plugin - there
was a right way to use it, and misuse could get you in trouble. That seems
par for the course with Rails though.

> > I am happy enough with a lot of these issues to go with a Rails solution
> > for something non-critical or prototyping. But I can't in good faith bet
> > the project on it.
> That's probably a good bet, especially if your employer's got no Ruby
> talent on hand.

I am working on the lack of Ruby talent, by well, educating me at least.
Rails isn't hard, but its got a big domain learning curve (which people confuse
with Rails learning curve if they are fretting), but Ruby is easy to pick up
once you have recognised its key differences from your background. Ironically
the volume of Ruby online info slows this process down - it dilutes access
to those hardcore reference guides and total n00b material.

In fact, if you are doing any kind of web development, you should be
checking out WATIR anyway, which is a Ruby primer, and will address
your lack of Ruby talent pretty quickly.

> I would't wait a year; just measure the success of your
> own non-critical app.

Not enough non-critical projects, we have longer project durations. I am trying
to fit in Rails prototyping between all the wierd req.s gathering I am
doing now.

Chad Perrin

unread,
Sep 5, 2006, 6:42:30 PM9/5/06
to
On Wed, Sep 06, 2006 at 06:41:45AM +0900, Peter Booth wrote:
> I think he makes an important point that choice of technology is rarely a
> proximate cause of project failure. ( EJB & Corba being the two largest risk
> I can recall.)

If he makes that point, he does so by accident. His main point is that
technology (in terms of the ecosystem surrounding the technology as part
of it) is indeed a cause of failure, and you should thus stick with the
technologies "everybody knows" are "safe", to avoid failure.

. . assuming that by "he" you mean "Joel Spolsky".


>
> I love Java but I wouldn't use it to write application (cf plumbing) code,
> if I were paying the bills.

I wouldn't use Java for much of anything unless someone were paying me
to do it and wasn't open to alternatives. That's in large part a matter
of personal preference, though -- I don't like Java very much.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

Chad Perrin

unread,
Sep 4, 2006, 6:54:53 AM9/4/06
to
On Mon, Sep 04, 2006 at 12:37:03PM +0900, Vidar Hokstad wrote:
>
> Joseph wrote:
> > Although I respect Joel very much, I believe he makes a fundamental
> > mistake in his reasoning.
> >
> > Basically what he is saying can be deconstructed this way:
> >
> > * Do not risk developing in new cutting edge technology. Even if
> > successful proof of concepts are already out there (37 signals et. al)
> > * Use what most people use: PHP / J2EE / .Net not what most experts
> > tell you to use. Communities and support are paramount.
> > * Corporations and the people in those organizations favor safety, if
> > your job is on the line go with the tried and true. Take no risks.
> >
> > All three assumptions rely on a single assumption: FEAR.
>
> No. They rely on sound risk management principles.

One might say that's just euphemistic phrasing. I'm not prepared to
make such an assertion at this time (I'd like to think about this a bit
more before doing so), but it does occur to me as a possibility.


>
> > * Fear the technology would eventually not deliver.
>

> Replace "Fear" with "Risk" and the above is reasonable if your company
> does not have people experienced in a particular technology. And fact
> is today it is still far harder to find people skilled at Ruby than
> many other languages. More importantly, there is too little experience
> with many Ruby technologies for a company with no Ruby experience to
> _know_ whether Ruby will be appropriate for a specific project.

This brings us to the "real" problem:

Decision makers need to know something about the technologies to be able
to make the "right" decisions. One cannot effectively expect that any
particular decision is more or less likely to be a good one unless the
decision maker actually knows the options at hand. In other words, Joel
Spolsky's advice about choosing "proven" technologies is nonsense: the
real advice should be "Choose from among technologies you know. If you
are not an expert at all the options that sound good, learn enough to be
able to make an informed decision. Failing to do so does not guarantee
that you will make the wrong decision, but it does guarantee that you
will make your decision for the wrong reasons. Period."

In other words, every time a nontechnical manager is given the
responsibility of choosing a programming language and/or framework for a
project, someone has screwed up. How can (s)he possibly evaluate the
available technologies, or even the advice (s)he receives about them
(whether from employees, friends, consultants, or Joel On Software) to
be sure it's not a load of hooey without knowing the technologies
personally?


>
> > * Fear the support will not be sufficient.
>

> Replace "Fear" with "Risk" again. The company I work for, Edgeio, uses
> PHP for our frontend code (but Ruby for our backend) because when we
> started building it I had concerns about the availability of people
> skilled with Ruby in general or Rails in particular.

Sure, Java and PHP programmers are a dime a dozen -- as long as you're
willing to settle for a dime-a-dozen quality programmer. If you want
programmers that are worth their paychecks, however, you significantly
narrow the field no matter what the language you're using. Considering
the learning ability and proclivities of excellent programmers, however,
I rather suspect that you'll find as many excellent programmers who know
"exciting new languages" as "boring old languages", Considering the
direction language design has been going lately, "exciting new
languages" are generally easier to learn, too. This means that if you
choose to hire for excellence over familiarity with a given language,
you're just as likely to find yourself constrained to choose an
excellent C programmer over a poor Java programmer as you are to choose
an excellent C programmer over a poor Ruby programmer -- but if you're
working with Ruby, your excellent C programmer will probably pick up the
language faster.

I guess what I'm saying is that you're probably better off choosing
excellent programmers and the language that works best, technically
speaking, for your project. Choosing a language for which programmers
are a dime a dozen regardless of technical merit is more likely to leave
you with crappy software development, lightning-fast employee turnover,
or (more likely) both.


>
> When we started hiring those concerns were validated: It's proved
> extremely hard to find people with Ruby experience. While it's
> certainly getting easier rapidly, not everyone can afford to take the
> risk. In our case I decided to start phasing Ruby in for small self
> contained components in our backend, and gradually take it from there
> as we get enough Ruby skills through training or hiring, which has
> proven to work well and meant that in the event that we'd run into
> unforeseen problems, the effort involved in switching back to another
> language would have been limited.

Define "experience". If by "experience" you mean "has spent ten years
developing enterprise applications in the language", darn right it would
be more difficult to find people with Ruby "experience" than many other
languages. If, on the other hand, you mean "has demonstrated aptitude,
Ruby skill, and programming wizardry likely to prove to be an
unequivocal asset to your team", you're probably looking in the wrong
places (since you're unlikely to find that in college internship
programs, where all they teach anyone is Java and .NET).


>
> > * Fear regarding your job safety as a corporate developer or manager
> > who chooses Ruby or Ruby on Rails for some mission critical project.
>

> Which is very valid if you make a choice detrimental to the company,
> regardless which language it involves. As much as I love working with
> Ruby, if someone working for me picked it for something inappropriate,
> and the project tanked badly, they certainly would have to take the
> responsibility for the language choice. If you don't have Ruby skills,
> or your team doesn't have sufficient Ruby skills, or there aren't
> enough skilled Ruby developers available in your location, picking it
> for a high risk project will certainly not speak to your favor with any
> risk

The problem is where people fear for job security based on choosing a
non-conservative technology, rather than for choosing an inappropriate
technology. Many people would never (under current conditions) choose
Ruby over Java, even if guaranteed that the project would be completed
with 110% requirements satisfaction within two months for Ruby or with a
10% chance of project failure, a 90% requirements satisfaction rate if
"successful", and an eighteen month development time for Java -- all
based on fear for job security. It's the "nobody ever got fired for
choosing IBM" syndrome. Even if choosing the conservative technology is
the Wrong Answer for the task at hand, it will be considered the Right
Answer for job security by a great many people.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

"Real ugliness is not harsh-looking syntax, but having to
build programs out of the wrong concepts." - Paul Graham

Chad Perrin

unread,
Sep 4, 2006, 7:03:13 AM9/4/06
to
On Mon, Sep 04, 2006 at 01:01:53PM +0900, Alvin Ryder wrote:
> David Vallner wrote:
> >
> > Speaking purely theorethically, Ruby can not be made as performant as
> > Java or C# could be made if they had ideally performing implementations.
> > Latent typing makes it almost impossible to do certain optimizations as
> > static typing does. That's pure fact. Of course, it's not saying Ruby
> > can't be fast enough - but there have been people with more experience
> > at the performance side of software development that talked much better
> > about that

>
> I'm not sure how fast or slow Ruby is but if it's as fast as Perl I'll
> be happy enough. Yes I know C is faster but I need fast development
> times too.

Based on what I've heard/seen/experienced, Ruby is generally somewhat
slower than Python which is, in turn, somewhat slower than Perl.

Generally. On average. For some definition of "average". One of the


nice things about Perl's relative maturity is the significant work that
has gone into performance concerns for the parser/runtime. I have no
doubt that Ruby will eventually match Perl for general-case performance,
but I also have no doubt that on the whole it has not matched it yet.

On the other hand, the difference is not so great that execution
performance is a factor I really take into account when choosing between
the two languages for a given project.

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 12:00:07 AM9/4/06
to
Chad Perrin wrote:
> On Mon, Sep 04, 2006 at 11:26:18AM +0900, M. Edward (Ed) Borasky wrote:
>> David Vallner wrote:
>>>> How about C#, well it runs in Windows and without serious and expensive
>>>> firewalls you just can't go anywhere near the Internet.
>>> You need to tighten off Unix-based servers too. Heck, there are even
>>> serious and expensive firewalls for Linux around too, because not
>>> everyone has an in-house iptables guru.
>> But everybody *should* have a *certified* Cisco engineer if they use
>> Cisco routers, for example. It's one of the costs of doing business.

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 12:28:24 AM9/4/06
to
Alvin Ryder wrote:
> Are you saying all languages yield the same level of productivity? If
> they aren't equally productive then how much more productive is Java
> over C++ or VB over assembler? Do you need "credible statistics and
> research" to answer the question?

*He* may not be saying all languages yield the same level of
productivity. But I'll say something similar: the productivity of
programmers depends more on their knowledge of the application area and
their *familiarity* with the development environment than it does on the
environment and language.

There are tools that can drag down an otherwise productive team, but
they tend to get discarded fairly quickly.

Devin Mullins

unread,
Sep 5, 2006, 11:08:31 PM9/5/06
to
Long post! Ack!

> The fact that one solution may take 50-100% longer to implement
> isn't necessarily a big risk. But when a library or technology that
> is critical to your infrastructure exposes its project-breaking flaw
> you are in serious sh1t, as unless there is a feature equivalent
> non-broken alternative, or you can write your own conveniently, your
> project schedule is out the window. Your initial schedule is nothing
> compared to the time overruns these kinds of showstoppers
> introduce.

So... the 50-100% extra time is okay, as long as it's known up front?

Why not just pad the extra 50-100% for the Ruby estimate, and just spend
the last few weeks partying when you finish early? :P

>> 2. What such case studies have you read about the other options you're

My question had an agenda. I meant: leading up to the moment you picked
[Java, I presume], what case studies had you read about its use? Just
trying to scope out for any double-edged swords. Sorry; I was cranky
yesterday.

> Mostly what I am concerned about are War stories


> - case studies of Rails adoption, even from companies that were not strong
> in web development

Well, at RailsConf, I talked to a guy who'd never programmed in his life
before Rails, and he said that within 2 months of picking it up, he'd
deployed an app to a customer. *shudder*

> I have seen reports on this list that Rails churns its plugins, that


> correct plugin operation is not guaranteed as it matures.

Hrm. The only thing I really recall breaking a whole bunch is Engines.
That said, through experience developing some of these apps, I've become
much more conservative of what plugins I use. I wrote my own tagging
code; were I to do it again, I'd write my own user/password code; etc.
Not so much because of Rails upgrades causing breakage, but because the
plugin implementations turned out to be flawed/buggy (read: poorly tested).

> to. People with an eye for risk take that pretty seriously - they


> expect security issues, but not rock-and-hard-place conflicts like
> that.

That's true, and that's one of the ways in which Rails needs somewhat
guru coders -- ones who test their app thoroughly, and are able to patch
the broken spots when they come up.

>> Ruby/Rails libraries/plugins/tools are all over the map


> i.e. all over the place - that makes it risky

Same could be said about any language, no?

> I don't see enough libraries


> that I am likely to depend on, at a mature enough version to warrant doing
> anything mission critical with them. Theres a lot of libraries that require
> binary installations that are unavailable on all platforms.

Ah, yeah, can't help you there -- I haven't needed much in the way of
libraries -- XML parser, HTML parser, etc.. I might be able to help you
with the second part in a few days -- I'm finally getting around to
profiling, and hence need to compile Shugo's or ZenProfiler for win32.

> write your own equivalents (which will wipe


> out your rails productivity boost).

Well, not according to the Relevance folks, but I admit, they seem
pretty adept.

>> Never had to do SNMP processing, sorry.


> Well lots of people do, and the question still stands as to what happens
> to the boost if you have to do any significant Ruby code processing in
> Rails (non-SNMP).

I'm confused... are you asking if the Ruby language is as quick to
program in as the Rails framework, or if the framework is fast despite

the language? Or are you talking about writing a Ruby extension? In any

case, I don't know if you are going to get metrics much more specific
than the Relevance posts. People seem pretty guarded about their own
professional productivity. Probably a little productivity abritage going on.

> I am not talking of the classic situation of where you own the server or


> can directly install on in customer equipment. I am talking of scenarios
> where you have to produce a windows/Mac OS/Linux installer/packager,
> that will extract out a working Rails Apps with dependencies and
> minimal interaction from the user.

Interesting. Well, Rubyscript2Exe is supposed to do just that. Never
used it, but seen an example package being run. A little slow to start up.

> I don't have a problem with working on Rails myself, I always figured

> that if
> I could get a functional prototype of what our main app is doing,
> working in
> Rails, it would be a good start. But its all these edge conditions that
> would
> kill Rails adoption - and I don't have the time to both address/investigate
> the concerns

Ah. Well, I might, were I working for you.

But I'm not. :D

Devin
You made it! You win a prize! <insert MegaRace reference>

Chad Perrin

unread,
Sep 4, 2006, 3:48:14 PM9/4/06
to
On Tue, Sep 05, 2006 at 04:25:36AM +0900, M. Edward (Ed) Borasky wrote:
> Chad Perrin wrote:

[ snip a bunch of bad diagramming ]

>
> You can usually do something like this in Forth. As you're developing,
> you save off the whole enchilada (the Forth interpreters and compiler
> and assembler, along with your application code, all of which reside in
> the dictionary) as an executable. When you're ready to release the
> application, you take a special pass and strip out everything your
> application doesn't use, getting a smaller executable that only contains
> the pieces of the Forth environment needed to run the application.

That's at least darned close. I'd have to learn more about what exactly
it does to know how close.


>
> I haven't spent any appreciable time inside either Common Lisp or
> Scheme, or for that matter Ruby, so I don't know how this would work in
> any language except Forth. Maybe what you want is as "simple" as
> implementing Ruby on top of Forth. :)

Actually, now that I think about it, I wish something like that would be
what they'd do for Perl 6 instead of wedding it to a VM if they want
some kind of persistent compilation that doesn't preclude runtime
dynamism.


>
> > There would likely be more binary size necessary, but considering that
> > even an interpreter is (generally) a compiled binary that just operates
> > on input, I don't see any reason to assume we cannot cannot compile
> > dynamic language code into a persistent binary with accomodations made
> > for the parts of the program that require runtime dynamic behavior.
> No reason it can't be done. The question is only "should it be done?" :)

I certainly think so, if only to provide an alternative to the "worst of
both worlds" bytecode-VM approach.


>
> > This strikes me as a superior approach to a JIT compiler/interpreter
> > approach like Perl's, a pure interpreter approach like Ruby's, or a
> > bytecode compilation plus runtime interpreter VM like Java's, for
> > performance.
>
> Java also has JIT, of course. Curiously enough, someone once told me
> that if I looked at the JVM carefully, I'd see Forth. :)

It's a quite different approach to JIT compilation than Perl's, of
course.


>
> > Add to that the potential increased performance for some
> > parts of a program written in a more dynamic language something like the
> > following might actually run faster than the equivalent compiled program
> > I diagrammed above:
> >
> > |+++++++--------|
> >
> > . . . depending on how well those dynamic bits (represented by the
> > hyphens) optimize at runtime for a particular run of the program.
>
> Well ... maybe we should leave that to the chip? :)

That's sorta the idea.

Richard Conroy

unread,
Sep 4, 2006, 3:49:19 PM9/4/06
to
On 9/4/06, Joseph <jlhu...@gmail.com> wrote:
> Vidar,
>
> Risk Management IS NOT equivalent to FEAR, in that you are right.
>
> However, as I said earlier, no SIGNIFICANT progress can be expected
> without some risk. Risk Management is about dealing with risk, not
> eliminating it.

I would have thought that eliminating risk would be a job well done
by someone responsible for Risk Management? No?

I am seeing an awful lot of chatter here along the lines that technology


decision makers are insipid jobsworths who fall in line behind the big

tech brands because they are afraid to stick their neck out. ie. the only reason
they are not picking Rails is because they don't have the stones for it.

Has anyone ever considered the fact that many of these decision makers
are very serious, ethically minded people? They take their job seriously
and feel a strong responsibility to make a correct technology decision.

I am really strongly looking at Rails at the moment for an up and coming
solution. But we got some funky requirements that may result in our use
of Rails being purely reserved for rapid prototyping and development/test
tools. While I love how quickly you can get a best-practice solution together,
and how elegant the solutions are, I am concerned that the time you
save early on you lose down the road dealing with edge problems.

The concerns are not that questions exist, but that the questions are
not being really well answered. Some concerns that I have about RoR:
- lack of good success & failure case studies with lessons learned

- library (Ruby) and plugin (Rails) immaturity
- library portability
- what happens to productivity when you go outside the rails problem domain
- how narrow is that problem domain (how easy is it to overstep)

- what happens (to productivity/performance) when your rails apps
need to do wierd stuff like bolt-on SNMP processing ruby-code

- how forgiving is the technology, if you make mistakes/bad assumptions,
how easy is it to recover

- deployment of Rails apps/bundling rails apps

- immaturity of tools
- international support

I am happy enough with a lot of these issues to go with a Rails solution
for something non-critical or prototyping. But I can't in good faith bet the

project on it. I would be happy enough to wait a year though and see what
happens to my concerns as it is moving really rapidly, in the meantime
levelling up my Rails skills.

Don't assume decision makers are stupid or spineless. Their responsibility
is to their employer, it is not their responsibility to promote a technology.
They read mailing lists and bloggers and case studies and do google searches.
They see the extended debates on multinational rails, performance/scalability,
plugin life-expectancy and weak/unknown applicability outside of classic
web apps. Sure the rapid development aspects with implicit bet practices
is great, thats why they are looking at it in the first place, thats the carrot.

I would like it to be ready for prime time now, because next year I probably
won't be in a position to put in place any Rails solution. And it sure is a
lot of fun to work with - I can code for fun at home, but if I get my
employer to
adopt it i can get paid for RoR-ing too.

Chad Perrin

unread,
Sep 4, 2006, 4:22:55 PM9/4/06
to
On Tue, Sep 05, 2006 at 04:49:19AM +0900, Richard Conroy wrote:
> On 9/4/06, Joseph <jlhu...@gmail.com> wrote:
> >Vidar,
> >
> >Risk Management IS NOT equivalent to FEAR, in that you are right.
> >
> >However, as I said earlier, no SIGNIFICANT progress can be expected
> >without some risk. Risk Management is about dealing with risk, not
> >eliminating it.
>
> I would have thought that eliminating risk would be a job well done
> by someone responsible for Risk Management? No?

If you eliminate risk entirely, you end up guaranteeing failure -- for
some definition of risk. Any definition of risk that does not result in
that end is either meaningless or effectively impossible to eliminate.


>
> I am seeing an awful lot of chatter here along the lines that technology
> decision makers are insipid jobsworths who fall in line behind the big
> tech brands because they are afraid to stick their neck out. ie. the only
> reason
> they are not picking Rails is because they don't have the stones for it.

My take is that people who choose a technology based on popularity
rather than knowledge of the technology is an insipid jobsworth who
falls in line behind the big tech brands because (s)he is afraid to
stick his/her neck out. Those who choose a technology based on
knowledge of the technology, on the other hand, is a smart guy that
should be making a lot of money, whether the ultimate decision is to go
with J2EE, Rails, Common Lisp, VB.NET, or Wasabi. Of course, I think
VB.NET is unlikely to be a good choice outside of extremely pathological
edge-cases, but that's beside the point.


>
> Has anyone ever considered the fact that many of these decision makers
> are very serious, ethically minded people? They take their job seriously
> and feel a strong responsibility to make a correct technology decision.

. . but if they end up making a decision based on the criteria Joel
Spolsky advocated in the essay that started all this discussion, they're
either malicious or incompetent.

--
CCD CopyWrite Chad Perrin [ http://ccd.apotheon.org ]

M. Edward (Ed) Borasky

unread,
Sep 4, 2006, 1:40:23 PM9/4/06
to
Stephen Kellett wrote:
> In message <44FB5AD5...@comcast.net>, Devin Mullins
> <twi...@comcast.net> writes
>> does clean mean? reduced duplication?). Pretty means fewer LOC, which
>> is about the only objective measure of maintainability we know.
>
> I take it you've never had the pleasure of reading someone else's APL
> code? Its about as dense as you can get in LOC.
>
> Sure is not easy to maintain. Often described as a "write only language".
>
> I think the word "pretty" is not the correct word, "elegant" would be
> better.
>
> Stephen

I've never written a line of APL code, but that hasn't ever stopped me
from being able to read APL code if the need or desire to do so arose.
There was a time when it was a dominant language in econometrics and
computational finance; indeed, the "A Plus" open source descendant of
APL originated at Morgan Stanley.

APL and its original implementation APL\360 were/are works of pure
genius. I was privileged to meet one of their creators (Falkoff) when I
was a very young programmer working at IBM. APL is one of the few truly
unique programming languages and possesses an elegant simplicity found,
in my opinion anyway, in only two other programming languages --
Lisp/Scheme and Forth.


Joseph

unread,
Sep 9, 2006, 1:00:03 PM9/9/06
to
Austin,

I'd love to joing TRUG, I sent you a email privately, do answer back! ;
)

Best Regards,

Jose L. Hurtado
Web Developer / IT Security
Toronto, Canada

Austin Ziegler wrote:
> On 9/3/06, Joseph <jlhu...@gmail.com> wrote:
> > Best Regards,
> >
> > Jose L. Hurtado
> > Web Developer
> > Toronto, Canada
>
> So ...
>
> we've never seen you at a TRUG meeting (we just had one yesterday).
>
> Come on out and join us!
>
> -austin
> --
> Austin Ziegler * halos...@gmail.com * http://www.halostatue.ca/
> * aus...@halostatue.ca * http://www.halostatue.ca/feed/
> * aus...@zieglers.ca

Joseph

unread,
Sep 9, 2006, 1:39:44 PM9/9/06
to
Folks,

Well this is a LONG discussion!

Some points I wanted to address that have been raised by different
posters, I have included a short version of their own post below:

VIDAR HOKSTAD SAID
Sometimes the payoff in trying a technology that your team is
inexperienced with or that isn't widely deployed ...or the risks are
mitigated by your teams experience (Vidar has an application that is
already written in another language)

PETER BOOTH SAID


One point he doesn't make is the difference OPM and MHC (Other Peoples
Money versus My Hard-earned Cash.) When you are spending OPM costs can
become unreal. So as an employee a "buzz-word compliant/standard

approach" Java or NET solution might make more sense...


RESPONSE
Vidar and Peter I agree partially with your main points, however I
would argue your points apply to existing applications, software
already written in a given language where migration risks may truly
outweight embracing a new technology, and then perhaps staying where
you are is a safer, wiser choice.

I say perhaps because:
- Your code base would have to be significant, and your current
language at least good enough to achieve your business or startup
objectives in a reasonable amount of time.
- The productivity savings of the alternative (Ruby and RoR) would have
to be low or unimportant enough to ignore the other language.
- Other people's money truly dislike risk, any risk, and of course if
you have convinced them that
[insert-your-safe-language-framework-combination-here] is the best, and
the coolest, safest tools to use... how would you go back to them,
asking for money and telling them you were wrong... tough to do, yes,
but on ocassion I would argue WISE. Because if this savings in
productivity were coming to benefit YOUR HARD EARNED CASH (MHC) then I
could bet Ruby would be the only obvious answer!

Also, a little off topic, you could reach a middle of the ground
compromise, extending your app in Ruby and leaving the "legacy" code
alone, a common practice in the mainframe world where COBOL apps have
been largely left in the cold or with minimal maintenace, while hooks
have been made into Java based applications that extend them! Why not
do the same with Net/Java apss with links to Ruby on Rails?

Finally to end my reply I would say that if you are starting a NEW
project, mission critical or not, and there is no "showstopper" for
your application, by all means Ruby and Ruby on Rails are indeed the
best possible tools you could choose. THAT IS PRECISELY my point. The
productivity savings, the elegance and clarity of the framework is just
to good to ignore.

Joseph

unread,
Sep 9, 2006, 1:48:31 PM9/9/06
to
Phlip,

When I mentioned the error in reasoning Joel did, by equating almost
all his infamous post about Ruby, basically with FEAR, some people did
not fully get my point.

Then you added the UNCERTAINTY and DOUBT that was there too, to create
the trifecta of a technology attack based on truly nothing significant
at all.

Well, guess who aggreed with us, almost point by point? David
Heinemeier, co-founder and lead developer of Ruby on Rails, check out
his post here:

http://www.loudthinking.com/arc/000596.html

Have a nice weekend everyone,

Jose L. Hurtado
Web Developer
Toronto, Canada


Phlip wrote:
> Joseph wrote:
>
> > Although I respect Joel very much, I believe he makes a fundamental
> > mistake in his reasoning.
>

> Joel is such a good writer that sometimes his jaw-drooping errors are
> impossible to refute. (And don't encourage him; he loves it when you fight
> back!)
>

> > Basically what he is saying can be deconstructed this way:
> >
> > * Do not risk developing in new cutting edge technology. Even if
> > successful proof of concepts are already out there (37 signals et. al)
> > * Use what most people use: PHP / J2EE / .Net not what most experts
> > tell you to use. Communities and support are paramount.
>

> The open source tools that succeed must have higher technical quality than
> the Daddy Warbucks tools. The latter can afford to buy their communities and
> "support" networks. Because an open source initiative cannot buy its
> community and marketing, only the strong survive, and their early adopters
> will form this community spontaneously. They will provide the true
> word-of-mouth advertising that marketing tends to simulate.
>
> And I am sick and tired of seeing at shops dragged down by some idiotic
> language choice made between the marketeers and a computer-illiterate
> executive.
>

> > * Corporations and the people in those organizations favor safety, if
> > your job is on the line go with the tried and true. Take no risks.
>

> Ah, so looking like you are following best practices is more important than
> doing everything you can to ensure success. Gotcha!
>
> Yes, I have seen that upclose, too!
>

> > All three assumptions rely on a single assumption: FEAR.
> >

> > * Fear the technology would eventually not deliver.

> > * Fear the support will not be sufficient.

> > * Fear regarding your job safety as a corporate developer or manager
> > who chooses Ruby or Ruby on Rails for some mission critical project.
>

Phlip

unread,
Sep 9, 2006, 3:16:27 PM9/9/06
to
Joseph wrote:

> Well, guess who aggreed with us, almost point by point? David
> Heinemeier

Well, he had an edge over me. He probably actually read Joel's article.

;-)

--
Phlip
http://www.greencheese.us/ZeekLand <-- NOT a blog!!!


Phlip

unread,
Sep 9, 2006, 3:19:14 PM9/9/06
to
Joel Spolsky wrote:

> [Ruby is slow] so if you become The Next MySpace,
> you'll be buying 5 times as many boxes as the .NET guy
> down the hall.

Last time I checked, 5 new boxes were cheaper than 5 new programmers...

It is loading more messages.
0 new messages