Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

market research--what would be "cheap enough"?

32 views
Skip to first unread message

Eric Moss

unread,
Sep 8, 2001, 9:31:32 PM9/8/01
to
Hi all,

Instead of asking what "should" be free, I would like to ask, "how much
is the source worth to you?", or alternately, "How cheap must a product
be before we don't care that it's not open-sourced?" I know some people
need the source, and others want the source for whatever reason. For the
rest of us, though, a little market research regarding each vendor....


First I will stipulate that:

[0] A company can charge anything they want, so "should" won't be used
with any moral implication--this is only a test of the free-market
waters.
[1] Lisp is so productive that it's worth a lot. Yes, but worth how
much, and to whom? That is the question I am asking here.
[2] I will assume that reported problems get fixed quickly and
correctly, which generally seems to be the case with the lisp vendors.
[3] I will ignore CORBA and other Enterprise-level packages that say "we
are serious about delivering big value to big customers."
[4] I will assume that, minus dynamic compilation, unlimited runtime
delivery is granted.


OK, batter up...

[1] How much would you pay for LWPro before you had to justify it (e.g.
as a business expense)? I would pay $199 without hesitation, $299 with a
pause. The current $700-$800 is totally fair, but I can't justify that
unless I know it will return more than edification.


[2] Now for Franz. Their pricing is totally fair (I'm already tired of
the disclaimer, already), but only a manic hobbyist or a company
determined to use lisp can justify it. Franz doesn't publicly display
their prices, negotiates them for "bulk" purchases, and has some hefty
runtime fees. In other words, few people can be casual about an Allegro
purchase.

Their products are my favorite (based on a job where I used them), but
>$5000 for environment plus CLIM plus extended libraries practically guarantees that all I ever do is download their trial versions. For the nth time, it's morally fine that I don't get to own every luxury item in the world, but I would *like* to see a change. So...

What would *you* pay for Allegro--

-- with compiler but no extended libraries
-- with extended libs but no CLIM
-- extended libs and CLIM


I could see $200, $500 and $1000 (or thereabouts) since I like their
product. I might add $300-$400 for a one-time no-compilation-included
runtime fee (just to even the comparison between it and LW/Digitool).

[3] Digitool (Mac-only). I'm willing to pay the $600-$700 for 3-issue
subscription to the environment+CLIM just because I know how much harder
it is to survive in the Mac market, and am willing to toss some extra
coins in their pail for the two levels of faith. ;) It's still not a
casual buy, however. What would others pay before they had to think hard
about it?

I won't pick on Corman or muLisp since they are already dirt cheap. :)

Thanks for the input.

Eric

natas

unread,
Sep 9, 2001, 5:22:16 PM9/9/01
to

>
> OK, batter up...
>

Ok, since this is the internet, where everybody's opinion is equally
irrelevant, I will climb out of my hidey-hole and respond to this question.

I won't take it point by point, because in my mind it all boils down to
this: Common Lisp vendors could all learn a good lesson in pricing from
Functional Objects. A guy (or girl) could walk out of their store with a
bitchen programming environment (_including_ (I might add) the whiz-bang
CORBA stuff for "enterprise" development) for under $1K. The only thing
lacking is a gui designer. So, to answer your question, I would pay ~$400
for an IDE+compiler+extendended libs that let me do everything except CORBA.
I would still be drawing gui's by hand with that tool. I would pay
~$600-$999 for the above with a gui designer ala Delphi. That's a wide
range, but the price would depend on how much chrome was on the gui layout
tool. I'd say something like Delphi is at or near the top of the stack in
that arena, and it costs $999 for the Pro version. I would pay at most,
$400 extra for the CORBA add-ons. I would not expect to receive any source
code at these prices, but I would expect unlimited runtime distribution.
(FWIW: Again, looking at Fun-O's, I think their environment does include
some source code. And, of course, all versions of Delphi come with the full
VCL source code.)

Good luck with your research, I hope someone from Franz is watching this
thread.

Friedrich Dominicus

unread,
Sep 10, 2001, 2:07:46 AM9/10/01
to
"natas" <cubic...@hell.com> writes:

> >
> > OK, batter up...
> >
>
Found it. LispWorks Professional fits your range.

Regards
Friedrich

Mark Watson

unread,
Sep 10, 2001, 10:37:27 AM9/10/01
to
Hello Eric,

I occasionally get asked for recommendations for
a CL dev kit. I usually recommend using the free
LispWorks personal edition until one has a
deployable application or a compelling demo,
then buy the Pro version.

I think that Franz and Xanalys are excellent companies, and
do the right thing by providing either free personal
editions or trial copies.

To answer your question: I have no problems with
the pricing of either Franz or Xanalys.

For Linux users, CMU CL is very good, but lacks
the goodies of the commercial products.

I used to use a Xerox 1108 Lisp Machine, and
suffered terribly because of no cheap deployment
options.

Now, Lisp'ers have it good!

-Mark

--Mark Watson -- Java consulting, Open Source and Content:
www.markwatson.com

Nels Tomlinson

unread,
Sep 10, 2001, 11:18:09 AM9/10/01
to

Eric Moss wrote:
> Hi all,
>
> Instead of asking what "should" be free, I would like to ask, "how much
> is the source worth to you?", or alternately, "How cheap must a product
> be before we don't care that it's not open-sourced?"

For me, a closed source product would have to have a large, negative
price ... say, a year's salary for each year I was to use it.

The problem is that any work you do with the proprietary product becomes
a hostage to the proprietor. Even if you purchase from a responsible,
reputable, charitable vendor, your work can still be ripped out from
under you, despite the proprietor's best intentions. I know a fellow
who runs his entire business on some custom applications he wrote for
his osborn computer. I don't remember in what package he wrote them,
but it seems it was never made for dos. When his last osborn dies, all
his work is useless. He had three osborns left last time I saw him.

This kind of trouble doesn't exist with libre software. Odds are that
someone will port linux and the bsd's to every new platform to come.
along for a long time to come. Given that, porting my favorite
appication becomes thinkable.

There is another sort of problem with closed-source code: it is
ultimately a black box. Maple is a great example of this: most of Maple
is written in Maple, but the heart of it is written in ... something.
It's closed, and we don't know in what language it's written, let alone
what it is. This is not optimal for scientific research. Black box
methods just don't fit well with the scientific method. Of course, we
know what Waterloo CLAIMS the black boxes do, but no one can check their
claim.

A third sort of problem with closed-source software is the licensing
mess. There are special fees for distribution of programs built with
Franz Lisp (you said, I believe, down below), and in general, the price
for a package depends on the use you will make of it. It's cheap if
you're a student, but when you graduate, you have to pay up or violate
your lisence. Can students use the software for a paying project, or
just for classwork? Better ask your lawyer. To top it all off, we see
the sort of mess that the city of Virginia beach got into; they lost
some of their receipts for some of their software, and so had to pay for
it again, and had to pay some large fines.

Overall, closed source software has a LOT of problems, and the fact that
the libre stuff is often better is the least reason to prefer it, in the
long run.

Nels

Tim Bradshaw

unread,
Sep 10, 2001, 11:42:49 AM9/10/01
to
* Nels Tomlinson wrote:

> The problem is that any work you do with the proprietary product
> becomes a hostage to the proprietor. Even if you purchase from a
> responsible, reputable, charitable vendor, your work can still be
> ripped out from under you, despite the proprietor's best intentions.
> I know a fellow who runs his entire business on some custom
> applications he wrote for his osborn computer. I don't remember in
> what package he wrote them, but it seems it was never made for dos.
> When his last osborn dies, all his work is useless. He had three
> osborns left last time I saw him.


This is what standards are for.

Raymond Toy

unread,
Sep 10, 2001, 12:58:55 PM9/10/01
to
>>>>> "Nels" == Nels Tomlinson <toml...@purdue.edu> writes:

Nels> The problem is that any work you do with the proprietary product
Nels> becomes a hostage to the proprietor. Even if you purchase from a

Don't make the proprietor your enemy. Make him your partner. He wins
if you win, and vice versa.

Nels> This kind of trouble doesn't exist with libre software.
Nels> Odds are that someone will port linux and the bsd's to every
Nels> new platform to come. along for a long time to come. Given
Nels> that, porting my favorite appication becomes thinkable.

Of course that trouble exists. Libre software dies too. Any one
remember brik? flip? zoo? Where are they now? Even if you have
source, you really ought to think whether you should be responsible
for maintaining it. Would your time and money be better spent doing
something else instead of maintaining in some haphazard way some libre
software?

Nels> There is another sort of problem with closed-source code: it is
Nels> ultimately a black box. Maple is a great example of this: most of
Nels> Maple is written in Maple, but the heart of it is written in
Nels> ... something. It's closed, and we don't know in what language it's
Nels> written, let alone what it is. This is not optimal for scientific
Nels> research. Black box methods just don't fit well with the scientific
Nels> method. Of course, we know what Waterloo CLAIMS the black boxes do,
Nels> but no one can check their claim.

This doesn't matter to me because I wouldn't understand how the
internals are done anyway. That's probably true of most software. I
don't care how gcc works or emacs, or Solaris. I just want it to
work. This is probably true for most people for some software.

Nels> project, or just for classwork? Better ask your lawyer. To top it
Nels> all off, we see the sort of mess that the city of Virginia beach got
Nels> into; they lost some of their receipts for some of their software,
Nels> and so had to pay for it again, and had to pay some large fines.

So what if Virginia Beach lost the source and binaries to their own
hacked version of some free software that is 5 versions behind the
current version? They have to spend a large amount of time and money
either to reconstruct the old code somehow or upgrade to the newer
releases. How is this really any different? Tax payers have to cough
up a bunch of money.

Nels> Overall, closed source software has a LOT of problems, and the fact
Nels> that the libre stuff is often better is the least reason to prefer it,
Nels> in the long run.

Software, free and otherwise, should be carefully evaluated to
understand the true costs of it. Then make your decisions
accordingly.

Just because it's free doesn't make it the right choice. It can be,
but it isn't automatically.

Ray

Erik Naggum

unread,
Sep 10, 2001, 2:28:22 PM9/10/01
to
* Nels Tomlinson <toml...@purdue.edu>

> This kind of trouble doesn't exist with libre software.

That is not true at all. Software rot is real: When the environment
changes, you still need to keep it running. Your Osborne example would
just as much out of luck with source as he is without.

> Odds are that someone will port linux and the bsd's to every new platform
> to come. along for a long time to come. Given that, porting my favorite
> appication becomes thinkable.

I doubt that. Have you see how much changes from release to release in
Linux? Anyone who wrote a real application would have to work hard to
stay on top of these changes. Duane Rettig of Franz Inc has written
several good articles about Allegro CL on Linux here about these problems.

> Black box methods just don't fit well with the scientific method.

I do not think that statement fits well with the scientific method.

> To top it all off, we see the sort of mess that the city of Virginia
> beach got into; they lost some of their receipts for some of their
> software, and so had to pay for it again, and had to pay some large fines.

From a single data point you can extrapolate in any direction. People
have lost the only sources to their software, too. It is important to
remember that accidents are just that.

> Overall, closed source software has a LOT of problems, and the fact that
> the libre stuff is often better is the least reason to prefer it, in the
> long run.

If the free Lisp stuff was better, it would probably kill the commercial
offerings, but the free offerings for Lisp are very, very poor compared
to the commercial software. This is why Franz Inc. can survive at all.

///

Erik Winkels

unread,
Sep 10, 2001, 4:09:01 PM9/10/01
to
Nels Tomlinson <toml...@purdue.edu> writes:
> The problem is that any work you do with the proprietary product
> becomes a hostage to the proprietor.

The problem is in the way you approach your work and choose your
vendors then.


> Even if you purchase from a responsible, reputable, charitable
> vendor, your work can still be ripped out from under you, despite
> the proprietor's best intentions.

Pray tell how "your work can still be ripped out from under you"?


> There is another sort of problem with closed-source code: it is
> ultimately a black box.

That depends on the vendor and your relationship with him.


> A third sort of problem with closed-source software is the licensing
> mess.

As opposed to the really clear way in which things are handled on the
open-source front. I'd rather say that with a good vendor licensing
is a done deal while it still has to be seen how some of the more
baroque open-source licenses hold up in court.


> Overall, closed source software has a LOT of problems, and the fact
> that the libre stuff is often better is the least reason to prefer
> it, in the long run.

I really like the way you support your position with generalisations
and vagueness. "The libre stuff is often better" how?


Erik
--
"If I had wanted your website to make noise I would have licked
my finger and rubbed it across the monitor."
-- "istartedi" on slashdot.org

Kent M Pitman

unread,
Sep 10, 2001, 5:21:11 PM9/10/01
to
Nels Tomlinson <toml...@purdue.edu> writes:

> Eric Moss wrote:
> > Hi all,
> >
> > Instead of asking what "should" be free, I would like to ask, "how much
> > is the source worth to you?", or alternately, "How cheap must a product
> > be before we don't care that it's not open-sourced?"
>
> For me, a closed source product would have to have a large, negative
> price ... say, a year's salary for each year I was to use it.
>
> The problem is that any work you do with the proprietary product becomes
> a hostage to the proprietor.

This is really a complete exaggeration if you are using an ANSI CL.
If you isolate your system dependencies to a small amount of code,
you can rewrite the code in a small amount of time if you have to
port to another vendor, and I bet that porting work is comparable in
time, IF NOT LESS, if you are moving to another, debugged commercial
vendor rather than an open-source, no-guarantees offered, caveat-emptor,
open source implementation.

That's not to say that I don't think there can be good open source
implementations, but only to say that open-sourceness is no proof that
fixes are easy or even possible for the holder. I got Red Hat 6.0 a while
back and it didn't support a new ethernet card from a major manufacturer.
Sure, I had the sources and you can say that I could just "write my own
device driver", but personally, I'd rather not.

> There is another sort of problem with closed-source code: it is
> ultimately a black box.

I find this to be a great benefit. It limits my responsibility to know.
The whole notion of the "black box" was invented as a simplifying assumption.
If you really believe that the opening of all black boxes would make the
world more simple, I think you're neglecting the TIME it would take to
understand what is in them. The TIME it takes to learn yet another tool,
EVEN IF you are mentally, emotionally, intellectually, and financially
capable of taking on the task can be as impenetrable a barrier as lack of
source.

I find the "open source" argument the same kind of trivial nonsense as
a proof of "turing power". Oh, yes, of course you know that if you have
sources you can fix things. Right, just like you know if Fortran is Turing
Equivalent to CL then you'll be able to program anything in Fortran that
you can in CL... given enough time. I dunno about you, but time is BY FAR
my most precious commodity. And if you bound your timeline instead of
assuming an infinite one (the analog of bounding tape length instead of
assuming infinite tape on a Turing Machine), you get very different results.

> A third sort of problem with closed-source software is the licensing
> mess.

No, licensing is orthogonal.

> There are special fees for distribution of programs built with
> Franz Lisp (you said, I believe, down below)

But not with Xanalys LispWorks, and it's closed source.

I'm quite comfortable using LispWorks Professional for my own uses
because I can continue to use the same Lisp image year after year,
even distributing executables, without fee.

> and in general, the price
> for a package depends on the use you will make of it.

That's a vendor issue. Using a specific vendor's policy as a basis for
a claim against the paradigm of closed source is not legitimate.

Symbolics used to have open source stuff that you had to pay a fee to get
and that you were restricted from using in various ways.

Open source might make it easier for you to willfully disregard the law,
but the law is quite clear that these issues are orthogonal.

> Overall, closed source software has a LOT of problems, and the fact that
> the libre stuff is often better is the least reason to prefer it, in the
> long run.

And it has benefits. I've detailed them recently in other posts, so I won't
repeat them here.

Gavin E. Mendel-Gleason

unread,
Sep 11, 2001, 2:18:25 PM9/11/01
to
Erik Naggum <er...@naggum.net> writes:

> > Black box methods just don't fit well with the scientific method.
>
> I do not think that statement fits well with the scientific method.
>

I feel that the orginal statement is accurate. Your rebutal however
will require a more detailed description of your problems with the
original statement if I'm to decide that I disagree.

I have found numerous difficulties in the use of close sourced scientific
computation packages. Some of the problems that I faced were:

1. There was no way to tell the accuracy and/or error of many of the results
of floating point computation made by functions in the system.

2. There was no way to view the subtle assumptions that are made in the
implementation of complicated algorithms.

The first (1.) can be attributed to a poor interface. The second (2.) on
the other hand is very slippery. The best specification of an algorithm
is source.

--
Gavin E. Mendel-Gleason
(let ((e-mail-address "PGIU...@VGIHKRR.TKZ"))(loop with new-string = (make-string (length e-mail-address))
for count from 0 to (1- (length e-mail-address)) for char-code = (char-code (aref e-mail-address count))
for new-char-code = (if (and (> char-code 64)(< char-code 123))(+ (mod (+ 13 char-code) 52) 65) char-code)
do (setf (aref new-string count) (code-char new-char-code)) finally (return new-string)))

Kent M Pitman

unread,
Sep 12, 2001, 3:33:28 AM9/12/01
to
jaco...@pacbell.net (Gavin E. Mendel-Gleason) writes:

> Erik Naggum <er...@naggum.net> writes:
>
> > > Black box methods just don't fit well with the scientific method.
> >
> > I do not think that statement fits well with the scientific method.
> >
>
> I feel that the orginal statement is accurate. Your rebutal however
> will require a more detailed description of your problems with the
> original statement if I'm to decide that I disagree.
>
> I have found numerous difficulties in the use of close sourced scientific
> computation packages. Some of the problems that I faced were:
>
> 1. There was no way to tell the accuracy and/or error of many of the results
> of floating point computation made by functions in the system.

Certainly there is. It can simply be documented.

Further, the essence of the scientific method is independent trial should
yield the same result. So you are welcome to independently test any part
that you doubt.

> 2. There was no way to view the subtle assumptions that are made in the
> implementation of complicated algorithms.

That's true anyway.

Once again, the scientific method has not traditionally required that every
aspect of one's notes and computations are exposed. I've read any number
of reports where only tables of computational results are exposed. What
makes these scientific is not that they do or don't make subtle assumptions,
but that they document what they think ARE the relevant factors so that YOU,
the reader, can try to duplicate things yourself under whatever alternate
assumptions or conditions you want, in order to assure yourself of the same.

"The scientific method" is NOT proof by static analysis. It is "gradual
consensus and acceptance by failure of a community to construct a
counterexample".

> The first (1.) can be attributed to a poor interface. The second (2.) on
> the other hand is very slippery. The best specification of an algorithm
> is source.

You seem to me to be confusing tools with tests. If you're saying that
a program is ITSELF a scientific experiment, that's a different claim.
But I think that's not the common case. There are plenty of tools that
are closed-source, just like there are probably plenty of voltmeters,
oscilloscopes, etc. that have seals on them that say "warranty void
if opened".

I think the claim that "closed source" is in any way at odds with science
or the scientific method is utter rubbish.

Alain Picard

unread,
Sep 12, 2001, 4:31:37 AM9/12/01
to
jaco...@pacbell.net (Gavin E. Mendel-Gleason) writes:

> Erik Naggum <er...@naggum.net> writes:
>
> > > Black box methods just don't fit well with the scientific method.
> >
> > I do not think that statement fits well with the scientific method.

> I feel that the orginal statement is accurate.

So do I. Ob. anecdote follows.


Where I did my astronomy work, we used this package called FIGARO
to reduce our data. Now, FIGARO is, well, a mess (no offence, Keith! :-)

At most other institutions, people use something called IRAF, which is
_way_ more polished, and was largely written with federal grant money
by professional programmers at NOAO. It's in almost every measurable
way superior to FIGARO.

Therein lies its flaw.

The problem is that it does things so smoothly, the students who are
exposed to it tend to develop an attitude of "if IRAF can't do it, it
can't be done (or isn't worth doing)".

Students exposed to FIGARO develop the philosophy "If FIGARO can't
do this, I'll hack my routine" (FIGARO is very open to extension).


So it takes the latter category of students 5 times longer than
the first category to do anything, but they learn that anything
can be done. Later in life, they discover IRAF, switch to it,
and use it very astutely, as they _know_ that they can, in principle,
reduce their data in any way they choose. So they turn out to
be better scientists.


Now, the analogy to commercial black boxes is flawed, as both FIGARO
and IRAF have open sources, but the point remains: the package *perceived*
as a black box does, in fact, "fits less well with the scientific method."


As always, YMMV.
Alain Picard

--
It would be difficult to construe Larry Wall, in article
this as a feature. <1995May29....@netlabs.com>

Tim Bradshaw

unread,
Sep 12, 2001, 5:01:23 AM9/12/01
to
* Kent M Pitman wrote:

> I think the claim that "closed source" is in any way at odds with science
> or the scientific method is utter rubbish.

It's a bit like arguing that quantum mechanics is at odds with the
scientific method because it says that some parts of the word are
sort-of closed source, and puts extremely comprehensive restrictions
on reverse-engineering them, too.

--tim

Martin Cracauer

unread,
Sep 12, 2001, 11:11:03 AM9/12/01
to
Erik Naggum <er...@naggum.net> writes:

>> Odds are that someone will port linux and the bsd's to every new platform
>> to come. along for a long time to come. Given that, porting my favorite
>> appication becomes thinkable.

> I doubt that. Have you see how much changes from release to release in
> Linux? Anyone who wrote a real application would have to work hard to
> stay on top of these changes. Duane Rettig of Franz Inc has written
> several good articles about Allegro CL on Linux here about these problems.

I once posted a list for Linux problems for CMUCL, same issue.

However, the personal behaviour of Torvalds, some Redhat people, the
idiot maintaining glibc and other Linux people should not be taken as
representative for the OpenSOurce for free software world.

Use FreeBSD instead. Have one look at the ELF transition of FreeBSD,
at the way libc and signal handler changes are done. Heck, most of
the time you have fewer change problems running Linux (not FreeBSD)
binaries on different FreeBSD versions than on different Linux
versions.

> If the free Lisp stuff was better, it would probably kill the commercial
> offerings, but the free offerings for Lisp are very, very poor compared
> to the commercial software. This is why Franz Inc. can survive at all.

The word "better" is certainly a bit too simple for my taste to make
such a statement. But certainly the free Lisp systems give a damn
about some aspects of customer wishes and those are fed by commercial
vendors. That's the way it should be.

Martin
--
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Martin Cracauer <crac...@bik-gmbh.de> http://www.bik-gmbh.de/~cracauer/
FreeBSD - where you want to go. Today. http://www.freebsd.org/

Nils Goesche

unread,
Sep 12, 2001, 11:00:13 AM9/12/01
to
Kent M Pitman <pit...@world.std.com> writes:

> I think the claim that "closed source" is in any way at odds with
> science or the scientific method is utter rubbish.

That's correct, of course. There is nothing wrong with regarding some
software package you're using as a `black box', indeed that's what
modular programming is all about, isn't it? However, problems arise
when this `black box' doesn't work as specified, no matter how good
and clear the specification is.

Until 1 1/2 years ago, I was working on an embedded, but very large,
system that used Windows NT as its OS (switching to NT was the
decision of some braindead manager. Originally, the thing ran under
Solaris). Over the time we spent /months/ stepping through the
Assembler code of several parts of NT, whenever something didn't work
as advertised. Some problems could be solved that way, others
couldn't (until now, I would really like to see the sources of the so
called ``Service Contol Manager'', which gave us particular trouble).
If we had had the sources of NT, not just the kernel symbols, our
lifes would have been much easier.

Somewhat smaller devices need sophisticated OSes, too. There are
several variants of embedded UNIXen to choose from, which often are
_very_ expensive (Franz Lisp costs _nothing_ compared to them). Same
story - the kind of bugs you can report to the vendor often sound like
``Well, I use this feature of yours as documented and after a while
the system crashes.'' When you can't find out what happens by
stepping through the Assembler code, all you can do is waiting for a
fix by the vendor. Often, they don't even answer for _months_ when
they have no idea what causes the bug. This is not just one vendor I
am talking about, this is the general experience of mine and most of
the people I know in the embedded business.

Most people I know in the field came to the conclusion that it is
suicidal to use anything you don't get sources for. Either write what
you need yourself or insist on sources. Recently, using embedded
Linux or some BSD variant is becoming an option, which makes things
even easier because vast literature and knowledge about the internals
of these OSes exist.

Sure, newbies and morons can cause great damage by hacking stupid
``fixes'' into those packages, but beating them strongly over the head
usually causes them to improve over time (the newbies, at least. The
morons are finally transferred to the Visual C++ crowd :-).

Regards,
--
Nils Goesche
"Don't ask for whom the <CTRL-G> tolls."

PGP key ID 0x42B32FC9

Kaz Kylheku

unread,
Sep 12, 2001, 12:45:59 PM9/12/01
to
In article <9nntu7$177b$1...@counter.bik-gmbh.de>, Martin Cracauer wrote:
>However, the personal behaviour of Torvalds, some Redhat people, the
>idiot maintaining glibc and other Linux people should not be taken as
>representative for the OpenSOurce for free software world.
>
>Use FreeBSD instead. Have one look at the ELF transition of FreeBSD,
>at the way libc and signal handler changes are done. Heck, most of
>the time you have fewer change problems running Linux (not FreeBSD)
>binaries on different FreeBSD versions than on different Linux
>versions.

Linux refers to an entire collection of distributions of an operating
system. FreeBSD is effectively just one distribution of BSD.

The differences between two Linux distributions pale in comparison to the
differences between two variants of BSD.

Thomas F. Burdick

unread,
Sep 12, 2001, 2:24:22 PM9/12/01
to
k...@ashi.footprints.net (Kaz Kylheku) writes:

> In article <9nntu7$177b$1...@counter.bik-gmbh.de>, Martin Cracauer wrote:
> >However, the personal behaviour of Torvalds, some Redhat people, the
> >idiot maintaining glibc and other Linux people should not be taken as
> >representative for the OpenSOurce for free software world.
> >
> >Use FreeBSD instead. Have one look at the ELF transition of FreeBSD,
> >at the way libc and signal handler changes are done. Heck, most of
> >the time you have fewer change problems running Linux (not FreeBSD)
> >binaries on different FreeBSD versions than on different Linux
> >versions.
>
> Linux refers to an entire collection of distributions of an operating
> system. FreeBSD is effectively just one distribution of BSD.

In this context, I took "Linux" to mean linux (the kernel) + glibc,
because that's the relevant part of the system. The analysis holds
for any distro, though, even Debian, which puts a premium on being
stable, because the source of this problem is the kernel and C
library. So go ahead and replace "Linux" with "Debian" or "RedHat",
and re-read the above.

> The differences between two Linux distributions pale in comparison to the
> differences between two variants of BSD.

Well, naturally. That does not hold if you only count descendants of
4.4BSD, though.

Martin Cracauer

unread,
Sep 12, 2001, 5:50:09 PM9/12/01
to
Tim Bradshaw <t...@cley.com> writes:

So here we have scientific prove that the real world is generally
crap, can't be fixed and is not worth supporting.

Cool. Where's my gaming machine? Need some better world for a break.

Martin Cracauer

unread,
Sep 12, 2001, 5:46:48 PM9/12/01
to
k...@ashi.footprints.net (Kaz Kylheku) writes:

>In article <9nntu7$177b$1...@counter.bik-gmbh.de>, Martin Cracauer wrote:
>>However, the personal behaviour of Torvalds, some Redhat people, the
>>idiot maintaining glibc and other Linux people should not be taken as
>>representative for the OpenSOurce for free software world.
>>
>>Use FreeBSD instead. Have one look at the ELF transition of FreeBSD,
>>at the way libc and signal handler changes are done. Heck, most of
>>the time you have fewer change problems running Linux (not FreeBSD)
>>binaries on different FreeBSD versions than on different Linux
>>versions.

>Linux refers to an entire collection of distributions of an operating
>system. FreeBSD is effectively just one distribution of BSD.

I was talking about *one* Linux distributions, but one Linux
distribution when going through updates. Updated kernel, updated
glibc, updated mountd, automounter, whatever. This breaks Linux for
many applications. Upgrades of FreeBSD base system almost never break
applications.

>The differences between two Linux distributions pale in comparison to the
>differences between two variants of BSD.

See above, I wasn't addressing the issue of different Linux
distributions, but same distribution of diffeent dates.

Also, the differences between BSDs may be bigger, but FreeBSD is so
dominant that most software that provides any BSD version provides
just one for FreeBSD. At least NetBSD has a very good FreeBSD ABI
support, so that isn't a problem for them. Don't know about OpenBSD.
As FreeBSD and NetBSD have very good Linux ABI support, BTW, as I
said, sometimes more reliable than Linux itself.

Friedrich Dominicus

unread,
Sep 13, 2001, 2:44:54 AM9/13/01
to
crac...@counter.bik-gmbh.de (Martin Cracauer) writes:

>
> I was talking about *one* Linux distributions, but one Linux
> distribution when going through updates. Updated kernel, updated
> glibc, updated mountd, automounter, whatever. This breaks Linux for
> many applications.

Well I use Debian and I updated from libc5 to libc6 and all the
X-Window stuff from 3.3x to 4.0.x I updated the kernel from I guess
2.0 over 2.2. to 2.4 but I can't remember havin broken applications
(well there was a problem with LispWorks on after the update on XFree
4.0.x. Now I do not know other distributions, but Debian plays in the
first league if one wants an updateable Linux distribution.

> Also, the differences between BSDs may be bigger, but FreeBSD is so
> dominant that most software that provides any BSD version provides
> just one for FreeBSD. At least NetBSD has a very good FreeBSD ABI
> support, so that isn't a problem for them. Don't know about OpenBSD.
> As FreeBSD and NetBSD have very good Linux ABI support, BTW, as I
> said, sometimes more reliable than Linux itself.

Well I do not doubt that the BSDs are fine too.

Regards
Friedrich

Erik Naggum

unread,
Sep 13, 2001, 7:29:58 AM9/13/01
to
* Friedrich Dominicus <fr...@q-software-solutions.com>

> Now I do not know other distributions, but Debian plays in the first
> league if one wants an updateable Linux distribution.

My experience is also with Debian on my development system, but an
installed system "needed" something that only RedHat does, and it got an
early version of RedHat 7.0 installed, which I thought risky, but it
worked, so it was probably OK. Recently they upgraded to a new version
that apparently uses an otherwise unpublished version of GCC that does
weird things with C++ libraries and something really weird with calling
sequences or something -- I am not quite sure what it was, but it has
caused a serious problem in compatibility between Linux distributions.
This is RedHat's fault, not Linux or the library/comiler developers'.
I think basing your decision to support Linux on RedHat is much too risky
to be worth it. Choose a particular kernel/library configuration and
depend on versions of each the way Debian does it. It has turned out to
be the only really _working_ way of surviving updates cleanly.

It is important for people who have RedHat scars that they were not
scarred by Linux as such and not by any other distribution than RedHat.

///

Will Deakin

unread,
Sep 13, 2001, 8:47:18 AM9/13/01
to
Martin Cracauer wrote:
> So here we have scientific prove that the real world is generally
> crap, can't be fixed and is not worth supporting.
But only if you are exhibiting variations in position and momentum greater
than 6.626x10^-34 Js. I think there is a way to go yet,

;)w

Peter Wood

unread,
Sep 13, 2001, 10:06:38 AM9/13/01
to
crac...@counter.bik-gmbh.de (Martin Cracauer) writes:

> I was talking about *one* Linux distributions, but one Linux
> distribution when going through updates. Updated kernel, updated
> glibc, updated mountd, automounter, whatever. This breaks Linux for
> many applications. Upgrades of FreeBSD base system almost never break
> applications.

I use a home built GNU/Linux system. Everything was/is compiled from
scratch. I have upgraded libc twice, binutils, fileutils, gcc, and a
number of other packages.. The kernel has been upgraded more times
than I can remember.

I have never experienced anything breaking.

Ever.

Of course I don't use CMUCL, so maybe that explains it.

Regards,
Peter

Andy Freeman

unread,
Sep 13, 2001, 11:40:50 AM9/13/01
to
Alain Picard <api...@optushome.com.au> wrote in message news:<86zo80q...@gondolin.local.net>...

> So it takes the latter category of students 5 times longer than
> the first category to do anything, but they learn that anything
> can be done. Later in life, they discover IRAF, switch to it,
> and use it very astutely, as they _know_ that they can, in principle,
> reduce their data in any way they choose. So they turn out to
> be better scientists.
>
> Now, the analogy to commercial black boxes is flawed, as both FIGARO
> and IRAF have open sources, but the point remains: the package *perceived*
> as a black box does, in fact, "fits less well with the scientific method."

Not at all. At most, the anecdote claims some pedagogical effects of
open source, namely "take longer to learn/use" and "better students".
We don't know what would happen if the IRAF folks used the time savings
to teach some of the lessons "required" by using FIGARO. (We also don't
know whether IRAF is seductively ill-designed.)

Moreover, even that point is flawed as both are open source, so the
effects are more appropriately blamed on "FIGARO is frustrating, so
people think that working around it is standard operating procedure".

One could argue that open source is inherently or usually more
frustrating, but ....

-andy

Eric Moss

unread,
Sep 13, 2001, 11:43:12 AM9/13/01
to
Would everyone do me a favor and start a new thread if you are unwilling
to answer the original question? All these digressions make it difficult
to search through by thread name to find relevant information.

Thanks,

Eric

Kent M Pitman

unread,
Sep 13, 2001, 1:20:59 PM9/13/01
to
Nils Goesche <car...@cartan.de> writes:

> Kent M Pitman <pit...@world.std.com> writes:
>
> > I think the claim that "closed source" is in any way at odds with
> > science or the scientific method is utter rubbish.
>
> That's correct, of course. There is nothing wrong with regarding some
> software package you're using as a `black box', indeed that's what
> modular programming is all about, isn't it?

Exactly.

> However, problems arise when this `black box' doesn't work as
> specified, no matter how good and clear the specification is.

First, this is an unrelated issue I had not intended to opine on.
However, even here, I'm not so sure this statement can be made in
the absolute.

Partly we are talking about convenience, not science now. What is
convenient is often very subjective.

Suppose you provide source code, but it's in C++ not Lisp. (Or if you
don't find C++ as impenetrable as I do, substitute your own favorite
unreadable language). The mere presence of source does not imply the
usability of source.

Suppose you provide source code, but it requires a compiler that I don't
have access to. Perhaps as a legal matter it was open source, but perhaps
you didn't even realize that it wouldn't compile with a standard compiler
and you deleted your copy thinking it would. The mere fact of the legal
stuats of something is not proof of ease of access.

Suppose you provide source code, but I have an alternate source. What if
I have a hand-held calculator that breaks. Would I (a) appeal to the
sources and burn my own new chip or (b) just grab another (probably closed
source) device to replace it. Market availability can compensate for
difficulty of reproduction.

You are making assumptions in your remark that closed source implies single
supplier, and that's not a fair assumption. You are making assumptions
that open source implies possibility in time and money to make the
replacement/repair in that way.

And, finally, most importantly to my remarks about closed soiurce, the most
common bad assumption is that someone who produces closed source would, if
their only option were to produce open soure, produce the product at all.
For example, suppose I am contemplating the writing of a compiler. As a
closed source product, I amy be incentivized by the availability of an
income stream from it. Now if there were a world where that was not
available, the open source advocates want me to think that I could get
money other ways, but they never acknowledge that I might not WANT to make
money those other ways. They seem to assume the thing will get written
anyway, but maybe I decide I'd rather dig ditches. Frankly, this is very
close to the way I approach the world. I don't dig ditches as an alternative,
but I am a creative person and there are numerous things I can create.
I survey those and find the one that makes the most sense to me economically
and pursue that. The things I create, I like to think, are not just
commodities but things that are unique to me. Others can create interesting
things, too, of course. And maybe it's best if you just starve my gene pool
and you breed a different gene pool, but it's important NOT to get into
the notion that you are comparing "x with sources" to "x without sources".
You are really comparing "x with sources" to "y without sources". Apples
and oranges. The open source world incentivizes a certain set of poeple
and the closed source world a different set of people.

It's all well and good that Linux is an open-source version of Unix, but it
remains an unknown question whther Linux would have happened without Unix
happening first. Almost surely SOME open-source system might have arisen,
but whether it would have been better or worse than commercial ones is an
open question. Whether it would have been adoptable by commercial companies
if it wasn't like one of the proprietary companies is an open question. Even
if it would have eventually gained acceptance, whether that would have been
ten years ago is doubtful (since that's not the way it played out in fact),
and more likely it would have taken more time. Debating these hypotheticals
is pointless, but acknowledging that it's an apples to oranges comparison
and that any argument of the form "I'd rather have x with sources" is
ill-placed UNLESS you can show you are being offered that as a choice, or
unless you seriously think you can change the minds of people like me by
so arguing and so you are just trying to sway me (which thus far you are not).

> Until 1 1/2 years ago, I was working on an embedded, but very large,
> system that used Windows NT as its OS (switching to NT was the
> decision of some braindead manager. Originally, the thing ran under
> Solaris). Over the time we spent /months/ stepping through the
> Assembler code of several parts of NT, whenever something didn't work
> as advertised. Some problems could be solved that way, others

> couldn't [...]

NT was not offered to you open-source, so this argument is (by my argument
above) irrelevant. You always had the option of using an open-source system
but yet you chose not to. Perhaps you would make that choice differently
today but many of us continue not to.

I don't like Microsoft any more than you, and I switched to Linux in defiance
one day out of aggravation, but I also switched back out of aggravation at the
pain of maintaining my own stuff and at the irritation of hearing the mantra
"you just need to recompile the kernel" for the 10,000th time. I compiled
hundreds of kernels with myriad open source switches set and it did not
help me. In the end, I just had a lot of kernels taking up disk space, was
not even sure what I could safely delete, and was generally very unhappy.



> Most people I know in the field came to the conclusion that it is
> suicidal to use anything you don't get sources for.

Many people I know in the field are of the opinion it's suicidal to use
anything you can't second source. But that is very different than open
source.

Further, many people I know are also of the opinion that having sources
is proof of little or nothing. It can be extraordinarily expensive to
use those sources, so much so that if you have a basically working product,
it is far cheaper to pay the most expensive support than to try to fix
the thing yourself.

> Sure, newbies and morons can cause great damage by hacking stupid
> ``fixes'' into those packages,

So can I, but I like to amuse myself into thinking I'm not a newbie or
moron. Truth be told, I don't like the term moron and I'm not even very
hot on newbie. We are mostly all specialists in today's world, if we are
worth much at all, and that makes us all newbies or morons when we're
out of our element, at least some of the time. The term you should use
is "people not intimately familiar with the details of what they are hacking"
which is "most of us". And you should express the cost of becoming
intimately familiar with an arbitrary piece of code where the only
reason you have had to do so is that there was one little bug.

> but beating them strongly over the head
> usually causes them to improve over time (the newbies, at least. The
> morons are finally transferred to the Visual C++ crowd :-).

Even without the smiley, this is an overly simplistic rendering at
best. Adding the notion that one should take lightly the idea that
someone might reasonably WANT not to be an expert in everything only
compounds things.

Kent M Pitman

unread,
Sep 13, 2001, 1:36:54 PM9/13/01
to
Alain Picard <api...@optushome.com.au> writes:

> Where I did my astronomy work, we used this package called FIGARO
> to reduce our data. Now, FIGARO is, well, a mess (no offence, Keith! :-)
>
> At most other institutions, people use something called IRAF, which is
> _way_ more polished, and was largely written with federal grant money
> by professional programmers at NOAO. It's in almost every measurable
> way superior to FIGARO.
>
> Therein lies its flaw.
>
> The problem is that it does things so smoothly, the students who are
> exposed to it tend to develop an attitude of "if IRAF can't do it, it
> can't be done (or isn't worth doing)".
>
> Students exposed to FIGARO develop the philosophy "If FIGARO can't
> do this, I'll hack my routine" (FIGARO is very open to extension).

This more shows that people get easily lazy and can make bad conclusions.
Science is not helped by reasoning in unfounded ways. You could turn the
opportunity into a help of science by teaching them why this is an unfounded
assumption and so improving their reasoning skills.

It's not that I don't understand that sometimes people follow the
Clarke-ian "any sufficiently advanced technology is indistinguishable from
magic" thing, where they can start to revere a large component they don't
understand, I just don't see source availability as a fix for that. Do
you honestly think that if Figaro's source was available, you could have
just pointed to line 7 of some piece of code and said "it says here that
althought it can't do this, it is still known to be possible to do"?
The specific example you cite is one that requires remarkable sophistication
to prove even in the presence of sources, and probably only the person doing
the proof will truly know they are right. Others will probably just
accept the credentials of the proof-maker (whose brain is closed source,
incidentally) and accept that on faith. If he's wrong in his proof, they'll
likely accept it anyway.

> So it takes the latter category of students 5 times longer than
> the first category to do anything, but they learn that anything
> can be done.

Actually, as regards that, I have found often that the open source community
causes people to say "you shouldn't write that yourself, you should get
the standar component that's already available". This diminishes the
availability of multiple sources of components (even if both were open source)
and it also diminishes people's actual experience building standard stuff.
They don't always READ the source, often they just compile it and trust
it. By contrast, the availability of closed source can lead to multiple
gene pools and increased understanding because at least some people go back
and reimplement things that are done before, creating more knowledge of
how-to from the ground up and SOMETIMES creating new ways of implementing
or thinking about something that would not have been created in the
convergent line of "it's been done, don't redo it" that sometimes pops
up in open source. I'm not saying open source always causes this, but
I have definitely observed this to happen enough times that it DOES happen
sometimes--ESPECIALLY when an employer doesn't want you to spend money
reimplementing a package that is already freely available (even if inferior
to what you think you might write). If an employer felt there was cost in
both directions, they might let you experiment some, but the fact is that
the dollar cost of obtaining freeware biases whether they will let you do
something over in a way that may not match the theoretical advisability.

> Later in life, they discover IRAF, switch to it,
> and use it very astutely, as they _know_ that they can, in principle,
> reduce their data in any way they choose. So they turn out to
> be better scientists.
>
> Now, the analogy to commercial black boxes is flawed, as both FIGARO
> and IRAF have open sources, but the point remains: the package *perceived*
> as a black box does, in fact, "fits less well with the scientific method."
>
> As always, YMMV.

It did. ;-)

Tim Moore

unread,
Sep 13, 2001, 3:24:22 PM9/13/01
to

Ah! Perhaps you are not familiar with the issues. CMUCL depends on
detailed knowledge of details below the level of ANSI C or POSIX source
code compatibility, in particular the layout of signal stack frames. If
this changes in an incompatible way, you're hosed.

FreeBSD doesn't get off the hook completely, and nor should it have to:
between 3.X and 4.X the layout of signal stack frames changed when support
for large numbers of signals was added. We had to make changes to CMUCL
to support this in order to continue building the C support code from
source (perhaps a magic #define would have kept things working too).
However, old binaries did continue to "just work."

Tim


Jochen Schmidt

unread,
Sep 13, 2001, 4:12:09 PM9/13/01
to
Tim Moore wrote:

> Ah! Perhaps you are not familiar with the issues. CMUCL depends on
> detailed knowledge of details below the level of ANSI C or POSIX source
> code compatibility, in particular the layout of signal stack frames. If
> this changes in an incompatible way, you're hosed.
>
> FreeBSD doesn't get off the hook completely, and nor should it have to:
> between 3.X and 4.X the layout of signal stack frames changed when support
> for large numbers of signals was added. We had to make changes to CMUCL
> to support this in order to continue building the C support code from
> source (perhaps a magic #define would have kept things working too).
> However, old binaries did continue to "just work."

Is there decent OpenGL support for *BSD? (HW not SW)

ciao,
Jochen

--
http://www.dataheaven.de

Alain Picard

unread,
Sep 14, 2001, 8:41:16 AM9/14/01
to
Kent M Pitman <pit...@world.std.com> writes:

> Alain Picard <api...@optushome.com.au> writes:
>
> >
> > Students exposed to FIGARO develop the philosophy "If FIGARO can't
> > do this, I'll hack my routine" (FIGARO is very open to extension).
>
> This more shows that people get easily lazy and can make bad conclusions.

Perhaps. Or perhaps not.

Another analogy: People used to tinker with cars more 30 years ago,
when cars were more "transparent". Now, with electronic fuel injection,
there's not a lot you can do to tune a car, so people tend not to do it.
As a result, people tend to know less about cars.

This is _notwithstanding_ the fact that modern cars are, in fact,
_superior_ to cars of old, and require far less maintenance. But the
price of this superior technology is that _people_ now have inferior
knowledge on cars *as systems*.

> Science is not helped by reasoning in unfounded ways.

Oh, I see. Thanks, I was confused there. :-)


> It's not that I don't understand that sometimes people follow the
> Clarke-ian "any sufficiently advanced technology is indistinguishable from
> magic" thing, where they can start to revere a large component they don't
> understand, I just don't see source availability as a fix for that.

Well, if you re-read my post, you'll note that I said that _both_
systems were, in fact, open source, but one had the property of being
sealed somewhat more hermetically, making it more "black box like"
than the other. And from this I drew a personal conclusion (shared by
many scientists I know, BTW).

I was _not_ starting a diatribe of closed vs. open sources, a discussion
which does not interest me. I was talking about "black boxes" and the
scientific method.

> > As always, YMMV.
>
> It did. ;-)

I expected it to. ;-)

Coby Beck

unread,
Sep 14, 2001, 9:08:46 AM9/14/01
to

"Alain Picard" <api...@optushome.com.au> wrote in message
news:86lmji9...@gondolin.local.net...

> Kent M Pitman <pit...@world.std.com> writes:
>
> > Alain Picard <api...@optushome.com.au> writes:
> >
> > >
> > > Students exposed to FIGARO develop the philosophy "If FIGARO can't
> > > do this, I'll hack my routine" (FIGARO is very open to extension).
> >
> > This more shows that people get easily lazy and can make bad conclusions.
>
> Perhaps. Or perhaps not.
>
> Another analogy: People used to tinker with cars more 30 years ago,
> when cars were more "transparent". Now, with electronic fuel injection,
> there's not a lot you can do to tune a car, so people tend not to do it.
> As a result, people tend to know less about cars.
>
> This is _notwithstanding_ the fact that modern cars are, in fact,
> _superior_ to cars of old, and require far less maintenance. But the
> price of this superior technology is that _people_ now have inferior
> knowledge on cars *as systems*.
>

To me this debate is losing site of the original statement about black-box and
scientific method. I think Alain's anecdotal arguments definately support the
notion that black boxes discourage investigation - of those black boxes, at
least. But I took the term scientific method to mean experimentation,
hypothesis, prediction and all that stuff, not just being curious or having a
desire to learn.

Kents points about other kinds of tools (physical machinery) being similarily
"black" are well made. The question is really how close to the heart of what
you are investigating is the software you are using.

In some senses I would think it *against* scientific methods to be able to open
up the hood of your software and tweak it because, say, the results were not
what you expected. To use the car analogy, suppose you are scientifically
investigating how far A is from B. Do you open the hood and adjust the
odometer settings because the reading surprised you? If you doubt the result,
get another car (black box) and do the experiment again.

This is not to say fixing a broken odometer is necessarily anti-scientific I
just think open versus closed software tools is orthoganal to principles of
scientific method.

Coby

Tim Moore

unread,
Sep 14, 2001, 3:29:56 PM9/14/01
to
On Thu, 13 Sep 2001, Jochen Schmidt wrote:

> Tim Moore wrote:
>
>> [FreeBSD rocks]

> Is there decent OpenGL support for *BSD? (HW not SW)
>
> ciao,
> Jochen

My understanding is that it's not quite as good as Linux, but still pretty
good. Cards whose support is open source, like the Matrox line, are very
well supported. Linux X server binaries run fine. Otherwise the level of
support depends on whether or not binary kernel drivers are required, and
even then I believe that those modules can be made to work with
appropriate shims.

If you want to use the latest NVidia cards to their fullest, it's probably
best to stick with Linux. Otherwise, the decision is more complex.

Well off topic,
Tim


Kent M Pitman

unread,
Sep 14, 2001, 4:12:13 PM9/14/01
to
Alain Picard <api...@optushome.com.au> writes:

> Kent M Pitman <pit...@world.std.com> writes:
>
> > Alain Picard <api...@optushome.com.au> writes:
> >
> > >
> > > Students exposed to FIGARO develop the philosophy "If FIGARO can't
> > > do this, I'll hack my routine" (FIGARO is very open to extension).
> >
> > This more shows that people get easily lazy and can make bad conclusions.
>
> Perhaps. Or perhaps not.
>
> Another analogy: People used to tinker with cars more 30 years ago,
> when cars were more "transparent". Now, with electronic fuel injection,
> there's not a lot you can do to tune a car, so people tend not to do it.
> As a result, people tend to know less about cars.

But that may not be a function of the closedness. It might be a function
of the complexity or the tools required. For example, people used to write
assembly code themselves but there exist assembly languages with sufficient
addressing modes and other strange features that humans don't tend to
tackle them manually. Consequently, they know less about them. That doesn't
mean they use them less nor use them less effectively. Directness of use
and cost of creation vs cost of dispoal issues can play in, too. My
conclusion is that this thought experiment is inadequately controlled.



> This is _notwithstanding_ the fact that modern cars are, in fact,
> _superior_ to cars of old, and require far less maintenance. But the
> price of this superior technology is that _people_ now have inferior
> knowledge on cars *as systems*.

(Sorry about the sensitive topic, but) I'm told modern planes are easy
to fly because so much is automatic. You would think this made them
necessarily safer, but the result has not been uniformly "pilot spends
more time thinking about the flight". Rather, pilots now fall asleep
with boredom and flight attendants have a regular duty to wake them.

Accidentally back on the topic of Common Lisp, one of the dangers of living
with any technology that insulates you from the pain of lower level details
is that such superior technology (source or not) leaves you with inferior
knowledge by at least some metric. Most people hope taht this doesn't
mean we stop using our brains, but that we apply our brains to higher and
higher things. I almost didn't graduate from MIT in 1980 because there was an
insistence that I study hardware and numerous claims by professors that one
could not master programming if one did not understand the hardware. I
refused to take the classes they insisted I needed, and had to change majors
from EE&CS to Philosophy to graduate with the courses I felt would properly
prepare me. I like to think myself not "stupider" for my lack of knowledge
of hardware, but rather "a specialist in a different area".

I was told mathemeticians used to spend a lot of their lives memorizing
logarithms, and I'm sure now they just use calculators. I'm quite sure
lawyers of the past have spent too much time memorizing case law, where
new lawyers probably don't know how to do that because they can do searches
into well-indexed databases. All of math is well-documented (open source).
Direct access to legal data turns out to be problematic and is more like
closed-source, yet that doesn't impede lawyers.

People with cooks learn to be lazy, and it's not for lack of cookbooks.

None of this has to do with sources.

> > Science is not helped by reasoning in unfounded ways.
> Oh, I see. Thanks, I was confused there. :-)

Happy to be of help. ;-)



> > It's not that I don't understand that sometimes people follow the
> > Clarke-ian "any sufficiently advanced technology is indistinguishable from
> > magic" thing, where they can start to revere a large component they don't
> > understand, I just don't see source availability as a fix for that.
>
> Well, if you re-read my post, you'll note that I said that _both_
> systems were, in fact, open source, but one had the property of being
> sealed somewhat more hermetically, making it more "black box like"
> than the other. And from this I drew a personal conclusion (shared by
> many scientists I know, BTW).

Certainly you're entitled to any opinion you like, of course. But, as
I'm sure you know, scientists can hold non-scientific points of view,
so saying it's a view held by a scientist is not, in and of itself, a
proof of any kind.

> I was _not_ starting a diatribe of closed vs. open sources, a discussion
> which does not interest me. I was talking about "black boxes" and the
> scientific method.

That's ok. But even black boxes can be studied. (The Turing test is
one such attempt. Psychology, in general, is another, since people are
in many ways black boxes. Yes, people can have their anatomies probed,
but software black boxes are, as a matter of science if not a matter
of law, reverse engineerable, too.)

It's possible for something to be in plain view but still computationally
impenetrable to the same degree as if you had withheld it. Read
Rivest's work on chaffing, for example. Or consider the halting problem.
Even in the domain of black boxes instead of closed sources, a very similar
set of issues, if not a completely isomorphic set of issues, would seem
to apply.

Kent M Pitman

unread,
Sep 14, 2001, 4:28:01 PM9/14/01
to
"Coby Beck" <cb...@mercury.bc.ca> writes:

> To me this debate is losing site of the original statement about
> black-box and scientific method. I think Alain's anecdotal
> arguments definately support the notion that black boxes discourage
> investigation - of those black boxes, at least.

I claim what his argument supports is the notion that "convenient technology"
discourages investigation.

> But I took the term
> scientific method to mean experimentation, hypothesis, prediction
> and all that stuff, not just being curious or having a desire to
> learn.

Yes, that was my assumption, too.



> Kents points about other kinds of tools (physical machinery) being similarily
> "black" are well made. The question is really how close to the heart of what
> you are investigating is the software you are using.
>
> In some senses I would think it *against* scientific methods to be
> able to open up the hood of your software and tweak it because, say,
> the results were not what you expected.

FWIW, I think it is *against* scientific method to say that something is
against scientific method. Scientific method is a process, a tool. A
superhard surface is not "anti-hammer". An unturnable pin is not
"anti-screwdriver". Not every tool applies in every case. It is not the
job of data to bend to the scientific method. The scientific method works
pretty well. It may even have limits (e.g., in quantum physics where
reproducibility is tough).

> To use the car analogy,
> suppose you are scientifically investigating how far A is from B.
> Do you open the hood and adjust the odometer settings because the
> reading surprised you? If you doubt the result, get another car
> (black box) and do the experiment again.
> This is not to say fixing a broken odometer is necessarily anti-scientific I
> just think open versus closed software tools is orthoganal to principles of
> scientific method.

Yep.

Kaz Kylheku

unread,
Sep 14, 2001, 4:46:33 PM9/14/01
to
In article <sfwy9nh...@world.std.com>, Kent M Pitman wrote:
>"Coby Beck" <cb...@mercury.bc.ca> writes:
>
>> To me this debate is losing site of the original statement about
>> black-box and scientific method. I think Alain's anecdotal
>> arguments definately support the notion that black boxes discourage
>> investigation - of those black boxes, at least.
>
>I claim what his argument supports is the notion that "convenient technology"
>discourages investigation.

Moreover, blackboxes encourage *empirical* investigation; disciplined
empirical investigation is the ``scientific method''.

Natural systems are often blackboxes; their rules are hidden from direct
observation, and must be derived by repeated hypothesis-forming and
confirmation via experiments. For instance, we want to understand language,
but the human brain is a blackbox which does not allow us direct
access to the unconscious rules of language. That doesn't mean that
investigation is discouraged, it is only made difficult.

When empirical methods are applied to something man-made, like a
programming language or a programming interface, that is a stupid waste of
time, justifiable only in the absence of correct, adequate documentation,
source code or other artifacts by which one can gain more or less direct
access to the rules.

Craig Brozefsky

unread,
Sep 14, 2001, 4:47:03 PM9/14/01
to
Kent M Pitman <pit...@world.std.com> writes:

> But that may not be a function of the closedness. It might be a
> function of the complexity or the tools required. For example,
> people used to write assembly code themselves but there exist
> assembly languages with sufficient addressing modes and other
> strange features that humans don't tend to tackle them manually.
> Consequently, they know less about them. That doesn't mean they use
> them less nor use them less effectively. Directness of use and cost
> of creation vs cost of dispoal issues can play in, too. My
> conclusion is that this thought experiment is inadequately
> controlled.

A childhood friend of mine is an auto mechanic, as is his father.
While chilling in their office sipping burnt drip coffee and downing
some b0mb cheeze-steaks we got to talking about the changes brought
about by the computerization of autos. They showed me several of the
diagnostic tools they used. They were quite simple, basically a blue
box with some leads, couple buttons, and a catridge slot for a ROM.
They were little more than hand-held TTYs with small LED screens,
packaged in something that is a bit more appropriate for garages.

What they ended up paying for was the software to interface with the
different chipsets the manufacturer used. They would perhaps have to
put on a adaptor for the leads, but otherwise the same tool could be
used for all models in a auto companies line-up. The tool was
cheaped. What they payed for was the ROMs.

These ROMs prolly used the tools serial interface and a simple API for
the LED, and the rest was a lookup table of error and result codes,
perhaps a simple menu of commands. After going thru a few of them, it
was obvious they were not all that complex, certainly not worth the
price they had to pay in order to maintain their subscription.
However, they had no choice, you couldn't get them anywhere else.

--
Craig Brozefsky <cr...@red-bean.com>
http://www.red-bean.com/~craig
The outer space which me wears it has sexual intercourse. - opus

Alain Picard

unread,
Sep 14, 2001, 6:20:29 PM9/14/01
to
Kent M Pitman <pit...@world.std.com> writes:

> Certainly you're entitled to any opinion you like, of course. But, as
> I'm sure you know, scientists can hold non-scientific points of view,

Oh, of course. Scientists are people, of course, and their viewpoints
need only be scientific when they do science. After hours, over a beer,
most of them turn into philosophers. Hell, after some years, some of
them even turn into computer programmers! ;-)


> so saying it's a view held by a scientist is not, in and of itself, a
> proof of any kind.

That's called "proof by authority". :-) Seriously, of course not.
But it _is_ interesting to know how scientists think about their work
and science in general. And the variety of views on that topic you'll
see is actually quite outstanding, a fact I found surprising in my
naīve youth, but which, in retrospect, should have been obvious.

That's what "shools of thought" in science (and other fields, I guess)
are about.

As always, thanks for the lively and interesting discussion, Kent!

Håkon Alstadheim

unread,
Sep 15, 2001, 12:21:11 PM9/15/01
to
Kent M Pitman <pit...@world.std.com> writes:
[.. lots of good stuff snipped...]

> That's ok. But even black boxes can be studied. (The Turing test is
> one such attempt. Psychology, in general, is another, since people are
> in many ways black boxes. Yes, people can have their anatomies probed,
> but software black boxes are, as a matter of science if not a matter
> of law, reverse engineerable, too.)

This is really beside the point if the black box is supposed to be a
tool helping in your queries into something else. Usually you'd like
to get right to work and not have to do heavy research to find the
specifications of your tools.

>
> It's possible for something to be in plain view but still computationally
> impenetrable to the same degree as if you had withheld it. Read
> Rivest's work on chaffing, for example. Or consider the halting problem.
> Even in the domain of black boxes instead of closed sources, a very similar
> set of issues, if not a completely isomorphic set of issues, would seem
> to apply.

Research topics in their own right. Nobody uses a purely theoretical
turing machine to run an astrophysics simulations. They use some
well-documented or open source system.

--
Håkon Alstadheim, Montreal, Quebec, Canada

Bijan Parsia

unread,
Sep 16, 2001, 2:12:22 PM9/16/01
to
On Fri, 14 Sep 2001, Kaz Kylheku wrote:

[snip]


> When empirical methods are applied to something man-made, like a
> programming language or a programming interface, that is a stupid waste of
> time, justifiable only in the absence of correct, adequate documentation,
> source code or other artifacts by which one can gain more or less direct
> access to the rules.

Just to add to the list of justifiable circumstances: Various testing,
verification, and Q&A methodologies make use of empircial methods and
explicitly employ a black box hypothesis. Direct access to the rules
sometimes hinders human beings in understanding the effects, correctness,
or complex interaction of those rules (i.e., does this mean what we
*think* it means).

I sometimes find this useful for honing or bootstrapping my understanding
of a bit of code or a system, i.e., give it some inputs, check the
outputs, *then* look at the code, try some more inputs, see the outputs,
ah *HAH*, now I "get" the algorithm. (Note, I'm not saying that this is a
verification procedure, but that sometimes I understand something better
with a body of concrete experiences backing me up.)

A conceptually separate, but connected, mode of exploration is "stepping
through" or tracing calls into the system. Seems a bit of a mixture of
finegrain black boxy stuff (i.e., check inputs/outputs at a variety of
levels) and "direct access to the rules".

Cheers,
Bijan Parsia.

cbbr...@acm.org

unread,
Sep 17, 2001, 10:36:02 PM9/17/01
to
Alain Picard <api...@optushome.com.au> writes:
> Kent M Pitman <pit...@world.std.com> writes:
>
> > Alain Picard <api...@optushome.com.au> writes:
> >
> > >
> > > Students exposed to FIGARO develop the philosophy "If FIGARO can't
> > > do this, I'll hack my routine" (FIGARO is very open to extension).
> >
> > This more shows that people get easily lazy and can make bad conclusions.
>
> Perhaps. Or perhaps not.

> Another analogy: People used to tinker with cars more 30 years ago,
> when cars were more "transparent". Now, with electronic fuel
> injection, there's not a lot you can do to tune a car, so people
> tend not to do it. As a result, people tend to know less about
> cars.

> This is _notwithstanding_ the fact that modern cars are, in fact,
> _superior_ to cars of old, and require far less maintenance. But
> the price of this superior technology is that _people_ now have
> inferior knowledge on cars *as systems*.

I'd argue a somewhat different reasoning...

People don't twiddle with EFI because of the balance of cost and
benefit.

In the past, fiddling with carbueration involved sticking in a
screwdriver, and "fixing" a twiddle gone wrong pretty much involved
turning the screw back.

The "new classic" car repair scenario is to need to spend $300 to
replace some on-board computer. EFI may be "better," but appears
fairly fragile. And there's generally no "screw" to turn. Indeed,
the only "twiddling" generally available involves buying a new
after-market chip to plug in, and that's liable to cost more than $300
:-(.

This doesn't add up to a scenario that's at _all_ condusive to
experimentation. It happens to nicely play into the hands of the auto
makers; they'd be just as happy if the only people capable of fiddling
with their cars are their own "factory trained technicians," and
that's pretty much the result of things like EFI.

A perhaps comparable notion on the scientific research side is that of
high energy particle physics. Things have "progressed" to the point
where, in order to do new work, it takes an accelerator costing
multiple _BILLIONS_ of dollars.

This nicely connects back to ILISP as well as the notion of OS
research...

Rob Pike has commented that _novel_ OS research is pretty much dead in
his talk "Systems Software Research is Irrelevant;" the principles are
pretty similar to that with EFI, as the cost of building a usable
"system environment" that _isn't_ "essentially Unix" is so high as to
be daunting to just about anyone with less money than Bill Gates.

OSS "pundit", Eric Raymond, suggests that the EROS operating system
might be the successor to Linux; about the only way that is likely to
happen is if someone builds a Unix environment emulation sufficiently
precise that applications can't tell that they're _NOT_ running on
Linux. And if that's the case, then it sort of begs the question:
Isn't the environment _essentially_ the same as Unix then?

> > Science is not helped by reasoning in unfounded ways.
> Oh, I see. Thanks, I was confused there. :-)
>
>
> > It's not that I don't understand that sometimes people follow the
> > Clarke-ian "any sufficiently advanced technology is indistinguishable from
> > magic" thing, where they can start to revere a large component they don't
> > understand, I just don't see source availability as a fix for that.
>
> Well, if you re-read my post, you'll note that I said that _both_
> systems were, in fact, open source, but one had the property of being
> sealed somewhat more hermetically, making it more "black box like"
> than the other. And from this I drew a personal conclusion (shared by
> many scientists I know, BTW).
>
> I was _not_ starting a diatribe of closed vs. open sources, a discussion
> which does not interest me. I was talking about "black boxes" and the
> scientific method.

I suggest the thought that even that's not quite orthogonal enough.

In order to find it worthwhile to try to apply scientific method,
there needs to be a reasonable expectation of finding something
interesting enough to justify spending enough research resources on
it.

For the average person, the costs associated with fiddling with their
car have grown rather high, making it daunting and generally
unworthwhile.

The same is pretty much true for OS research; almost anything novel
enough to be academically interesting requires too much effort.

Actual _language_ efforts associated with CL are probably in the same
boat: the costs of getting standardization efforts back on track
probably outweigh the value of such, and there's not really anyone
prepared to engage the effort. [You've got to have a reasonably
diverse set of folks willing to work on the committee, willing to pay
for things like travel costs, time not doing "more productive" work,
and such...]
--
(reverse (concatenate 'string "moc.enworbbc@" "enworbbc"))
http://www.ntlug.org/~cbbrowne/wp.html
"In computing, invariants are ephemeral." -- Alan Perlis

Erik Winkels

unread,
Sep 18, 2001, 8:37:49 AM9/18/01
to
cbbr...@acm.org writes:
>
> OSS "pundit", Eric Raymond, suggests that the EROS operating system
> might be the successor to Linux; about the only way that is likely
> to happen is if someone builds a Unix environment emulation
> sufficiently precise that applications can't tell that they're _NOT_
> running on Linux. And if that's the case, then it sort of begs the
> question: Isn't the environment _essentially_ the same as Unix then?

To the applications perhaps but not to the user. The user now has an
OS in which he can run legacy Unix applications and applications that
take advantage of the features of his new OS.


Erik.

rohan nicholls

unread,
Sep 18, 2001, 8:49:24 AM9/18/01
to
Eric Moss <eric...@alltel.net> wrote in message news:<3BA0D410...@alltel.net>...
Phew that was a long digression.

Okay from someone new to Lisp.

This is my background: I am a web application developer, and want to
deepen the scope of my development. So I thought I would learn some
new languages and start getting into how to make applications, and get
more into the server side architecture to improve preformance etc. I
started with Java as a natural progression from the web, but found it
cumbersome, and I detest and refuse to use VB. I started to do some
research into languages and ended in Lisp which has been like a coming
home. I am at the moment developing on Windows2000

This lead to looking into different implementations and these are my
impressions of both product and price.

CLISP - can't get a better price, but I am not very adept at
commandline, something I intend to learn, and I did not know enough to
integrate it with emacs, but might in the futre.

Allegro CL - Nice product, obviously a heavy duty tool, and good
documentation. I started by using their demo, and then wanted to get
a price as I did not want to sink a lot of time learning a system
without the possibility of using it in the future under my own steam
as a developer, and if I could sell my company on using it then buying
a license. No prices on the website, so I emailed and the Allegro
people are great, really helpful and they said they were not used to
receiving requests from people in my position ie.not a student, but
learning and might be developing on my own. Then I saw their prices
and understood what they were talking about. Obviously Allegro is an
industrial tool with industrial clients who are already committed to
Lisp as a development platform. There is no way I would be able to
afford their license on my own ($4000 pro, $8000 ent + licensing
fees), and if I convinced an employer to adopt Lisp as a development
platform hitting them right off with huge license fees and royalties
was not going to help my case. I understand that Allegro is an
industrial instrument and I am no where near to needing all the power
they offer, but they are also completely out of my reach. I did
however appreciate the six month trial, although when I realized that
using them professionally on my own was out of the question I have
stopped using them.

Xanalysis LWW - I have not really looked at it, and I have the
personal edition. If I was to convince an employer to buy a license
they would have no problem with anything under $1000 until it had
proved its worth and then more after that, up to $2000 and you are
clear, no royalties otherwise I personally find that the whole thing
becomes expensive. The only problem with LWW is that they go from the
free personal edition to $700 in one jump. What might be nice is
something that I am allowed to produce and distribute programs with
limited functionality for somewhere in the middle. I am willing to
pay $200-300 for something like that with an option to upgrade later
to the more heavy duty version, and not have to worry about
distributing applications with it, or developing objects for a web
server that will be used in a professional setting to try them out,
and then upgrade as I come to be able to effectively use more and more
of the power offered in the more expensive versions.

Corman Lisp - which I am using at the moment and am in the process of
buying a license for is very much a work in progress, but it is a good
quality work in progress and I am willing to fork out $200 for the
amount I have been enjoying learning on it, and that it produces fast
compiled code and is a nice implementation. It is priced in a range
that I am willing to pay for while in a learning stage, and I don't
have to worry about buying expensive licenses if I want to use for
purposes that walk the blurry line of commercial or personal, such as
an object on my web server that has advertising, so I know I can use
it. At this early stage in what I hope will be a long Lisp career,
this product has gained my loyalty, and I intend as I learn to be
contributing to the community and hopefully helping fill in the holes
where needed.

I know that in the future I might end up using one of the other
products out there, as I come to make use of the increased power,
however knowing myself it is just as likely that I would develop the
needed functionality myself and then release it to the community to
fill in one of those holes. And by the time I move onto unix style
operating systems Corman will probably be ported over as is intended.

So, personally I know I would pay somewhere between $200-$300US for a
product out of my pocket and pretty much up front, such as now while I
am still learning it. Then as I have more need for and learn the
usefulness of more powerful tools and utilities I would upgrade,
especially as I was being financially compensated for the work I have
done with the tool either directly or indirectly, then it becomes a
business cost. But that initial customer loyalty should not be
underestimated, as it is a large effort to learn a new tool and its
particular libraries, tools and idiosyncracies, so being creatures of
habit we are more likely to buy an upgrade or develop the needed
functionality ourselves than go out learn, test and buy a whole new
tool.

I do appreciate that a lot of time, and effort goes into these tools,
and I do not think they are necessarily overpriced for the power they
offer, but I would not call the prices asked approachable.

Yours-in-cheapness :)

Rohan

Frank Sonnemans

unread,
Sep 18, 2001, 11:27:38 AM9/18/01
to
Programming mostly for fun I miss entry (price) level products for learning
CL. Having explored many languages I now turned to CL with the Franz trial
version. Not having to pay up front to learn a new language is important to
get going. I personally would not invest in a tool unless I have found it to
be of use.

Now I am at the point that I would like some applications for my personal
use. Here the trouble starts. I would not mind paying 200 USD for a product
which allows development of non-commercial (freely available) software.
Having to pay more when my apps turn commercial is ok, since it then becomes
a justifiable business expense.

The current price levels for the commercial implementations are just to high
for me. It seems they only want to address the high end professional user
market segments. This is a shame because developing on the Windows platform
(using COM, GUI etc) the free alternatives don't seem to offer what is
needed. Consequently for Windows development other (more affordable) less
powerfull languages are being used.This is not to the benefit of the
commercial vendors who would obviously benefit from a larger user base.


Kent M Pitman

unread,
Sep 18, 2001, 2:28:26 PM9/18/01
to
"Frank Sonnemans" <frank.s...@euronet.be> writes:

> Programming mostly for fun I miss entry (price) level products for learning
> CL. Having explored many languages I now turned to CL with the Franz trial
> version. Not having to pay up front to learn a new language is important to
> get going. I personally would not invest in a tool unless I have found it to
> be of use.
>
> Now I am at the point that I would like some applications for my personal
> use. Here the trouble starts. I would not mind paying 200 USD for a product
> which allows development of non-commercial (freely available) software.
> Having to pay more when my apps turn commercial is ok, since it then becomes
> a justifiable business expense.

I think the Xanalys LispWorks Professional product is cheaper than the Franz
option. And it has royalty-free runtimes. Xanalys has a free Personal
edition that you can try out.

> The current price levels for the commercial implementations are just to high
> for me.

Corman CL is a commercial implementation well within the price range
you mention. So is Eclipse.

But, frankly, I think you have unreasonable expectations. Even commercial
C++, while it has cheap (e.g., $99) teaching editions, pretty much makes
you pay at least $500-$600 for a professional edition. LispWorks, at least,
is about $800 ($700 for academic purchasers), which is well in range for
a commercial professional language product.

> It seems they only want to address the high end professional user
> market segments.

They? Like they're a big block all working together? Some go after the
low end. Check out Corman Common Lisp.

> This is a shame because developing on the Windows platform
> (using COM, GUI etc) the free alternatives don't seem to offer what is
> needed.

Well, you could always start your own Lisp company.

Come on, seriously, do you think they would be deliberately throwing away
business? They are each going after markets they think will keep
them in business. They are doing the best they can. If you can do better,
the burden is on you to do so. If not, then you should just decide if what
is there is worth it, and if not, you should use another language. Because
wishing just isn't going to make it so.

One reason free alternatives don't offer what is needed is that secretly
they don't develop for free. They need sources of income to pay for their
work. If you pay them for what you want, maybe they'll make more free stuff.
Or get together with your friends to raise the money. Otherwise, yes, you
have to wait on their time/generosity.

> Consequently for Windows development other (more affordable) less
> powerfull languages are being used.This is not to the benefit of the
> commercial vendors who would obviously benefit from a larger user base.

How can you make this last statement? Do you think they haven't thought
of this? If they dropped their price, they might get a larger base but
they also might not meet their quarterly numbers and they might be out
of business.

Incidentally, I make similar statements all the time. I happen to believe
that dropping prices would help, too. I just try to
routinely prefix them with "I think" so it's clear I am not making a
statement of fact. I understand there is enormous commercial risk in
asserting such a thing as if it were a certainty, and though I may have
a strong belief on this, it doesn't make me right. These people are
gambling on their chosen strategies with their professional lives. It is fine
to assert what you personally think, but don't confuse that with
an assertion that you know yourself to be right. No one knows what
is right.

The test of what really is right is what a person or company who is
personally willing to assume the consequnces of the risk is willing to
live with. Everything else is a fantasy.

They are not going to suddenly hit their foreheads and say "duh, we
should drop our prices. we never thought of that." or "oh my god,
there are people out there who would buy our product if we dropped the
price and we're losing them." That just isn't a correct description of
the problem.

Kent M Pitman

unread,
Sep 18, 2001, 3:03:41 PM9/18/01
to
rohan.n...@canoemail.com (rohan nicholls) writes:

> There is no way I would be able to

> afford [Franz] license on my own ($4000 pro, $8000 ent + licensing
> fees),

Did they phase out their personal edition? There is an irritating problem
between LispWorks that the naming of corresponding editions used to be:

Xanalys Franz
Personal (free) Trial (free)
Professional Personal
Enterprise Professional
Enterprise

I don't know if this has been fixed in franz, but you might ask if there
is a "personal" edition, and you might find it cheaper.

Personally, I inquired a couple of weeks ago about prices and no one sent
me anything. Had they done so, I might be able to speak from more
accurate knowledge. But I did try.

> Xanalysis LWW - I have not really looked at it, and I have the
> personal edition. If I was to convince an employer to buy a license
> they would have no problem with anything under $1000 until it had
> proved its worth and then more after that, up to $2000 and you are
> clear, no royalties otherwise I personally find that the whole thing
> becomes expensive. The only problem with LWW is that they go from the
> free personal edition to $700 in one jump.

So? What kind of an employer thinks this is an out of line price? If
you can't afford this much for good tools, you have a very odd model.
If you're doing programming for less than $35K, you're being cheated.
The loaded cost (including taxes and office space and equipment) of
a developer is going to be at least 1.5 times the salary the progrmmer
sees, so that's $50K., or about $1000/wk. 700 is then, 3 days pay.
If you don't think having a good commercial tool will pay for itself
in short order by not making you waste three paid days in the way
unsupported freeware might, I think you're kidding or confused.
And most programmers I know are paid to ro three times that, so
the payback time is only 1 or 2 days max. In some cases, the amount
will pay you back before you finish installation.

> What might be nice is
> something that I am allowed to produce and distribute programs with
> limited functionality for somewhere in the middle.

No one has ever come up with a serviceable model of limited functionality.
Also, this would injure the vendor who is already providing exceedingly
good quality as freeware to you when it's plain that every vendor could
desperately us that money. There is a limit to charity, and you are
asking for charity, nothing less.

I suspect the problem is that with a $200 version, people wouldn't use
it as a stepping stone but as an end to avoid ever upgrading. As such,
I can see a high-end vendor avoiding this. I think the reason Corman
can afford to do this is that he's making enough money to be able to
afford that risk; but larger companies might lose cash flow from a higher
end product if they offered one of these, and they cannot afford that risk.

But always, I come back to personal salary. If you can afford to implement
the thing yourself faster than what you get, you are paying too much. If
use of a program is not saving you an amount of time that is going to pay
for the cost, then you are paying too much. If you honestly think your
programming in C++ or whatever other language would not lose you producitvity
more than 2-3 days over programming in Lisp, then you have proven you should
be programming in one of those languages because you can get a better price.
Otherwise, you're being asked a fair price.

If your company won't buy it, buy it yourself. I've done that kind of
things many times in my career when I thought companies were stupid about
what they would and would not purchase. If it improves your productivity,
you'll get it back in salary increases. Plus it's probably tax deductible
as a business non-reimbursed expense (check with your tax accountant).

> I am willing to
> pay $200-300 for something like that with an option to upgrade later
> to the more heavy duty version, and not have to worry about
> distributing applications with it, or developing objects for a web
> server that will be used in a professional setting to try them out,
> and then upgrade as I come to be able to effectively use more and more
> of the power offered in the more expensive versions.

Xanalys Personal Edition is already useful enough to use for internal work
minus deployment. That should be adequate, IMO.

And it's a shame you haven't tried it. It's a total win of an interface
and well-packaed system. How can you evaluate the price without having
seen what you get for the price? The suite of available browsers, the
unified interface, etc. make the productivity boost over Corman CL well
worth it if you can scrape up the dollars.

I understand some people just don't have the dollars, but I am busy starting
my own business right now myself and all I can say is that if you are
seriously committed to succeeding in business, you don't say "gee, maybe
I can get by without investing in what I really need", you say instead
"gee, maybe i'll have to go into hock and call it an investment because
i just can't do without these tools". Businesses have to invest to get
ahead. You don't get to break even money-wise from the start; that
comes later.

Erik Naggum

unread,
Sep 18, 2001, 3:52:53 PM9/18/01
to
* Kent M Pitman <pit...@world.std.com>

> They are not going to suddenly hit their foreheads and say "duh, we
> should drop our prices. we never thought of that." or "oh my god, there
> are people out there who would buy our product if we dropped the price
> and we're losing them." That just isn't a correct description of the
> problem.

It may not be the new customers they lose that are the biggest problem,
but the customers they already have who pay maintenance fees that may be
much harder to recover than income from first sales and then probably
much lower maintenance fees from every customer. It is also not a good
idea to piss off your old customers by suddenly dropping prices when they
have had to fork over sizeable amounts of money to get what others are
now getting at much lower prices. That is bad for repeat business and
customer relationships. To be able to get away with something like this,
one would essentially need a new brand name and a new market such that
one could keep the old market and price level while developing a new one,
which would be significantly more expensive than most people are willing
to contemplate, and I for one am not sure it could be worth it, even if
one could accurately estimate the number of lost customers and revenue in
the lower price market.

///

Peter Herth

unread,
Sep 18, 2001, 5:10:05 PM9/18/01
to
Kent M Pitman <pit...@world.std.com> writes:

> So? What kind of an employer thinks this is an out of line price? If
> you can't afford this much for good tools, you have a very odd model.
> If you're doing programming for less than $35K, you're being cheated.
> The loaded cost (including taxes and office space and equipment) of
> a developer is going to be at least 1.5 times the salary the progrmmer
> sees, so that's $50K., or about $1000/wk. 700 is then, 3 days pay.
> If you don't think having a good commercial tool will pay for itself
> in short order by not making you waste three paid days in the way
> unsupported freeware might, I think you're kidding or confused.
> And most programmers I know are paid to ro three times that, so
> the payback time is only 1 or 2 days max. In some cases, the amount
> will pay you back before you finish installation.

That the price of Lispworks is very affordable in comparison to
other "professional" products be it MS C++ or JBuilder is undisputed.
But the problem is for all those, who would like to use it in a non-
professional environment, where they are neither paid for it nor get
any other revenue from their software. For those (including me) it
would be great if there was a semi-professional version for a price
up to $200. Thats about the maximum amount I would be willing to spend
for something that at the moment is a pure hobby.

> I suspect the problem is that with a $200 version, people wouldn't use
> it as a stepping stone but as an end to avoid ever upgrading. As such,
> I can see a high-end vendor avoiding this. I think the reason Corman
> can afford to do this is that he's making enough money to be able to
> afford that risk; but larger companies might lose cash flow from a higher
> end product if they offered one of these, and they cannot afford that risk.

Yes it its probably a risk, but looking for example at Troll Tech, their
QT library is nowadays free for all non-commercial use even on Windows.
And as far as I heard, this license model seems to work, since based on
the free versions, they have a huge user base. And all those hobby
programmers using the free version become sooner or later professional
programmers, used to that product, so they are likely to use it in their
professional work too.

Corman Lisp has a very attractive price, but I am not willing to start using
Windows just because of Lisp :b

(Since Roger Corman reads this newsgroup: is there a chance for a Linux
version of Corman Lisp ?)


--
Peter Herth
Dawn of the Ages
http://dawn.netcologne.de

cbbr...@acm.org

unread,
Sep 18, 2001, 7:44:08 PM9/18/01
to

The claimed merit of EROS is that it's designed to be really rather
secure.

But if you're creating a Unix-like environment on top of it, and have
all the execution paths set up to provide something sufficiently
Unix-like to support even the libraries and APIs that are a bit buggy
but still in use, then you pretty much have something as insecure as
Unix is insecure.

In effect, what happens here is about the same thing as creating a
Windows emulation atop Unix; if it's a good enough emulation to run
Outlook, then the emulated Windows environment is liable to be about
as vulnerable to macro viruses as the "real thing." It may not affect
native Unix stuff running in separate address spaces, but certainly
leaves its own holes open...

The more important point is that if all EROS is used for is to create
a _marginally_ more secure Unix variation, this doesn't particularly
justify its existence or the effort required to make that environment
work.

It's a lot of the reason why GNU Hurd hasn't progressed much:

- To the degree to which it's "just like Unix," the existing Linux
and BSD OSes are generally better (more robust/functional) than Hurd;

- There are things one might do with the multiserver capabilities of
Hurd that couldn't be done with traditional Unixes, but hardly anyone
does that because the install base of Hurd is so small, and it isn't
ready for "High Availability" requirements.

You could probably architect really cool DBMS, RPC, and probably
(obLisp) even distributed Lisp apps atop Hurd, but the OS isn't robust
enough yet to justify the effort. So interest languishes, to a great
extent...

> Erik.

Hmm... Another Erik...
--
(concatenate 'string "cbbrowne" "@cbbrowne.com")
http://www.ntlug.org/~cbbrowne/
Rules of the Evil Overlord #65. "If I must have computer systems with
publically available terminals, the maps they display of my complex
will have a room clearly marked as the Main Control Room. That room
will be the Execution Chamber. The actual main control room will be
marked as Sewage Overflow Containment." <http://www.eviloverlord.com/>

Kent M Pitman

unread,
Sep 18, 2001, 8:55:20 PM9/18/01
to
Peter Herth <he...@netcologne.de> writes:

> Kent M Pitman <pit...@world.std.com> writes:
>

> > So? What kind of an employer thinks this is an out of line price? ...


>
> That the price of Lispworks is very affordable in comparison to
> other "professional" products be it MS C++ or JBuilder is undisputed.
> But the problem is for all those, who would like to use it in a non-
> professional environment,

But that wasn't what I was responding to.

The poster identified himself as having an employer. I'm just starting out
with my business and I'm sure the story of my tricky finances is typical of
new businesses. But I'd buy proper tools for me or anyone working for me if
it would improve throughput, even though it would mean going farther into
debt to do it. I can't really see how I can afford not to do this and be
doing competent business.

As to hobbyists, there are teaching editions and freeware. We are NOT short
of low-cost alternatives. Yes, some involve compromises, but what do you
expect from freeware?

Ola Rinta-Koski

unread,
Sep 19, 2001, 3:28:53 AM9/19/01
to
Peter Herth <he...@netcologne.de> writes:
> up to $200. Thats about the maximum amount I would be willing to spend
> for something that at the moment is a pure hobby.
(...)

> Corman Lisp has a very attractive price, but I am not willing to start using
> Windows just because of Lisp :b

Nor should you. What is stopping you from using CMUCL or CLISP, both
of which are free? They are certainly up to "pure hobby" standards.
--
Ola Rinta-Koski o...@cyberell.com
Cyberell Oy +358 41 467 2502
Rauhankatu 8 C, FIN-00170 Helsinki, FINLAND www.cyberell.com

rohan nicholls

unread,
Sep 19, 2001, 5:46:19 AM9/19/01
to
Kent M Pitman <pit...@world.std.com> wrote in message news:<sfw7kuw...@world.std.com>...
> rohan.n...@canoemail.com (rohan nicholls) writes:
I would like to apologise for the fact that I was not clear about a
couple of things at the moment. I am not criticising the prices being
asked by the companies but responding to the question which is what I
would pay, and what can I afford, which at the moment is not much. I
am learning the language which I think I made clear at the beginning
and have not reached a level where Lisp is useful to me at the moment,
so the money I am putting out is from a personal research basis.

> > Xanalysis LWW - I have not really looked at it, and I have the
> > personal edition. If I was to convince an employer to buy a license
> > they would have no problem with anything under $1000 until it had
> > proved its worth and then more after that, up to $2000 and you are
> > clear, no royalties otherwise I personally find that the whole thing
> > becomes expensive. The only problem with LWW is that they go from the
> > free personal edition to $700 in one jump.
>
> So? What kind of an employer thinks this is an out of line price? If
> you can't afford this much for good tools, you have a very odd model.
My employer would have no problem buying a license for these prices,
the 4 - 8000 of Allegro might give them pause, but at the moment there
is no reason for them to do so.

> If you're doing programming for less than $35K, you're being cheated.
> The loaded cost (including taxes and office space and equipment) of
> a developer is going to be at least 1.5 times the salary the progrmmer
> sees, so that's $50K., or about $1000/wk. 700 is then, 3 days pay.
> If you don't think having a good commercial tool will pay for itself
> in short order by not making you waste three paid days in the way
> unsupported freeware might, I think you're kidding or confused.
> And most programmers I know are paid to ro three times that, so
> the payback time is only 1 or 2 days max. In some cases, the amount
> will pay you back before you finish installation.
>
> > What might be nice is
> > something that I am allowed to produce and distribute programs with
> > limited functionality for somewhere in the middle.
>
> No one has ever come up with a serviceable model of limited functionality.
> Also, this would injure the vendor who is already providing exceedingly
> good quality as freeware to you when it's plain that every vendor could
> desperately us that money.
> There is a limit to charity, and you are
> asking for charity, nothing less.
>
> I suspect the problem is that with a $200 version, people wouldn't use
> it as a stepping stone but as an end to avoid ever upgrading.
Maybe you would, but if I find I am getting good value and am using
the product and its features I would upgrade.

>As such,
> I can see a high-end vendor avoiding this. I think the reason Corman
> can afford to do this is that he's making enough money to be able to
> afford that risk; but larger companies might lose cash flow from a higher
> end product if they offered one of these, and they cannot afford that risk.
>
> But always, I come back to personal salary. If you can afford to implement
> the thing yourself faster than what you get, you are paying too much. If
> use of a program is not saving you an amount of time that is going to pay
> for the cost, then you are paying too much. If you honestly think your
> programming in C++ or whatever other language would not lose you producitvity
> more than 2-3 days over programming in Lisp, then you have proven you should
> be programming in one of those languages because you can get a better price.
> Otherwise, you're being asked a fair price.

Again I am not saying the price is not fair for what you receive, and
I will try out LW, and see what you mean, and if I get to the point
where I am using it professionally I will very happily pay the fee out
of my own pocket. Thanks for the criticism, it seems justified and I
never meant to get your back up about it.

Good luck with the new venture.

Kent M Pitman

unread,
Sep 19, 2001, 6:17:45 AM9/19/01
to
rohan.n...@canoemail.com (rohan nicholls) writes:

> Kent M Pitman <pit...@world.std.com> wrote

[...the text I wrote removed here...]


>
> I am not saying the price is not fair for what you receive, and
> I will try out LW, and see what you mean, and if I get to the point
> where I am using it professionally I will very happily pay the fee out
> of my own pocket. Thanks for the criticism, it seems justified and I
> never meant to get your back up about it.

It's nothing personal. I just sometimes see people here with no sense of
direction about how to think about the value of things, and I think it's
an important skill -- as important as any technical skill. I'm glad you
didn't take my remarks badly because I really wasn't trying to start a
fight--just to offer some things for you (and people in general) to think
about. Especially lately, since the newsgroup seems superheated of late,
I should probably be more careful.

> > I understand some people just don't have the dollars, but I am
> > busy starting my own business right now myself and all I can say
> > is that if you are seriously committed to succeeding in business,
> > you don't say "gee, maybe I can get by without investing in what I
> > really need", you say instead "gee, maybe i'll have to go into
> > hock and call it an investment because i just can't do without
> > these tools". Businesses have to invest to get ahead. You don't
> > get to break even money-wise from the start; that comes later.
>
> Good luck with the new venture.

Thanks!

Tim Bradshaw

unread,
Sep 19, 2001, 5:51:32 AM9/19/01
to
* Kent M Pitman wrote:
> But, frankly, I think you have unreasonable expectations. Even commercial
> C++, while it has cheap (e.g., $99) teaching editions, pretty much makes
> you pay at least $500-$600 for a professional edition. LispWorks, at least,
> is about $800 ($700 for academic purchasers), which is well in range for
> a commercial professional language product.

Another thing to bear in mind is that it is not clear that all
`commercial' language products are actually money-making. Many
vendors of these things are also OS or platform vendors, and they have
an obvious interest in undercharging for the language system to get
people to use their OS. Of course, this is pretty questionable when
the OS vendor is a monopolist, but given that everyone's favourite
monopolist OS vendor doesn't seem to care too much about other
questionable practices I would not be surprised to discover that their
language products are not a huge profit centre.

If you restrict your attention to vendors who do make their money only
from language products I think you might find that costs are typically
in the thousands of dollars per license.

--tim

Peter Herth

unread,
Sep 19, 2001, 4:35:35 PM9/19/01
to
Ola Rinta-Koski <o...@cyberell.com> writes:

> Nor should you. What is stopping you from using CMUCL or CLISP, both
> of which are free? They are certainly up to "pure hobby" standards.

Well... I used CMUCL quite a bit. Its a really mice system (the compiler
seems to be one of the best around). But there are some things I started
to miss: standalone creation for easy deployment of apps created with it
(thats the least problem, I could just deploy the source), but more the
missing GUI capabilities. I know there is CLM but never found much
documentation for it, and I saw references to cl-tk, but the download
site was no longer existant (is there perhaps an unknown location where
to obtain it?). CMUCL's usuability for me would already be greatly
extended if it had a readline facility to save typing...

Peter

Peter Herth

unread,
Sep 19, 2001, 4:49:29 PM9/19/01
to
Kent M Pitman <pit...@world.std.com> writes:

> But that wasn't what I was responding to.

Sorry if I mixed up the posts a bit.

> The poster identified himself as having an employer. I'm just starting out
> with my business and I'm sure the story of my tricky finances is typical of
> new businesses. But I'd buy proper tools for me or anyone working for me if
> it would improve throughput, even though it would mean going farther into
> debt to do it. I can't really see how I can afford not to do this and be
> doing competent business.

Yes of course, for professional uses the prices are without doubt very fair
and it would be very shortsighted to save some pennies at the tools you are
using, this mostly generates high long term costs.

> As to hobbyists, there are teaching editions and freeware. We are NOT short
> of low-cost alternatives. Yes, some involve compromises, but what do you
> expect from freeware?

As much as I get of competing freeware providers :b. Of course it is difficult
for a Lisp company to compete with SUN for example in giving away the
language products developed. Unlike SUN they dont live from selling
severs but software products. And of course I dont have any right
to expect from someone giving away parts of their prime product.
Yet, the software world has developed with a quick pace in the last
years and an incredible amount of good quality software can be
obtained for free. At the moment I am toying around with the combination
QT designer (free for noncommercial use as QT itself) and Python (the
language not the CMUCL compiler) a combination that seems very well
suited for quick GUI development. If the free/trial version of the
commercial Lisps should serve as a base to attrackt new Lisp users,
looking a bit at the competition outside Lisp couldnt hurt, could it?

Peter

Daniel Barlow

unread,
Sep 19, 2001, 4:52:43 PM9/19/01
to
Peter Herth <he...@netcologne.de> writes:

> to obtain it?). CMUCL's usuability for me would already be greatly
> extended if it had a readline facility to save typing...

CMUCL becomes infinitely more usable if you use it with Emacs and
ILISP, so you can use the normal emacs comint mode keystrokes (C-up,
C-down, etc) for command history and line editing

Of course, this depends on your being able to tolerate emacs. I
acknowledge that some people are allergic to it.


-dan

--

http://ww.telent.net/cliki/ - Link farm for free CL-on-Unix resources

Thomas F. Burdick

unread,
Sep 19, 2001, 6:36:20 PM9/19/01
to
Peter Herth <he...@netcologne.de> writes:

> Ola Rinta-Koski <o...@cyberell.com> writes:
>
> > Nor should you. What is stopping you from using CMUCL or CLISP, both
> > of which are free? They are certainly up to "pure hobby" standards.
>
> Well... I used CMUCL quite a bit. Its a really mice system (the compiler
> seems to be one of the best around). But there are some things I started
> to miss: standalone creation for easy deployment of apps created with it
> (thats the least problem, I could just deploy the source),

I'm talking out of my ass here, but I think this would probably be a
weekend's work. Automate the dumping and gzip'ing of an image; add
zlib to the C level so it can load a gzip'ed image.

> but more the missing GUI capabilities. I know there is CLM but never
> found much documentation for it, and I saw references to cl-tk, but
> the download site was no longer existant (is there perhaps an
> unknown location where to obtain it?).

You might try looking through
<http://ww.telent.net/cliki/Graphics%20Toolkit>. Personally I'm happy
using Garnet; it satisfies my needs and works on X11 and Mac, which
are the only platforms I'm likely to use. If I'm stuck on a windows
box with no X server, I've got bigger problems :)

Kent M Pitman

unread,
Sep 19, 2001, 6:48:03 PM9/19/01
to
Peter Herth <he...@netcologne.de> writes:

> If the free/trial version of the
> commercial Lisps should serve as a base to attrackt new Lisp users,
> looking a bit at the competition outside Lisp couldnt hurt, could it?

I'm not sure how to read this statement. The free/trial versions
already do this as well as they can.

I thought there was a proposal on the table to drop the price for an entry
version. But as Erik Naggum (I think it was) points out, yes, it can hurt.
Existing customers can be annoyed that you suddenly offer a lower-price
version than something they just paid more money for.

You could fail to win over new customers (the experiment might fail),
you could create a situation where people who would have paid $800 pay
only $200 and never upgrade, and you can create a situation where people
who already paid $800 are annoyed they didn't get in on the $200 deal.
There are a lot of ways to lose. So the strict answer to the "could"
question is: Yes, there are many ways any change of pricing policy can
hurt vendors.

The whole point of a free market is that any rational player is naturally
motivated to make decisions that will improve their market share. If you
see not one but many vendors not going after the same "apparent market
opportunity", there could be good reason.

The market does not assure that people won't make mistakes, of course. But
in general it creates an incentive for mistakes to get corrected. When they
consistently don't, you have to at least consider the possibility that perhaps
the "apparent mistake" is not an "actual mistake". That's not the only
possibility; but it must be considered.

If your only point was that the real competition with Lisp is not
between Lisp vendors but between any given Lisp vendor and vendors of
other languages, this is also not a new observation. It's true that
Lisp vendors used to view each other as the problem, but this shifted
majorly around 1988-1992; it was very apparent in the way that ANSI CL
design participants approached the language. They began in 1986 with
a great deal of suspicion of each other and ended in 1994 with a sense
that they must work together against the common enemy of other
languages. That's one reason that the employees of the various
vendors are so cordial in public--there's SOME degree of competition
among us, but all in all, we've long ago realized we're better off
helping than hindering one another.

Alexander Kjeldaas

unread,
Sep 19, 2001, 8:31:21 PM9/19/01
to
Kent M Pitman wrote:

> "Frank Sonnemans" <frank.s...@euronet.be> writes:
>> Consequently for Windows development other (more affordable) less
>> powerfull languages are being used.This is not to the benefit of the
>> commercial vendors who would obviously benefit from a larger user base.
>
> How can you make this last statement? Do you think they haven't thought
> of this? If they dropped their price, they might get a larger base but
> they also might not meet their quarterly numbers and they might be out
> of business.
>
> Incidentally, I make similar statements all the time. I happen to believe
> that dropping prices would help, too. I just try to
> routinely prefix them with "I think" so it's clear I am not making a
> statement of fact. I understand there is enormous commercial risk in
> asserting such a thing as if it were a certainty, and though I may have
> a strong belief on this, it doesn't make me right. These people are
> gambling on their chosen strategies with their professional lives. It is
> fine to assert what you personally think, but don't confuse that with
> an assertion that you know yourself to be right. No one knows what
> is right.
>

As a fairly new Lisp user (I have used it on and off for a few years, but
never for long), I would say that ACL's pricing seems right to me, and
makes a lot of sense. I think that the Linux distributions with all their
free programming language implementations means a few things to niche
programming language companies:

First, it means that the $199 market will mostly disappear. A no-hassles
free implementation of the language exists. This is what users will perfer
to start with, not a $199 language implementation (and I think that more
and more programmers will move to Linux so this trend will grow stronger).
The free implementations usually have advantages over these others in that
the user community has a passion for their implementation, which gives me
as a new user a good feeling about the implementation. A $199
implementation can't offer that.

Second, it means that lisp companies need to be aware and adapt to the fact
that a lot of their users will initially have experience with a free tool.
This means that compatibility with this tool can be important. As an
extreme example of this - if you try to sell a C-compiler to Linux users
you really want to work hard to implement most of the GNU extensions into
your compiler. Intel have done this.

I think ACL is doing the right think. They don't need the $199 users.
While we are newbies, people like me will be able to experience the
language without having to pay anything, and the lisp companies don't have
to deal with all the newbie questions that we ask since there will be a lot
of free help from the user community. I'm not willing to spend $199 on all
languages that I want to learn, and I don't want to go through the hazzle
of getting a trial version (although I've done that with ACL - it was used
at my university) when I can just apt-get[1] it in a few seconds.
Actually, after I stopped using Windows, my first encounter with a language
has _never_ been a for-pay commercial implementation (exception: Visual
Basic for Applications - but it kind of comes with Office, you don't pay
anything extra for it, and I was forced ;-).

However, when working at my company developing software that generates $$,
getting the _best_ tools is important, and often the language
implementation needs of businesses are somewhat different from what makes a
free lisp-implementation hacker tick. My employer would probably shell out
$4000 for a tool that made me more productive (if not, I'm not earning
enough - my productivity is obviously not worth much).

Also, knowing that a company like ACL has survived with a business model
where they charge this much for the product also makes Lisp a safer choice
as an implementation language. People are not stupid (especially people who
program in niche languages :-). If they pay $8000 for ACL they want
something in return. Probably a top-notch implementation, probably very
good support. By knowing that people are willing to pay $8000 for ACL I
can reasonably assume that there is somewhere I can go if I have trouble.


astor

[1] apt-get is a Debian/Linux command for installing new packages.

--
Alexander Kjeldaas Mail: as...@fast.grukk.no ('emove grukk)

Kent M Pitman

unread,
Sep 19, 2001, 10:00:57 PM9/19/01
to
Alexander Kjeldaas <astor...@fast.no> writes:

I found this analysis of yours quite interesting and well-reasoned.
I wanted to re-quote it to give people a second chance to read it.
I've attached some related remarks at the end.

> [1] apt-get is a Debian/Linux command for installing new packages.
>

> However, when working at my company developing software that generates $$,
> getting the _best_ tools is important, and often the language
> implementation needs of businesses are somewhat different from what makes a
> free lisp-implementation hacker tick. My employer would probably shell out
> $4000 for a tool that made me more productive (if not, I'm not earning
> enough - my productivity is obviously not worth much).
>
> Also, knowing that a company like ACL has survived with a business model
> where they charge this much for the product also makes Lisp a safer choice
> as an implementation language. People are not stupid (especially people who
> program in niche languages :-). If they pay $8000 for ACL they want
> something in return. Probably a top-notch implementation, probably very
> good support. By knowing that people are willing to pay $8000 for ACL I
> can reasonably assume that there is somewhere I can go if I have trouble.


As long as we are talking price, I think we need to consider that the
dimensionality of price is not 1. There are multiple potential axes
to consider. It's not just the price per seat, it's also price per
runtime and sometimes even price per runtime/year, and the answer may
not be a fixed constant but rather some kind of "function" on some of
the axes.

There are three controversial issues in the ACL pricing scheme. One is
the cost of the development seat. I agree with your remarks here that
this is a non-issue. It definitely locks Franz out of a lot of sales, but
they seem content to make that decision, and to tkae the higher price from
the customers that remain. No problem there.

The second controversial point is the issue of royalty-free runtimes.
ACL doesn't have them. Xanalys charges for runtimes in its Enterprise
Edition (and maybe for the Professional Editions under non-Linux Unix, I'm
not sure), but not for its Professional Windows and Linux editions.

Naturally, every company evaluates things differently, and perhaps
some are willing to deal with the runtime royalties in exchange for
whatever level of quality is offered, but personally I find it nearly
impossible to do any kind of internal predictions about product costs
in the light of runtime royalties. At least Xanalys publishes a fixed
price and you know you'll be bargaining down from there, so that
bounds the risks. Franz does not publish the cost of runtimes. I
worked at a prior company where this revelation was an instant no-go
for further use of Franz at all, and for my own start-up company, I
have made the decision likewise. No matter how much quality Franz may
charge, I *must* be able to second-source the product, so it has to be
the case that the level of support I require of my product is as good
as what some other Lisp vendor offers. The situation is analogous to
using an operator that has speed O(1) in one implementation and is
only O(n) in another, but that is critical to the product's pricing.
I can't build my system on the O(1) and just assume I'll slap in the
O(n) operator if I have to switch implementations. I must be able to
predict that the "big-P" (my ad hoc price analog of the big-O speed
modeling operator) characteristics of the product are going to be
upheld. It's as important to build a product of predictable pricing
as it is to build a product of predictable speed.

When I was at Harlequin (before it sold out to Global Graphics and
became Xanalys), I have seen then-Harlequin's own sales people tell
customer prospects they should buy the Professional Edition rather
than the Enterprise Edition almost in the same breath as they identified
the purchase opportunity in the first place ("We have an Enterprise
Edition but you probably don't want that because it will cost you for
runtimes."), I think because they were just plain tired of hearing people
react badly to the issue of runtime royalties, and they wanted to focus
the discussion on a sale they thought would really succeed.

When considering the Franz/Xanalys product options, I tend to steer people
toward the Xanalys Professional edition because it's the only one I can
comfortably recommend both from a technical and business standpoint without
further qualification about royalties. I'm sure there are places where
both the Xanalys Enterprise and the various Franz options are reasonable
in spite of the royalty, but I don't want to have steered someone down that
path without warning them of the issues and have them come back angry later
when they figure out the issues. So unless I have more time than I usually
do to chat about the issues with a person, that fact becomes the deciding
factor in which issue I do.

There is a third and utterly baffling issue in the old Franz license I
have seen. (I apologize for speaking from two-year-old information
but I have recently asked Franz for an up-to-date contract so that I
can decide if things have since changed for the better, and possibly
even make a purchase, but after several weeks I have received no reply
to my request from their sales person.) The third point is
"license timeout". As far as I understand, the Xanalys product gives
you license to deliver indefinitely. The Franz product gives you
license to distribute only for a year, and you have to renew your
license at whatever price they have the following year. The thing I
find baffling about this is that they cannot (or did not at the time
my information was current) guarantee that the price will not go up.
Indeed, I inquired about this very point and the sales person told me
"we can't guarnatee that our costs won't go up". The only thing is,
we're not talking about the cost of them providing anything at
all--we're only talking about the price of my continued right to use
what they already provided me and what it will cost them nothing to
continue to let me use--they'll simply be getting income. So when the
talk about their prices going up, it seems irrelevant. This condition
simply baffles me.

I emphasize that I'm not saying anything underhanded is going on. I
like the folks at Xanalys and at Franz and have the very best wishes
for them to survive and thrive. I also like their products from a
technical standpoint. But since we're all sharing information about
what makes and breaks our purchases, I figured I'd share my concerns
in case it is useful either to these organizations in revising the
pricing policies I don't like or to other organizations in making sure
they avoid the pricing policies I don't like.

Further to this, on reflection, I want to express some personal
feeling of guilt about remarks I made upthread, criticizing people's
attitudes about pricing. It's not that I don't hold the feelings I
expressed, but there are appropriate forums for expressing feelings
and others where it's probably best not to. And in a thread where
someone is effectively soliciting personal opinions, I probably set a
bad example by jumping on someomone for just offering data. In the
context of this thread and ones like it, we should probably try hard to
sit on our hands and listen to what people have to say so that more people
will feel comfortable coming out and expressing themselves and giving
us the good data that we very much need to properly understand
perceptions, whether we think those perceptions are justified or not.
So I'm sorry for any degree to which I've contributed to the sense that
people have to defend their opinions; I hope it won't stop others from
opining freely.

cbbr...@acm.org

unread,
Sep 19, 2001, 11:16:08 PM9/19/01
to
Kent M Pitman <pit...@world.std.com> writes:
> The whole point of a free market is that any rational player is
> naturally motivated to make decisions that will improve their market
> share. If you see not one but many vendors not going after the same
> "apparent market opportunity", there could be good reason.

Hmm... I would think that a rational player is naturally motivated to
make decisions that will improve their _profitability_.

Increasing the share of the market, or perhaps even the _size_ of the
market, represent interesting ways of improving profitability. But
they are not _guaranteed_ to do so.

It might, at times, prove more beneficial to _contract_ one's market
share, if that diminishes service costs more than it diminishes sales.

The classic theory that gets espoused is that if Franz (to make up a
name for the moment) were to cut license prices from $4000 to
[something a whole lot less], they would get greater market share.

It is not at all obvious that this would lead to _greater
profitability_.

I seriously doubt that cutting a price from $4000 to $1000 would
result in vast numbers of folks buying licenses; those feeling daunted
by $4000 would still be likely to find $1000 to be "too much." (I'm
in that boat, personally...)

The only way that ACL license would start flowing outwards like
hotcakes from an unfortunate manufacturing accident at McDonalds
breakfast foods division would be if the price fell to something
atrociously low, like $100.

But in order for Franz to maintain equivalent _sales_, they'd have to
be selling well over 40x as many copies, and that ignores that CDs and
even manuals aren't _all_ that expensive, but would start consuming a
considerable chunk of $100. Throw in that Franz would need lots of
staff to respond to email and service calls, and would have bigger
shipping bills, and it is more likely that to maintain similar
profitability with a price change from $4000 to $100 would require
that sales go up by a factor of somewhere between 50x and 100x.

I just don't see that happening....
--
(concatenate 'string "cbbrowne" "@acm.org")
http://www.cbbrowne.com/info/internet.html
"Consistency is the single most important aspect of *ideology.*
Reality is not nearly so consistent." - <cbbr...@hex.net>

Coby Beck

unread,
Sep 19, 2001, 11:30:05 PM9/19/01
to

"Kent M Pitman" <pit...@world.std.com> wrote in message
news:sfw66ae...@world.std.com...

> and others where it's probably best not to. And in a thread where
> someone is effectively soliciting personal opinions, I probably set a
> bad example by jumping on someomone for just offering data. In the
> context of this thread and ones like it, we should probably try hard to
> sit on our hands and listen to what people have to say so that more people
> will feel comfortable coming out and expressing themselves and giving
> us the good data that we very much need to properly understand
> perceptions, whether we think those perceptions are justified or not.
> So I'm sorry for any degree to which I've contributed to the sense that
> people have to defend their opinions; I hope it won't stop others from
> opining freely.
>

In the spirit of the above (ie expressing my perceptions, unjustified or not
: ) I would like to contribute my view.

I was very recently responsible for recommending which lisp to purchase for my
employer to build a specialized peice of a larger product. The product is
"middleware" designed to integrate distributed business processes accross
networks peiced together by mergers, etc. - ie things *not* desiged to work
together.

From my pov the choice was right away between Allegro and Xanalys. (my
apologies to any other fine implementations I may have unfairly dismissed) I
have developed on both in the past. I am a windows child by birth, if not by
choice. I have used emacs enough to realize its value, enough to get work done
in it, not enough to be good at it. My own perception was that Allegro is a
better, slicker tool (admittedly this is because of bells and whistles, not any
intimacy with compiler efficiency/correctness.) Lispworks' poor documentation
is the single biggest negative for me.

I said to my boss, I would like Allegro, but have heard it is prohibitively
expensive. The product we will be marketing, licenses for *very* big bucks
(for something more in the shrink-wrap market than consulting) so cost of
developer seats is not an issue. I believed Allegro's pricing for runtime was
based on a percent of what we would sell our product for. That raised alot of
red flags right away. My boss wants us to have the best tools, so we inquired
both places. Allegro needed alot of details on what we would use their product
for, so much so that they had to send over a signed non-disclosure agreement.
This did not sit well with anyone either. We still did not rule them out,
maybe the percentage would be very low. It wasn't, it was out of the question
plus there were yearly fees involved, I would not have recommended buying it
myself. Xanalys was well within reason.

I was left very disappointed, not because Allegro was ruled out, but because of
the way they view their customer relationship. I think of a development
environment as a tool, so right away I have a lot of trouble accepting the idea
that the price I pay will depend on what i do with it. I have no problem
paying alot more for high quality, but it is not relevant how much I will make
using it. But I could get over that part. Worse for me however is Franz's
explicitly stated view that their customers are "value added resellers" of
Franz technology. This struck me personally as arrogant and only justified if
Common Lisp were their own invention. I develop lisp code using a development
environment, I am not modifying and enhancing someone else's software. I know
lisp is different from other languages, I use eval and compile in my code, so I
could actually accept this view if Lisp did belonged to them. But it does not
and I do not like their attitude.

We use Lispworks, the documentation is very poor but the customer support is
excellent. The debugger is not as flashy but all of the tools are there, I can
only blame myself for not being very proficient yet. They have runtime fees
for us but it is flat rate and somewhere between 30-50 times cheaper that
Franz. I have no regrets about our choice.

I would never have agreed to Franz's terms. Maybe as part of a one-off
consulting project, but not on a percentage per sale + large fee per year
basis.

Coby

--
(remove #\space "coby . beck @ opentechgroup . com")


Kent M Pitman

unread,
Sep 20, 2001, 1:00:28 AM9/20/01
to
"Coby Beck" <cb...@mercury.bc.ca> writes:

> We use Lispworks [...] The debugger is not as flashy [...]

Are you sure you're using the right debugger? There are two. When you end
up in a teletype debugger, go into the Debug menu, pull down to Listener,
and track across to Start GUI Debugger. I've pleaded with them to make this
more prominent. It's too easy to get into the unflashy one and too hard to
get to the flashy one, which I find almost as useful as the Lisp Machine
debugger and find it hard to believe you would find it so inadequate as
to require comment of any sort. By the way, there are two inspectors, too.
The INSPECT function calls the cruddy one, but the Values->Inspect pulldown
gets you the flashy one, as does double-clicking on items in the flashy
debugger...

There are also a bunch of other browsers in the Works->Tools->xxx menu.

Roger Corman

unread,
Sep 20, 2001, 2:32:55 AM9/20/01
to
On 18 Sep 2001 23:10:05 +0200, Peter Herth <he...@netcologne.de> wrote:

>(Since Roger Corman reads this newsgroup: is there a chance for a Linux
>version of Corman Lisp ?)
>
>

Currently we do not have the resources to look at other platforms. However I
would very much like to support at least Intel Linux versions at some time if we
can afford to expand our development efforts. The port should not be too
expensive.

Roger

Roger Corman

unread,
Sep 20, 2001, 3:23:45 AM9/20/01
to
On Tue, 18 Sep 2001 19:03:41 GMT, Kent M Pitman <pit...@world.std.com> wrote:

>I suspect the problem is that with a $200 version, people wouldn't use
>it as a stepping stone but as an end to avoid ever upgrading. As such,
>I can see a high-end vendor avoiding this. I think the reason Corman
>can afford to do this is that he's making enough money to be able to
>afford that risk; but larger companies might lose cash flow from a higher
>end product if they offered one of these, and they cannot afford that risk.

I believe a product in the low-end price range is important, to attract people
who cannot afford the big bucks, possibly because they are using lisp as a hobby
or only for occasional purposes. Most computer language implementations do
not require run-time licenses, and I believe most potential customers will be
put off by them.

I chose to price Corman Lisp in the $200 range because I believe that is an easy
amount to budget (at least if you live in US or Europe) and does not require a
lot of justification. The goal is to attract a lot of people to use Common Lisp.
Personally, I feel more comfortable using a language for development when I have
full source code--at the very least source for the library functions. Sun Java
SDKs provide this, many C and C++ compilers provide this. I think this currently
sets Corman Lisp apart from Franz and Xanalys. It may sound weird, but a lot of
C and C++ programmers don't trust a system like a lisp system. They don't
understand how it works. I used to suffer from this. By having the source code,
they can, if they choose, understand the system to a deep level. And this comes
back as an advantage when sophisticated users make improvements to the system
and contribute them back to the product code stream. Many of the Corman Lisp
enhancements in the last year have been contributed by users.

I believe Corman Lisp could have been given away free, or could be sold at
several times the current price. I have chosen a combination of free (for
personal use) and a low price (for other use) because by at least charging a
small amount some income is generated which is invested back into development
expenses. I consider it a very long term investment. It would be nice if
computer languages were big money makers and could easily generate lots of
revenue, but they don't. The cost of developers learning a new language and
development environment are large, even for a free product. Developers
understand this, and would often pay a significant amount for a better product
if they see a clear need. However, lisp has to wean programmers from other
languages in order to grow its market. I don't want to see a stagnant lisp
market--I think is should grow at least with the overall market for development
tools. The fact is, most other general purpose languages have free, or very
inexpensive, implementations. For lisp to gain market share, it must be
competitive in this way. Corman Lisp will not become a profitable enterprise
until lisp gains market share. I believe it will, in the coming years. I think
we will see a rise in the use of dynamic languages. Among these languages, there
is not a clear leader. I think lisp is very strong here and should grow with
this trend.

I understand that a company like Franz can probably not risk dramatically
lowering their prices in an effort to gain market share. Their conservative
approach to the market has kept them in business all this time, which is much
better than going out of business. We all benefit from their apparent stability.

I understand Rohan's logic completely, because it pretty much is how I would
feel in that same situation. It's difficult to sell a company on a different
language like lisp, and much more difficult if the pricing and licensing model
appears prohibitive. Many of us in this group take it for granted that lisp
provides much greater productivity, and justifies a high price. However, I don't
take that for granted. I know *I* am more productive, and prefer it. However, I
am certain that many programmers would be somewhat resistant to it, and it may
not mesh with other languages and environments as easily as we would hope. We
need to remove any barriers to lisp use that we can, and a high price is
definitely a barrier.

Franz used to offer their ACL for Windows (the older version 3) for under $1000.
I am curious why they chose to increase the price dramatically. I understand the
new product is completely different, and more powerful (and fully compatible
with the Unix versions) but it seems to me that PC pricing has moved toward
lower prices in the intervening years, not higher. This is pushed by lowered
hardware costs and increasing market size, among other things. And, I presume,
the proliferation of free software of all types.

I am happy to see that both Franz and Xanalys have been increasing the
capabilities of their trial versions, and this certainly helps the lisp market.
For a potential lisp user to be able to download several powerful
implementations of the same language, even just as trials, sets common lisp
apart from quite a few other languages.

Roger

Kent M Pitman

unread,
Sep 20, 2001, 4:37:50 AM9/20/01
to
ro...@corman.net (Roger Corman) writes:

> On Tue, 18 Sep 2001 19:03:41 GMT, Kent M Pitman <pit...@world.std.com> wrote:
>
> >I suspect the problem is that with a $200 version, people wouldn't use
> >it as a stepping stone but as an end to avoid ever upgrading.
>

> I believe a product in the low-end price range is important,
> to attract people who cannot afford the big bucks,

Well, I absolutely think it's nice to have and nice to have you there.
I don't think the big vendors can afford to experiment in this
area because it could hurt their other products. As a lower-overhead
organization, you can bound your risk in a way that makes this possible.

> I chose to price Corman Lisp in the $200 range because I believe
> that is an easy amount to budget (at least if you live in US or
> Europe) and does not require a lot of justification.

I didn't mean to say anything in conflict with this. I only meant to imply
that it's understandable that the big guys can't afford to do this.

> The goal is to attract a lot of people to use Common Lisp.

:-)

> Personally, I feel more comfortable using a language for development
> when I have full source code--at the very least source for the
> library functions. Sun Java SDKs provide this, many C and C++
> compilers provide this. I think this currently sets Corman Lisp
> apart from Franz and Xanalys.

Yep. I certainly think there are reasons to use your Lisp other than
just price, and these are among those othe reasons.

> It may sound weird, but a lot of C and
> C++ programmers don't trust a system like a lisp system. They don't
> understand how it works. I used to suffer from this. By having the
> source code, they can, if they choose, understand the system to a
> deep level. And this comes back as an advantage when sophisticated
> users make improvements to the system and contribute them back to
> the product code stream. Many of the Corman Lisp enhancements in the
> last year have been contributed by users.

Cool.



> I believe Corman Lisp could have been given away free, or could be sold at
> several times the current price. I have chosen a combination of free (for
> personal use) and a low price (for other use) because by at least charging a
> small amount some income is generated which is invested back into development
> expenses.

I definitely feel you fill a very important niche that was going unserved.
I hope I didn't imply otherwise.

> I am happy to see that both Franz and Xanalys have been increasing the
> capabilities of their trial versions,

Well, I suppose this is always nice, but I'd rather they beef up their
for-sale versions. e.g., it still confuses me why they offered AllegroServe
for free. They need to keep adding functionality to their product to justify
a continued high price--adding that module would be about what I'd expect
new every year in order to justify a maintenance of a certain price line.
Yet they gave that away free and still want the same price for the other
stuff they have. Very odd. They can do as they please, of course, I just
honestly wish I understood the economic model here in case I'm missing
an important lesson.

> and this certainly helps the lisp market.
> For a potential lisp user to be able to download several powerful
> implementations of the same language, even just as trials, sets common lisp
> apart from quite a few other languages.

Yeah, most other languages are incompatible from vendor to vendor but use
a compatible interface. CL uses a completely different interface vendor to
vendor but is code-compatible. :-)

Alain Picard

unread,
Sep 20, 2001, 6:01:53 AM9/20/01
to
"Coby Beck" <cb...@mercury.bc.ca> writes:

> In the spirit of the above (ie expressing my perceptions, unjustified or not
> : ) I would like to contribute my view.
>

In the same spirit, I have to report an exactly similar experience (to
that of Coby's). I would easily have convince my company to spend any
large X for developer seats, but when it turned out that we would have
to pay indefinite and unspecified royalties, that became a total show
stopper. This killed the sale.

I've been totally puzzled by Franz's pricing model, and I speculate
that they've got some huge, locked in customers which represent
their entire revenue stream, and that they'll sacrifice any number
of new sales to keep this existing revenue stream. Whether this will
be good for Franz (and lisp) in the long term is something we'll have
to wait to find out.

--
It would be difficult to construe Larry Wall, in article
this as a feature. <1995May29....@netlabs.com>

Peter Herth

unread,
Sep 20, 2001, 3:52:58 PM9/20/01
to
ro...@corman.net (Roger Corman) writes:

> Currently we do not have the resources to look at other platforms. However I
> would very much like to support at least Intel Linux versions at some time if we
> can afford to expand our development efforts. The port should not be too
> expensive.

An Intel Linux version would be very great :)

Lieven Marchand

unread,
Sep 20, 2001, 12:36:45 PM9/20/01
to
Kent M Pitman <pit...@world.std.com> writes:

> "Coby Beck" <cb...@mercury.bc.ca> writes:
>
> > We use Lispworks [...] The debugger is not as flashy [...]
>
> Are you sure you're using the right debugger? There are two. When you end
> up in a teletype debugger, go into the Debug menu, pull down to Listener,
> and track across to Start GUI Debugger. I've pleaded with them to make this
> more prominent.

It's mentioned in the tutorial so I've found it easily enough. But I'm
still searching for a way to just make it the default. If you have the
GUI running, there isn't much sense in using the text debugger.

--
Lieven Marchand <m...@wyrd.be>
She says, "Honey, you're a Bastard of great proportion."
He says, "Darling, I plead guilty to that sin."
Cowboy Junkies -- A few simple words

cbbr...@acm.org

unread,
Sep 20, 2001, 5:24:38 PM9/20/01
to
Peter Herth <he...@netcologne.de> writes:
> ro...@corman.net (Roger Corman) writes:
>
> > Currently we do not have the resources to look at other platforms. However I
> > would very much like to support at least Intel Linux versions at some time if we
> > can afford to expand our development efforts. The port should not be too
> > expensive.
>
> An Intel Linux version would be very great :)

I doubt it would lead instantly to fame, fortune, gratuitous IPO
riches, and such, but I'll bet it would sell _reasonably_ well...

A most _fascinating_ option would be to deploy the GUI part using
WINE; that would likely ease the process, and possibly even turn the
result into a somewhat portable deployment environment where even much
of the Win32-oriented stuff would run on Linux...
--
(concatenate 'string "cbbrowne" "@ntlug.org")
http://www.ntlug.org/~cbbrowne/lisp.html
Where do you *not* want to go today? "Confutatis maledictis, flammis
acribus addictis" (<http://www.hex.net/~cbbrowne/msprobs.html>

Coby Beck

unread,
Sep 20, 2001, 10:25:32 PM9/20/01
to

"Kent M Pitman" <pit...@world.std.com> wrote in message
news:sfwsndi...@world.std.com...

Thanks for those pointers. I had seen the Gui debugger, it pops up depending
what buffer you are evaluating an expression in, but I was not aware I could
chose it when in the listener. Plus I see now many other interesting browsers
(like the function browser, that's the cross referencing stuff I was
missing--aside: is there something similar for special variables? ie find out
who sets and who references them)

All the functionality is definitely there, I just have to be more disciplined
about learning the tools! Thanks.

Kent M Pitman

unread,
Sep 20, 2001, 11:17:51 PM9/20/01
to
"Coby Beck" <cb...@mercury.bc.ca> writes:

> All the functionality is definitely there, I just have to be more disciplined
> about learning the tools! Thanks.

And Xanalys needs to take note that this is at least one datapoint that I
was right that some people are probably not finding the nice tools!
Can't put all the blame on the customer.

I've specifcally asked Xanalys to promote the Start GUI Debugger to the
main menu.

I agree with you that an option saying to just start it by default
would be useful. But in that case, maybe some people prefer to debug
in a context where they can search backward in the same buffer to grab
old inputs (even old debugger session expressions) for re-use. Still,
it could be on control of a special variable....

Friedrich Dominicus

unread,
Sep 21, 2001, 12:40:59 AM9/21/01
to
cbbr...@acm.org writes:

> Peter Herth <he...@netcologne.de> writes:
> > ro...@corman.net (Roger Corman) writes:
> >
> > > Currently we do not have the resources to look at other platforms. However I
> > > would very much like to support at least Intel Linux versions at some time if we
> > > can afford to expand our development efforts. The port should not be too
> > > expensive.
> >
> > An Intel Linux version would be very great :)
>
> I doubt it would lead instantly to fame, fortune, gratuitous IPO
> riches, and such, but I'll bet it would sell _reasonably_ well...

I'll bet against it. I don't tell it won't sell a copy but it will
sell in an order of a magnitude less than on Windows. Why do I say
that. Well we ported an IDE to Linux and have sold 4 pieces in 4
months! And that for "just" EUR 25. Other examples which show the
problematic are e.g the Eiffel compilers but that's a story for
antother group.

Regards
Friedrich

Aleksandr Skobelev

unread,
Sep 21, 2001, 3:08:08 AM9/21/01
to
cbbr...@acm.org wrote:
> Peter Herth <he...@netcologne.de> writes:
>> ro...@corman.net (Roger Corman) writes:
>>
>> > Currently we do not have the resources to look at other platforms. However I
>> > would very much like to support at least Intel Linux versions at some time if we
>> > can afford to expand our development efforts. The port should not be too
>> > expensive.
>>
>> An Intel Linux version would be very great :)
>
> I doubt it would lead instantly to fame, fortune, gratuitous IPO
> riches, and such, but I'll bet it would sell _reasonably_ well...
>
> A most _fascinating_ option would be to deploy the GUI part using
> WINE; that would likely ease the process, and possibly even turn the
> result into a somewhat portable deployment environment where even much
> of the Win32-oriented stuff would run on Linux...

O, no! Only not WINE! I think, it will be just a huge. Why does anybody need a
ported Linux version if it depends on WINE? He just might try to run windows
version under WINE, I think.


Peter Herth

unread,
Sep 21, 2001, 3:48:00 PM9/21/01
to
cbbr...@acm.org writes:

> WINE; that would likely ease the process, and possibly even turn the
> result into a somewhat portable deployment environment where even much
> of the Win32-oriented stuff would run on Linux...

If you talk about the IDE - I would be glad just with a terminal version.
(Emacs is my IDE :) Concerning a gui toolkit it might be a good idea to
create a binding to a platform-neutral toolkit like QT. So the Windows
and Linux version could be implemented with one codebase.

Peter

Martin Cracauer

unread,
Sep 21, 2001, 11:32:34 PM9/21/01
to
Erik Naggum <er...@naggum.net> writes:

> It is important for people who have RedHat scars that they were not
> scarred by Linux as such and not by any other distribution than RedHat.

I was attacking Linux, not Redhat, for:

- implementaing non memory overcommit mode (good) and making it the
default (bad) without announcing it clearly (worse)
- various, in fact almost all, newer glibc changes. This is not the
kernel, but used in all Distributions I am aware of (I think the
distribution with FreeBSD's libc went nowhere)
- I could dig up my CMUCL breakage list, all of them were generic
Linux issues, no redhat stuff IIRC
- I won't even start on cfs (via NFS), vmware on newest kernel etc.

Redhat's gcc "2.96" also produces noticable more wrong code in -O3
mode than other gcc's, BTW. I am not so concerned about that because
I never use gcc with more than -O without having a solid regression
test suite for the application in question. Some customers who use
Linux (and people were happy about that) require stock Redhat with
stock compiler. Redhat's approach to have new gcc and glibc tested
early and by many people by including it in the official distribution
might in many people's opinion be a good thing for OpenSource.
However, it is unfortunate that this specific vendor is also the
vendor that "blind" customers choose, leading to serious lossage.

Martin
--
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
Martin Cracauer <crac...@bik-gmbh.de> http://www.bik-gmbh.de/~cracauer/
FreeBSD - where you want to go. Today. http://www.freebsd.org/

Rolf Mach

unread,
Sep 24, 2001, 9:23:51 AM9/24/01
to
Alexander Kjeldaas wrote:

> By knowing that people are willing to pay $8000 for ACL I
> can reasonably assume that there is somewhere I can go if I have trouble.

Alexander,

so all it takes to make you confident is a 8k price tag?

Let´s see what we can do for you in upcoming releases ;-)

Regards

Rolf mach

>
>
> astor
>
> [1] apt-get is a Debian/Linux command for installing new packages.
>
> --
> Alexander Kjeldaas Mail: as...@fast.grukk.no ('emove grukk)

--


__________________________________________________________________________________________

XANALYS - www.xanalys.com

Data Analysis and Lisp Development Tools
Software zur Datenanalyse und Lisp Entwicklungsumgebungen
__________________________________________________________________________________________

Rolf Mach
Business Development & Sales Manager, Europe

An der Schaafhansenwiese 6
D-65428 Ruesselsheim, Germany
Phone ++49 +6142 938197
Fax ++49 +6142 938199
rm...@xanalys.com

__________________________________________________________________________________________

NEW: LispWorks 4.2 at LinuxWorld Expo, Frankfurt, 30-Oct/1-Nov 2001, Hall 6.0
Booth E08
__________________________________________________________________________________________

Watson - PowerCase - Quenza - LispWorks


Kent M Pitman

unread,
Sep 24, 2001, 10:10:22 AM9/24/01
to
Rolf Mach <rm...@xanalys.com> writes:

> Alexander Kjeldaas wrote:
>
> > By knowing that people are willing to pay $8000 for ACL I
> > can reasonably assume that there is somewhere I can go if I have trouble.
>

> so all it takes to make you confident is a 8k price tag?
> Let´s see what we can do for you in upcoming releases ;-)

Heh. This is funny because when I was at Symbolics, Paul Robertson
once observed that people expect more of low-priced packages. If you
price something at $50, he said, everyone will buy it and you'd better
be ready for billions of cusotmer calls complaining about all kinds
of stupid little things. But if you instead price it at $10,000, he
said, people will assume it's not ready for prime-time and will be
not nearly so critical aout major gaps in functionality. I have found
this "wisdom" to be astonishingly true in a number of circumstances.
Never price a product too low unless you're really sure it works.

These remarks are not meant to make a reverse implication about Franz
quality on my part. I was just observing that some people look at a
high price tag and conclude "confidence", while some people look at a
low one and conclude that. (Sorry, Rolf. I know you had your heart
set on raising prices over there .... :-)

I think often the high price says "we're expecting to have to do a lot
of hands-on work with you and we have to pay for it somehow". That
doesn't necessarily mean bug fixes, though it might; it might just
mean handholding during custom configuration, and might be partly an
apology for not having enough wizard tools and/or documentation. Or
it might just be an acknowledgement that customer applications are so
varied that it's cheaper just to offer direct help than documentation
in some cases.

Then again, maybe the Franz price tag just pays for the salary of all
those people that ask all those invasive questions when all you're trying
to do is get a simple quote on some software... that money's got to come
from somewhere.

Rolf Mach

unread,
Sep 24, 2001, 10:58:06 AM9/24/01
to
Kent M Pitman wrote:

> If you
> price something at $50, he said, everyone will buy it and you'd better
> be ready for billions of cusotmer calls complaining about all kinds
> of stupid little things. But if you instead price it at $10,000, he
> said, people will assume it's not ready for prime-time and will be
> not nearly so critical aout major gaps in functionality. I have found
> this "wisdom" to be astonishingly true in a number of circumstances.

Now as you say this I remember encountering exactly such a situation
lots of
years ago in a project where the group I worked with had to decide
between MCL
and Procyon Common Lisp for the Mac. We all opted for Procyon "because"
of its
10 times higher price, fat manuals, professional features etc. - a few month
later we found out that we got 10 times more issues as we would have
ever had
with MCL.

What do we learn from this? Nothing ;-)

The question of high price/low numbers vs. low price/high sales is about
as easy to solve as the famous Greek riddle how to get a circle out of squares...

I would be interested however in some sort of rational explanation why
the high price of commercial Lisp environments is partly/fully
responsible for people using C++ or Java instead of Lisp.

Personally, I don´t think that there is a relation at all.

At best I remember some discussions where applications with a very low
end user pricing could not be done in Lisp because of runtime fees.

Rolf

Dave Fox

unread,
Sep 24, 2001, 11:40:56 AM9/24/01
to
Kent M Pitman <pit...@world.std.com> writes:

>
> "Coby Beck" <cb...@mercury.bc.ca> writes:
>
> > All the functionality is definitely there, I just have to be more
> > disciplined about learning the tools! Thanks.
>
> And Xanalys needs to take note that this is at least one datapoint that I
> was right that some people are probably not finding the nice tools!
> Can't put all the blame on the customer.

Noted! In LispWorks 4.2, there are toolbar buttons providing easy
access to the tools, including the Debugger.

Also, the Start GUI Debugger menu command is more prominent than in
4.1.

Dave


--
Dave Fox Email: da...@xanalys.com
Xanalys Inc, Barrington Hall, Tel: +44 1223 873879
Barrington, Cambridge CB2 5RG, England. Fax: +44 1223 873873
These opinions are not necessarily those of Xanalys.

Kent M Pitman

unread,
Sep 24, 2001, 5:17:05 PM9/24/01
to
Rolf Mach <rm...@xanalys.com> writes:

> Kent M Pitman wrote:
>
> > If you
> > price something at $50, he said, everyone will buy it and you'd better
> > be ready for billions of cusotmer calls complaining about all kinds
> > of stupid little things. But if you instead price it at $10,000, he
> > said, people will assume it's not ready for prime-time and will be
> > not nearly so critical aout major gaps in functionality. I have found
> > this "wisdom" to be astonishingly true in a number of circumstances.
>
> Now as you say this I remember encountering exactly such a situation
> lots of years ago in a project where the group I worked with had to
> decide between MCL and Procyon Common Lisp for the Mac. We all opted
> for Procyon "because" of its 10 times higher price, fat manuals,
> professional features etc. - a few month later we found out that we
> got 10 times more issues as we would have ever had with MCL.

If you re-read what I wrote, you'll find you're agreeing with me.



> What do we learn from this? Nothing ;-)

I'm not so sure.



> The question of high price/low numbers vs. low price/high sales is
> about as easy to solve as the famous Greek riddle how to get a
> circle out of squares...

It's not THAT hard.

It may be that people don't INFER from a low price that it's better
quality, but there's every reason to believe that they should. The
closer you bring the price of the product to the marginal cost of
producing it, you are either saying you have no bugs (because you have
no budget left to handle them) or you are saying you have a very huge market
(in order to afford to be able to deal with bugs on so tiny a margin).
[Of course, you might just be foolish about how to price your products
and be operating with negative cash flow, or you might be giving out a
charity or loss leader.] But assuming you're not a charity and you're
competent at pricing, "confidence in product quality" and "low price"
would seem to be pretty obviously inversely correlated, whether or not
the customer realizes it.

> I would be interested however in some sort of rational explanation why
> the high price of commercial Lisp environments is partly/fully
> responsible for people using C++ or Java instead of Lisp.
>
> Personally, I don´t think that there is a relation at all.

I'm not sure that in a corporate setting it's a lot more than a
scapegoat papering over deep-seated fears of doing anything
non-standard and/or vague paranoia about Lisp held over from AI
Winter. Even so, a couple of legitimate issues that I've seen come up
are the issue (as you mention) that royalties can exceed the target
product price (or, at least, drive it up too much) or that royalties
can't (as in the case of the Franz license) be predicted in precise form.

People also sometimes take Lisp's inability to meet commodity pricing that
most other languages have established as a sign that there is something
less "established" about it. Whether or not this is a legit concern depends
on context; certainly it's no proof that Lisp will not perform techncially.
But it is true that Lisp doesn't have the market share that other languages
do, and in some corporate contexts, that's defined (for better or worse)
to be an essential characteristic of an acceptable solution.

> At best I remember some discussions where applications with a very low
> end user pricing could not be done in Lisp because of runtime fees.

Yes, this is an issue. It's one reason that the free offerings and
lower-cost options like Corman CL are good to have around to give Lisp a
more complete coverage.

0 new messages