Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Some exciting new trends in concurrency and software design

46 views
Skip to first unread message

jonathan

unread,
Jun 20, 2011, 6:49:02 AM6/20/11
to

I've pasted below a few interesting passages from an
interview with Joe Duffy. Joe Duffy is lead architect on an
OS incubation project at Microsoft, where he is responsible
for its developer platform, programming language, and
concurrency programming model.

Despite the snarky title to this thread, I was seriously
pleased to see someone in that position of influence speak
well of a concurrency model I like, (and to speak with
some honesty about its history).

http://www.infoq.com/articles/Interview-Joe-Duffy#view_69430

The interviewer begins with:

The reason we wanted to talk to you
is that research into concurrency and
parallelism appears to have taken a turn.

Joe Duffy:

...according to the past several decades
of research, no, this isn't a major
shift to how parallel programs are architected.
The Actor model, introduced nearly 40
years ago, was truly revolutionary,
and made 1st class the notion that agents
would concurrently orchestrate some
larger interesting task through message
passing. This general idea has surfaced
again and again over the years, accumulating
useful extensions and variants over
time. Hoare CSP's, Liskov's Argus, Occam,
Ada, MPI, Erlang, and more recently,
languages like Scala, Clojure, and Go.

On the other hand:

... yes, this is a major shift for most
of the development community, because
mainstream programming environments
are now adopting the ideas that we've
seen in research and experimental languages
over the years.

In 5 years:

I fully expect that C#, C++, and Java,
for example, will have Actor-based offerings
within the next 5 years. In the meantime,
developers on such platforms will experiment
with new languages, and continue to
increasingly build architectures in
this style without first class support
in their primary environment....
In fact, a very specialized class of
programmers on this planet -- technical
computing developers in HPC, science, and
on Wall Street -- are already accustomed to
doing this with specialized toolkits, like MPI.

Which reminds me - one of the reasons I was more than
happy to see MPI become entrenched in these communities
is that it gives people like Mr. Duffy something to point to
in defence of this programming style.
When I use MPI, it feels just like Ada '83 tasking,
partly because (like all the other MPI'ers I've talked
to) by default I use blocking message passing (rather than
asynchronous non-blocking options, which tend to scale badly):

https://computing.llnl.gov/tutorials/mpi_performance/#Rendezvous

Like I keep saying, it's just a remote rendezvous as far as I
am concerned, and I have much good to say about it. I find it
promotes good program structure, and makes reasoning about
concurrency straightforward.

Regarding more advanced methods for eliminating race-conditions,
etc:

We write C# and Java code without needing
to worry about type-system holes. An
entire class of program errors is eliminated
by-construction; it is beautiful. The
same needs to happen for concurrency-safety.
But it sure won't be easy retrofitting
these concepts into existing languages
like C#, C++, and Java.

The defense of strong static typing is much appreciated.
(But still ... it's the 21st century. I wish universities
had made this advocacy unnecessary.)

OOP developers are accustomed to partitioning
their program into classes and objects;
now they will need to become accustomed to
partitioning their program into asynchronous
Actors that can run concurrently.

Mr. Duffy likes the asynchronous, but I note that after
years of development, the Go concurrency model seems to
have become less asynchronous and more Ada-like (iiuc):

http://groups.google.com/group/golang-nuts/browse_thread/thread/b877e34723b543a7

As of February this year,

Non-blocking channel operations have been
removed from the language.

The equivalent
operations have always been possible using a
select statement with a default clause. If a
default clause is present in a select, that
clause will execute (only) if no other is ready,
which allows one to avoid blocking on a communication.


In a similar vein, here's another very interesting story -
Carnegie Mellon University (CMU) is removing OO from
its introductory courses. What's interesting is
the reason they're doing it.

http://existentialtype.wordpress.com/2011/03/15/teaching-fp-to-freshmen/

Robert Harper (Prof. of Computer Science at CMU):

Object-oriented programming is eliminated
entirely from the introductory curriculum,
because it is both anti-modular and anti-
parallel by its very nature, and hence
unsuitable for a modern CS curriculum...

Some more detail on this revolutionary idea:

https://existentialtype.wordpress.com/2011/04/16/modules-matter-most/

Robert Harper:

Modules Matter Most

When it comes to controlling the complexity
of developing and, more importantly,
maintaining a large system, the only
game in town is modularity. And as
even the strongest proponents of unityped
languages have come to accept, modularity
is all about types (static types, of
course, there being no other kind).
...
We have for decades struggled with using
object-oriented languages, such as Java
or C++, to explain these simple ideas,
and have consistently failed. And I
can tell those of you who are not plugged
into academics at the moment, many of
my colleagues world-wide are in the
same situation, and are desperate to
find a way out.

Professor Harper favors functional languages, SML in particular,
but at least he's on the right track;-)

J.

Georg Bauhaus

unread,
Jun 20, 2011, 10:40:36 AM6/20/11
to
On 20.06.11 12:49, jonathan wrote:

> We have for decades struggled with using
> object-oriented languages, such as Java
> or C++, to explain these simple ideas,
> and have consistently failed. And I
> can tell those of you who are not plugged
> into academics at the moment, many of
> my colleagues world-wide are in the
> same situation, and are desperate to
> find a way out.
>
> Professor Harper favors functional languages, SML in particular,
> but at least he's on the right track;-)

Not that it matters most, or is most important to the subject
of understanding modularity and parallelism, but I must mention
that there remain a few major blind spots in FP circles, IMHO.
It won't help as is.

"There is nothing more dreary than the corporate
bureaucracy of OOP, and few things more lovely
than the mathematical elegance of FP."

Yes, I'd agree that O-O bureaucracy and spaghetti code can be
correlated. Why, however, do we not seem to see bureaucracy in ML
source text?

0. There is less ML text. Different problem domains, too.
Perhaps suggesting a different kind of program right from the start.

1. Actually, they use imperative style or suitable contortions
in functional programming when efficiency is required. I remember
looking at the chapter on Fusions in Bird's FP in Haskell
book. The chapter's subject is messy, the approach produces
text similar to the lengthier noodles in Wirth's algorithms book,
conflating things for efficiency reasons. Where is the mathematical
elegance in that?

2. FP *hides* the effective work's complexity behind a mathematical
facade: you write down a nice formula after days of thinking,
shouting, explaining, and getting it right. Then, you do *not*
document your "thought process" in source text. Instead, you just
"sign" days of work with a few nice lines of code. That's quite
typical of FP, I think, but just less feasible in "imperative"
languages. We can stare at the short FP algorithm in awe of its
"mathematical elegance" and start wondering what the heck it is doing
behind the scenes. Yes, doing, since doing is what matters, even
when we just want to understand an algorithm. At least if Knuth is
to be trusted when he suggests that we should follow the steps that
algorithms take when we want to understand.


A possible opponent to dropping O-O from the early curriculum,
Bertrand Meyer, insists that in his approach to O-O (teaching)
an object is a module, period. Unlike C++, say. He does show
a way out, in a big new book, the Inverted Curriculum IIRC.
Eiffel is much "cleaner" than hybrid languages C++ or OCaml
or Java. Why not try a clean O-O approach (or just forget
implementation inheritance and so on) first before banning
O-O programming altogether because with they hybrid languages
the author's We failed explaining simple ideas?


Incidentally, regarding mathematical elegance, why and when do
students still run away from CS? As soon as they are overwhelmed with
formality, including inscrutable mathematical elegance. Is it because
they aren't bright, or is it rather because they cannot get good
explanations? Because their teachers simply presume "obvious" math
skills and cannot and do not know what it is like to not have grown
up with all prerequisite mathematics. All day.

If mathematicians say, for example, that abstract algebra is
more difficult than computing with whole numbers, then they
have not done their homework before arriving at this verdict.
(Algebra having been developed from number theory TTBOMK,
should hint at possible reasons.)

When a fan of ML (Harper), who, I think, has written an exceptionally good
introduction to ML, says that ML works better than O-O when teaching;
when, OTOH, a fan of O-O (Meyer), who has written an extraordinary
book about O-O (in Eiffel), says that the O-O curriculum works really
well at ETHZ, what is a tax payer to make of this?

Wouldn't a distant observer immediately want the teachers to swap
programming languages and to try the opponent's language in a controlled
experiment?

Georg Bauhaus

unread,
Jun 20, 2011, 10:48:49 AM6/20/11
to
On 20.06.11 16:40, Georg Bauhaus wrote:

> an object is a module, period. Unlike C++, say. He does show

s/an object/a class/

Sorry.

jonathan

unread,
Jun 20, 2011, 7:56:34 PM6/20/11
to
On Jun 20, 3:40 pm, Georg Bauhaus <rm.dash-bauh...@futureapps.de>
wrote:

> FP *hides* the effective work's complexity behind a mathematical
> facade: you write down a nice formula after days of thinking,
> shouting, explaining, and getting it right. Then, you do *not*
> document your "thought process" in source text.  Instead, you just
> "sign" days of work with a few nice lines of code.  That's quite
> typical of FP, I think, but just less feasible in "imperative"
> languages. We can stare at the short FP algorithm in awe of its
> "mathematical elegance" and start wondering what the heck it is doing
> behind the scenes. Yes, doing, since doing is what matters, even
> when we just want to understand an algorithm.
>

Functional programming doesn't appeal to me, for similar
reasons, but mostly the problem is I'll always need the best
efficiency that languages like Fortran/C/Ada can give me,
and I never want to battle a garbage collector.
What I do find appealing is Harper's defense of static typing
+ modularity. 20 yrs ago someone said to me in passing that
there was this really good new language out called C++. My
exact words were: I've come to the conclusion that languages
should be strongly typed and modular. The conversation
came to a full stop. Her eyes widened, and she didn't say
another word to me. Well, after 20 years OO chauvinism I did get
a kick out of Harper's confident claims and the observation that


"We have for decades struggled with using object-oriented
languages, such as Java or C++, to explain these simple ideas,

and have consistently failed." But he muddies the water by
deprecating OO. I'm pretty sure you can do all these things
fine in Ada, and without garbage collectors.

J.

steveh44

unread,
Jun 21, 2011, 5:36:31 AM6/21/11
to
On Jun 20, 4:56 pm, jonathan <johns...@googlemail.com> wrote:

>
> Functional programming doesn't appeal to me,

I also tried FP, and I did not like it either.

I could write some cool looking code, which does
something amazing in few lines, but when I come
back few days later and look at it, I find myself
struggling to understand it.

FP seems to be good for short programs. But try to
build a large software system with it, and things
start falling apart quickly.

Steve

Dmitry A. Kazakov

unread,
Jun 21, 2011, 8:19:16 AM6/21/11
to
On Mon, 20 Jun 2011 16:56:34 -0700 (PDT), jonathan wrote:

> "We have for decades struggled with using object-oriented
> languages, such as Java or C++, to explain these simple ideas,
> and have consistently failed." But he muddies the water by
> deprecating OO. I'm pretty sure you can do all these things
> fine in Ada, and without garbage collectors.

There is OO religion and much more practical theory of abstract types. It
is not clear what is meant where. Anyway there are more choices than just
between typed and untyped. E.g. procedural vs. object decomposition,
imperative vs. declarative, manifested vs. inferred (typing), scoped vs.
global. People arguing FL vs. OOPL usually mix everything making any
reasonable discussion impossible.

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

Phil Clayton

unread,
Jun 21, 2011, 8:14:03 AM6/21/11
to
On Jun 20, 3:40 pm, Georg Bauhaus <rm.dash-bauh...@futureapps.de>
wrote:
>
> 2. FP *hides* the effective work's complexity behind a mathematical
> facade: you write down a nice formula after days of thinking,
> shouting, explaining, and getting it right. Then, you do *not*
> document your "thought process" in source text.  Instead, you just
> "sign" days of work with a few nice lines of code.  That's quite
> typical of FP, I think, but just less feasible in "imperative"
> languages.  We can stare at the short FP algorithm in awe of its
> "mathematical elegance" and start wondering what the heck it is doing
> behind the scenes.

I don't believe that it is less feasible in imperative languages or
any other paradigm: put in enough effort to make insightful
observations and you can come up with succinct efficient algorithms
that are hard to understand. Consider the following imperative
example from Carroll Morgan's book Programming from Specifications
(see last page):
http://www.cs.ox.ac.uk/publications/books/PfS/PfS-21.ps.gz
(Well.. it didn't exactly leap out at me!)

I would suggest that elegant and insightful programs are due to the
more academic nature of the text/author. I do agree that such elegant
programs are more typical with FP but only because a much greater
proportion of FP text has academic roots.

There is plenty of industrial FP out there, where the emphasis is on
code being well documented and understandable, maintainable etc. by
normal people. I suppose that is industrial elegance!

Phil

Phil Clayton

unread,
Jun 21, 2011, 9:04:17 AM6/21/11
to
On Jun 21, 10:36 am, steveh44 <steve_...@yahoo.com> wrote:
>
> FP seems to be good for short programs.

I find most languages are good for small programs. (Maybe that was
your point! :)

The fundamental property of pure functional programming - no
destructive assignment, i.e. all value-based - is good for large
programs. You know that the value of data is determined only by
certain points in your program. You could could achieve a similar
effect in Ada by "programming with expressions" though Ada wasn't
designed for that and would be more limited. This is particularly
suitable for data transformation algorithms. I readily admit that
this makes FP fundamentally less suitable for certain classes of
algorithm. However, if you do put the effort in to keep programs
purely functional, the above property means that it would be much
easier to introduce concurrency.

A big issue with FP is the complexity of the compliers/interpreters.
Predicting what happens at run-time is difficult. Even with Ada (or C
etc.), imagine writing a program as an enormous expression by using
only functions - how deep does the stack go?


> But try to
> build a large software system with it, and things
> start falling apart quickly.

I'm not sure what you mean by "falling apart". FP languages usually
have module systems that support large-scale software. In my
experience there is one area where FP can quickly fall apart for large
systems: efficiency. In practice, there are a handful of things to
know and most problems will be avoided. Even so, timing regression
tests are important.

Phil

Shark8

unread,
Jun 21, 2011, 8:37:53 PM6/21/11
to
On Jun 21, 4:36 am, steveh44 <steve_...@yahoo.com> wrote:
>
> I could write some cool looking code, which does
> something amazing in few lines, but when I come
> back few days later and look at it, I find myself
> struggling to understand it.
>
> Steve

You know, that's exactly why most languages allow comments.
And a particular reason that I like Ada; the comments themselves tend
to be written better (this is purely subjective and highly dependent
on my own experiences/exposures and more informative than any of the
"curly-brace" languages. {Javadoc being perhaps an exception; its
verbosity & rigid structure is quite the double-edged sword and it is
easy for the documentation and code to get out of sync ESP. by someone
using some quick "cut-n-paste programming".}

Oliver Kleinke

unread,
Jun 22, 2011, 4:39:16 AM6/22/11
to
Am Mon, 20 Jun 2011 16:40:36 +0200
schrieb Georg Bauhaus <rm.dash...@futureapps.de>:

> Incidentally, regarding mathematical elegance, why and when do
> students still run away from CS? As soon as they are overwhelmed with
> formality, including inscrutable mathematical elegance. Is it because
> they aren't bright, or is it rather because they cannot get good
> explanations? Because their teachers simply presume "obvious" math
> skills and cannot and do not know what it is like to not have grown
> up with all prerequisite mathematics. All day.

Talking to my class mates at university (mostly first to fourth term
students, most of them had had no programming experience) I rather get
the impression that OOP involves abstract constructs that are not
'compact' for a beginner. FP wouldn't do any better. IMO the
best approach would be to first begin with simple structured
programming and then go on with programming by extension,
modularization, etc. From my limited experience I tell Java, C/C++ and
the like are not a good choice for an introductory programming
language and are meant to fail. But all that is nothing really new..
(There were some papers on that topic ~15 years ago I think.)

an...@att.net

unread,
Jun 22, 2011, 5:45:25 AM6/22/11
to
Taking a weekend break may clear your head that may help solve the
problem but it can also cause you to forget how you were solving that
problem.

Plus, back in the day, some of the profs would say "It might be the best
code in the world, but without comments its a grade of zero." Because
even the prof may not remember which problem your working on.

And it only take a few comments.

In <b66eac0c-c65b-4a9c...@o10g2000prn.googlegroups.com>, steveh44 <stev...@yahoo.com> writes:

Nasser M. Abbasi

unread,
Jun 22, 2011, 10:48:13 PM6/22/11
to
On 6/22/2011 1:39 AM, Oliver Kleinke wrote:

>
> Talking to my class mates at university (mostly first to fourth term
> students, most of them had had no programming experience) I rather get
> the impression that OOP involves abstract constructs that are not
> 'compact' for a beginner. FP wouldn't do any better. IMO the
> best approach would be to first begin with simple structured
> programming and then go on with programming by extension,
> modularization, etc. From my limited experience I tell Java, C/C++ and
> the like are not a good choice for an introductory programming
> language and are meant to fail. But all that is nothing really new..
> (There were some papers on that topic ~15 years ago I think.)

I am happy that my first language at school was Pascal, and my
programming books bibles for years were

"Algorithms + Data Structures = Programs" by Wirth.
Data Structures and Algorithms by Aho and Ullman
Fundamentals of data structures in Pascal by Horowitz and Sahni

Those were the good old days :)

--Nasser

Yannick Duchêne (Hibou57)

unread,
Jun 23, 2011, 5:23:57 AM6/23/11
to
Le Mon, 20 Jun 2011 16:40:36 +0200, Georg Bauhaus
<rm.dash...@futureapps.de> a écrit:

> Not that it matters most, or is most important to the subject
> of understanding modularity and parallelism, but I must mention
> that there remain a few major blind spots in FP circles, IMHO.
> It won't help as is.
>
> "There is nothing more dreary than the corporate
> bureaucracy of OOP, and few things more lovely
> than the mathematical elegance of FP."
>
> Yes, I'd agree that O-O bureaucracy and spaghetti code can be
> correlated. Why, however, do we not seem to see bureaucracy in ML
> source text?
>
> 0. There is less ML text. Different problem domains, too.
> Perhaps suggesting a different kind of program right from the start.
>
> 1. Actually, they use imperative style or suitable contortions
> in functional programming when efficiency is required. I remember
> looking at the chapter on Fusions in Bird's FP in Haskell
> book. The chapter's subject is messy, the approach produces
> text similar to the lengthier noodles in Wirth's algorithms book,
> conflating things for efficiency reasons. Where is the mathematical
> elegance in that?
My small-talk: this is simply failing with the simple rule which is “use
the best language for what it is best suited”. Troubles comes when you
want only one for everything.

--
“Syntactic sugar causes cancer of the semi-colons.” [Epigrams on
Programming — Alan J. — P. Yale University]
“Structured Programming supports the law of the excluded muddle.” [Idem]
Java: Write once, Never revisit

Yannick Duchêne (Hibou57)

unread,
Jun 23, 2011, 5:59:51 AM6/23/11
to
Le Tue, 21 Jun 2011 11:36:31 +0200, steveh44 <stev...@yahoo.com> a écrit:

> On Jun 20, 4:56 pm, jonathan <johns...@googlemail.com> wrote:
>
>>
>> Functional programming doesn't appeal to me,
>
> I also tried FP, and I did not like it either.

I you just (and not more) used it as a clean base from which to derive
other implementations, in say, Ada, you may had another feeling. Sure FP
lacks things like the expression of concrete hardware (real life machines)
limitations: typically, with FP, you will not bother about defining a
range for an integer type, instead, it may rely on big numbers, which are
obviously less efficient and only limited by available memory and CPU
speed, which is obviously not acceptable at all in many case.

Keep in mind: FP does not focus on the same thing a language like Ada do;
however, multiple views is often a good thing. I like to think for some
area (not all, don't forget), FP is nice for a first model, from which you
will derive a concrete implementation. You will then not throw it away,
because it may help you to compare result of both FP and Ada
implementation, if ever some suspicious results occurs with the release
implementation, or if you lost some tracks due to some side-effect mess.
You think of an FP model as a documentation for an Ada implementation, as
a way to explain it (providing the FP text is documented enough, of course)

Also, as Phil outlined, being expression based (no side effect), FP also
leave you a chance to turn a sequential implementation into a concurrent
implementation, which is not the least. Whatever an Ada implementation is,
try to turn a sequential design into a concurrent design, and tell about
it after you did.

As a rule of thumb, although its not true for every thing, keep in mind FP
is not good for final release implementation.

> I could write some cool looking code, which does
> something amazing in few lines, but when I come
> back few days later and look at it, I find myself
> struggling to understand it.

As Phil and Shark said: use “(*” comments “*)” ;) (aren't you talking in
an Ada Usenet here ? ;) )

> FP seems to be good for short programs. But try to
> build a large software system with it, and things
> start falling apart quickly.

That is true of any bad design, what ever the language is. Also note that
SML comes with Signature, Structure, and Functor, which all may help here.

Nasser M. Abbasi

unread,
Jun 23, 2011, 6:03:41 AM6/23/11
to
On 6/23/2011 2:23 AM, Yannick Duchêne (Hibou57) wrote:

> My small-talk: this is simply failing with the simple rule which is “use
> the best language for what it is best suited”. Troubles comes when you
> want only one for everything.
>

This is true in principle, but sometimes in practice there
are other considerations.

One language, even though it might not be as "good" as
another for use on the same problem, might happen
to have a much larger library, or much better supporting
tools (debugger, GUI builder, cross compiler, etc...) or
much better support, books, help, and such, and all these things
add up in making developing in it more "convenient" than the
other (better) language.

I mean, when deciding which is the "best" language
for a problem, one must look beyond the language itself.

If it was just the language which decides, then everyone
should be using Ada, because we all know Ada is the best
language there is :)


--Nasser


Dmitry A. Kazakov

unread,
Jun 23, 2011, 6:25:00 AM6/23/11
to
On Thu, 23 Jun 2011 11:59:51 +0200, Yannick Duch�ne (Hibou57) wrote:

[...]


> with FP, you will not bother about defining a
> range for an integer type, instead, it may rely on big numbers, which are
> obviously less efficient and only limited by available memory and CPU
> speed, which is obviously not acceptable at all in many case.

This is a weak argument. The strong one is that the mathematical numbers
used in engineering are simply incomputable. The question is not efficiency
("how"), it is "if": you cannot represent numbers involved, e.g. real
numbers. Engineering computations are on the model numbers, where range is
just one constraint among others.

All programming is actually about constraints, which makes things so hard,
mathematically, algorithmically and also in terms of types (e.g. LSP
violation). Ignoring this does not help.

Yannick Duchêne (Hibou57)

unread,
Jun 23, 2011, 6:57:49 AM6/23/11
to
Le Thu, 23 Jun 2011 12:25:00 +0200, Dmitry A. Kazakov
<mai...@dmitry-kazakov.de> a écrit:

> This is a weak argument. The strong one is that the mathematical numbers
> used in engineering are simply incomputable. The question is not
> efficiency
> ("how"), it is "if": you cannot represent numbers involved, e.g. real
> numbers. Engineering computations are on the model numbers, where range
> is
> just one constraint among others.
Was indeed what I wanted to mean (yes, even SML is not able to really
express the ideal nature of numbers, and nothing is able to).

> All programming is actually about constraints, which makes things so
> hard,
> mathematically, algorithmically and also in terms of types (e.g. LSP
> violation). Ignoring this does not help.

Ignoring things can help.

There is always some stage where you ignore some one of some other things.
This is an unavoidable part of a design or understanding process (either
with math or computation, this is true with both). The question is: how do
you forget less and less as the time go, and how do you ensure you do not
forget what is relevant for this and that (“this” and “that” don't occur
at the same time). That's why I underlined the matter of focus: Ada and
SML have different focus.

FP is less executable than Ada, so there are things you will less bother
with FP, you will be required to bother about with Ada. There may be
techniques to make FP more executable (I do not know enough about OCaml
compilers), but I believe FP is primarily a seed to derive from. If you
don't expect FP text to be a final release, you can drop some physical
matters at this stage. Let say, if you can execute it, that's for testing
and prove by execution, which make it more handy than maths.

Do you see what I mean ?

By the way, if you want to prove an Ada program, how do you do it, if not
by turning it into a view where it is all made of expressions (with
possibly multiple expressions at some point, for coverage) ? That's what
SPARK do so far. Yes, SPARK include Ada's type constraint in the proof
requirement, so that is not strictly comparable, but this is a matter of
stage and focus. Let say the functional view you get when proving an Ada
program with SPARK, is more frozen and narrowed than what an expression of
the same program would be with SML. And “frozen” and “narrowed” should not
occurs too much early (and can't occur early anyway), and even, you may
want to keep an hand or an eye on the non-frozen / non-narrowed view at
any time.

Do you feel the picture ?

Yannick Duchêne (Hibou57)

unread,
Jun 23, 2011, 7:07:55 AM6/23/11
to
Le Thu, 23 Jun 2011 12:03:41 +0200, Nasser M. Abbasi <n...@12000.org> a
écrit:

> This is true in principle, but sometimes in practice there
> are other considerations.
>
> One language, even though it might not be as "good" as
> another for use on the same problem, might happen
> to have a much larger library, or much better supporting
> tools (debugger, GUI builder, cross compiler, etc...) or
> much better support, books, help, and such, and all these things
> add up in making developing in it more "convenient" than the
> other (better) language.
Yes, but the intent was mainly to say the danger is to use only one for
everything. I know the topic of choosing which is better for what, is less
easily solved than suggested.

> If it was just the language which decides, then everyone
> should be using Ada, because we all know Ada is the best
> language there is :)

Sorry, I don't even believe that :p (not joking)
I would prefer say: it is in many occasions a best choice (and better than
typical choice at least), but not always for everything.

If you search a bit, you will come with examples yourself.

Dmitry A. Kazakov

unread,
Jun 23, 2011, 8:20:44 AM6/23/11
to
On Thu, 23 Jun 2011 12:57:49 +0200, Yannick Duch�ne (Hibou57) wrote:

> Ignoring things can help.
>
> There is always some stage where you ignore some one of some other things.

Yes, but you should not forget that you deal with a model. Many approaches
claiming themselves mathematically puristic suffer this error.

> FP is less executable than Ada, so there are things you will less bother
> with FP, you will be required to bother about with Ada.

Do you mean that FP programs are never intended to be executed? (:-))

> Do you see what I mean ?

No, I suppose you mean that FPL are more declarative than Ada. That is not
necessarily true, because Ada as a language supporting OO decomposition has
much declarative stuff to support this, missing in the languages oriented
on procedural decomposition.

> By the way, if you want to prove an Ada program, how do you do it, if not
> by turning it into a view where it is all made of expressions (with
> possibly multiple expressions at some point, for coverage) ?

Again, if you mean that any formal proof has to be strictly declarative
(assuming its relation to the run time). Yes it evidently has to be. But
the proof is not the program itself. They are written in different
languages of different power. IMO, it is fundamentally impossible to melt
them into one.

Also consider what would happen if you wanted to prove the proof. You would
need a third language for which the second one would be "executable," and
the inference it does would be just a program.

Georg Bauhaus

unread,
Jun 23, 2011, 6:17:23 PM6/23/11
to
On 6/23/11 12:57 PM, Yannick Duchêne (Hibou57) wrote:
> Le Thu, 23 Jun 2011 12:25:00 +0200, Dmitry A. Kazakov <mai...@dmitry-kazakov.de> a écrit:
>> This is a weak argument. The strong one is that the mathematical numbers
>> used in engineering are simply incomputable. The question is not efficiency
>> ("how"), it is "if": you cannot represent numbers involved, e.g. real
>> numbers. Engineering computations are on the model numbers, where range is
>> just one constraint among others.
> Was indeed what I wanted to mean (yes, even SML is not able to really express the ideal nature of numbers, and nothing is able to).
>
>> All programming is actually about constraints, which makes things so hard,
>> mathematically, algorithmically and also in terms of types (e.g. LSP
>> violation). Ignoring this does not help.
> Ignoring things can help.
>
> There is always some stage where you ignore some one of some other things.

Ignoring things is a pattern in programming whose consequences,
while known to turn out expensive, are consistently ignored, too.
A pattern, incidentally, that seems to have been strikingly absent
when the HOLWG collected a catalog of language requirements more
than 30 years ago.

Ignoring some things can set the stage for successful software
business, since ignoring things allows concentrating on a model that
is simple, and clearly defined. A simple to use framework is an
example. Or a cute sequential scripting language. Or some easy to
program mapping of objects to relational database tables. (Or the aspect
of a FPL that interests you most.)

These simple offerings have a few characteristics in common:

- They are all simple.

-They can therefore be understood and handled by many.

- They are therefore attractive,

So they are seen as having potential.


Alas, the potential turns out to be highly energetic.

When the potential of simplified attractive offerings is activated,
its first effect is that "the market" quickly establishes businesses
around the simple offerings. It also establishes a workforce that
accumulates knowledge about these things; some will know the
framework, some will have learned the scripting language, some will
have started mapping objects to RDBMS tables.

(Ada, I guess, has had its new market, too, but for different reasons,
not because things had been ignored. "The mandate" would help create
Ada business in those years.)

Second, the offerings' potential, when moving the marketm is then starting
to generate an ecosystem of work. Work that isn't easy, frequently not
fun, and not very productive: work that is about tackling the things that
have been ignored. Such as:

- when the framework turns out to be little more than boilerplate
text---but otherwise requires detailed knowledge of a vast configuration
sublanguage.

- when the scripting language is used to process international text
in a concurrent environment and fails, since char issues and concurrency
have been ignored.

- when the program's data model turns out to be relational, but
the object-as-a-table magic does not quite work.
The workforce then finds themselves trying to reinvent a relational
database management system, investing man months ...

The reason that in the context of FPLs I'm mentioning this pattern (of
deliberately ignoring some things being a cause and expensive work
being an effect) is that one of these languages, the ATS language, is
really highly efficient by design and also includes a proof system
right in the language. It has safe pointers, too. So it would seem
promising. Even as a layman, I am sure there are many interesting
advancedfeatures in ATS.

But as usual, some things have been ignored.

As far as I can tell, at least these:

The numeric types are, by default, basically int of ISO/IEC 9899 (C).
A factorial function of a large enough argument may therefore
return a number < 0. Likewise, watch the program

implement main() = begin
print_int(4000000000) ; print_newline()
end

$ ./a.out
-294967296

O'Caml appears to be operating at a similar level, if my copy
isn't broken:

# let i = 4000000000;;
val i : int = 4000000000
# i * i;;
- : int = -2446744073709551616
#

The puzzling issue is why don't functional programmers just copy
the solution from Lisp, such as CMUCL? Or from Python? Are they
ignoring even "their own relatives"?

* (let ((i 4000000000))
(* i i))

16000000000000000000
*

>>> i = 4000000000
>>> i * i
16000000000000000000L

Won't it be a good idea to first collect requirements for a system
of fundamental types such as whole numbers for use in new languages?
Then implement them for new functional languages. So as to *not*
build new ideas atop ISO/IEC 9899 and spoil it with the consequences
of importing int. Doing so seems like breaking future software by
design.


Continuing things ignored in ATS:
Syntax. Programmers will need initiation into another interpretation
of ASCII punctuation. ATS does use ASCII symbols, which may be a lot
better than words insofar as symbols are as international as "+".
The symbols have precisely specified meanings. But they have these
meanings in just this language. How many programmers use just
one language? The situation is worsened by overloading ASCII
symbols, which according to Australian studies does not seem
to help newcomers understand source text. (Which, incidentally,
is an argument against Ada's X(42) meaning both array component
and call.)

Concurrency. ATS does address concurreny---with POSIX threads
and such. Well, that's at least something. Or maybe not?

ATS appears to be a research work, too. I understand that Haskell is
the outcome of a collaborative effort to create a FPL that would have
desired properties. Maybe ATS work will at some stage be integrated
into some FPL. Hopefully, though, only the ideas will be adopted into
contemporary programs, not the underlying fundamental types.
How can a safe ATS pointer be safe in the presence of int ?

In the long run, I don't think it is a good idea to ignore things
as often as it is actually done, and supported. The pattern,
whether employed in language design or in programming, will generate
insecure software and entail costly repair.

Phil Clayton

unread,
Jun 23, 2011, 9:26:57 PM6/23/11
to
On Jun 23, 11:17 pm, Georg Bauhaus <rm.dash-bauh...@futureapps.de>
wrote:

> The reason that in the context of FPLs I'm mentioning this pattern (of
> deliberately ignoring some things being a cause and expensive work
> being an effect) is that one of these languages, the ATS language, is
> really highly efficient by design and also includes a proof system
> right in the language. It has safe pointers, too. So it would seem
> promising.  Even as a layman, I am sure there are many interesting
> advancedfeatures in ATS.
>
> But as usual, some things have been ignored.

I don't know about ATS but can usefully add some points...


> As far as I can tell, at least these:
>
> The numeric types are, by default, basically int of ISO/IEC 9899 (C).

Arbitrary magnitude integers are supported in at least SML, OCaml and
Haskell.


> O'Caml appears to be operating at a similar level, if my copy
> isn't broken:
>
> # let i = 4000000000;;
> val i : int = 4000000000
> # i * i;;
> - : int = -2446744073709551616
> #
>
> The puzzling issue is why don't functional programmers just copy
> the solution from Lisp, such as CMUCL? Or from Python? Are they
> ignoring even "their own relatives"?

OCaml:
#load "nums.cma";;
open Big_int;;
let i = big_int_of_int 4000000000;;
string_of_big_int (mult_big_int i i);;

SML:
open IntInf; (* not required if structure Int is IntInf *)
val i = 4000000000;
i * i;


> Syntax. Programmers will need initiation into another interpretation
> of ASCII punctuation.  ATS does use ASCII symbols, which may be a lot
> better than words insofar as symbols are as international as "+".
> The symbols have precisely specified meanings. But they have these
> meanings in just this language.  How many programmers use just
> one language?  The situation is worsened by overloading ASCII
> symbols, which according to Australian studies does not seem
> to help newcomers understand source text.

My guess is that in the FP world, there is more variation in syntax
because there was no dominant language for the flock to follow.

Overloading of arithmetic operators, as in SML above, seems preferable
to having a different name for e.g. "+" for each type.


> Concurrency.  ATS does address concurreny---with POSIX threads
> and such. Well, that's at least something. Or maybe not?

There is Concurrent Haskell which I know nothing about. For SML, Poly/
ML provides an interface to the pthreads library that has been used to
provide a higher-level concurrency interface:
http://www4.in.tum.de/~wenzelm/papers/parallel-ml.pdf
I'm not sure which others offer true concurrency.

I generally agree that many FPLs lack essential features.

Phil

Yannick Duchêne (Hibou57)

unread,
Jun 23, 2011, 9:27:25 PM6/23/11
to
Le Fri, 24 Jun 2011 00:17:23 +0200, Georg Bauhaus
<rm.dash...@futureapps.de> a écrit:

> Ignoring things is a pattern in programming
Funny wording :)

> whose consequences,
> while known to turn out expensive, are consistently ignored, too.
> A pattern, incidentally, that seems to have been strikingly absent
> when the HOLWG collected a catalog of language requirements more
> than 30 years ago.

Please, tell more

> Ignoring some things can set the stage for successful software
> business, since ignoring things allows concentrating on a model that
> is simple, and clearly defined.

Yes, that's the purpose of modeling. But this does not include on-purpose
erroneous design, as you erroneously suggest later (seems you've ignored
some possible views on the topic :-P ).

> A simple to use framework is an
> example. Or a cute sequential scripting language. Or some easy to
> program mapping of objects to relational database tables. (Or the aspect
> of a FPL that interests you most.)

I was not talking about FPL in the large, as I know only one, which is SML
(others are too much different). And this one was not designed to be cute,
but to be sound. It was specified prior the first implementation (as you
know I guess), just like Ada was, and this specification was not based on
The Great Cute Levels Catalog, but made of coherence proofs instead.
Unfortunately, you've made this error so much early, that a large part of
what you wrote since here is unclear due to that assumption. But you are
still OK, as its my sole fault I was too much imprecise about the meaning
of “forget” (will be clarified a bit later).

> These simple offerings have a few characteristics in common:
>
> - They are all simple.
>
> -They can therefore be understood and handled by many.
>
> - They are therefore attractive,

None of the three applies here.

> So they are seen as having potential.

Not really, as people complains the crowd did not adopted it (true for
Haskell too).


> Alas, the potential turns out to be highly energetic.
>
> When the potential of simplified attractive offerings is activated,
> its first effect is that "the market" quickly establishes businesses
> around the simple offerings. It also establishes a workforce that
> accumulates knowledge about these things; some will know the
> framework, some will have learned the scripting language, some will
> have started mapping objects to RDBMS tables.
>
> (Ada, I guess, has had its new market, too, but for different reasons,
> not because things had been ignored. "The mandate" would help create
> Ada business in those years.)
>
> Second, the offerings' potential, when moving the marketm is then
> starting
> to generate an ecosystem of work. Work that isn't easy, frequently not
> fun, and not very productive: work that is about tackling the things that
> have been ignored. Such as:

Hey, I was not talking about Python!

> - when the framework turns out to be little more than boilerplate
> text---but otherwise requires detailed knowledge of a vast configuration
> sublanguage.
>
> - when the scripting language is used to process international text
> in a concurrent environment and fails, since char issues and concurrency
> have been ignored.

Not a language failure, but an application design error.

> - when the program's data model turns out to be relational, but
> the object-as-a-table magic does not quite work.

Wrong analysis. No language can avoid this, as this is often prior to
enough text in any language, except human's languages.

> The workforce then finds themselves trying to reinvent a relational
> database management system, investing man months ...

You end into reinventing paradigms when you used the wrong language for
the wrong thing, which was what I was arguing against in another message
to someone (nicely) stating Ada is the best language ever for everything.
I would not promote SML as the best language for everything, not more than
I would do this for Ada. You have to feel that SML is well suited for one
task prior to use it, or else, don't use it. In short, I would say SML is
well suited when 1) you want to express some kind of algorithm with the
safety of avoiding side effects (which Ada can't prevent by nature) 2)
when you want to express an algorithm with some terseness, which is
something I know Ada lawyers will argue against, but which is anyway
better than abstract comments in an Ada program, which would be controlled
in no way and which would be too much interleaved with the Ada text to be
of real readability help 3) when you want expression based text, which may
help proof by any mean (by hand, or why not automatic). To be honest, Ada
provides tools for the latter, especially since Ada 2012 and its
conditional expressions. But it still lack terseness and easy avoidance of
side effects. Never mind, it should not provide these, as this is not its
purpose.

When I said “forget things”, I meant for example, forget about number
limitations, explicit annotation of type (which are anyway statically
explicit to the SML compiler, thanks to type inference) to help terseness
and then provided a better view of the whole with less cognition load, and
forget about any other kind of memory concrete things, like available
memory, including memory management.

When you start forgetting such things, you do not build anything
erroneous, you just create something which will need to be refined and
specialized, which is different than something which will need to be
totally redesigned.

You may think of it as HTML+CSS, where HTML is the core of the meaning and
CSS the layout layer: the core meaning of an algorithm may be the SML
text, and the meaning+presentation would be the concrete implementation to
be presented to some machine or hardware, the one which is better
expressed with Ada, due to many of the requirement at this stage.

If HTML forgets about presentation, that's for a good, and this does not
lead to erroneous pages. This leads instead to pages with potentially
multiple implementation, due to clean separation of both, just like an SML
text may have multiple Ada implementations, depending on requirement:s
memory limitation, concurrency available or not, preference for sequential
implementation or not, use of garbage collector or not -- which I
personally dislike, available or recommended width for numbers, etc. This
is of no help at all to introduce all of these too much early, this would
have the same effect as obfuscation.

> The reason that in the context of FPLs I'm mentioning this pattern (of
> deliberately ignoring some things being a cause and expensive work
> being an effect) is that one of these languages, the ATS language, is
> really highly efficient by design and also includes a proof system
> right in the language. It has safe pointers, too. So it would seem
> promising. Even as a layman, I am sure there are many interesting
> advancedfeatures in ATS.
>
> But as usual, some things have been ignored.

Yes, Among others, modularity (I still not tried this one, sorry, just
read about it).

> As far as I can tell, at least these:
>
> The numeric types are, by default, basically int of ISO/IEC 9899 (C).
> A factorial function of a large enough argument may therefore
> return a number < 0. Likewise, watch the program
>
> implement main() = begin
> print_int(4000000000) ; print_newline()
> end
>
> $ ./a.out
> -294967296

Does not stand for what I was suggesting to forget about. Otherwise I
would have suggested to simply leave Ada ;-) and I didn't.

> O'Caml appears to be operating at a similar level, if my copy
> isn't broken:
>
> # let i = 4000000000;;
> val i : int = 4000000000
> # i * i;;
> - : int = -2446744073709551616
> #

Do you know what SML proponents says about OCaml ? They complains the
fundamental principles of SML were sacrificed to efficiency, it is
suspected to be less sound, and to be less safe due to heavy use of side
effects.

Let's try it with SML:

Moscow ML version 2.01 (January 2004)
Enter `quit();' to quit.
- val i = 4000000000;
! Toplevel input:
! val i = 4000000000;
! ^^^^^^^^^^
! Lexical error: integer constant is too large.

Oops, integer too large, so let's drop a zero (I did not checked MoSML
specific limitations)

- val i = 400000000;
> val i = 400000000 : int
- val j = i * i;
! Uncaught exception:
! Overflow

Either a compiler error or a runtime exception. And that's not Moskow ML
specific, that's by SML specification.

Just to underline a modeling language which help to forget about
somethings, is not to be confused with a badly designed language.

> The puzzling issue is why don't functional programmers just copy
> the solution from Lisp, such as CMUCL? Or from Python? Are they
> ignoring even "their own relatives"?
>
> * (let ((i 4000000000))
> (* i i))
>
> 16000000000000000000
> *
>
>>>> i = 4000000000
>>>> i * i
> 16000000000000000000L

I could not understand that point.

> Won't it be a good idea to first collect requirements for a system
> of fundamental types such as whole numbers for use in new languages?
> Then implement them for new functional languages. So as to *not*
> build new ideas atop ISO/IEC 9899 and spoil it with the consequences
> of importing int. Doing so seems like breaking future software by
> design.

I don't remember if I already told you (I feel I remember I did, but I may
be wrong), but there is a plan to refine SML specifications.
Unfortunately, as this one is not famous enough, too few people contribute
ideas. This is still the best base I believe, due to the amount of work
already done for that language (proved soundness and some experienced
users with worthy comments).

> Continuing things ignored in ATS:
> Syntax. Programmers will need initiation into another interpretation
> of ASCII punctuation. ATS does use ASCII symbols, which may be a lot
> better than words insofar as symbols are as international as "+".
> The symbols have precisely specified meanings. But they have these
> meanings in just this language. How many programmers use just
> one language?

Well, I guess they “use only one for everything” :D

> The situation is worsened by overloading ASCII

Hum, Unicode phobia again.

> symbols, which according to Australian studies does not seem
> to help newcomers understand source text.

At least not me, so I can believe it.

> (Which, incidentally,
> is an argument against Ada's X(42) meaning both array component
> and call.)

No, that's not the same, because array access is a kind of function :-P .
While that said, this would be easy enough to wrap array access in a
function when one want the source to be independent of whether Foo is
defined as an array or a function. This feature is from time to time, even
promoted as a good feature of Ada (array access using the same syntax as
function invocation)

> Concurrency. ATS does address concurreny---with POSIX threads
> and such. Well, that's at least something. Or maybe not?

I am tempted to say “not as clean as Ada task, probably”, that's a good
point to notice about ATS, but this is not really due to the ability of
the language to hide feature X which is not relevant at stage Y.

> ATS appears to be a research work, too. I understand that Haskell is
> the outcome of a collaborative effort to create a FPL that would have
> desired properties. Maybe ATS work will at some stage be integrated
> into some FPL. Hopefully, though, only the ideas will be adopted into
> contemporary programs, not the underlying fundamental types.
> How can a safe ATS pointer be safe in the presence of int ?

What is the question precisely ? I am not sure I've understood the
question.

> In the long run, I don't think it is a good idea to ignore things
> as often as it is actually done, and supported.

When you say “as it is actually done, and supported”, do you suggest there
could a good way to do ?

> The pattern,
> whether employed in language design or in programming, will generate
> insecure software and entail costly repair.

What about the example precisions I provided above in this message ?

Yannick Duchêne (Hibou57)

unread,
Jun 23, 2011, 9:34:53 PM6/23/11
to
Le Fri, 24 Jun 2011 03:26:57 +0200, Phil Clayton
<phil.c...@lineone.net> a écrit:

> On Jun 23, 11:17 pm, Georg Bauhaus <rm.dash-bauh...@futureapps.de>
> wrote:
>> The reason that in the context of FPLs I'm mentioning this pattern (of
>> deliberately ignoring some things being a cause and expensive work
>> being an effect) is that one of these languages, the ATS language, is
>> really highly efficient by design and also includes a proof system
>> right in the language. It has safe pointers, too. So it would seem
>> promising. Even as a layman, I am sure there are many interesting
>> advancedfeatures in ATS.
>>
>> But as usual, some things have been ignored.
>
> I don't know about ATS but can usefully add some points...

If you want to learn bout ATS, have look there: http://www.ats-lang.org/

Georg Bauhaus

unread,
Jun 24, 2011, 6:32:50 AM6/24/11
to
On 6/24/11 3:27 AM, Yannick Duchêne (Hibou57) wrote:
> Le Fri, 24 Jun 2011 00:17:23 +0200, Georg Bauhaus <rm.dash...@futureapps.de> a écrit:
>> Ignoring things is a pattern in programming
> Funny wording :)
>
>> whose consequences,
>> while known to turn out expensive, are consistently ignored, too.
>> A pattern, incidentally, that seems to have been strikingly absent
>> when the HOLWG collected a catalog of language requirements more
>> than 30 years ago.
> Please, tell more

That's Ada history. Find it looking for "Requirements for
High Order Computer Programming Languages".


> I was not talking about FPL in the large, as I know only one, which is SML (others are too much different). And this one was not designed to be cute, but to be sound. It was specified prior the first implementation (as you know I guess), just like Ada was, and this specification was not based on The Great Cute Levels Catalog, but made of coherence proofs instead. Unfortunately, you've made this error so much early, that a large part of what you wrote since here is unclear due to that assumption. But you are still OK, as its my sole fault I was too much imprecise about the meaning of “forget” (will be clarified a bit later).

My topic is the pattern of deliberately ignoring things, and
how it is related to forgetting, simplifying, or refining.

ML has one feature whose characteristics and effects were basically
ignored. It may be that, when designing the feature, something
essential about writing programs had just been forgotten, but
certainly the feature was not to be refined, later. On the contrary.

If you forget things, they tend to be added later, or at
least you will concede that there is a bug. This can be
fixed.

Refining is possible only when there had been an assumption
of something to be refined. Ignoring is different from refining.

The pattern of ignoring things is present in ML syntax. Andrew
Appel expounds in his Critique of ML. In one paragraph, he mentions
that ML syntax does not get him into trouble any more. But in later
paragraphs, he explains at some length where and how ML syntax was
*not* given the same attention as the semantics of Standard ML, and
why this is not the best of things.

I believe that ML syntax wasn't forgotten. Just ignored. It wasn't to
be refined later, either. It wasn't simplified for modeling
reasons. It was just ignored. With SML out the door, we, the
programmers, can get our load of condescension when we complain
about inscrutable error messages when some expression gulps the
rest of the compilation unit: "Why didn't you put the
optional semicolon there, STUPID!? If your expression awaits
another function, then the compiler is quite right to consider
the next fun definition in the file to be that function!" Or similar.

(Do you recognize the familiar argument: Joe Bloggs, the programmer,
writes something obviously correct, but it isn't correct; a language
"feature" has caught Joe; then someone says that Joe is stupid (and
shouldn't be programming) because Joe doesn't know the language
"feature".)

Next they address us in managerial style to talk about the Practical
Programmer who would just accept things the way they are. Just don't
have the makers of ML admit they have made a serious mistake
when ignoring the very manifestation of human-computer interaction:
the syntax. And once the example is out, others will copy it.

Syntax is the means by which humans structure what they wish
to express, a program in our case. How can this be forgotten?
When does industry accept the time loss spent in training
awareness of idiosyncrasies and in tackling the effects
of things ignored?


>> So they are seen as having potential.
> Not really, as people complains the crowd did not adopted it (true for Haskell too).

F#, Scala, R, ...


>> - when the scripting language is used to process international text
>> in a concurrent environment and fails, since char issues and concurrency
>> have been ignored.
> Not a language failure, but an application design error.

When a language is basically not thread safe, I'll call it
a language issue.


>> - when the program's data model turns out to be relational, but
>> the object-as-a-table magic does not quite work.
> Wrong analysis. No language can avoid this, as this is often prior to enough text in any language, except human's languages.

Let a problem P have a relational solution R. Demand that R be
implemented using a non-relational, but acclaimed language O.
Ignore programmers who mention that the admittedly fashionable
choice O might turn out wrong, and costly in the long run.

Another instance of the pattern of ignoring things at work. The
motive here is the relative weight of feeling the immediate advantages
of crowd decision on the one hand and concerns about long-term
technical consequences on the other. If the crowd is the current
source of profit, the choice seems clear.

>> The puzzling issue is why don't functional programmers just copy
>> the solution from Lisp, such as CMUCL? Or from Python? Are they
>> ignoring even "their own relatives"?
>>
>> * (let ((i 4000000000))
>> (* i i))
>>
>> 16000000000000000000
>> *
>>
>>>>> i = 4000000000
>>>>> i * i
>> 16000000000000000000L
> I could not understand that point.

This is the crucial bit. The goal is a language based on ML, like
ATS. But by the pattern, the designers of ATS seem to ignore both
ML's int (which raises Overflow in place of erroneous execution)
and they also ignore alternative "functional ints". Instead,
for "practical reasons" I guess, they choose C int behavior.
Assume that the new language is out the door, and is being used
on commercial projects. Will it continue to offer only C int and
associated semantics or will it move towards better defined int?
Will sacred backwards compatibility be in the way of rectifying
int towards ML's, for example?


>> How can a safe ATS pointer be safe in the presence of int ?
> What is the question precisely ? I am not sure I've understood the question.

Assume, for the sake of generalization, that a pointer points
safely to a sum of addresses computed from int offsets.

Georg Bauhaus

unread,
Jun 24, 2011, 6:41:55 AM6/24/11
to
On 6/24/11 3:26 AM, Phil Clayton wrote:

> Arbitrary magnitude integers are supported in at least SML, OCaml and
> Haskell.

They are supported, yes, my point was, however, that special
effort is required: if a program uses "plain" ATS/O'Caml integers,
it will easily run into erroneous execution. I am assuming that
this was neither the programmer's intent, nor was it the intent
of language designers who add a proof system to their language.
It seems, then, that the designers have ignored this aspect
of C int, which they have used.

It doesn't seem right to let C int be the default for such
a language.

Yannick Duchêne (Hibou57)

unread,
Jun 24, 2011, 9:45:08 AM6/24/11
to
Le Fri, 24 Jun 2011 12:32:50 +0200, Georg Bauhaus
<rm.dash...@futureapps.de> a écrit:

> That's Ada history. Find it looking for "Requirements for
> High Order Computer Programming Languages".
Will have look, thanks for the reference

> The pattern of ignoring things is present in ML syntax.

If I understand you correctly, this is not a pattern provided by the
language to design things with the language, but a pattern which was
applied when designing the language. This is not the same.

> Andrew
> Appel expounds in his Critique of ML. In one paragraph, he mentions
> that ML syntax does not get him into trouble any more. But in later
> paragraphs, he explains at some length where and how ML syntax was
> *not* given the same attention as the semantics of Standard ML, and
> why this is not the best of things.
>
> I believe that ML syntax wasn't forgotten. Just ignored. It wasn't to
> be refined later, either. It wasn't simplified for modeling
> reasons. It was just ignored. With SML out the door, we, the
> programmers, can get our load of condescension when we complain
> about inscrutable error messages when some expression gulps the
> rest of the compilation unit: "Why didn't you put the
> optional semicolon there, STUPID!? If your expression awaits
> another function, then the compiler is quite right to consider
> the next fun definition in the file to be that function!" Or similar.

You got the point. I indeed agree with that: SML syntax is really an
issue, and if registered into the wiki dedicated to discuss future
evolution of SML, that was to suggest to rework the syntax. I have planned
to argue it this way: leave the abstract semantic construct as-is, which
is safe, and work on the syntax, which would be valuable and still safe. I
feel this would add a big value at rather low cost.

Example is semi-colon omission as you said (the same error JavaScript
did), there are also ambiguous construct which cannot be caught by the
compiler and end into program which was not as intended by the writer. An
example come with nested “case” conditional expressions (I had a very bad
surprise with this one, ad since, I always wrap these within parenthesis).
Another example is not an ambiguity, but an example of weird syntactical
choice: “fun” vs “fn” (different meaning depending on the presence of a
letter or not, is simply silly, as both refers to the a same word which so
should have a unique meaning: “function”). There are also some know issues
of SML compiler developers, involving structure instantiation, with some
syntactical constructs (luckily rare enough) which are so much ambiguous
without complex handling to solve the ambiguity, that no SML compiler
support these.

It even happens I though a kind of SML-Lint would be welcome (the same
kind of Lint you have with C-Lint or JS-Lint, which is not glorious for
SML, I agree)

OK. But that is still not forgetting more or less low level aspects of a
design, that is all about failures in the design of a language. Not what I
introduced.

Side note: obviously Ada is an unbeatable winner in the area of syntax
(may be modulus some comments worth to discuss, like the one you did about
array components)

> (Do you recognize the familiar argument: Joe Bloggs, the programmer,
> writes something obviously correct, but it isn't correct; a language
> "feature" has caught Joe; then someone says that Joe is stupid (and
> shouldn't be programming) because Joe doesn't know the language
> "feature".)

Effect of a bad language syntax or hardly understandable semantic
definition (like with C/C++), I know. The doctor should suggest Ada as a
good medicine here :D

> Next they address us in managerial style to talk

Clever wording

> about the Practical
> Programmer who would just accept things the way they are. Just don't
> have the makers of ML admit they have made a serious mistake
> when ignoring the very manifestation of human-computer interaction:
> the syntax. And once the example is out, others will copy it.

Right, except as I said above, I am not a proponent of “forget about
language design”.

Do you agree the targets are different ?

Silly choice of semi-colon omission behavior, as nothing to do with the
target of helping the author to write at model level: this simply does not
serve this target and was not introduced in that purpose (this what is
called “Syntactic sugar” instead… see my actual signature, which is an old
enough quote).

> Syntax is the means by which humans structure what they wish
> to express, a program in our case. How can this be forgotten?

At least, what I am pleased with here, the the time you spent to point how
much syntax is important (so I can still be pleased even if this not the
matter I introduced :) ).

> When does industry accept the time loss spent in training
> awareness of idiosyncrasies and in tackling the effects
> of things ignored?

I do not know enough about industry (I am mostly jobless, you know), but I
guess they often do irrational choices (people who sells generally do not
understand the matters people who do have to deal with).

>> Not a language failure, but an application design error.
>
> When a language is basically not thread safe, I'll call it
> a language issue.

Finally, you are right for me.

> Let a problem P have a relational solution R. Demand that R be
> implemented using a non-relational, but acclaimed language O.
> Ignore programmers who mention that the admittedly fashionable
> choice O might turn out wrong, and costly in the long run.

Turns out to be a similar comment as the one you made above about industry
losing time with silly design errors.

> Another instance of the pattern of ignoring things at work. The
> motive here is the relative weight of feeling the immediate advantages
> of crowd decision on the one hand and concerns about long-term
> technical consequences on the other. If the crowd is the current
> source of profit, the choice seems clear.

I have never invoked the crowd when promoting SML as a possible modeling
language for some Ada applications or design parts.

> This is the crucial bit. The goal is a language based on ML, like
> ATS. But by the pattern, the designers of ATS seem to ignore both
> ML's int (which raises Overflow in place of erroneous execution)
> and they also ignore alternative "functional ints". Instead,
> for "practical reasons" I guess, they choose C int behavior.
> Assume that the new language is out the door, and is being used
> on commercial projects. Will it continue to offer only C int and
> associated semantics or will it move towards better defined int?
> Will sacred backwards compatibility be in the way of rectifying
> int towards ML's, for example?

I believe there is provision to rectify the definition of integers, as ATS
seems not too much widely used for the time.

> Assume, for the sake of generalization, that a pointer points
> safely to a sum of addresses computed from int offsets.

OK, so this was in reference to the prior comment about integer operations
errors.

Robert A Duff

unread,
Jun 29, 2011, 5:39:05 PM6/29/11
to
an...@att.net writes:

> Taking a weekend break may clear your head that may help solve the
> problem but it can also cause you to forget how you were solving that
> problem.
>
> Plus, back in the day, some of the profs would say "It might be the best
> code in the world, but without comments its a grade of zero." Because
> even the prof may not remember which problem your working on.

Yes, indeed! I find it extremely annoying when I have to read an entire
screenful of code in order to puzzle out what this code is trying to
accomplish, when a little comment would explain it.

> And it only take a few comments.

Under-commenting is far worse than over-commenting. You shouldn't
comment the obvious. But the person writing the code is necessarily
a poor judge of what's obvious -- that person just got done thinking
carefully about it, so it all seems obvious, but it's mysterious to
somebody else (or to the same person a week later). So I say: err
on the side of too many comments.

- Bob

an...@att.net

unread,
Jun 30, 2011, 12:52:29 PM6/30/11
to
Actually, choosing the right name for a variable is in some case 50% of
the comments needed. That's is one point that a programmer can correctly
choose.

I'm just tied of seeing a page of comments mostly dup of previous page for
a single statement wrap in a conditional statement. One of many reasons
that I dislike Ada and GNAT using conditional statement.

Shark8

unread,
Jul 1, 2011, 2:31:03 PM7/1/11
to
On Jun 30, 11:52 am, a...@att.net wrote:
> Actually, choosing the right name for a variable is in some case 50% of
> the comments needed. That's is one point that a programmer can correctly
> choose.
>
> I'm just tied of seeing a page of comments mostly dup of previous page for
> a single statement wrap in a conditional statement. One of many reasons
> that I dislike Ada and GNAT using conditional statement.
>

IF and CASE?

0 new messages