Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Ada Successor Language

1,392 views
Skip to first unread message

Shark8

unread,
Jun 2, 2018, 12:43:52 AM6/2/18
to
It occurs to me that in order to design a successor to Ada, there’s not merely one language that ought to be defined — but five — and the reason is that Ada is several languages all at once: there’s a language for generics, a language for proofs [SPARK], low-level hardware, and a language for tasking in addition to the Ada that maps to “normal” programming languages.
One of the frustrations about Ada as-it-is is that there is a lot that seems like it could be “folded together”, things like (eg) all the Ada.[Wide_[Wide_]]Strings packages. Or, some sort of mechanism for [explicitly] showing the relationships between types.
In order to do that we would need some sort of meta-language, wherein all the rest of the languages (ideally both syntactic and semantic) could be defined.

(1) The Meta language
(2) The Generic Language
(3) The Concurrent/Parallelism language
(4) The Proving language [SPARK]
(5) The HW/Representation language

----------
Your thoughts?

Luke A. Guest

unread,
Jun 2, 2018, 2:52:13 AM6/2/18
to
Shark8 <> wrote:

>
> (1) The Meta language
> (2) The Generic Language
> (3) The Concurrent/Parallelism language
> (4) The Proving language [SPARK]
> (5) The HW/Representation language
>
> ----------
> Your thoughts?
>

There’s a guy, eLucian, wanting to implement his own language because Ada
is too complex, his is level, where he has defined 5 compilers all written
in different languages with different capabilities. I don’t agree with
that, it’s way too complex! Plus, he wants others to do the work while he
makes the money on it, no thanks.

You could just use SML and implant those five things with that. But I
disagree there too.

Any successor needs to be much simpler in design but not scope. We should
retain multi-paradigm programming imperative and OOP, but increase that
with FP - I’m coming around to FP more, but it’s not the be all and end
all.

The grammar needs to be easier to implement so that tools can be developed
much more quickly. Simplify the grammar, maybe look at python and ruby for
hints on syntax structure.

Any successor needs to retain what Ada can do but also do things better
where Ada struggles and add features for modern programming. This new
language needs to retain the ability to develop anything from small
embedded to server, distributed and bigger applications.

One thing which makes Ada complicated from the compiler perspective is the
ability to take multiple compilation units at a time and the fact it’s
defined as being able to take them in one source file, I’d simplify that
and separate out the idea of a program, I do like not having to define a
“main” though.

I’ve said it before and it needs to be said again, a new language needs to
be Unicode from the start using UTF-8, text manipulation in Ada is painful
at this current stage. For embedded we can also define 8-bit character
sets. It needs real strings and a fully capable library.

We should adopt the common C shorthand operators, -=, +=, etc. Which means
picking a different /= operator, may as well use != here too.

Being able to override ‘image ‘value would be really useful, for custom
types, e.g. outputting a record in json format for example. Imagine a
custom image for an enumeration!

Parallel blocks, yes I know it’s coming, but it’s something that should
still exist in any new language. In fact as much tasking capabilities as
possible given how many cores we have now.

Keep the concept of restrictions and profiles/subsets.

Keep packages.

Support endian-ness from the start even on primitive types, I.e. not just
records.

The runtime needs to be permissively license to attract users.

My initial thoughts. I have other frustrations I can’t think of right now.

Dmitry A. Kazakov

unread,
Jun 2, 2018, 4:12:25 AM6/2/18
to
You forgot:

(6) The language of type declarations

As for #1, that is possible to do, but then you will have just one more
language on top of others. You cannot fuse meta- and object languages.

What one can do is to completely throw away #2 and merge reduced #5 with #6.

P.S. I don't understand this push for a new language. There are only few
mistakes made in Ada with cannot be worked around. And there are
problems unresolved on the theoretical level, like handling MD.

A new language solves nothing unless being conceptually new. Languages
created in recent 30 years perfectly illustrate this thesis.

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

Mehdi Saada

unread,
Jun 2, 2018, 8:14:20 AM6/2/18
to
> I don't understand this push for a new language.
Haven't you complained already how such or such area is unlikely to even be fixed because of backward compatibility ?
> We should adopt the common C shorthand operators, -=, +=, etc.
What's the problem with @ ? It's more elegant and versatile.

Dmitry A. Kazakov

unread,
Jun 2, 2018, 8:43:41 AM6/2/18
to
On 2018-06-02 14:14, Mehdi Saada wrote:
>> I don't understand this push for a new language.
> Haven't you complained already how such or such area is unlikely to even be fixed because of backward compatibility ?

In most cases it is merely an excuse to reject a change. Most changes
can be made keeping everything compatible.

E.g. for renaming one can use an alternative syntax (which is needed
anyway) and let the old renaming slowly die.

>> We should adopt the common C shorthand operators, -=, +=, etc.
> What's the problem with @ ? It's more elegant and versatile.

You mean @ for idem as in

X := Idem + 1;

It is semantically problematic because it limits the implementation or
forces a wrong implementation in case of temporal, volatile, shared
objects. A special operator would allow implementation via proper procedure.

It is analogous to automatic dereferencing and indexing. There will be
no solution to until these operations will be done via user-defined
procedure. No number of kludges and obscure helper type will ever work.
Assignment is no different.

Luke A. Guest

unread,
Jun 2, 2018, 8:57:28 AM6/2/18
to
Mehdi Saada <0012...@gmail.com> wrote:

>> We should adopt the common C shorthand operators, -=, +=, etc.
> What's the problem with @ ? It's more elegant and versatile.
>

The proposal is to use +:=, -:=, etc which is ok but a little weird. The
other proposal is to use @ as the lhs shorthand.


Dan'l Miller

unread,
Jun 2, 2018, 9:48:05 AM6/2/18
to
On Saturday, June 2, 2018 at 3:12:25 AM UTC-5, Dmitry A. Kazakov wrote:
> (6) The language of type declarations

The successor to Ada should be hypoAda
a) that allows writing today's Ada as a library/personality
and
b) that allows writing a somewhat different Ada as an alternate personality.
That way all the backwards compatibility issues can be merely labeled as historical legacy in today's Ada, not brought over to hyperAda, but expressible in hypoAda so as to support today's Ada in the same compiler.

Note that something like hyperAda would be a better working name than succAda (successor to Ada) or even Ada++ (unless the next-gen Ada was to import much of C/C++'s symbol syntax: ++, --, &, &&, {}, and so forth). Btw, today's Ada needs a nickname too—perhaps Ada8652 from the ISO standard number.

Seed7 (a somewhat-Ada derivative) and Alphard (Carnegie-Mellon's 1970's research language) have already explored some of these hypoAda topics. Alphard even had a sublanguage way of writing the syntax of various language's looping constructs; Seed7 approaches the topic differently, but accomplished much the same result.

There should be some sort of ground-rules for hyperAda so that it doesn't go far afield with, say, C-family syntax as merely being a cousin of Rust.

Byron seeks to be modular in the backend to permit LLVM or libFirm or GIMPLE-RTL. Byron could also be modular in the ‘lower-front-end’ so to speak to permit alternative Ada-esque personalities of front-end ••in the same compiler••, choosable on the compiler's command-line, as if the Ada-esque personality ‘upper-frontend’ were a pluggable DLL to the compiler:
A) today's Ada8652
B) Ada-merged-with-SPARK
C) Pascal-family-syntaxed hyperAda, leaving backwards compatibility behind in choices A and/or B
D) even Ada(-or-hyperAda)-semantics-expressed-C-crypto-syntax instead of Pascal syntax: Ada with braces instead of begin/end, and a hundred other syntactic transliterations
E) and so forth.

G.B.

unread,
Jun 3, 2018, 3:39:31 AM6/3/18
to
On 02.06.18 14:43, Dmitry A. Kazakov wrote:

> It is semantically problematic because it limits the implementation or forces a wrong implementation in case of temporal, volatile, shared objects. A special operator would allow implementation via proper procedure.
>
> It is analogous to automatic dereferencing and indexing. There will be no solution to until these operations will be done via user-defined procedure. No number of kludges and obscure helper type will ever work. Assignment is no different.
>

What set of guarantees can user-defined Ada offer, in comparison,
when users will provide those operations to the compiler that are
now guarandteed by the language? (Where to "draw the line"?)

Example:

- typed, indexed RAM (array),

as opposed to

- typed, indexed data structure (user defined).

Dmitry A. Kazakov

unread,
Jun 3, 2018, 3:58:09 AM6/3/18
to
It is not who provides implementation but what the interface of an array
is. The difference is between:

Update (Get_Reference (A, I), V)

and

Update (A, I, V)

There is no way #1 can handle all important cases of containers and
there is no way to shape #1 in a way acceptable from the software design
POV.

ric....@gmail.com

unread,
Jun 3, 2018, 6:01:31 AM6/3/18
to
Ada is an incredible languages as it is. It doesn't need a successor - it needs more adoption.

Dan'l Miller

unread,
Jun 3, 2018, 9:04:36 AM6/3/18
to
On Sunday, June 3, 2018 at 5:01:31 AM UTC-5, ric....@gmail.com wrote:
> Ada is an incredible languages as it is. It doesn't need a successor - it needs more adoption.

I actually agree with this for the most part. Ada needs another compiler (or at least one of its few compiler vendors to market themselves differently) more than it needs a successor language.

The only way that Ada needs a successor is a minor variant that explores language changes that the ARG doesn't pursue, mostly in the category of great ideas that would break backwards compatibility on arcane topics. But the chance that a successor to Ada would restrain itself to strictly being a minor variant is low, which is almost too much risk.

Conversely, what has cemented the view that C-family syntax is more readable than Pascal-family syntax is that there are a plethora of C-syntaxed cousin and sibling languages to C. Ada looks like it is dead-end evolutionary branch. Evolutionarily-dead-end Ada could have been named: Duckbilled Platypus, Dodobird, Coelacanth. Among these Coelacanth is best: thought by many to be extinct, but still occasionally found in the wild.

What would be healthy for Ada is a new compiler whose front end is implemented as a honey pot to attract researchers to experiment with new variants of Ada, as they vaunt their researcwares as The Next Big Thing with much energy and fanfare. Imagine someone even experimenting with •exactly• same Ada that we have today but with strictly one topic changed, either:

1) an entirely different memory model than Ada has today in its storage pools (or should I say, entirely different than the cattle chute that compilers & standard-library sends us down today);

or
2) an entirely different C-symbol cryptosyntax;

or
3) multimethods built-in;

or
4) some logic programming other than SPARK;

or
5) multistage programming source-code generation (instead of a preprocessor);

or
6) overt choice of different Ichbiah/Ada83 or Taft/Ada95 or Randy/Ada2020 or Rust-esque correctness enforcement;

or
7) a hundred other experiments.

Hence, the idea earlier in this thread for an Alphard-esque-on-steroids or Seed7-esque-on-steroids hypoAda on which today's Ada8652* can be implemented atop but also on which syntactic or semantic variants can be easily derived. LLVM is an analogous honeypot for research on the backend. Likewise, hypoAda could be another honeypot intentionally designed to attract researchers to fan the flames of energy among the universities (again) and among the tech billionaires (for the first time) regarding Ada.

* Ada8652 is all of the ISO 8652 eras: Ada83, Ada95, Ada2005, Ada2012, Ada2020, …

Shark8

unread,
Jun 3, 2018, 9:09:01 AM6/3/18
to
On Sunday, June 3, 2018 at 4:01:31 AM UTC-6, ric....@gmail.com wrote:
> Ada is an incredible languages as it is. It doesn't need a successor - it needs more adoption.

I actually agree; I'd like to see more adoption -- but there are some well-founded gripes, even among the hardcore Ada supporters -- and there have been several major design mistakes such as anonymous access types and access parameters. (The ARG so highly values backwards compatibility that, honestly, there's more chance of a successor language than the correction of most of these mistakes; just ask Randy.)

But a discussion like this allows us to discuss these problems, possibly even getting an 'ah-ha!' moment for solving them within current Ada. -- And if it does result in a successor language, we'll have a better idea of exactly what are the bad parts or how to correctly address what they were intended to.

Lucretia

unread,
Jun 3, 2018, 11:09:33 AM6/3/18
to
On Sunday, 3 June 2018 11:01:31 UTC+1, ric....@gmail.com wrote:
> Ada is an incredible languages as it is. It doesn't need a successor - it needs more adoption.

This is half of the problem. People have an irrational hatred of the language:

1) starting from the inception of the language.
2) from being told it's bad, usually by people who've never touched it, because they were told it's bad.

It's seen as old fashioned because it uses a Wirthian-esque syntax, yet C syntax is just as old.

People are stupid, see quotes (kind of) like:

1) "I can't read it, it's too verbose," some C/C++ programmer, who can't read words but apparently can read the crap that people write in those languages.
2) "I can't read it, it needs braces," yeah, because braces are so easy to read, same as 1.
3) "It achieves safety by tedium. It's about as fun as COBOL" << actual quote

Then on top of that we have:

1) Not enough tools.
2) Not enough libs.
3) Try finding *GOOD* auto-completion for Ada in an editor.
4) There's no one place to get Ada sources from, people want an equivalent of go get and rust's cargo.
5) There's not enough people working together.
6) The commercial users of Ada, aerospace/military, don't do open source, they use it though.
7) The speed of language updates really is glacial and based mainly on what the commercial users of the language want.
8) Not enough use outside military and aerospace.

For 3 above, I hate Vi-like editors, Emacs is dog slow and auto-complete is tag based and pretty crap, imo. GPS crashes all over the place and auto complete doesn't work there either. I refuse to use web browser based editors as I don't consider them stable or good enough or fast enough and I consider them to be experimental in nature. Everything else isn't good either.

So, the choice is, imo, a new toolchain sticking to all the Ada baggage, or a new language which we can start from a blank slate and attempting to make a better Ada for this and the next century.


ric....@gmail.com

unread,
Jun 3, 2018, 11:14:13 AM6/3/18
to
I'm currently training a team of 1st and 2nd year students in Ada. They're loving it. I run a company that specializes in commercial and enterprise Ada applications.

I think training and building, like I am actually doing, is a lot less work, and a lot less effective than trying to fix something that's not broken.

Lucretia

unread,
Jun 3, 2018, 11:14:34 AM6/3/18
to
On Sunday, 3 June 2018 14:09:01 UTC+1, Shark8 wrote:

there's more chance of a successor language than the correction of most of these mistakes; just ask Randy.)

^ This

Lucretia

unread,
Jun 3, 2018, 11:16:01 AM6/3/18
to
On Sunday, 3 June 2018 16:14:13 UTC+1, ric....@gmail.com wrote:
> I'm currently training a team of 1st and 2nd year students in Ada. They're loving it. I run a company that specializes in commercial and enterprise Ada applications.
>
> I think training and building, like I am actually doing, is a lot less work, and a lot less effective than trying to fix something that's not broken.

Ask them what they like about it and how it compares to what they're used to, and explicitly ask what they're used to, which language, etc.

ric....@gmail.com

unread,
Jun 3, 2018, 11:22:34 AM6/3/18
to
No. I'd rather focus on getting them to work. We'll be open-sourcing a large majority of our work too.

If you want to create a new language, go for it. But there exists no other language that is as carefully engineered and considered than Ada. You're talking about making another Rust.

In my mind, I'm on a mission to bring Ada to more general use, and to promote good engineering discipline that Ada already encourages.

I can't understand why you're here. If you think you've got such a great idea, go build it. I have no more to contribute to this discussion - I've got to get to work.

Lucretia

unread,
Jun 3, 2018, 11:31:14 AM6/3/18
to
On Sunday, 3 June 2018 16:22:34 UTC+1, ric....@gmail.com wrote:

> I can't understand why you're here. If you think you've got such a great idea, go build it. I have no more to contribute to this discussion - I've got to get to work.

Because I like the language, but there are issues with it too, to say there aren't is to bury your head in the sand.

ric....@gmail.com

unread,
Jun 3, 2018, 11:41:27 AM6/3/18
to
*Sigh* Alright, you baited me just enough with that one.

I've overseen the development, deployment, and maintenance of very large, distributed, high-availability cloud-native enterprise systems, exclusively in Ada 2012.

We've used just about every single feature of the language in this. We've got hundreds of packages, and prevalent re-use. We have had not one single bug in production, and no unplanned downtime. Testing took about 10% of the allotted time because of the lack of bugs.

Nothing in this world is perfect. But I have not encountered a single serious deficiency in Ada. Having trained quite a few people in Ada, and supervising teams, as well as personally contributing a large portion of the original codebase, I can tell you from direct experience that every so-called deficiency in Ada has actually been a reflection of a detractor's own personal short comings. I know that's harsh to say, but this is frankly a huge part of the problem here.

Ada is very different from other languages because it is probably the only one out there that was not written by programmers for programmers. This is why it gets hate. Making it more programmer friendly is not fixing it.

Yes, we need more compilers, and more than AdaCore. I'm working on that. I'm giving it my all.

But nothing gets my blood boiling more than talk of reinventing the wheel. There are more pressing problems in computing and software. There will never be a "perfect" language. We are making too many damn languages. This is not the real problem.

Dmitry A. Kazakov

unread,
Jun 3, 2018, 11:54:04 AM6/3/18
to
On 2018-06-03 17:09, Lucretia wrote:

> GPS crashes all over the place and auto complete doesn't work there either. I refuse to use web browser based editors as I don't consider them stable or good enough or fast enough and I consider them to be experimental in nature. Everything else isn't good either.

Maybe Linux version does. Windows version of GPS is quite stable and
automatic completion works just fine. The only problem is that I don't
know how to turn it off, for I hate that annoying stuff.

> So, the choice is, imo, a new toolchain sticking to all the Ada baggage, or a new language which we can start from a blank slate and attempting to make a better Ada for this and the next century.

And why should it turn any easier? The reasons why Ada is not popular
and why there is no investment in Ada tools and libraris will not go
away. These reasons lie outside technicalities. The technology is a
subject here and is shaped so that only worst languages gain support.
You cannot work against negative selection, it is a force of the
economical nature. The new language can only be worse in order to
survive. I stay with Ada.

Jeffrey R. Carter

unread,
Jun 3, 2018, 12:43:20 PM6/3/18
to
On 06/03/2018 05:09 PM, Lucretia wrote:
>
> GPS crashes all over the place and auto complete doesn't work there either.
I haven't had any problems with GPS crashing nor with its auto-completion features.

GPS 6.1.2016 that comes from the Xubuntu 18.04 repositories has problems with
highlighting, so I use GPS 2017, which doesn't have that problem.

--
Jeff Carter
"It's symbolic of his struggle against reality."
Monty Python's Life of Brian
78

Björn Lundin

unread,
Jun 3, 2018, 2:44:32 PM6/3/18
to
On 2018-06-03 17:09, Lucretia wrote:
> GPS crashes all over the place and auto complete doesn't work there either.

Hmm, I use it on debian almost every day. Can't say that I
consider it crash-prone.
I can't remember when it happened last.

--
--
Björn

Paul Rubin

unread,
Jun 3, 2018, 3:37:38 PM6/3/18
to
Lucretia <lague...@googlemail.com> writes:
> It's seen as old fashioned because it uses a Wirthian-esque syntax,
> yet C syntax is just as old.

It has Algol-esque syntax and Algol's best known dialect was from 1960.
C is nowhere near that old. More cogently C's syntax displaced
Algol/Pascal syntax because people liked its conciseness.

> 1) "I can't read it, it's too verbose," some C/C++ programmer, who
> can't read words but apparently can read the crap that people write in
> those languages.

"Verbose" to me refers to the total amount of code you have to write to
get something done, rather than localized begin-vs-curly differences.
It does seem to me that you need more code in Ada than in other
languages. Looking at examples on Rosetta code bears this out.

I looked at the docs for Ada's priority queues with the idea of fixing
the Rosetta example about Hamming numbers, but it started to look like a
huge project, compared with the Python heapq-based example that I wrote
in about 5 minutes. The Corporate Bullshit Sentence Generator that I
looked at a few weeks ago (while a fun program) also seemed very
verbose, with its 500 different random number generation functions and
supporting datatypes.

> 6) The commercial users of Ada, aerospace/military, don't do open
> source, they use it though.

I thought Ada was losing ground even in those areas.

Dan'l Miller

unread,
Jun 3, 2018, 7:56:05 PM6/3/18
to
On Sunday, June 3, 2018 at 2:37:38 PM UTC-5, Paul Rubin wrote:
> Lucretia writes:
> > It's seen as old fashioned because it uses a Wirthian-esque syntax,
> > yet C syntax is just as old.
>
> It has Algol-esque syntax and Algol's best known dialect was from 1960.
> C is nowhere near that old.

C's syntax derives directly from B. (B descended from BCPL at Multics but BCPL had a more Algol-esque/PL/I-lite syntax.) B originated in 1969.

> > 6) The commercial users of Ada, aerospace/military, don't do open
> > source, they use it though.
>
> I thought Ada was losing ground even in those areas.

Job postings are a moderately good leading indicator. Job postings from defense contractors in the Dallas & Fort Worth, Texas, metropolitan area fall into 3 categories:
1) Java, Java, everywhere, including non-UI/UX embedded systems.
2) C/C++ for embedded hard-realtime.
3) Ada-to-C++ conversion to decommission Ada systems.

In recent months, I haven't seen •any• Ada-only or you're-going-to-author-Ada-source-code job postings at DoD/aerospace companies locally here. And that includes the local facilities for Raytheon, Boeing, Rockwell Collins, L3, General Dynamics, Northrup Grumman, and Textron Vaught Bell helicopters.

Paul Rubin

unread,
Jun 3, 2018, 8:24:29 PM6/3/18
to
"Dan'l Miller" <opt...@verizon.net> writes:
> C's syntax derives directly from B. (B descended from BCPL at Multics
> but BCPL had a more Algol-esque/PL/I-lite syntax.) B originated in
> 1969.

Yes, Algol (starting 1958) is considerably older in computing history.
I do see that BCPL used curly braces though.

> Job postings are a moderately good leading indicator....
> 1) Java, Java, everywhere, including non-UI/UX embedded systems.
> 2) C/C++ for embedded hard-realtime.
> 3) Ada-to-C++ conversion to decommission Ada systems.

Oh well, 2) and 3) are unfortunate I guess. 1) is meh: I can see
advantages to using Java (or these days Scala) in non-realtime
programming.

Rust's ascendance shows people really do see the issues of C and C++,
enough to be willing to switch languages. That they chose to develop a
new language instead of using Ada supports the idea that it's time for
an Ada successor. Is Rust the successor? Should Mozilla's Firefox
rewrite have just used Ada instead? I dunno.

Ben Bacarisse

unread,
Jun 3, 2018, 8:41:56 PM6/3/18
to
Paul Rubin <no.e...@nospam.invalid> writes:

> "Dan'l Miller" <opt...@verizon.net> writes:
>> C's syntax derives directly from B. (B descended from BCPL at Multics
>> but BCPL had a more Algol-esque/PL/I-lite syntax.) B originated in
>> 1969.
>
> Yes, Algol (starting 1958) is considerably older in computing history.
> I do see that BCPL used curly braces though.

That's relatively recent. The 1967 description is neutral on the exact
characters. It uses a section sign and an underlines section sign. The
implementation I am familiar with used $( and $).

<snip>
--
Ben.

ric....@gmail.com

unread,
Jun 3, 2018, 10:01:13 PM6/3/18
to
On Sunday, June 3, 2018 at 7:56:05 PM UTC-4, Dan'l Miller wrote:

> Job postings are a moderately good leading indicator. Job postings from defense contractors in the Dallas & Fort Worth, Texas, metropolitan area fall into 3 categories:
> 1) Java, Java, everywhere, including non-UI/UX embedded systems.
> 2) C/C++ for embedded hard-realtime.
> 3) Ada-to-C++ conversion to decommission Ada systems.
>
> In recent months, I haven't seen •any• Ada-only or you're-going-to-author-Ada-source-code job postings at DoD/aerospace companies locally here. And that includes the local facilities for Raytheon, Boeing, Rockwell Collins, L3, General Dynamics, Northrup Grumman, and Textron Vaught Bell helicopters.

There are a few studies out there that have shown training new hires in Ada is very easy, and is much easier than trying to find programmers already proficient in Ada. Of course the money-driven defence industry would rather hire a bunch of cheap young people than older experienced people. I also do Ada training, and I concur that it is quite easy to do. We also don't explicitly require Ada on our job applications, since most universities don't teach Ada (unfortunately), and we're not working in aerospace.

So I personally don't think that the job postings not including Ada as a requirement really means, definitively, that these companies are not using Ada anymore. Ada is technically superior in many cases, and this has to be a consideration for some of the more competent managers, however rare they may be.

Dan'l Miller

unread,
Jun 3, 2018, 10:27:13 PM6/3/18
to
On Saturday, June 2, 2018 at 1:52:13 AM UTC-5, Luke A. Guest wrote:
> We should adopt the common C shorthand operators, -=, +=, etc. Which means
> picking a different /= operator, may as well use != here too.

If hardwiring C-symbol syntax into next-generation replaces hardwiring Pascal/Wirth-esque syntax into Ada8652, then next-generation Ada had better have a hundred or a thousand better reasons to exist than such cosmetic battles. The only way that C-symbol syntax should be allowed into next-generation Ada is as per-site policy.

Ada already is the A#1 language on the planet* where per-site policies harshly restrict which language features are forbidden (e.g., don't use any Ada95-or-later features; don't use tagged-record-based features; don't use any post-Ada95 features whatsoever; don't use anything that begins with unchecked_). And this is in addition to Ada's much vaunted profiles of adding focused areas of functionality and/or assurance.

* and that is an extraordinarily difficult competition to win versus C++, whose coding standards read like a long list of all the dozens to low hundreds of undefined-behaviors/gotchas/code-smells to not do.

Perhaps Ada should take the hint that its main claim to future fame is to be the ultimate language for per-site policy-set custom-tailorization. In this view, Ada could have various correctness checks/philosophies turned on or off.

To Luke's C-symbol point, a per-site (or even per-developer) policy could be: at check-out, give me the C-symbol alternate syntax, or give me the old pragma-based syntax, or give me the new aspect-based syntax, or any other syntactic variant for the same underlying semantic meaning.

John Smith

unread,
Jun 3, 2018, 11:19:44 PM6/3/18
to
On Saturday, June 2, 2018 at 12:43:52 AM UTC-4, Shark8 wrote:
> It occurs to me that in order to design a successor to Ada, there’s not merely one language that ought to be defined — but five — and the reason is that Ada is several languages all at once: there’s a language for generics, a language for proofs [SPARK], low-level hardware, and a language for tasking in addition to the Ada that maps to “normal” programming languages.
> One of the frustrations about Ada as-it-is is that there is a lot that seems like it could be “folded together”, things like (eg) all the Ada.[Wide_[Wide_]]Strings packages. Or, some sort of mechanism for [explicitly] showing the relationships between types.
> In order to do that we would need some sort of meta-language, wherein all the rest of the languages (ideally both syntactic and semantic) could be defined.
>
> (1) The Meta language
> (2) The Generic Language
> (3) The Concurrent/Parallelism language
> (4) The Proving language [SPARK]
> (5) The HW/Representation language
>
> ----------
> Your thoughts?

It seems like Ada has some legacy code (like the "Wide") that's left over. Splitting it up into 5 different languages doesn't make any sense. It's like your tetris game having some legacy features... so you decide to make 5 different games. One game just rotates a block. One game just drops blocks. And so on and so on. This doesn't make any sense.

Jacob Sparre Andersen

unread,
Jun 4, 2018, 1:01:31 AM6/4/18
to
Jeffrey R. Carter wrote:

> I haven't had any problems with GPS crashing nor with its
> auto-completion features.

The version packaged with Debian isn't exactly stable, but if you on
have one main per project, it is usable.

The binary distribution provided directly by AdaCore works fine, but
I've only tried it with compilers distributed by AdaCore.

Greetings,

Jacob
--
"It's not a question of whose habitat it is,
it's a question of how fast you hit it."

Simon Wright

unread,
Jun 4, 2018, 3:19:23 AM6/4/18
to
Jacob Sparre Andersen <ja...@jacob-sparre.dk> writes:

> Jeffrey R. Carter wrote:
>
>> I haven't had any problems with GPS crashing nor with its
>> auto-completion features.
>
> The version packaged with Debian isn't exactly stable, but if you on
> have one main per project, it is usable.
>
> The binary distribution provided directly by AdaCore works fine, but
> I've only tried it with compilers distributed by AdaCore.

From a Mac perspective, I have to use the AdaCore binary (far too many
dependencies for me to manage a build myself!), and always with an FSF
compiler.

Managing more than one FSF compiler is a little tedious, though (have to
edit a little script deep inside the application bundle).

Dmitry A. Kazakov

unread,
Jun 4, 2018, 3:44:57 AM6/4/18
to
Out of curiosity.

Does OS X have a free repository of binary packages to put Ada there?
Homebrew looks like source only.

Why did nobody port rpm-dnf or dpkg-apt to OS X?

Lucretia

unread,
Jun 4, 2018, 9:25:33 AM6/4/18
to
On Monday, 4 June 2018 08:19:23 UTC+1, Simon Wright wrote:

> From a Mac perspective, I have to use the AdaCore binary (far too many
> dependencies for me to manage a build myself!), and always with an FSF
> compiler.
>
> Managing more than one FSF compiler is a little tedious, though (have to
> edit a little script deep inside the application bundle).

This is what I have atm, and it's unstable.

Björn Lundin

unread,
Jun 4, 2018, 9:53:14 AM6/4/18
to
On 2018-06-04 09:44, Dmitry A. Kazakov wrote:
> Out of curiosity.
>
> Does OS X have a free repository of binary packages to put Ada there?
> Homebrew looks like source only.

Mac had DarwinPort, which after some fuzz became MacPort.
This was - at the begining - source only.
But later releases includes some binary ports too.
Like the big and/or popular ones.
And I think the binary ports increases


--
--
Björn

Dan'l Miller

unread,
Jun 4, 2018, 10:08:35 AM6/4/18
to
On Monday, June 4, 2018 at 2:19:23 AM UTC-5, Simon Wright wrote:
> Jacob Sparre Andersen writes:
>
> > Jeffrey R. Carter wrote:
> >
> >> I haven't had any problems with GPS crashing nor with its
> >> auto-completion features.
> >
> > The version packaged with Debian isn't exactly stable, but if you on
> > have one main per project, it is usable.
> >
> > The binary distribution provided directly by AdaCore works fine, but
> > I've only tried it with compilers distributed by AdaCore.
>
> From a Mac perspective, I have to use the AdaCore binary (far too many
> dependencies for me to manage a build myself!)

Luke, the your-mileage-may-vary different outcomes for (in)stability on the ‘same’ Linux distribution might be due to your machine having a ••different mix of versions of .so shared-library-based dependencies•• than the people who are reporting that it is perfectly stable.

Lucretia

unread,
Jun 4, 2018, 10:55:24 AM6/4/18
to
On Monday, 4 June 2018 15:08:35 UTC+1, Dan'l Miller wrote:

> Luke, the your-mileage-may-vary different outcomes for (in)stability on the ‘same’ Linux distribution might be due to your machine having a ••different mix of versions of .so shared-library-based dependencies•• than the people who are reporting that it is perfectly stable.

I know. I just haven't got around to packaging it for free-ada yet, the billion dependencies and getting the right versions is a nightmare.

G. B.

unread,
Jun 4, 2018, 12:54:47 PM6/4/18
to
Dmitry A. Kazakov <mai...@dmitry-kazakov.de> wrote:

> Why did nobody port rpm-dnf or dpkg-apt to OS X?

MacOS users can expect a working DMG, at least if the vendor is not in
[Oracle].
Also, MacOS offers a pretty stable BSD Unix, I‘d think. MacPort etc. have
been
one backfiring maintenance hazard needing attention all the time.

I don‘t know a good reason for free software to depend on layers of Unix
“ported” to Unix!
Isn’t it a better idea to clean out dated non-portable non-POSIX
dependences?

Even performance wise, and addressing architecture, if GNAT can benefit
from
features of the Darwin OS, why put a hindrance of several layers of C
libraries in
between?

Dmitry A. Kazakov

unread,
Jun 4, 2018, 3:37:53 PM6/4/18
to
On 2018-06-04 18:54, G. B. wrote:

> I don‘t know a good reason for free software to depend on layers of Unix
> “ported” to Unix! Isn’t it a better idea to clean out dated non-portable non-POSIX
> dependences?

[ Well, having no POSIX is a great advantage ]

It looks that Mac OS X is no position to have a say in anything. The
situation is similar to the one for Ada. We have to use horrid Linux C
libraries however much we hate it, because there is no substitute
written in Ada.

> Even performance wise, and addressing architecture, if GNAT can benefit
> from features of the Darwin OS, why put a hindrance of several layers of C
> libraries in between?

Because there is nothing else?

Mehdi Saada

unread,
Jun 4, 2018, 4:56:53 PM6/4/18
to
What kind of structural change/additional features would allow for more compile-time bugs discovery/safety/speed/you-name-it ?

...Damn, reading Dan Miller made me speak like Marvel' Celestials !

Dan'l Miller

unread,
Jun 4, 2018, 5:06:03 PM6/4/18
to
On Saturday, June 2, 2018 at 1:52:13 AM UTC-5, Luke A. Guest wrote:
> Shark8 <> wrote:
>
> >
> > (1) The Meta language
> > (2) The Generic Language
> > (3) The Concurrent/Parallelism language
> > (4) The Proving language [SPARK]
> > (5) The HW/Representation language
> >
> > ----------
> > Your thoughts?
> >
>
> There’s a guy, eLucian, wanting to implement his own language because Ada
> is too complex, his is level, where he has defined 5 compilers all written
> in different languages with different capabilities.

5 compilers, written in 5 different languages is a mess. The Level logo is an eye-catching design though.

Conversely, Shark8's Meta-language is analogous to what I call hypoAda: the rudimentary Ada-esque language constructs in which Ada8652 (or variants thereof) could be written as a library of sorts (or perhaps a plug-in DLL to the compiler). These rudimentary language constructs would be somewhat like what Seed7 does and very much as William Wulf's & Mary Shaw's Alphard partially explored during the 1970s.

Imagine Seed7 not being a subset of Ada's type-declaration language. Then Seed7 would be a start on hypoAda or what, I think, Shark8 calls The Meta-language.

Paul Rubin

unread,
Jun 4, 2018, 5:14:57 PM6/4/18
to
Shark8 <onewing...@gmail.com> writes:
> (1) The Meta language
> (2) The Generic Language
> (3) The Concurrent/Parallelism language
> (4) The Proving language [SPARK]
> (5) The HW/Representation language

By "Meta language" I first thought you meant replacing the ARM with a
machine checkable specification, written in something like Twelf[1].
That seems like a great idea. But instead it seems like you want some
general syntactic umbrella in which the other stuff is embedded as
DSL's. I.e. you are reinventing Lisp ;).

Also it's unclear what you mean by the Generic language: is that
supposed to be a special sub-language for type-level programming,
something like ML's module language? I dunno, maybe that should be
closely connected with the proving language.

Concurrency/Parallelism doesn't need a special language: it's mostly
runtime stuff with a bit of compiler backend support and maybe an even
smaller bit of syntactic support.

The proving language might look quite a bit different from SPARK because
of how much that field has changed since SPARK was new. I wonder if
again a Lisp-inspired approach would be helpful: the language definition
would include a formal spec for AST's that could be handed off to
external proof systems, along with any embedded assertions and contracts
that could be connected up with proofs packaged separately from the user
program.

I wonder if we're in for some big advances in automated proof search any
time soon. I haven't been hearing anything about it but it just seems
like an obvious thing for people to be working on.

Randy Brukardt

unread,
Jun 4, 2018, 5:17:15 PM6/4/18
to
>Luke A. Guest" <lag...@archeia.com> wrote in message
>news:217213441.549636947.114...@nntp.aioe.org...
> Mehdi Saada <0012...@gmail.com> wrote:
>
>>> We should adopt the common C shorthand operators, -=, +=, etc.
>> What's the problem with @ ? It's more elegant and versatile.
>>
>
> The proposal is to use +:=, -:=, etc which is ok but a little weird.

Which has been extensively discussed by the ARG but rejected. There's both
technical issues (given the possibility of overloading, "A +:= B" cannot
always be equivalent to "A := A + B") and the ugliness factor.

> The other proposal is to use @ as the lhs shorthand.

This is in Ada 2020. I believe GNAT already supports it (in extensions
mode). It's a lot more flexible since it can be used in any function or
attribute -- no restriction to operators.

Randy.



G.B.

unread,
Jun 4, 2018, 6:13:00 PM6/4/18
to
On 04.06.18 21:37, Dmitry A. Kazakov wrote:
> On 2018-06-04 18:54, G. B. wrote:
>
>> I don‘t know a good reason for free software to depend on layers of Unix
>> “ported” to Unix! Isn’t it a better idea to clean out dated non-portable non-POSIX
>> dependences?
>
> [ Well, having no POSIX is a great advantage ]
>
> It looks that Mac OS X is no position to have a say in anything.

Actually, Apple's devices have inspired AdaCore customers to want
a way of translating Ada to whatever runs on iOS. It's done via C,
currently, IIUC. Since much of iOS inherits the C-tradition, and since
AdaCore had already put efforts into that kind of translation,
that's a reasonable choice.


>> Even performance wise, and addressing architecture, if GNAT can benefit
>> from features of the Darwin OS, why put a hindrance of several layers of C
>> libraries in between?
>
> Because there is nothing else?

Why layers of Unix on Unix?

If Unix needs to be between Ada and the OS for GNAT to work,
then why not just use the BSD Unix that Macs have got?

It seems as though programmers like putting much effort into making
their programs work with autoconf and xyz-ports.
This takes away much time that could be spent on making programs just
work on the most wide spread flavors of Unix. Just in case, that
includes Apple BSD and Linux for Android.

Shark8

unread,
Jun 4, 2018, 8:17:26 PM6/4/18
to
On Monday, June 4, 2018 at 3:14:57 PM UTC-6, Paul Rubin wrote:
> Shark8 writes:
> > (1) The Meta language
> > (2) The Generic Language
> > (3) The Concurrent/Parallelism language
> > (4) The Proving language [SPARK]
> > (5) The HW/Representation language
>
> By "Meta language" I first thought you meant replacing the ARM with a
> machine checkable specification, written in something like Twelf[1].
> That seems like a great idea. But instead it seems like you want some
> general syntactic umbrella in which the other stuff is embedded as
> DSL's. I.e. you are reinventing Lisp ;).

A little of column-A, a little of column-B.
(I would love to get my hands on a copy of the source for Symbolics Ada; it was awesome to see how integrated it was with the LISP environment -- But have no Idea where to start for that.)

Having the LRM define a nice, relatively-simple, machine-checkable language which could be used to define everything else would be really, really nice for compiler developers.

Having [some] things formally and explicitly defined in terms of sets would also be helpful -- see: https://en.wikibooks.org/wiki/Ada_Programming/Type_System#The_Type_Hierarchy -- having (eg) a real definition of Universal_Integer would allow some nice definition clean-up.

(ALSO: I personally think that a user definable "abstract type interface" [kinda/sorta type classes], as well as user-definable attributes, would be great ways to handle the above while expanding the capabilities of Ada'Succ.)

>
> Also it's unclear what you mean by the Generic language: is that
> supposed to be a special sub-language for type-level programming,
> something like ML's module language? I dunno, maybe that should be
> closely connected with the proving language.

Ada already /has/ a generic language; the whole section on generic formal parameters defines it -- https://en.wikibooks.org/wiki/Ada_Programming/Generics#Generic_formal_types -- and these are going to be expanded in Ada 2020, IIUC, to certain provability properties.

>
> Concurrency/Parallelism doesn't need a special language: it's mostly
> runtime stuff with a bit of compiler backend support and maybe an even
> smaller bit of syntactic support.

Except it already is its own language TASK, SELECT, and arguably DELAY don't really appear in the rest of the language.

This is likely going to become more apparent with parallel-blocks and the proposed method for mapping.

>
> The proving language might look quite a bit different from SPARK because
> of how much that field has changed since SPARK was new. I wonder if
> again a Lisp-inspired approach would be helpful: the language definition
> would include a formal spec for AST's that could be handed off to
> external proof systems, along with any embedded assertions and contracts
> that could be connected up with proofs packaged separately from the user
> program.

There's some interesting stuff that could be done in this area; for compilers we could commandeer the ideas here -- http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.324.4139&rep=rep1&type=pdf and https://pdfs.semanticscholar.org/d3a1/4b2301487946614ca89ec23726abf27d9f65.pdf and [IIRC these two papers] https://www.cc.gatech.edu/~xzhang36/papers/pldi14b-long.pdf and http://www.rug.nl/research/portal/files/33882898/Kurs14a_ParsingContext.pdf -- for some really interesting capabilities.

>
> I wonder if we're in for some big advances in automated proof search any
> time soon. I haven't been hearing anything about it but it just seems
> like an obvious thing for people to be working on.

There were a few papers I read early last year that were interesting, though I don't remember them off the top of my head.

Dmitry A. Kazakov

unread,
Jun 5, 2018, 3:18:09 AM6/5/18
to
On 2018-06-05 00:12, G.B. wrote:
> On 04.06.18 21:37, Dmitry A. Kazakov wrote:
>> On 2018-06-04 18:54, G. B. wrote:
>>> Even performance wise, and addressing architecture, if GNAT can benefit
>>> from features of the Darwin OS, why put a hindrance of several layers
>>> of C libraries in between?
>>
>> Because there is nothing else?
>
> Why layers of Unix on Unix?

It is not about technical merits. UNIX is an awful OS. It is just as the
things are. Most of public domain libraries are developed for Linux,
ported to Windows and nobody really cares about the rest.

> If Unix needs to be between Ada and the OS for GNAT to work,
> then why not just use the BSD Unix that Macs have got?

Because Mac users don't want to use BSD?

> It seems as though programmers like putting much effort into making
> their programs work with autoconf and xyz-ports.

Luckily we have gpr projects [*]. But the question was about packaging
and handling dependencies.

--------
* The irony is that AdaCore does trust it and continues to use the
configure-mess.

Lucretia

unread,
Jun 5, 2018, 8:31:10 AM6/5/18
to
On Monday, 4 June 2018 22:06:03 UTC+1, Dan'l Miller wrote:

> > There’s a guy, eLucian, wanting to implement his own language because Ada
> > is too complex, his is level, where he has defined 5 compilers all written
> > in different languages with different capabilities.
>
> 5 compilers, written in 5 different languages is a mess.

Yes, I told him as much. His reasoning was that it was to learn the languages, I don't see that as an excuse for over complicating things.

> The Level logo is an eye-catching design though.

Nah, looks like Jevel.

Dan'l Miller

unread,
Jun 5, 2018, 12:01:23 PM6/5/18
to
On Monday, June 4, 2018 at 7:17:26 PM UTC-5, Shark8 wrote:
> On Monday, June 4, 2018 at 3:14:57 PM UTC-6, Paul Rubin wrote:
> > Shark8 writes:
> > > (1) The Meta language
> > > (2) The Generic Language
> > > (3) The Concurrent/Parallelism language
> > > (4) The Proving language [SPARK]
> > > (5) The HW/Representation language
> >
> > By "Meta language" I first thought you meant replacing the ARM with a
> > machine checkable specification, written in something like Twelf[1].
> > That seems like a great idea. But instead it seems like you want some
> > general syntactic umbrella in which the other stuff is embedded as
> > DSL's. I.e. you are reinventing Lisp ;).
>
> A little of column-A, a little of column-B.
> (I would love to get my hands on a copy of the source for Symbolics Ada; it was awesome to see how integrated it was with the LISP environment -- But have no Idea where to start for that.)

http://www.symbolics-dks.com
The intellectual property of the old Symbolics is for sale if you can raise enough money to license pieces of it.

Conversely, in the modern era, Thomas Mertes PhD thesis is somewhat interesting:
It is not Lisp, but Seed7's definable language constructs for a language such as Ada8652 could be at least an inspiration for what you call The Meta Language or what I call hypoAda.

definable language constructs:
http://seed7.sourceforge.net/examples/declstat.htm

definable operators:
http://seed7.sourceforge.net/examples/operator.htm

Alejandro R. Mosteo

unread,
Jun 5, 2018, 12:46:22 PM6/5/18
to
On 03/06/2018 17:09, Lucretia wrote:

> 4) There's no one place to get Ada sources from, people want an equivalent of go get and rust's cargo.

There are two projects working in this direction that I'm aware of:

http://ravenports.ironwolf.systems/

And from yours truly:

https://github.com/alire-project/alr

Niklas Holsti

unread,
Jun 6, 2018, 5:13:25 AM6/6/18
to
On 18-06-05 00:14 , Paul Rubin wrote:
> Shark8 <onewing...@gmail.com> writes:
>> (1) The Meta language
>> (2) The Generic Language
>> (3) The Concurrent/Parallelism language
>> (4) The Proving language [SPARK]
>> (5) The HW/Representation language
>
> By "Meta language" I first thought you meant replacing the ARM with a
> machine checkable specification, written in something like Twelf[1].
> That seems like a great idea.

I agree strongly. Extending the ARM with a formal specification would, I
believe, make it much easier to evolve Ada further, assuming that the
formalization would permit automatic analysis and proof of properties of
the language, such as the absence of Beaujolais effects and the absence
of contradictory specifications in different parts of the ARM.

This would make it easier to understand the impact of any proposed
change to Ada, both for backward compatibility and for the properties of
the changed language. It might make it easier to permit language changes
that break backward compatibility: by showing precisely where and how
that break affects the language, it might help compilers optionally
support both the old and the new forms of the language.

The formalization should allow the automatic generation of an Ada parser
and semantic analyzer, to the same level as ASIS now provides. Such a
tool, even if not fast enough for a production compiler, should mitigate
the obstacles that the rich (some would say "complex") Ada syntax and
semantics now pose for experimental language extensions and language
"augmentation" tools (source-to-source tools).

This formalization will not be easy or cheap, but should be feasible
today, at least for the syntax and static semantics.

I'm not sure if the dynamic semantics can be formalized in a way that
permits proof of its properties (as opposed to mere simulation of the
execution). The increasing powers of program-proving tools make me
hopeful. However, current tools focus on proving properties of a
particular (single) program, but proving properties of the Ada dynamic
semantics would require considering all Ada programs that obey those
semantics.

Note that I do not insist on a formalization that provides fully
automatic proofs, just one that allows formal (automatically checked)
proofs, even if manually guided or programmed.

To comment on the general "Ada successor" discussion, I think the
language should be kept integrated, for example with the "Proving
language" being an integrated part of the whole, as in Ada 2012
contracts, rather than being an add-on, as with the original SPARK
embedded in Ada comments.

--
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
. @ .

gautier...@hotmail.com

unread,
Jun 6, 2018, 5:02:15 PM6/6/18
to
Cool! For feeding catalogs, or for getting Ada sources "manually", the following two places have already a good bunch of them:

https://sourceforge.net/directory/language:ada/

https://github.com/search?utf8=%E2%9C%93&q=language%3AAda&type=Repositories&ref=advsearch

Alejandro R. Mosteo

unread,
Jun 8, 2018, 6:03:59 AM6/8/18
to
Thanks! I was already tracking github
and http://www.adaic.org/ada-resources/tools-libraries/

Right now I'm prioritizing projects announced here, followed by my
perceived popularity of projects in the public repositories above.


Dan'l Miller

unread,
Jun 8, 2018, 12:28:04 PM6/8/18
to
> This formalization will not be easy or cheap,
> but should be feasible today, at least for the
> syntax and static semantics.

Who would be willing to fund such an effort?
1) fizzle: The big tech companies on the West Coast of the USA seem highly disinterested in safety.
2) fizzle: DARPA seems to have moved on to AI & robotics only, not all the goals that motivated Ada.
3) fizzle: INRIA seems more interested in OCaml than Ada.
3) potential: The most likely funding would come from national governments in Europe, especially France or Germany, who are at times pursuing safe software systems.

What allied hot topic would attract funding to translating the _LRM_ into some sort of formal-specifications language? Ada as required by the insurance industry? Ada for robotics? AI for Ada? Ada for blockchain currency?

Mehdi Saada

unread,
Jun 8, 2018, 12:56:52 PM6/8/18
to
> Ada for robotics? AI for Ada ?
Are you saying that it is not fit for those kinds of applications ?

Shark8

unread,
Jun 8, 2018, 1:33:25 PM6/8/18
to
On Friday, June 8, 2018 at 10:28:04 AM UTC-6, Dan'l Miller wrote:
> > This formalization will not be easy or cheap,
> > but should be feasible today, at least for the
> > syntax and static semantics.
>
> Who would be willing to fund such an effort?
Perhaps this is the wrong question. The big problem being so few with resources have bought into Ada -- and those that have (eg Boeing) seem to either be transitioning away, or satisfied with state of things and uninterested in improvement.

I have a few contacts locally who might be interested, given the proper presentation; I'm trying to write up a proposal about it. / I could probably use some help doing so, but I plan to present it first and foremost as a full-IDE (and fully verified) for HW and SW. (pop me an e-mail if you're interested in helping.)

> 1) fizzle: The big tech companies on the West Coast of the USA seem highly disinterested in safety.
While I agree based on my experiences, and attitudes in lower management corporately; do you have any citable references on this? It would help boost my proposal, I think.

> 2) fizzle: DARPA seems to have moved on to AI & robotics only, not all the goals that motivated Ada.
Sad, and stupid. -- "AI" as it is now is nothing less than pattern-matching and while necessary for AI is not sufficient. (For true AI, we're about at the level of "epileptic chicken".)

> 3) fizzle: INRIA seems more interested in OCaml than Ada.
Well, given the hype-wave of "functional programming" this is at least understandable. Though, IIRC, there were some fans/proponents of Ada there.

> 3) potential: The most likely funding would come from national governments in Europe, especially France or Germany, who are at times pursuing safe software systems.
Interesting.

> What allied hot topic would attract funding to translating the _LRM_ into some sort of formal-specifications language? Ada as required by the insurance industry? Ada for robotics? AI for Ada? Ada for blockchain currency?
Honestly, I think Ada at the systems-level would work well. Microsoft Research did a fully type-safe OS (Verve) a while back and the researchers were blown away by the lack of need for a debugger: https://www.microsoft.com/en-us/research/wp-content/uploads/2016/02/pldi117-yang.pdf -- You can almost hear the amazement off those paragraphs.

Mehdi Saada

unread,
Jun 8, 2018, 4:38:44 PM6/8/18
to
Do you see only an IDE, or also new higher level constructs in the language itself ?

Dan'l Miller

unread,
Jun 11, 2018, 9:51:40 PM6/11/18
to

> Shark8 wrote:
> > Dan’l Miller wrote:
> > 3) potential: The most likely funding would
> > come from national governments in Europe,
> > especially France or Germany, who are at
> > times pursuing safe software systems.
> Interesting.

The European science-research apparatus commenced on this once before sometime around 1987. Here is an AJPO paper from the USA side of the Atlantic:

https://www.researchgate.net/publication/235202926_The_European_Formal_Definition_of_Ada_A_US_Perspective

https://link.springer.com/chapter/10.1007/BFb0022125

It was presented at the ESEC 1987 conference:
https://link.springer.com/book/10.1007/BFb0022092

The big question is why it stopped at a draft version of a formal description of (a subset?) of Ada83:
https://cordis.europa.eu/project/rcn/17317_fr.html

Luke A. Guest

unread,
Jun 12, 2018, 11:23:08 AM6/12/18
to
Dan'l Miller <> wrote:


> The big question is why it stopped at a draft version of a formal
> description of (a subset?) of Ada83:
> https://cordis.europa.eu/project/rcn/17317_fr.html
>

It’s posts like these that make me wonder why so many Ada people keep
living in the past. Don’t you think it’s time to learn from the past and
drag your arses into the 21st century.


Dan'l Miller

unread,
Jun 12, 2018, 11:44:54 AM6/12/18
to
The only way to drag my arse into the 21st century is for me & likeminded fellows to figure out what the heck went so wrong on this path. If you think the current path is the best one, then Dr Strangelove‘s advice to stop be afraid and love the “bomb”, so to speak.

Luke A. Guest

unread,
Jun 12, 2018, 1:59:52 PM6/12/18
to
Dan'l Miller <t> wrote:

> The only way to drag my arse into the 21st century is for me & likeminded
> fellows to figure out what the heck went so wrong on this path. If you
> think the current path is the best one, then Dr Strangelove‘s advice to
> stop be afraid and love the “bomb”, so to speak.
>

It was a good path back then although mightily complicated, still
complicated tbh. Time to modernise! 😝😝😝


Shark8

unread,
Jun 13, 2018, 1:46:15 AM6/13/18
to
On Friday, June 8, 2018 at 2:38:44 PM UTC-6, Mehdi Saada wrote:
> Do you see only an IDE, or also new higher level constructs in the language itself ?

The IDE idea is rather separate from an Ada successor language, at the moment. Given things like the Meltdown and Specter vulnerabilities it's becoming obvious that HW is reaching the point where it needs provable verification just like SW does -- at least for SW we have SPARK -- a fully integrated Ada and VHDL IDE would allow some form of crossover, allowing the SPARK provers to be used on HW descriptions and prove properties there, just like we do in SW.

This gets even more exciting when you have HW and SW together: you can verify the whole system! Imagine, for a moment, the possibilities of a chip, its OS, and its compiler all formally verified.

Shark8

unread,
Jun 13, 2018, 1:53:03 AM6/13/18
to
Well, some of the 21st century "solutions" are utterly inferior -- take "Continuous Integration" and "Source Control" as an example -- integrating both together solves a lot of problems: http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.26.2533&rep=rep1&type=pdf

+ Less redundant processing; you don't need to parse everything to check its correctness.
+ Better and more accurate source-control (because meaningless things like whitespace don't matter)
+ Formatting/style wars don't matter (because meaningless things like whitespace don't matter)
+ File-system environment doesn't matter, because the program source is in an actual DB instead of in test-files in the ad-hoc DB of the file-system.

Simon Wright

unread,
Jun 13, 2018, 2:57:34 AM6/13/18
to
Shark8 <onewing...@gmail.com> writes:

> Well, some of the 21st century "solutions" are utterly inferior --
> take "Continuous Integration" and "Source Control" as an example --
> integrating both together solves a lot of problems:
> http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.26.2533&rep=rep1&type=pdf
>
> + Less redundant processing; you don't need to parse everything to
> check its correctness.
> + Better and more accurate source-control (because meaningless things
> like whitespace don't matter)
> + Formatting/style wars don't matter (because meaningless things like
> whitespace don't matter)
> + File-system environment doesn't matter, because the program source
> is in an actual DB instead of in test-files in the ad-hoc DB of the
> file-system.

Sounds like the Ratinal Environment.

Dan'l Miller

unread,
Jun 13, 2018, 12:03:23 PM6/13/18
to
> > Dan'l Miller <t> wrote:
> > The only way to drag my arse into the 21st
> > century is for me & likeminded fellows to
> > figure out what the heck went so wrong on
> > this path. If you think the current Ada8652
> > path of currently-extant compilers is the best
> > one, then Dr Strangelove‘s advice to
> > stop worrying and love the “bomb”, so to
> > speak.
>
> It was a good path back then although mightily
> complicated, still complicated tbh. Time to
> modernise! 😝😝😝

(I corrected my mistakes in the quotation above.)

Modernization likely means an Ada-esque drastic subset.

My idea of a next-gen Ada goes the opposite direction: nearly everything in Ada8652, except breaking backwards compatibility in some Ichbiah oddities that are standing in the way of new modern era features, then plus one heck of a lot more. The problem with Ada8652 is that odd baggage is holding Ada back from being a much much bigger language.

The answer to the question of why did the release of this work stop at a draft is because its VDM spec turned commercial to become the DDC-I Ada compiler still extant today.
https://cordis.europa.eu/project/rcn/17317_fr.html

More-detailed history:
https://en.m.wikipedia.org/wiki/Dansk_Datamatik_Center

Luke A. Guest

unread,
Jun 13, 2018, 12:20:16 PM6/13/18
to
Dan'l Miller <net> wrote:

likely means an Ada-esque drastic subset.
>
> My idea of a next-gen Ada goes the opposite direction: nearly everything
> in Ada8652, except breaking backwards compatibility in some Ichbiah
> oddities that are standing in the way of new modern era features, then
> plus one heck of a lot more. The problem with Ada8652 is that odd
> baggage is holding Ada back from being a much much bigger language.
>

An issue is that not all types can be restricted and extended. Seems that
these are features of Ada but not for all types like it should be.

G. B.

unread,
Jun 13, 2018, 2:04:46 PM6/13/18
to
Dan'l Miller <opt...@verizon.net> wrote:
>> Luke Guest wrote:
>>> Dan'l Miller <> wrote:
>>> The big question is why it stopped at a draft version of a formal
>>> description of (a subset?) of Ada83:
>>> https://cordis.europa.eu/project/rcn/17317_fr.html
>>
>> It’s posts like these that make me wonder why so many
>> Ada people keep
>> living in the past. Don’t you think it’s time to learn from
>> the past and drag your arses into the 21st century.
>
> The only way to drag my arse into the 21st century is for me & likeminded
> fellows to figure out what the heck went so wrong on this path.

A good candidate:
“the tax payers’ gov. wants it” seen as a huge economic opportunity. An
opportunity that grows even larger if one can minimize investment.
Unfortunately, only tool makers and consultants see it that way. “Too
expensive”, I’ve heard a few times when asking why it went wrong.

> If you think the current path is the best one, then Dr Strangelove‘s
> advice to stop be afraid and love the “bomb”, so to speak.
>

The advice captures cloud computing trends pretty well, IMO. The mainframes
are back! The service models are just too promising.


Shark8

unread,
Jun 13, 2018, 2:59:00 PM6/13/18
to
On Wednesday, June 13, 2018 at 12:57:34 AM UTC-6, Simon Wright wrote:
>
> Sounds like the Ratinal Environment.

I've only ever read about it myself; but the people that I've talked with that have personally used it seem to be pretty impressed with it (considering its age and the contemporary environments).

IMO, the rise to prominence of Unix and C [and arguably C++] is the worst thing that happened in the field of CS. The amount of money, time, energy, and effort sucked up by these is so staggering to contemplate that it would be difficult to overestimate the costs, both real [e.g. all the buffer overflow vulnerabilities] and opportunities lost [e.g. the low-level of today's "advanced" continuous integration, or the cost of using the FS as a DB, or the costs of moronically considering unstructured text as THE appropriate/native format for storing programs].

Dmitry A. Kazakov

unread,
Jun 13, 2018, 3:19:30 PM6/13/18
to
On 2018-06-13 20:58, Shark8 wrote:
> On Wednesday, June 13, 2018 at 12:57:34 AM UTC-6, Simon Wright wrote:
>>
>> Sounds like the Ratinal Environment.
>
> I've only ever read about it myself; but the people that I've talked with that have personally used it seem to be pretty impressed with it (considering its age and the contemporary environments).

I used Rational in one project and can confirm, I was impressed.

> IMO, the rise to prominence of Unix and C [and arguably C++] is the worst thing that happened in the field of CS. The amount of money, time, energy, and effort sucked up by these is so staggering to contemplate that it would be difficult to overestimate the costs, both real [e.g. all the buffer overflow vulnerabilities] and opportunities lost [e.g. the low-level of today's "advanced" continuous integration, or the cost of using the FS as a DB, or the costs of moronically considering unstructured text as THE appropriate/native format for storing programs].

Unix, Windows, C could not be so efficient in burning everything down
had economical conditions not allowed price dumping, monopolization and
in the end killing the SW market.

Paul Rubin

unread,
Jun 13, 2018, 5:15:21 PM6/13/18
to
Shark8 <onewing...@gmail.com> writes:
> Unix and C [and arguably C++] is the worst thing ... opportunities
> lost [e.g. the low-level of today's "advanced" continuous integration,
> or the cost of using the FS as a DB, or the costs of moronically
> considering unstructured text as THE appropriate/native format for
> storing programs].

I don't think those lost opportunities have much to do with Unix or C.
Fancy CI that I've seen has usually been in conjunction with Ruby or
Python projects, etc. The FS is similarly a perfectly good DB depending
on what you want for it to do, syntax can be recovered from free text by
parsing, etc. This mostly comes down to human factors rather than
artifacts of buffer overflows or anything like that. Ada to zeroth
order is C++ without the buffer overflows. That's a good thing but it
doesn't affect the other stuff you mentioned.

Lucretia

unread,
Jun 13, 2018, 11:19:04 PM6/13/18
to
On Wednesday, 13 June 2018 20:19:30 UTC+1, Dmitry A. Kazakov wrote:

> Unix, Windows, C could not be so efficient in burning everything down

Windows was originally written in Pascal, not C, C and C++ came later.

Lucretia

unread,
Jun 13, 2018, 11:20:16 PM6/13/18
to
On Wednesday, 13 June 2018 22:15:21 UTC+1, Paul Rubin wrote:

> Python projects, etc. The FS is similarly a perfectly good DB depending

You want a FS that's a DB, look at BeOS' FS.

Shark8

unread,
Jun 13, 2018, 11:27:05 PM6/13/18
to
On Wednesday, June 13, 2018 at 3:15:21 PM UTC-6, Paul Rubin wrote:
> Shark8 writes:
> > Unix and C [and arguably C++] is the worst thing ... opportunities
> > lost [e.g. the low-level of today's "advanced" continuous integration,
> > or the cost of using the FS as a DB, or the costs of moronically
> > considering unstructured text as THE appropriate/native format for
> > storing programs].
>
> I don't think those lost opportunities have much to do with Unix or C.

Oh, but it does!

> Fancy CI that I've seen has usually been in conjunction with Ruby or
> Python projects, etc.

Relatively new; when they should have been industry standard two decades ago? Sounds like lost opportunity that we-as-an-industry are just now rediscovering. (Also, TTBOMK all of these solutions are text-based.)

> The FS is similarly a perfectly good DB depending
> on what you want for it to do, syntax can be recovered from free text by
> parsing, etc.

The FS is NOT a good substitute for a DB, furthermore it introduces perfectly avoidable environmental dependence which would not otherwise exist (e.g. case sensitivity, the path, the path separator, timestamp resolution, etc.)

> This mostly comes down to human factors rather than
> artifacts of buffer overflows or anything like that. Ada to zeroth
> order is C++ without the buffer overflows. That's a good thing but it
> doesn't affect the other stuff you mentioned.

Except you're ignoring things like the Rational 1000 which was mentioned earlier in this thread -- which is thirty years old now, IIRC -- the entire environment was geared for SW development; instead of getting environments like this we-as-an-industry wasted time and energy on C-isms and Unix-isms.

The whole text-based craze which causes you to parse everything again and again and again is straight from Unix. But more than this, it's false sense "mostly works" and lack of foresight is littered throughout.

Let's take the whole "one program that does one thing well" concept, ok that's all well and good. So you build a system piping inputs and outputs...it seems fine, at first glance, until you look at the fact that the medium of transport [pipes] is text. -- Now what you have is a system that serailizes and deserializes constantly, but not with any actual standard way, and usually with each component using an ad hoc method to read and write. -- So what you've done by standardizing on text is forced continual parsing and re-parsing for every element in the pipeline.

This is the whole reason that JSON is popular; because it "solves" the problem of text-stream consistency... sadly, the industry fails to realized that these serialize/deserialize problems were already solved via ASN.1.

I highly recommend you watch this video: http://www.youtube.com/watch?v=8pTEmbeENF4

Paul Rubin

unread,
Jun 14, 2018, 1:17:39 AM6/14/18
to
Shark8 <onewing...@gmail.com> writes:
>> Fancy CI that I've seen
> Relatively new; when they should have been industry standard two
> decades ago?

No I don't think the associated technology or the machine resources were
available that easily 20 years ago.

> The FS is NOT a good substitute for a DB,

Depends on wat you are doing

> (e.g. case sensitivity, the path, the path separator, timestamp
> resolution, etc.)

Might not matter, depending on what you are doing.

> Except you're ignoring things like the Rational 1000 which was

Whatever that is, almost nobody uses it now, probably for a reason.

> This is the whole reason that JSON is popular; because it "solves" the
> problem of text-stream consistency...

JSON is just restricted, curly-braced S-expressions.

> sadly, the industry fails to realized that these serialize/deserialize
> problems were already solved via ASN.1.

That's just another thing to parse and unparse. Here's my favorite
ASN.1 related rant:

https://www.cs.auckland.ac.nz/~pgut001/pubs/x509guide.txt

That said, reprocessing ad hoc text formats was more tractable in the
20th century Unix era, when there weren't so many programs around. This
stuff exists in the real world after all.

> I highly recommend you watch this video:
> http://www.youtube.com/watch?v=8pTEmbeENF4

On my list.

Dmitry A. Kazakov

unread,
Jun 14, 2018, 3:26:35 AM6/14/18
to
Windows 3.1 is 1992. C++ is a decade older, around 1980. C is another
decade older.

(It took a considerable time for C to poison everything. Same is true
for Windows.)

Marius Amado-Alves

unread,
Jun 14, 2018, 4:51:31 AM6/14/18
to
The Rational brand of IBM started as an Ada product. Didn't know that. My mind is blown.

Recently worked with RTC Rational Team Concert in a small team (10 people). Did the job, but way too many features getting in the way. In the past, worked with Rational Rose, similar feelings. And saw no Ada in either.

They surely managed to frack up whatever good was in the Ada foundation.

Lucretia

unread,
Jun 14, 2018, 7:25:52 AM6/14/18
to
On Thursday, 14 June 2018 08:26:35 UTC+1, Dmitry A. Kazakov wrote:
> On 2018-06-14 05:19, Lucretia wrote:
> > On Wednesday, 13 June 2018 20:19:30 UTC+1, Dmitry A. Kazakov wrote:
> >
> >> Unix, Windows, C could not be so efficient in burning everything down
> >
> > Windows was originally written in Pascal, not C, C and C++ came later.
>
> Windows 3.1 is 1992. C++ is a decade older, around 1980. C is another
> decade older.

I didn't mean the languages came later, the Windows implementations in those languages did.

Dmitry A. Kazakov

unread,
Jun 14, 2018, 8:22:59 AM6/14/18
to
My guess is that Windows is still in QuickBASIC ...

Randy Brukardt

unread,
Jun 14, 2018, 5:00:24 PM6/14/18
to

"Shark8" <onewing...@gmail.com> wrote in message
news:1b03e4ff-daf1-4c13...@googlegroups.com...
>IMO, the rise to prominence of Unix and C [and arguably C++] is the worst
>thing that happened
>in the field of CS. The amount of money, time, energy, and effort sucked up
>by these is so
>staggering to contemplate that it would be difficult to overestimate the
>costs, both real [e.g.
>all the buffer overflow vulnerabilities] and opportunities lost [e.g. the
>low-level of today's
>"advanced" continuous integration, or the cost of using the FS as a DB, or
>the costs of
>moronically considering unstructured text as THE appropriate/native format
>for storing
>programs].

While I agree with your basic premise, I'm dubious that there is a better
format than unstructured text for programs. There were many such
alternatives explored in the 1980's, and they all had the property of making
editing more complex. (I tried one for a while when a vendor wanted us to
consider bundling it; it was a very nice job for the time, but the attempt
to keep code parsable at all times made it much harder to do program
editing - one needed multiple steps to do what was easy to do in one step
and a bit of text editing. For instance, consider cutting out an elsif
branch and turning it into a stand-alone if statement -- a restucturing I
tend to do fairly frequently.)

More generally, non-plain-text editors put barriers into program
construction, and those barriers make one stop and worry about how to make
the editor happy rather than continuing to worry about the actual problem
you are solving. While some level of friction can be tolerated if it leads
to big savings later (that's why we can tolerate Ada's strong typing!),
syntax errors just aren't a significant enough problem (they get detected
soon enough in any scheme) to give much benefit.

This area is a problem I've been thinking about for literally decades, and
I'm pretty convinced that most of the "solutions" are worse than the problem
they'd be fixing. For a new scheme to be better, it would have to have
little friction.

I also worry about the fragility of databases. If some "unstructured text"
gets corrupted, you might lose a subprogram or two. (That used to happen a
lot in the old days, and it still does once in a while.) And you can usually
get that from a backup. But a corrupted database is pretty much useless - if
the recovery tools fail, the only choice is to reconstruct from scratch. (I
just had to reinstall one of the subsystems on our Windows Server because of
a problem like this - had to uninstall the entire subsystem, delete all of
the files, and then reinstall. Luckily I mainly cared about getting rid of
the error messages, not so much the loss of data.)

With databases, everything is harder than it is with simple files: backups,
searching, etc. all need specialized tools. So the benefits would have to be
massive in order to make up for the variety of tiny pains that would ensue.

Randy.


Lucretia

unread,
Jun 14, 2018, 11:35:10 PM6/14/18
to
On Thursday, 14 June 2018 13:22:59 UTC+1, Dmitry A. Kazakov wrote:

> My guess is that Windows is still in QuickBASIC ...

If it was any BASIC it would be Visual. But the sources have been leaked a number of times, I'm fairly sure it's C++ now, not that I've seen it. Neither do I care to.

Lucretia

unread,
Jun 14, 2018, 11:41:39 PM6/14/18
to
On Thursday, 14 June 2018 22:00:24 UTC+1, Randy Brukardt wrote:
> "Shark8" <> wrote in message
> news:1b03e4ff-daf1-4c13...@googlegroups.com...
> >IMO, the rise to prominence of Unix and C [and arguably C++] is the worst
> >thing that happened

TBH, standardising on an OS API is not a bad idea, the fact they did it based on C and it's issues was a bad idea. The fact that Unix *was* originally open and availbale to pretty much all was a good thing, but when AT&T (I think) who revoked that, not a good move.


> >"advanced" continuous integration, or the cost of using the FS as a DB, or
> >the costs of
> >moronically considering unstructured text as THE appropriate/native format
> >for storing
> >programs].
>
> While I agree with your basic premise, I'm dubious that there is a better
> format than unstructured text for programs. There were many such
> alternatives explored in the 1980's, and they all had the property of making
> editing more complex. (I tried one for a while when a vendor wanted us to

I agree that the whole idea that the source should be a DB is a bad idea from the outset. The idea of the APSE was hated from the outset, pretty much every idea that Shark8 has about bringing Ada "back" is from the original bad ideas that everyone hated and there are numerous papers on what was bad at the time. Time to move on and not repeat the mistakes of the past, especially where Ada is concerned.

Luke.

Dmitry A. Kazakov

unread,
Jun 15, 2018, 3:08:20 AM6/15/18
to
On 2018-06-15 05:41, Lucretia wrote:
> On Thursday, 14 June 2018 22:00:24 UTC+1, Randy Brukardt wrote:

>> While I agree with your basic premise, I'm dubious that there is a better
>> format than unstructured text for programs. There were many such
>> alternatives explored in the 1980's, and they all had the property of making
>> editing more complex. (I tried one for a while when a vendor wanted us to
>
> I agree that the whole idea that the source should be a DB is a bad idea from the outset.

It is two different ideas, actually.

1. The source code is not stream of characters but a persistent object
of certain structure, e.g. a text buffer.

2. The source code objects have dependencies and grouped in larger
structures.

Both ideas are valid, good and long time due. The only obstacle are two
horrific families of OS - Unix and Windows.

Simon Wright

unread,
Jun 15, 2018, 3:15:58 AM6/15/18
to
"Randy Brukardt" <ra...@rrsoftware.com> writes:

> While I agree with your basic premise, I'm dubious that there is a
> better format than unstructured text for programs. There were many
> such alternatives explored in the 1980's, and they all had the
> property of making editing more complex. (I tried one for a while when
> a vendor wanted us to consider bundling it; it was a very nice job for
> the time, but the attempt to keep code parsable at all times made it
> much harder to do program editing - one needed multiple steps to do
> what was easy to do in one step and a bit of text editing. For
> instance, consider cutting out an elsif branch and turning it into a
> stand-alone if statement -- a restucturing I tend to do fairly
> frequently.)

The R1000 had three phases for a unit: the first, whose name I forget,
was essentially free text; the second was 'semanticised', i.e. parsed
and checked for legality; the third, again I forget the name, was
code-generated. Once a unit was semanticised, the free text form was
forgotten and would be regenerated if you needed to edit the unit.

Annoying if you really didn't like the code format enforced by the
system, but it didn't take long to let go of that particular form of
artistic hubris!

Dmitry A. Kazakov

unread,
Jun 15, 2018, 3:20:27 AM6/15/18
to
On 2018-06-15 05:35, Lucretia wrote:
> On Thursday, 14 June 2018 13:22:59 UTC+1, Dmitry A. Kazakov wrote:
>
>> My guess is that Windows is still in QuickBASIC ...
>
> If it was any BASIC it would be Visual.

No way! Visual Basic is not backward compatible with itself. It must
QuickBASIC, as the legend tells that B.G. personally wrote a few lines
of ...

> But the sources have been leaked a number of times, I'm fairly sure it's C++ now, not that I've seen it. Neither do I care to.

Same here. (:-))

jm.ta...@gmail.com

unread,
Jun 15, 2018, 7:38:20 AM6/15/18
to
>
> I used Rational in one project and can confirm, I was impressed.
>
Just curious.
How much cost a seat of such impressing environment?

Dmitry A. Kazakov

unread,
Jun 15, 2018, 8:06:25 AM6/15/18
to
I cannot say, we were a subcontractor, all expenses were paid by another
party.

J-P. Rosen

unread,
Jun 15, 2018, 11:42:19 AM6/15/18
to
Le 15/06/2018 à 14:06, Dmitry A. Kazakov a écrit :
> On 2018-06-15 13:38, jm.ta...@gmail.com wrote:
>>>
>>> I used Rational in one project and can confirm, I was impressed.
>>>
>> Just curious.
>> How much cost a seat of such impressing environment?
>
> I cannot say, we were a subcontractor, all expenses were paid by another
> party.
>
I don't know the exact price, but all users I met said that
1) it was very expensive
2) the price was quickly covered by the gains in productivity

--
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr

Lucretia

unread,
Jun 15, 2018, 12:04:00 PM6/15/18
to
On Friday, 15 June 2018 08:20:27 UTC+1, Dmitry A. Kazakov wrote:
> On 2018-06-15 05:35, Lucretia wrote:
> > On Thursday, 14 June 2018 13:22:59 UTC+1, Dmitry A. Kazakov wrote:
> >
> >> My guess is that Windows is still in QuickBASIC ...
> >
> > If it was any BASIC it would be Visual.
>
> No way! Visual Basic is not backward compatible with itself. It must
> QuickBASIC, as the legend tells that B.G. personally wrote a few lines
> of ...

If BG had anytrhing to do with it, then QuickBASIC would've been too unstable to do anything with. I used AmigaBASIC bitd and it crashed a lot, that was written by BG apparently.

;)

Simon Wright

unread,
Jun 15, 2018, 1:30:37 PM6/15/18
to
"J-P. Rosen" <ro...@adalog.fr> writes:

> Le 15/06/2018 à 14:06, Dmitry A. Kazakov a écrit :
>> On 2018-06-15 13:38, jm.ta...@gmail.com wrote:
>>>>
>>>> I used Rational in one project and can confirm, I was impressed.
>>>>
>>> Just curious.
>>> How much cost a seat of such impressing environment?
>>
>> I cannot say, we were a subcontractor, all expenses were paid by
>> another party.
>>
> I don't know the exact price, but all users I met said that
> 1) it was very expensive
> 2) the price was quickly covered by the gains in productivity

I worked for Ferranti at the time; we had a site in Cwmbran which was
the software engineering & research part (from our point of view,
what we'd now call toolchain and driver developers).

The Cwmbran team had I think 7 R1000s. This was (according to a rumour)
because they weren't happy with Rational's CMVC (Configuration
Management & Version Control) and insisted on implementing their own on
top of it, so needed more machines than Rational would have thought
necessary. Good for Rational's bottom line, of course.

jm.ta...@gmail.com

unread,
Jun 15, 2018, 1:55:18 PM6/15/18
to
> >
> I don't know the exact price, but all users I met said that
> 1) it was very expensive
> 2) the price was quickly covered by the gains in productivity

How much is "very expensive"?
Are similar products for other languages as expensive?

Shark8

unread,
Jun 15, 2018, 3:58:11 PM6/15/18
to
AFAIK, there's no other "similar product" to that degree. What is really interesting though is the statement "the price was quickly covered by the gains in productivity" -- this means the utility in ONLY productivity was enough to quickly pay for itself.

The reason is that the entire thing was an Integrated Development Environment in the true sense of the phrase, not the "hyped up editor" (a slight exaggeration) that modern IDEs get. / One of the issues is that the tools in such environments *AREN'T* integrated. So you get idiotic text-based source control where

int main() { while (true){}; }
and
int main()
{ while (true){}; }

are actually *different* -- this means that stupid things like 'style', "tabs vs. spaces" and other formatting elements are flagged as 'changes' completely apart from any semantic alteration. (This means that, within such source-control, there can be many "false positives" when you're looking at a change-log... and even if you have the proper revision loaded, finding the actual alteration that was supposed to be logged might be difficult because so many non-semantic changes interfere with getting a picture of the actual semantic changes. [ex: looking at a diff when someone's editor helpfully changed tabs/spaces and they committed some buggy code.])

The lack of integration can be seen in other areas too; take CPAN [perl's package manager] for instance -- it's essentially a giant archive of zip-files and an index, synchronized across a network -- so far, so good & everything works well... until the text-based index is corrupted or there's clashes between zip-files or such. // An alternative would be to store the code as a DB-amiable IR, which is stored in a DB... things could be done here like *automatically* tracking dependencies, and compatibilities within these dependencies; such an effort would be incredibly hard to set-up [and maintain] in the "text-file + zip-files" approach.

This sort of integration is something that isn't all that common in the places I've worked, which usually go with various individual tools.

----------
This Technical Report [1988; 92 pg] describes the whole thing:
http://resources.sei.cmu.edu/asset_files/TechnicalReport/1988_005_001_15650.pdf

There's an Army Report [1995; 72 pg] here, which I haven't read yet:
http://www.dtic.mil/get-tr-doc/pdf?AD=ADA301551

Simon Wright

unread,
Jun 16, 2018, 3:04:26 AM6/16/18
to
Shark8 <onewing...@gmail.com> writes:

> (This means that, within such source-control, there can be many
> "false positives" when you're looking at a change-log... and even if
> you have the proper revision loaded, finding the actual alteration
> that was supposed to be logged might be difficult because so many
> non-semantic changes interfere with getting a picture of the actual
> semantic changes. [ex: looking at a diff when someone's editor
> helpfully changed tabs/spaces and they committed some buggy code.])

I had to refuse an offer to collaborate on the Ada 95 Booch Components
because the person insisted on editing using an editor (Grasp?) which
didn't respect the original layout.

Mind, someone who insisted on using an editor that did that might not
have been a very good collaborator anyway.

jm.ta...@gmail.com

unread,
Jun 16, 2018, 5:14:22 AM6/16/18
to

> AFAIK, there's no other "similar product" to that degree.

For me, Visual Studio is more stable and advanced that GPS or any Ada toolchain. I don't know what has Rational, but I find Visual Studio impressing.

>What is really interesting though is the statement "the price was quickly covered by the gains in productivity" -- this means the utility in ONLY productivity was enough to quickly pay for itself.

It looks that nobody can answer how much is a seat of Rational (and that is scaring). I've heard about 15.000$, but I can't confirm or deny it. If I'm doing projects of 10 millions dollars, 15.000 may be acceptable. If I'm doing 50.000 100.000$ projects, it is not.

So, maybe there is one of the answers of why Ada is not more popular. Only deep pockets and really big project can use Ada at its full power.

> are actually *different* -- this means that stupid things like 'style', "tabs vs. spaces" and other formatting elements are flagged as 'changes' completely apart from any semantic alteration. (This means that, within such source-control, there can be many "false positives" when you're looking at a change-log...

Formatting and styles *are changes*. In fact I make commits to correct style. Maybe editors should store source in a compact format and reflow it on the fly when you load the file, so, if you load with a classic editor a source of Ada,C,C++, etc, you would only see a very long line. But until such editors become the mainstream, changes in formatting and style are not false positives, are changes that must be committed.

>
> This sort of integration is something that isn't all that common in the places I've worked, which usually go with various individual tools.

Most IDEs integrate Editor (with completion, refactoring, jumping easily from a definition to another, stubs, help and a few things), some kind of files of project and a Debugger.

There are other tools (Version control, code analyzers etc) Does integrating such tools boots productivity a 20%? I don't think so. So, if the environment is a 5% of prize of the project or a 20% of the project, the level of integration may be secondary.

Simon Wright

unread,
Jun 16, 2018, 6:22:22 AM6/16/18
to
jm.ta...@gmail.com writes:

> It looks that nobody can answer how much is a seat of Rational (and
> that is scaring). I've heard about 15.000$, but I can't confirm or
> deny it. If I'm doing projects of 10 millions dollars, 15.000 may be
> acceptable. If I'm doing 50.000 100.000$ projects, it is not.

We've been talking about the Rational R1000 (Delta 4, I think)
Environment, which ran on custom hardware (the R1000). Rational then
developed Apex (or ported Delta 4) to provide the same/similar
facilities on then-common hardware. Sold to IBM ... Atego ... PTC.

I've never used Apex.

https://en.wikipedia.org/wiki/Rational_R1000
http://www.adaic.org/ada-resources/pro-tools-services/
https://www.ptc.com/en/products/developer-tools/apexada

Jeffrey R. Carter

unread,
Jun 16, 2018, 6:50:49 AM6/16/18
to
On 06/16/2018 12:22 PM, Simon Wright wrote:
>
> We've been talking about the Rational R1000 (Delta 4, I think)
> Environment, which ran on custom hardware (the R1000). Rational then
> developed Apex (or ported Delta 4) to provide the same/similar
> facilities on then-common hardware. Sold to IBM ... Atego ... PTC.
>
> I've never used Apex.

I never used the R1000, but I saw it demonstrated once.

I used Apex on 2 projects. The thing about it is that it stores the code in an
internal format (DIANA, I think), not as text. When you view a unit as text, the
code is formatted according to your preferences. That means everyone on a
project may use his personal formatting preferences without impacting anyone
else. While entering or editing text, the editor formats on the fly pretty
aggressively, and tends to point out syntax errors.

There are default formatting preferences, which are pretty poor IMO, and a way
to modify them, though finding the place to do that just from the UI seemed to
be impossible. One project decided everyone should use the default. On the
other, they had instructions on how to change your preferences, so everyone
customized theirs. The only drawback to this was when you looked at code on
another's display, and the formatting was not what you were used to.

--
Jeff Carter
"That was the most fun I've ever had without laughing."
Annie Hall
43

Dmitry A. Kazakov

unread,
Jun 16, 2018, 7:32:46 AM6/16/18
to
On 2018-06-16 11:14, jm.ta...@gmail.com wrote:

> For me, Visual Studio is more stable and advanced that GPS or any Ada toolchain.

Actually GPS is more stable than VS now. It is pretty much faster and
has better source code navigation. Another huge problem with VS is that
each new version brings new project formats, new incompatibilities, new
look and feel. I did a lot of VS migration 2003 to 2005 to 2008 to 2010,
and now incoming 2017 horror. The only bad thing about GPS is the debugger.

Dan'l Miller

unread,
Jun 17, 2018, 11:31:11 PM6/17/18
to
On Monday, June 4, 2018 at 7:17:26 PM UTC-5, Shark8 wrote:
> (ALSO: I personally think that a user definable "abstract type interface" [kinda/sorta type classes], as well
> as user-definable attributes, would be great ways to handle the above while expanding the capabilities of
> Ada'Succ.)

Ada'Succ is an awesome name at some level (e.g., "Ada'Succ" would be unique on WWW searches). Instead of a play on Ada Lovelace Byron's name et cetera, it is the 2nd insect regarding Ada when read phonetically: Ada tick suck. The mascot could be literally be a tick insect.

(Ada95 would be either Ada'Class or Ada'Tag under this system of monikers.)

With some creativity, somehow the sucking blood metaphor of “ada tick suck” could be retailored to be a positive meaning. E.g.,

1) the Boost project that is influencing modern C++ in the 21st century was originally jokingly named Booze as a beverage even better than Java/coffee/joe. Booze got morphed into Boost.

2) Boilermaker was an insult applied by another college-town's newspaper to Purdue University's athletes back in the 19th century who were working part-time at a local steam-equipment factory to pay their way through Purdue University. Instead of being an insult, those athletes wore the moniker as their official badge of honor as their official team name.

Where there is a will, there is a way.

Björn Lundin

unread,
Jun 18, 2018, 2:58:13 AM6/18/18
to
On 2018-06-18 05:31, Dan'l Miller wrote:
> Ada'Succ is an awesome name at some level (e.g., "Ada'Succ" would be unique on WWW searches).
> Instead of a play on Ada Lovelace Byron's name et cetera, it is the
2nd insect regarding Ada when read phonetically:
> Ada tick suck. The mascot could be literally be a tick insect.

Another pun in the spirit of c++ is tcl's OO framework
called 'incr tcl'.

i++ in tcl is written as incr i
No-one but tcl:ers know this. It is a bad name.
How many, execpt ada:ers, know the meaning of T'Succ?

Besides, a tick is a nasty spider related animal, arachnid. Not an
insect. And it is the bearer of at lest two seroius deceases (TBE and
Borrelios). Really want that as a mascot? I don't.


And I can really imagine problems on wiki-pages and other printed
materail when you say
<source lang='Ada'Succ'>

The parsers/string handlers will not like it.


--
--
Björn

Dan'l Miller

unread,
Jun 18, 2018, 8:33:53 AM6/18/18
to
On Monday, June 18, 2018 at 1:58:13 AM UTC-5, björn lundin wrote:
> On 2018-06-18 05:31, Dan'l Miller wrote:
> > Ada'Succ is an awesome name at some level (e.g., "Ada'Succ" would be unique on WWW searches).
> > Instead of a play on Ada Lovelace Byron's name et cetera, it is the
> 2nd insect regarding Ada when read phonetically:
> > Ada tick suck. The mascot could be literally be a tick insect.
>
> Another pun in the spirit of c++ is tcl's OO framework
> called 'incr tcl'.
>
> i++ in tcl is written as incr i
> No-one but tcl:ers know this. It is a bad name.
> How many, execpt ada:ers, know the meaning of T'Succ?
>
> Besides, a tick is a nasty spider related animal, arachnid. Not an
> insect.

8-bit bites from a tick would be more appropriate for a computer-programming language than a 6-bit bite from an insect anyway.

Niklas Holsti

unread,
Jun 18, 2018, 3:16:31 PM6/18/18
to
On 18-06-18 06:31 , Dan'l Miller wrote:
> On Monday, June 4, 2018 at 7:17:26 PM UTC-5, Shark8 wrote:
>> (ALSO: I personally think that a user definable "abstract type
>> interface" [kinda/sorta type classes], as well as user-definable
>> attributes, would be great ways to handle the above while expanding
>> the capabilities of Ada'Succ.)
>
> Ada'Succ is an awesome name at some level (e.g., "Ada'Succ" would be
> unique on WWW searches).

Nah... invites "Ada'Succ sucks".

My entry into the pool of names: Ada Nouveau.

The first DuckDuckGo hit (for me) is the "Beaujolais effect" from Wikipedia.

--
Niklas Holsti
Tidorum Ltd
niklas holsti tidorum fi
. @ .

Dan'l Miller

unread,
Jun 18, 2018, 3:28:51 PM6/18/18
to
On Monday, June 18, 2018 at 2:16:31 PM UTC-5, Niklas Holsti wrote:
> My entry into the pool of names: Ada Nouveau.

Ada'Tock: Time marches onward.

Jeffrey R. Carter

unread,
Jun 18, 2018, 4:22:13 PM6/18/18
to
On 06/18/2018 09:16 PM, Niklas Holsti wrote:
>
> Nah... invites "Ada'Succ sucks".

Also, in Ada, Ada'Succ is not the successor of Ada.

--
Jeff Carter
"He nevere yet no vileynye ne sayde
In al his lyf unto no maner wight."
Canterbury Tales
156
It is loading more messages.
0 new messages