Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Generators/coroutines in future Ada?

495 views
Skip to first unread message

Victor Porton

unread,
Jul 9, 2017, 4:19:51 PM7/9/17
to
Is it possible to implement generators and coroutines (like as in Python) in
a future version of Ada?

I am not an expert on how generators are implemented. Is it possible in a
language like Ada?

Well, I think it could add additional reliability, because programming of
processing of different "streams" becomes easier and thus less error prone.

--
Victor Porton - http://portonvictor.org

Victor Porton

unread,
Jul 9, 2017, 7:56:54 PM7/9/17
to
Dennis Lee Bieber wrote:

> On Sun, 09 Jul 2017 23:19:23 +0300, Victor Porton <por...@narod.ru>
> declaimed the following:
>
>>Is it possible to implement generators and coroutines (like as in Python)
>>in a future version of Ada?
>>
>>I am not an expert on how generators are implemented. Is it possible in a
>>language like Ada?
>>
>
> Generators -- not as you think of them (ad hoc, on-demand, creation).
> cf:
> http://www.adaic.org/resources/add_content/standards/05rm/html/RM-A-5-2.html
> (which is focused on random number generation, but basically anything
> could follow the schema). You'd have to define a (generic) package with a
> generator type definition, and methods to manipulate state of the
> generator. The instances of the generator type are what holds the state
> between invocations.

What you described is a workaround how to code equivalently to generators in
absence of true language support of generators.

It is just like object oriented programming in C.

>>Well, I think it could add additional reliability, because programming of
>>processing of different "streams" becomes easier and thus less error
>>prone.
>
> I wouldn't hold out much hope for coroutines. Ada was created from the
> start with the concept of independent tasking (all the way back before
> 1980!). Formal co-routines (a la Modula-2) do not fit the philosophy as
> they are a cooperative "task" switch (in classic co-routines, each routine
> determines when it will transfer control to another, and possibly where in
> that other transfer goes to -- it was even doable in FORTRAN-IV when
> statement labels could be passed in/out of a call). I've never studied the
> Python "co-routine" but it does not map to what I consider a co-routine.

Dmitry A. Kazakov

unread,
Jul 10, 2017, 3:33:38 AM7/10/17
to
On 09/07/2017 23:28, Dennis Lee Bieber wrote:
> I wouldn't hold out much hope for coroutines. Ada was created from the
> start with the concept of independent tasking (all the way back before
> 1980!). Formal co-routines (a la Modula-2) do not fit the philosophy as
> they are a cooperative "task" switch (in classic co-routines, each routine
> determines when it will transfer control to another, and possibly where in
> that other transfer goes to -- it was even doable in FORTRAN-IV when
> statement labels could be passed in/out of a call). I've never studied the
> Python "co-routine" but it does not map to what I consider a co-routine.

Instead of co-routine we could talk about non-scheduled tasks or
user-scheduled tasks. [What is actually needed is local stack without
scheduling overhead.]

--
Regards,
Dmitry A. Kazakov
http://www.dmitry-kazakov.de

Victor Porton

unread,
Jul 10, 2017, 9:38:53 AM7/10/17
to
Victor Porton wrote:

> Is it possible to implement generators and coroutines (like as in Python)
> in a future version of Ada?
>
> I am not an expert on how generators are implemented. Is it possible in a
> language like Ada?
>
> Well, I think it could add additional reliability, because programming of
> processing of different "streams" becomes easier and thus less error
> prone.

Don't generators require closures?

But AFAIK Ada does not support closures.

Are closures really needed to implement generators?

It is nearly impossible to add closures to Ada, right?

Paul Rubin

unread,
Jul 10, 2017, 1:01:14 PM7/10/17
to
Victor Porton <por...@narod.ru> writes:
> Are closures really needed to implement generators?

You can do it with OOP, like with the iterator protocol in C++.
It's ugly.

Victor Porton

unread,
Jul 10, 2017, 5:24:30 PM7/10/17
to
I know that every algorithm can be done with OOP (or even with plain
procedural programming). Every algorithm can be done with procedures without
functions; this does not imply that functions are not necessary.

So the question is not whether an equivalent algorithm can be coded, but
whether it can be done naturally and explicitly.

Is it possible to add explicit generator functions to Ada?

J-P. Rosen

unread,
Jul 11, 2017, 5:42:59 AM7/11/17
to
Le 10/07/2017 à 15:38, Victor Porton a écrit :
> But AFAIK Ada does not support closures.
>
> Are closures really needed to implement generators?
>
> It is nearly impossible to add closures to Ada, right?

Taking the example from Wikipedia:

function startAt(x)
function incrementBy(y)
return x + y
return incrementBy

variable closure1 = startAt(1)
variable closure2 = startAt(5)

This could be written as:
generic
Start_Value : Integer
function Increment_By (X : integer) return Integer;

function Closure1 is new Increment_By (1);
function Closure2 is new Increment_By (5);

----
I can hear you saying that "generics are a poor man's workaround for the
lack of closure", to which I respond that closures are a limited
workaround for the lack of the more general concept of generic...

TBH, I don't think there is any value in the kind of discussion that
starts with "Language X is lacking feature A from language Y".

Express a problem in language Y, let people provide their solution in
language X, and then we can discuss the merits of each solution, given
some quality criteria. Note that sometimes, criteria can be at odds;
f.e., people nowadays value ease of writing and flexibility very much,
but these are quite opposed to safety, readability, and long term
maintenance which form the basic requirements of Ada.

--
J-P. Rosen
Adalog
2 rue du Docteur Lombard, 92441 Issy-les-Moulineaux CEDEX
Tel: +33 1 45 29 21 52, Fax: +33 1 45 29 25 00
http://www.adalog.fr

Victor Porton

unread,
Jul 11, 2017, 8:53:02 AM7/11/17
to
J-P. Rosen wrote:

> Le 10/07/2017 à 15:38, Victor Porton a écrit :
>> But AFAIK Ada does not support closures.
>>
>> Are closures really needed to implement generators?
>>
>> It is nearly impossible to add closures to Ada, right?
>
> Taking the example from Wikipedia:
>
> function startAt(x)
> function incrementBy(y)
> return x + y
> return incrementBy
>
> variable closure1 = startAt(1)
> variable closure2 = startAt(5)
>
> This could be written as:
> generic
> Start_Value : Integer
> function Increment_By (X : integer) return Integer;
>
> function Closure1 is new Increment_By (1);
> function Closure2 is new Increment_By (5);

Please describe your solution with generics in more details, I do not quite
understand it. I want to see how to imitate closures with generics.

Victor Porton

unread,
Jul 11, 2017, 9:01:54 AM7/11/17
to
I've realized how to imitate closures with generics.

But is there is a way to add generators to Ada?

J-P. Rosen

unread,
Jul 11, 2017, 9:26:54 AM7/11/17
to
Le 11/07/2017 à 15:01, Victor Porton a écrit :
> I've realized how to imitate closures with generics.
>
> But is there is a way to add generators to Ada?

AFAIU, generators are functions with states, where the returned value at
one call depends on the state that was left behind by the previous call.
This can easily be achieved with a function within a package, where the
package serves to protect and hide the function's state.

Alternatively, you can mimic the behaviour with a task that has several
accept for the same entry, where you could describe the hidden state as
the execution position of the task.

But once again, please provide an example where you find generators
convenient, and we can discuss the best solution in Ada (I know other
people in this group who would be interested too ;-) )

Dmitry A. Kazakov

unread,
Jul 11, 2017, 12:59:05 PM7/11/17
to
On 2017-07-11 15:26, J-P. Rosen wrote:
> Le 11/07/2017 à 15:01, Victor Porton a écrit :
>> I've realized how to imitate closures with generics.
>>
>> But is there is a way to add generators to Ada?
>
> AFAIU, generators are functions with states, where the returned value at
> one call depends on the state that was left behind by the previous call.

Which is an incredibly bad idea for sure.

> But once again, please provide an example where you find generators
> convenient, and we can discuss the best solution in Ada (I know other
> people in this group who would be interested too ;-) )

(:-)) I remember a few cases when this results in a disaster. Even
innocent cases like saving the last found item of a search do. Or when
Ada.Text_IO.Put_Line with no file is used for anything except debugging.
How big are chances to discover much later that you need to change the
output file?

There must be no hidden state, almost never. The state must be packed
into an object which must be an explicit parameter.

Victor Porton

unread,
Jul 11, 2017, 2:36:36 PM7/11/17
to
J-P. Rosen wrote:

> Le 11/07/2017 à 15:01, Victor Porton a écrit :
>> I've realized how to imitate closures with generics.
>>
>> But is there is a way to add generators to Ada?
>
> AFAIU, generators are functions with states, where the returned value at
> one call depends on the state that was left behind by the previous call.

This state also includes the point of execution (like "We are in the second
loop in its second operator.")

> This can easily be achieved with a function within a package, where the
> package serves to protect and hide the function's state.

Ada currently has no support to save such (point of execution) states.

J-P. Rosen

unread,
Jul 11, 2017, 3:23:03 PM7/11/17
to
Le 11/07/2017 à 20:36, Victor Porton a écrit :
> This state also includes the point of execution (like "We are in the second
> loop in its second operator.")
>
>> This can easily be achieved with a function within a package, where the
>> package serves to protect and hide the function's state.
> Ada currently has no support to save such (point of execution) states.
It has. Make your generator a task, and have an accept when the data is
ready, which can be nested as deep as you want in any construct.

A task is a perfect abstraction for a thread of control that maintains
its own execution state!

Simon Wright

unread,
Jul 11, 2017, 3:27:19 PM7/11/17
to
Victor Porton <por...@narod.ru> writes:

> But is there is a way to add generators to Ada?

Could one build something based on generalized iterators? Brad Moore had
some examples of unusual iterators, e.g. for prime numbers - see the
thread starting at [1], and the link to ACATS code at [2].

[1]
https://groups.google.com/d/msg/comp.lang.ada/jZS9uHYF0KM/yrKywKuUBwAJ
[2]
https://groups.google.com/d/msg/comp.lang.ada/jZS9uHYF0KM/nQH17Pe-BwAJ

Pascal Obry

unread,
Jul 11, 2017, 3:52:34 PM7/11/17
to
Le mardi 11 juillet 2017 à 18:59 +0200, Dmitry A. Kazakov a écrit :
> There must be no hidden state, almost never. The state must be
> packed into an object which must be an explicit parameter.

This sounds plain wrong to me. We have ASM and ADT, why do you want to
forbid ASM which is a simple and nice design pattern?

I agree with Jean-Pierre, generator is a a package with state in the
body. Ada has been designed with building blocks in mind, for example a
class is a package+tagged-type.

--
  Pascal Obry /  Magny Les Hameaux (78)

  The best way to travel is by means of imagination

  http://www.obry.net

  gpg --keyserver keys.gnupg.net --recv-key F949BD3B

Dmitry A. Kazakov

unread,
Jul 11, 2017, 4:18:16 PM7/11/17
to
On 2017-07-11 21:52, Pascal Obry wrote:
> Le mardi 11 juillet 2017 à 18:59 +0200, Dmitry A. Kazakov a écrit :
>> There must be no hidden state, almost never. The state must be
>> packed into an object which must be an explicit parameter.
>
> This sounds plain wrong to me. We have ASM and ADT,

What has any of them to do with hidden states?

> why do you want to
> forbid ASM which is a simple and nice design pattern?

There is nothing nice in ASM.

> I agree with Jean-Pierre, generator is a a package with state in the
> body. Ada has been designed with building blocks in mind, for example a
> class is a package+tagged-type.

No, class is a set of types sharing some common properties. E.g.

- a class of tagged types rooted in some parent type
- a class of integer types
- a class of types declared in the instances of a generic package
...

Dmitry A. Kazakov

unread,
Jul 11, 2017, 4:25:22 PM7/11/17
to
On 2017-07-11 21:22, J-P. Rosen wrote:
> Le 11/07/2017 à 20:36, Victor Porton a écrit :
>> This state also includes the point of execution (like "We are in the second
>> loop in its second operator.")
>>
>>> This can easily be achieved with a function within a package, where the
>>> package serves to protect and hide the function's state.
>> Ada currently has no support to save such (point of execution) states.
> It has. Make your generator a task, and have an accept when the data is
> ready, which can be nested as deep as you want in any construct.
>
> A task is a perfect abstraction for a thread of control that maintains
> its own execution state!

Except that no control is required.

I don't understand what is wrong with:

type Stateful is limited private;
function Generate (X : not null access Stateful) return Things;

Victor Porton

unread,
Jul 11, 2017, 7:19:50 PM7/11/17
to
The thing "wrong" with this, is that it is sometimes hard to implement
Generate(). In Python this can be done in an easier way by using generators
(Python's "yield" keywords).

J-P. Rosen

unread,
Jul 12, 2017, 12:55:04 AM7/12/17
to
Le 12/07/2017 à 01:19, Victor Porton a écrit :
> Dmitry A. Kazakov wrote:
>> I don't understand what is wrong with:
>>
>> type Stateful is limited private;
>> function Generate (X : not null access Stateful) return Things;
>
> The thing "wrong" with this, is that it is sometimes hard to implement
> Generate(). In Python this can be done in an easier way by using generators
> (Python's "yield" keywords).
>
Please provide a concrete example, so we can compare solutions!

Randy Brukardt

unread,
Jul 12, 2017, 1:35:19 AM7/12/17
to
"Victor Porton" <por...@narod.ru> wrote in message
news:ok3meg$fg6$1...@gioia.aioe.org...
> Dmitry A. Kazakov wrote:
>
....
>>> A task is a perfect abstraction for a thread of control that maintains
>>> its own execution state!
>>
>> Except that no control is required.
>>
>> I don't understand what is wrong with:
>>
>> type Stateful is limited private;
>> function Generate (X : not null access Stateful) return Things;
>
> The thing "wrong" with this, is that it is sometimes hard to implement
> Generate(). In Python this can be done in an easier way by using
> generators
> (Python's "yield" keywords).

The thing "wrong" with this is that it forces sequential execution. A Python
generator is by its nature a serial bottleneck (indeed, when this was
discussed on the ARG list, that was suggested as an advantage!!! of the
Python approach).

The coming problem with "classic" programming languages like Ada is that
they don't map well to architectures with a lot of parallelism. The more
constructs that require sequential execution, the worse that Ada programs
will perform. With that in mind, it is hard to imagine why anyone would want
to invest the hundreds or even thousands of hours required for a new
language version on still more sequential constructs. Especially constructs
that by their nature require sequential execution.

Randy.


darkestkhan

unread,
Jul 12, 2017, 1:42:12 AM7/12/17
to
Other ways in which this can be done (beside generic package with state):
- task
- protected object
- additional argument of [tagged] limited type

Likely some other ways are possible. I just don't see what is the point of special construct for generators when we got at least 4 ways to do it.

G.B.

unread,
Jul 12, 2017, 2:35:05 AM7/12/17
to
Arguably, first, a "function" lacks query/command separation:

type Stateful is limited interface;
function Current (x : Stateful) return Thing; -- (*)
procedure Next (x : in out Stateful);

If, then, so-called "elegance" of side-effect generating functions
(or of some equivalent operator) is required for reasons of
socially acceptable expression, to make generators visible,
the above pattern should be integrated with some special iteration
scheme like Ada now has in "for ... of ... loop".

Not surprisingly, I think, Icon's generators are available to
use in while-loops that look perfectly ordinary. If Ada requires
generators to be more explicit:

G : Stateful := Strings.Find
(Subject => "...",
Pattern => "___");

while X of G with condition (X) loop -- not Ada
...
end loop;

"while ... of ... with" then requires G to be a Stateful.
Presence of special syntax is, as previously, HOL.

__
(*) Only if the type is special this can make any sense.
E.g., calling Current twice should raise an exception.
That's unlikely to happen if there is special Ada syntax tied
to Stateful, hence "while ... of ... with".

Dmitry A. Kazakov

unread,
Jul 12, 2017, 3:19:59 AM7/12/17
to
That would be a co-routine rather than merely a stateful execution.
These are not same. Regarding co-routines J-P already answered it, a
task is exactly what a co-routine is, except for external scheduling. To
have co-routines in Ada everything one needs is user-defined explicit
scheduling. 'Yield' belongs there.

Dmitry A. Kazakov

unread,
Jul 12, 2017, 3:27:31 AM7/12/17
to
On 12/07/2017 07:35, Randy Brukardt wrote:
> "Victor Porton" <por...@narod.ru> wrote in message
> news:ok3meg$fg6$1...@gioia.aioe.org...
>> Dmitry A. Kazakov wrote:
>>
> ....
>>>> A task is a perfect abstraction for a thread of control that maintains
>>>> its own execution state!
>>>
>>> Except that no control is required.
>>>
>>> I don't understand what is wrong with:
>>>
>>> type Stateful is limited private;
>>> function Generate (X : not null access Stateful) return Things;
>>
>> The thing "wrong" with this, is that it is sometimes hard to implement
>> Generate(). In Python this can be done in an easier way by using
>> generators
>> (Python's "yield" keywords).
>
> The thing "wrong" with this is that it forces sequential execution.

Some algorithms, protocols in particular, are sufficiently easier and
clearer to implement in a sequential manner: fetch data, wait, react,
repeat. Ada tasks is an ideal abstraction for this except for being too
heavy weight for certain applications.

> The coming problem with "classic" programming languages like Ada is that
> they don't map well to architectures with a lot of parallelism.

Rather the opposite. Parallel architectures do not map well to any
reasonable programming language. The thing is hard no matter how you
approach it.

Dmitry A. Kazakov

unread,
Jul 12, 2017, 3:34:05 AM7/12/17
to
On 12/07/2017 08:35, G.B. wrote:
> On 11.07.17 22:25, Dmitry A. Kazakov wrote:
>> On 2017-07-11 21:22, J-P. Rosen wrote:
>>> Le 11/07/2017 à 20:36, Victor Porton a écrit :
>>>> This state also includes the point of execution (like "We are in the
>>>> second
>>>> loop in its second operator.")
>>>>
>>>>> This can easily be achieved with a function within a package, where
>>>>> the
>>>>> package serves to protect and hide the function's state.
>>>> Ada currently has no support to save such (point of execution) states.
>>> It has. Make your generator a task, and have an accept when the data is
>>> ready, which can be nested as deep as you want in any construct.
>>>
>>> A task is a perfect abstraction for a thread of control that maintains
>>> its own execution state!
>>
>> Except that no control is required.
>>
>> I don't understand what is wrong with:
>>
>> type Stateful is limited private;
>> function Generate (X : not null access Stateful) return Things;
>
> Arguably, first, a "function" lacks query/command separation:
>
> type Stateful is limited interface;
> function Current (x : Stateful) return Thing; -- (*)
> procedure Next (x : in out Stateful);

I don't understand this example. Why Next cannot be a part of Current?
Example:

X := Stack.Pop;

> Not surprisingly, I think, Icon's generators are available to
> use in while-loops that look perfectly ordinary. If Ada requires
> generators to be more explicit:
>
> G : Stateful := Strings.Find
> (Subject => "...",
> Pattern => "___");
>
> while X of G with condition (X) loop -- not Ada
> ...
> end loop;
>
> "while ... of ... with" then requires G to be a Stateful.
> Presence of special syntax is, as previously, HOL.

I don't understand this example. If you mean multiple return values
(tuples), yes Ada should have them, but that is unrelated to stefulness.

Maciej Sobczak

unread,
Jul 12, 2017, 4:58:00 AM7/12/17
to
> Taking the example from Wikipedia:
>
> function startAt(x)
> function incrementBy(y)
> return x + y
> return incrementBy
>
> variable closure1 = startAt(1)
> variable closure2 = startAt(5)

Note that this example does not exploit (pun intended) the fact that such variables can be created dynamically, where their exact number is not known until the program is run. Imagine an array (of length determined at run-time) of such generators. This shows that static approach of Ada is not always convenient and is not applicable to every problem.

Dmitry is right that generators can be replaced with functions that operate on explicit state, but generators are useful, because they have the same interface as stateless functions[*] and can be used as arguments in higher-level generics that accept "callable" entities without caring about whether they are stateful or not[**]. Functional languages (or at least those that claim to have some features from functional programming) are convenient for gluing stuff together this way, with obvious compromises in other areas.

[*] That is, there is some commonality between stateless and stateful functions and it might be useful to capture this commonality in some programming concept that is not present in Ada.

[**] C++ relies on such constructs in the STL library (and many others), but calls them "functors" instead. That is, a generator can be implemented as a stateful functor (and they can be created dynamically). Whether they are used wisely is another story.

--
Maciej Sobczak * http://www.inspirel.com

Victor Porton

unread,
Jul 12, 2017, 9:07:19 AM7/12/17
to
J-P. Rosen wrote:

> Le 12/07/2017 à 01:19, Victor Porton a écrit :
>> Dmitry A. Kazakov wrote:
>>> I don't understand what is wrong with:
>>>
>>> type Stateful is limited private;
>>> function Generate (X : not null access Stateful) return Things;
>>
>> The thing "wrong" with this, is that it is sometimes hard to implement
>> Generate(). In Python this can be done in an easier way by using
>> generators (Python's "yield" keywords).
>>
> Please provide a concrete example, so we can compare solutions!

I don't have at hand an Ada example. So instead I will give two examples in
Python with and without explicit generators. Which of the two is simpler?

1 # Using an explicit generator function
2 def firstn(n):
3 num, nums = 0, []
4 while num < n:
5 yield num
6 num += 1

and

1 # Using the generator pattern (an iterable)
2 class firstn(object):
3 def __init__(self, n):
4 self.n = n
5 self.num, self.nums = 0, []
6
7 def __iter__(self):
8 return self
9
10 # Python 3 compatibility
11 def __next__(self):
12 return self.next()
13
14 def next(self):
15 if self.num < self.n:
16 cur, self.num = self.num, self.num+1
17 return cur
18 else:
19 raise StopIteration()

Modified from
https://wiki.python.org/moin/Generators

Dmitry A. Kazakov

unread,
Jul 12, 2017, 9:38:56 AM7/12/17
to
On 12/07/2017 15:07, Victor Porton wrote:
> J-P. Rosen wrote:
>
>> Le 12/07/2017 à 01:19, Victor Porton a écrit :
>>> Dmitry A. Kazakov wrote:
>>>> I don't understand what is wrong with:
>>>>
>>>> type Stateful is limited private;
>>>> function Generate (X : not null access Stateful) return Things;
>>>
>>> The thing "wrong" with this, is that it is sometimes hard to implement
>>> Generate(). In Python this can be done in an easier way by using
>>> generators (Python's "yield" keywords).
>>>
>> Please provide a concrete example, so we can compare solutions!
>
> I don't have at hand an Ada example. So instead I will give two examples in
> Python with and without explicit generators. Which of the two is simpler?
>
> 1 # Using an explicit generator function
> 2 def firstn(n):
> 3 num, nums = 0, []
> 4 while num < n:
> 5 yield num
> 6 num += 1


type First (N : Positive) is tagged limited record
Self : not null access First'Class := First'Unchecked_Access;
Current : Positive := 1;
end record;
function Value (Generator : First) return Integer;

function Value (Generator : First) return Integer is
begin
if Generator.Current > Generator.N then
raise End_Error;
else
return Result : Integer := Generator.Current do
Generator.Self.Current := Generator.Current + 1;
end return;
end if;
end Value;

and then

Dozen : First (12);
begin
loop
Put_Line (Integer'Image (Dozen.Value));
end loop;
exception
when End_Error =>
null;

This should print
1
2
...
12

Paul Rubin

unread,
Jul 12, 2017, 1:09:28 PM7/12/17
to
Victor Porton <por...@narod.ru> writes:
> every algorithm can be done with OOP...
> So the question is not whether an equivalent algorithm can be coded, but
> whether it can be done naturally and explicitly.

There is a fairly natural bijection between OOP and closures, if that's
what you're asking.

> Is it possible to add explicit generator functions to Ada?

I don't know if Ada has syntactic support for traversing a container by
calling some method on it. C++ and Python both have that and I doubt it
would be a big deal to add it to Ada if it's not there already.

Paul Rubin

unread,
Jul 12, 2017, 1:34:33 PM7/12/17
to
"J-P. Rosen" <ro...@adalog.fr> writes:
> AFAIU, generators are functions with states, where the returned value at
> one call depends on the state that was left behind by the previous call.
> This can easily be achieved with a function within a package, where the
> package serves to protect and hide the function's state.

It's not always so easy though. Generators can yield from other
generators, so doing them as functions with states can require the
states to be stacks or collections of stacks. Doing it with tasks is
more natural, but Ada tasks can be heavyweight, e.g. mapping to Posix
threads.

In Haskell, generators aren't even explicit: every list is automatically
a generator. Here's an example which would be quite messy in Ada:
a 5-smooth number a/k/a Hamming number is a number with no prime factors
greater than 5. The first 20 of them are:

1,2,3,4,5,6,8,9,10,12,15,16,18,20,24,25,27,30,32,36

Problem: find the 1500th Hamming number (answer: 859963392). This is a
classic Haskell problem solved by merging corecursively generated lists:

hamming = 1 : m 2 `merge` m 3 `merge` m 5 where
m k = map (k *) hamming
xxs@(x:xs) `merge` yys@(y:ys) = case x `compare` y of
LT -> x : (xs `merge` yys)
EQ -> x : (xs `merge` ys)
GT -> y : (xxs `merge` ys)

main = print (hamming !! 1499)

This prints the answer instantly (0.004 seconds elapsed, mostly in the
operating system). I remember doing it in C++ and it was a page or so
of ugly code.

Victor Porton

unread,
Jul 12, 2017, 1:39:09 PM7/12/17
to
Paul Rubin wrote:

> Victor Porton <por...@narod.ru> writes:
>> every algorithm can be done with OOP...
>> So the question is not whether an equivalent algorithm can be coded, but
>> whether it can be done naturally and explicitly.
>
> There is a fairly natural bijection between OOP and closures, if that's
> what you're asking.

I am asking for the easier (smaller code) side of the bijection. That
generator functions with "yield" (in Python sense) can be replaced with
another side of the bijection does not necessarily mean that we should use
the more long code side of the bijection, when we have shorter and probably
easier to understand (and thus probably more reliable) code at the another
side of the bijection.

>> Is it possible to add explicit generator functions to Ada?
>
> I don't know if Ada has syntactic support for traversing a container by
> calling some method on it. C++ and Python both have that and I doubt it
> would be a big deal to add it to Ada if it's not there already.

I am not about syntactic support for traversing a container by calling some
method of it (certainly both Ada, Python, and C++) do support it.

I am about iterators coded as a function with "yield" construct (in Python
sense). I am not about iterators coded with plain OOP without using "yield"
or a similar construct.

G.B.

unread,
Jul 12, 2017, 4:49:23 PM7/12/17
to
On 12.07.17 09:34, Dmitry A. Kazakov wrote:

>>> I don't understand what is wrong with:
>>>
>>> type Stateful is limited private;
>>> function Generate (X : not null access Stateful) return Things;
>>
>> Arguably, first, a "function" lacks query/command separation:
>>
>> type Stateful is limited interface;
>> function Current (x : Stateful) return Thing; -- (*)
>> procedure Next (x : in out Stateful);
>
> I don't understand this example. Why Next cannot be a part of Current? Example:
>
> X := Stack.Pop;

It didn't say it can't be. (Without query/command separation,
one needs access parameters using most versions of Ada, though,
because then a function (query) is not really allowed to modify
_its_ parameters, unlike a procedure (command) that has an in out
parameter.)

Assume that the language has special generator syntax, e.g.:

while X of Stack with 0 < X and X < 10 loop

This is to mean that generating a value happens once per iteration,
binding the result to the X before *of* and then evaluating the
condition. Stack being a Stateful, it has the prim ops needed.

If X, still a generator, were instead to be repeatedly evaluated,
as in

while 0 < X and X < 10 loop

then it isn't clear how many values will be generated if,
under the hood, the expression entails evaluating

0 <Stack.Pop and Stack.Pop < 10.

What about optimizing? Nothing new.

At first sight, this pattern can be expressed by making
a single assignment which has been working even before Ada 83:

X : Value_Type := Stack.Pop;
while 0 < X and X < 10 loop

But this pattern can only lead to emulating any number of
particular generator-like objects, not to generators:
no special types, no syntax, hence no support from the compiler.

Note that this argument emphasizes the use of a generator,
not the construct that furnishes it.

Shark8

unread,
Jul 12, 2017, 6:47:19 PM7/12/17
to
On Tuesday, July 11, 2017 at 11:35:19 PM UTC-6, Randy Brukardt wrote:
>
> The coming problem with "classic" programming languages like Ada is that
> they don't map well to architectures with a lot of parallelism. The more
> constructs that require sequential execution, the worse that Ada programs
> will perform.

I agree, but, on the flip-side NVidia's CUDA and OpenMP are weird/horriffic amalgamations of [arguably] high-level language and what is essentially "parallel-assembly" via pragma, which does nothing at all for the reputation of parallel in general except MAKE it both difficult and error-prone.

I do wonder how much it would be different if instead of the "parallel pragma enabled C" compiler they released for CUDA they had gone with an Ada compiler with a simple/single pragma "parallel" that could be applied to TASK or [perhaps] protected-object constructs.

Yes, I know that "automatic parallelization" of high-level language is considered fairly hard right now, but less than 50 years ago an optimizing compiler was considered hard... and we gained a LOT from allowing optimizing compilers + bootstrapping.

That's also why I'm also a bit dubious about the proposed PARALLEL / AND / END blocks: it seems to me that the mistake here is to delve too far into the minutia (the above "parallel assembly" idea) so as to make it difficult or impossible to automatically optimize parallel code because of the more low-level view imposed by the language... much like C's notion of arrays is fundamentally broken and undermines the benefits of C++.

Now, I admit I could very well be wrong here, but there's a gut-feeling that this is not the route we want to go down.

Paul Rubin

unread,
Jul 13, 2017, 2:35:42 AM7/13/17
to
Victor Porton <por...@narod.ru> writes:
> I am about iterators coded as a function with "yield" construct (in Python
> sense).

Oh ok, yeah, that's harder. You could do it as a switch statement with
branches at the yield points.

Dmitry A. Kazakov

unread,
Jul 13, 2017, 4:18:22 AM7/13/17
to
On 12/07/2017 22:49, G.B. wrote:

> Assume that the language has special generator syntax, e.g.:
>
> while X of Stack with 0 < X and X < 10 loop
>
> This is to mean that generating a value happens once per iteration,
> binding the result to the X before *of* and then evaluating the
> condition. Stack being a Stateful, it has the prim ops needed.

OK, but this is not generator syntax. It is the problem of using some
computed value twice. Yes we need a syntax for that. Far more important
than the loop-generator case are cases like:

"Generator-new":

X := new S'Class;
declare
Object : S'Class renames S'Class (X.all);
begin
...
end;

"Generator-return":

like above, create a more specific instance returning it as a more
generic class-wide.

"generator-test":

if X in T'Class then
declare
Object : T'Class renames T'Class (X);
begin
...
end;

> If X, still a generator, were instead to be repeatedly evaluated,
> as in
>
> while 0 < X and X < 10 loop
>
> then it isn't clear how many values will be generated if,
> under the hood, the expression entails evaluating
>
> 0 <Stack.Pop and Stack.Pop < 10.

"Generator-test" above.

> Note that this argument emphasizes the use of a generator,
> not the construct that furnishes it.

I see why co-routines are useful when dressed as tasks. I see great use
in syntax constructs which might reduce above cases into single liners.
I see no use in stateful functions, only damage.

Robert Eachus

unread,
Jul 16, 2017, 9:11:46 AM7/16/17
to
On Wednesday, July 12, 2017 at 6:47:19 PM UTC-4, Shark8 wrote:
> On Tuesday, July 11, 2017 at 11:35:19 PM UTC-6, Randy Brukardt wrote:
> >
> > The coming problem with "classic" programming languages like Ada is that
> > they don't map well to architectures with a lot of parallelism. The more
> > constructs that require sequential execution, the worse that Ada programs
> > will perform.
>

> That's also why I'm also a bit dubious about the proposed PARALLEL / AND / END blocks: it seems to me that the mistake here is to delve too far into the minutia (the above "parallel assembly" idea) so as to make it difficult or impossible to automatically optimize parallel code because of the more low-level view imposed by the language... much like C's notion of arrays is fundamentally broken and undermines the benefits of C++.
>
> Now, I admit I could very well be wrong here, but there's a gut-feeling that this is not the route we want to go down.

In the process of going down this rabbit hole from the back entrance...

Consider a very lightweight task construct like:

for parallel I in X'Range loop...end loop;

Obviously anything declared within the loop needs no protection (unless there are nested tasks). Similarly, reads of external variables need no protection as long as there are no updates during the existence of the loop. Can these tasks call (heavyweight) tasks. Sure, as long the reverse is not permitted. Note that you don't want to do this unless it is a single call with the correct answer. (This needs some thinking about. Maybe: for I in X'Range loop until Solved... with the semantics that no new threads will be spawned once Solved is set to true.) What about protected objects? Fine, as long as you realize that calling functions of a protected object are fine, but calls to procedures and entries of a protected object from within a parallel loop are likely to bring you down to single threaded speeds. Atomic objects can be updated, but it seems best to have intrinsic operations which map to RMW cycles which have hardware support. For example, a Max intrinsic: procedure Max(X,Y) assigns Y to X if Y is greater than X, otherwise does nothing. Or a function Add(X,Y) that returns X+Y. Or even an Add_One(X) that returns true if X becomes zero if that is all you have.

Again the direction I am coming at this from is how to create very lightweight threads that can use all of those CPU cores or GPU shaders out there. When computing an array, you depend on the cache coherency and write pipes to insure that writes to a result array don't produce garbage. If this means that threads must be merged (by hand or compiler) so that a complete cache line is written by one lightweight thread so be it. For example, in matrix multiplication you might have to compute 4 or 8 entries in the result array from one thread. Not a problem if it allows me to compute the result using all the CPU cores or shaders available without spending 90% of cycles in synchronization.

Would doing all this "right" require a lot of compiler support? Sure. But if that is what it takes, there will be (expensive) compilers for large machines, and compilers which recognize the syntax and translate it into standard Ada tasking.

Randy Brukardt

unread,
Jul 17, 2017, 7:54:56 PM7/17/17
to
"Shark8" <onewing...@gmail.com> wrote in message
news:b6e36917-63f0-4c13...@googlegroups.com...
On Tuesday, July 11, 2017 at 11:35:19 PM UTC-6, Randy Brukardt wrote:
>
>> The coming problem with "classic" programming languages like Ada is that
>> they don't map well to architectures with a lot of parallelism. The more
>> constructs that require sequential execution, the worse that Ada programs
>> will perform.
...
>Yes, I know that "automatic parallelization" of high-level language is
>considered
> fairly hard right now, but less than 50 years ago an optimizing compiler
> was
> considered hard... and we gained a LOT from allowing optimizing compilers
> + bootstrapping.

No, I think there is very little chance of it becoming possible. When I was
a young buck at the University of Wisconsin, there was a lot of optimism and
research into automatic parallelization of programs (particularly Fortran
programs). I recall reading a number of papers in this area for classes.

Fast forward (gulp!) about 38 years, and most of the optimism is gone, and
there has not been a lot of progress in the area of automatic
parallelization. Certainly, compilers can recognize certain patterns and
parallelize them (generally in cases where all of the "tasklets" run the
same code), but general-purpose parallelization has defied solution (at
least for conventional languages -- there might be more luck with a
purpose-built language like Parasail -- but this is an Ada forum and such
languages are irrelevant for Ada).

It should be clear that fine-grained parallelism (the kind that a compiler
could conceivably do) is likely to be a failure, as scheduling takes a
significant amount of overhead, on top of which a lot of conventional
sequential code doesn't have much parallelism. (Remember that things like
exceptions, assignment semantics, and the like add sequential requirements
that probably aren't necessary to solve the actual problem.)

I would never want to say never, but I think it is far more likely that
programming will cease to be a human activity at all (perhaps because
everything is replaced by neural nets) than automatic parallelization of
"classic" languages become possible.

>That's also why I'm also a bit dubious about the proposed PARALLEL / AND /
>END
> blocks: it seems to me that the mistake here is to delve too far into the
> minutia (the
> above "parallel assembly" idea) so as to make it difficult or impossible
> to automatically
> optimize parallel code because of the more low-level view imposed by the
> language...
> much like C's notion of arrays is fundamentally broken and undermines the
> benefits
> of C++.

If you believe, like I do, that automatic parallelization is likely to be
impossible in the reasonable term, the next best thing is to give the
compiler hints. Moreover, nothing about "parallel" blocks and loops actually
requires them to be run in parallel; it just suggests the possibility. It
also includes checking that the "parallel" code is actually possible to
execute in parallel, so you are actually giving an optimizer more help by
ensuring that the code can really be run that way.

My goal for these constructs is that they are essentially semantically
identical to the non-parallel versions, just with checking that parallel
execution is possible. (I have no idea how close or far from that goal we
will end up, as these constructs change a lot from meeting to meeting.)

> Now, I admit I could very well be wrong here, but there's a gut-feeling
> that
> this is not the route we want to go down.

You're wrong here. ;-)

I think it is highly unlikely that the majority of Ada code could ever be
parallelized automatically. Assignments (needed for results like
reductions), and the handling of exceptions would prevent that. There is
also the need (at least in the foreseeable future) to make your "chunks" of
execution large enough so that scheduling overhead doesn't kill any
parallelism gains. That's best handled with explicit code of some kind;
Brad's libraries are a step in the right direction, but one would want
checking for conflicts (one of the things that is extremely hard to do with
existing Ada tasks) and more integration with the syntax.

Randy.


Dmitry A. Kazakov

unread,
Jul 18, 2017, 3:38:14 AM7/18/17
to
On 18/07/2017 01:54, Randy Brukardt wrote:

> No, I think there is very little chance of it becoming possible. When I was
> a young buck at the University of Wisconsin, there was a lot of optimism and
> research into automatic parallelization of programs (particularly Fortran
> programs). I recall reading a number of papers in this area for classes.
>
> Fast forward (gulp!) about 38 years, and most of the optimism is gone, and
> there has not been a lot of progress in the area of automatic
> parallelization. Certainly, compilers can recognize certain patterns and
> parallelize them (generally in cases where all of the "tasklets" run the
> same code), but general-purpose parallelization has defied solution (at
> least for conventional languages -- there might be more luck with a
> purpose-built language like Parasail -- but this is an Ada forum and such
> languages are irrelevant for Ada).
>
> It should be clear that fine-grained parallelism (the kind that a compiler
> could conceivably do) is likely to be a failure, as scheduling takes a
> significant amount of overhead, on top of which a lot of conventional
> sequential code doesn't have much parallelism. (Remember that things like
> exceptions, assignment semantics, and the like add sequential requirements
> that probably aren't necessary to solve the actual problem.)
>
> I would never want to say never, but I think it is far more likely that
> programming will cease to be a human activity at all (perhaps because
> everything is replaced by neural nets) than automatic parallelization of
> "classic" languages become possible.

Huh, when I was young everybody believed AI round the corner. However
already then, long long before that actually, it was well known that
neuronal networks were a dead end.

The algorithm (Perceptron) was invented in 50's and already then shown
that it can separate only linear classes, that is by a flat surface in a
n-D space. Imagine how does it work with two concentric circles. Even a
hen can do that, and probably an insect too.

Fast forward, people forgot 50's, 80's and incidentally the very skill
of researching (reading?) scientific publications ... then Perceptron
was reborn!

> If you believe, like I do, that automatic parallelization is likely to be
> impossible in the reasonable term, the next best thing is to give the
> compiler hints. Moreover, nothing about "parallel" blocks and loops actually
> requires them to be run in parallel; it just suggests the possibility. It
> also includes checking that the "parallel" code is actually possible to
> execute in parallel, so you are actually giving an optimizer more help by
> ensuring that the code can really be run that way.

I fully agree.

> My goal for these constructs is that they are essentially semantically
> identical to the non-parallel versions, just with checking that parallel
> execution is possible. (I have no idea how close or far from that goal we
> will end up, as these constructs change a lot from meeting to meeting.)

IMO, it is pure time-wasting while so many problems stay unresolved.

Modern multi-core architectures of shared memory is clearly a dead end.
When new architectures emerge we will see what language support or even
paradigms will be needed to handle these, not before,
0 new messages