Hi,
using array2d is indeed the only (and idiomatic) way of initialising a 2d array using a comprehension. However, you can use the fact that nested generators are executed in the same order as multi-dimensional arrays are laid out in memory, so you can simply write
array[A,B] of int: matrix = array2d(A, B, [ a*b | a in A, b in B ]);
We've been thinking about adding syntax for 2d and 3d comprehensions, so this may come in a future release.
Note that [ [a*b | a in A] | b in B ] would be a different type, array[B] of array[A] of int, since you would be able to write comprehensions that result in irregularly shaped arrays. E.g., [ [a*b] | a in 1..b | b in 1..10 ] can't be expressed as a single multi-dimensional array.
We could introduce syntax such as [ a*b | a in A | b in B ], where the two generator parts would be independent, i.e., [ a*b | a in 1..b | b in B ] would not be allowed since b isn't visible in the inner generator. This could of course be extended to three or more dimensions as well.
One problem that this syntax doesn't solve is that comprehensions generate 1-based integer index sets, so you wouldn't get any type safety when using enums. E.g. consider the following code:
enum X = {A,B,C};
enum Y = {D,E,F};
array[X,Y] of int: x = [ 1 | i in Y | j in X ];
This is incorrect, because we're iterating over Y and X in the wrong order. Currently, the compiler wouldn't catch this, and even worse, if you add an array2d call, it will happily coerce the index sets (because they have the same size). So it would be good to have a comprehension construct that automatically determines the resulting index sets as well, but I haven't been able to come up with a concise and clear syntax for that yet.
Cheers,
Guido