constexpr all_of etc.

394 views
Skip to first unread message

David Krauss

unread,
Sep 2, 2014, 10:19:41 AM9/2/14
to std-pr...@isocpp.org
The initializer_list overloads of std::min and std::max are constexpr. Why not add similar overloads to std::all_of, any_of, and none_of?

The use that comes to mind is pack inequality.

template< typename ... pack_a, typename ... pack_b >
std::enable_if_t< ! std::all_of({ std::is_same_v< pack_a, pack_b > ... }) >

Although, come to think of it, this can be done with tuples.

std::enable_if_t< ! std::is_same_v< std::tuple< pack_a ... >, std::tuple< pack_b ... > > >

I’d still prefer the <algorithm> solution.

Ville Voutilainen

unread,
Sep 2, 2014, 11:14:41 AM9/2/14
to std-pr...@isocpp.org
On 2 September 2014 12:59, David Krauss <pot...@gmail.com> wrote:
> The initializer_list overloads of std::min and std::max are constexpr. Why
> not add similar overloads to std::all_of, any_of, and none_of?

Sounds like a decent idea. Any particular reason why we wouldn't make
the existing overloads constexpr?

> The use that comes to mind is pack inequality.
>
> template< typename ... pack_a, typename ... pack_b >
> std::enable_if_t< ! std::all_of({ std::is_same_v< pack_a, pack_b > ... }) >
>
> Although, come to think of it, this can be done with tuples.
>
> std::enable_if_t< ! std::is_same_v< std::tuple< pack_a ... >, std::tuple<
> pack_b ... > > >
>
> I’d still prefer the <algorithm> solution.


Sure, the tuple solution works for this particular case, but I suppose
the algorithm
solution has more potential uses.

George Makrydakis

unread,
Sep 2, 2014, 12:50:21 PM9/2/14
to std-pr...@isocpp.org, Ville Voutilainen

The tuple solution works because you are using std::tuple as a boost::mpl::vector substitute and implementing catamorphisms over packs through "tuple processing" naming them differently from what they substantially are.

Are you certain that making constexpr overloads of such algorithms like all_of etc is better than addressing the real issue, meaning the lack of first class support for the fundamentals allowing such constructs? Shouldn't these things be considered eventually by expanding <type_traits> to boost::mpl territory?

Seems more like another popular substitute by hasty inspiration than a solution by design.

rhalb...@gmail.com

unread,
Sep 2, 2014, 1:35:07 PM9/2/14
to std-pr...@isocpp.org
On Tuesday, September 2, 2014 5:14:41 PM UTC+2, Ville Voutilainen wrote:
On 2 September 2014 12:59, David Krauss <pot...@gmail.com> wrote:
> The initializer_list overloads of std::min and std::max are constexpr. Why
> not add similar overloads to std::all_of, any_of, and none_of?

Sounds like a decent idea. Any particular reason why we wouldn't make
the existing overloads constexpr?

Funny enough, the relaxed constexpr paper N3597 has std::bitset::any() as a motivating example, so it would fit right into that philosophy.

And since std::move is also constexpr as of C++14, why not make std::swap constexpr as well?  

And why not lambdas and all of std::array, std::bitset, std::tuple, std::complex as well? 

Or ultimately: full compile-time function evaluation as is possible in D (anything not involving I/O, so including virtuals and dynamic allocation).

Ville Voutilainen

unread,
Sep 2, 2014, 1:42:04 PM9/2/14
to std-pr...@isocpp.org
On 2 September 2014 20:35, <rhalb...@gmail.com> wrote:
> And since std::move is also constexpr as of C++14, why not make std::swap
> constexpr as well?
>
> And why not lambdas and all of std::array, std::bitset, std::tuple,
> std::complex as well?

Chances are that most of things in that set of examples are good candidates for
additional constexpr.

> Or ultimately: full compile-time function evaluation as is possible in D
> (anything not involving I/O, so including virtuals and dynamic allocation).


It's, however, likely that at some point there's going to be
increasing resistance
towards imposing such compile-time evaluation of everything on every
implementation.

George Makrydakis

unread,
Sep 2, 2014, 1:47:39 PM9/2/14
to std-pr...@isocpp.org, rhalb...@gmail.com

On September 2, 2014 8:35:07 PM EEST, rhalb...@gmail.com wrote:

>And why not lambdas and all of std::array, std::bitset, std::tuple,
>std::complex as well?
>
>Or ultimately: full compile-time function evaluation as is possible in
>D
>(anything not involving I/O, so including virtuals and dynamic
>allocation).

A simple explanation to this is political pride over technical substance. You will get many non-reasons supporting this stance. It is more convenient to proceed this way because hasty inspirations are more popular for expert beginners.

Rein Halbersma

unread,
Sep 2, 2014, 1:50:19 PM9/2/14
to std-pr...@isocpp.org
On Tue, Sep 2, 2014 at 7:42 PM, Ville Voutilainen <ville.vo...@gmail.com> wrote:
On 2 September 2014 20:35,  <rhalb...@gmail.com> wrote:
> And since std::move is also constexpr as of C++14, why not make std::swap
> constexpr as well?
>
> And why not lambdas and all of std::array, std::bitset, std::tuple,
> std::complex as well?

Chances are that most of things in that set of examples are good candidates for
additional constexpr.

Great! Are there efforts underway that you know of?
 
> Or ultimately: full compile-time function evaluation as is possible in D
> (anything not involving I/O, so including virtuals and dynamic allocation).


It's, however, likely that at some point there's going to be
increasing resistance
towards imposing such compile-time evaluation of everything on every
implementation.

Just a naive and honest question: what are the main technical obstacles for compile-time evaluation of all non-I/O expressions? (and prefereably without the constexpr keyword, implicit would do nicely) Works for D, so it seems it can be done, but maybe D's compilation model is too different from C++'s?

George Makrydakis

unread,
Sep 2, 2014, 1:55:41 PM9/2/14
to std-pr...@isocpp.org, Rein Halbersma
Compile time evaluation implies implementation of constructs that have been naively considered as "abominations" for the language. See work done by Alexandrescu on static if. Some of it could get resurrected by Ville. Still not enough.

Rein Halbersma

unread,
Sep 2, 2014, 2:18:14 PM9/2/14
to std-pr...@isocpp.org
On Tue, Sep 2, 2014 at 7:55 PM, George Makrydakis <irreq...@gmail.com> wrote:
Compile time evaluation implies implementation of constructs that have been naively considered as "abominations" for the language. See work done by Alexandrescu on static if. Some of it could get resurrected by Ville. Still not enough.

With abomination you mean of course "posing obstacles to Concepts".

I don't see how interference with the Concepts holy grail (that basically torpedoed static_if) is an issue with full compile-time function evaluation. The restriction on constexpr lambdas is especially frustrating, as it can be worked around with using a handwritten struct with constexpr constructor and operator(). The same Concepts interference card was played against allowing polymorphic lambdas (which also could be worked around with using templated operator()), until that was shown to be resolvable. 

George Makrydakis

unread,
Sep 2, 2014, 2:26:56 PM9/2/14
to std-pr...@isocpp.org

You essentially caught the gist of it. Concepts are nothing more than glorified shorthands over sfinae hacks. Their political justification is the overwhelming non-reason for not going through with full blown compile time evaluation in a friendly way. Their actual intent would have been met if they behaved more like typeclasses instead of sfinae shorthands.

In another thread, Sutton says that he would consider concepts a failed experiment if they required any kind of template metaprogramming in order to work. Problem is that template metaprogramming is the quintessential tool for compile time evaluation and properly done, annihilates concepts the way they are right now. The more you advance constexpr metaprogramming to become as wide as template one, the less need you have for concepts. That is one big issue for constexpr.

--

---
You received this message because you are subscribed to the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this group and stop receiving emails from it, send an email to std-proposal...@isocpp.org.
To post to this group, send email to std-pr...@isocpp.org.
Visit this group at http://groups.google.com/a/isocpp.org/group/std-proposals/.

Ville Voutilainen

unread,
Sep 2, 2014, 2:27:09 PM9/2/14
to std-pr...@isocpp.org
On 2 September 2014 21:18, Rein Halbersma <rhalb...@gmail.com> wrote:
> I don't see how interference with the Concepts holy grail (that basically
> torpedoed static_if) is an issue with full compile-time function evaluation.


It seems there are enough other reasons besides Concepts that
torpedoed static_if.

rhalb...@gmail.com

unread,
Sep 2, 2014, 2:28:44 PM9/2/14
to std-pr...@isocpp.org
Which is why I asked: "what are the main technical obstacles for compile-time evaluation of all non-I/O expressions?" 

George Makrydakis

unread,
Sep 2, 2014, 2:29:56 PM9/2/14
to std-pr...@isocpp.org, Ville Voutilainen

The issue is not static if alone. The committee does not have a real argument against compile time evaluation. Just politics.

Ville Voutilainen

unread,
Sep 2, 2014, 3:15:30 PM9/2/14
to George Makrydakis, std-pr...@isocpp.org
On 2 September 2014 21:29, George Makrydakis <irreq...@gmail.com> wrote:
> The issue is not static if alone. The committee does not have a real
> argument against compile time evaluation. Just politics.

Well, as I see it, the argument is very real, and it's not politics in
the very sense
of the word, but concern about implementation complexity which in turn leads to
portability concerns when different implementations are able to ship
more complex
constexpr support(*) with vastly different schedules. If that's
politics to you, that's fine by me,
but that argument is _very_ real.

(*) Feel free to count the number of implementations that support
C++14 constexpr.
The result of that count is one. The expectation is that there should
eventually be half
a dozen more.

George Makrydakis

unread,
Sep 2, 2014, 3:15:51 PM9/2/14
to std-pr...@isocpp.org, rhalb...@gmail.com
I seriously doubt we are ever going to get a convincing argument against having such levels of compile time evaluation. People who are adamantly opposing it argue that it would be too hard for non-expert users (nonsense), that their compilers would be strained (nonsense) or they just admit their terror even in their own proposals (nonsense++). People using libraries like boost::mpl show that there is a wide area of coverage for such kinds of evaluation.

Mostly it is about waiting to gather everything into a bizzaro universe where they can be shown to be correct in their approach and pontificate a "solution" that supports their views. Or that they become proficient enough to understand the benefits of Turing completeness within the compiler. But why aren't they just accepting that such evaluation should be a priority?

Politics used to justify technical decisions are inexplicable to all but expert beginners. The continuous relaxation of constexpr requirements is slowly leading C++ towards compile time evaluation semantics (albeit very, very incompletely). I don't think anymore that most of these people actually know what they are doing, especially when they have only politics supporting them.


On September 2, 2014 9:28:44 PM EEST, rhalb...@gmail.com wrote:
>
>
>On Tuesday, September 2, 2014 8:27:09 PM UTC+2, Ville Voutilainen
>wrote:
>>
>> On 2 September 2014 21:18, Rein Halbersma <rhalb...@gmail.com

Rein Halbersma

unread,
Sep 2, 2014, 3:38:47 PM9/2/14
to std-pr...@isocpp.org
On Tue, Sep 2, 2014 at 9:15 PM, Ville Voutilainen <ville.vo...@gmail.com> wrote:

(*) Feel free to count the number of implementations that support
C++14 constexpr.
The result of that count is one. The expectation is that there should
eventually be half
a dozen more.

And what is holding the non-Clang compilers back exactly? IIRC, Clang moved from C++11-style to C++14-style constexpr in the matter of a few months last year. It's sad that the slowest adaptors determine the pace of progress. Surely the committee is not in the business of protecting half a dozen feet-dragging compilers from competitive pressures? 

It would be interesting to see what would happen if someone hacked Clang to introduce full D-style CTFE as an extension, (or TS or whatever the politically accepted phrase is) 

George Makrydakis

unread,
Sep 2, 2014, 3:40:48 PM9/2/14
to Ville Voutilainen, std-pr...@isocpp.org

With all due respect, people incapable of fixing their compilers are either lazy or ignorant in their overwhelming majority. They have the burden of proof of why they can't do it if there are no conflicts with what already is in a language. "I said so" does not count. I see people working in clang++ being able to keep up with language feature implementations as fast as these come about. I see experimental gcc branches working up their way with features.

Just because some sponsoring companies may have problems putting together a decent compiler team does not mean that C++ has to remain in the dark ages, especially when it comes to compile time evaluation.

Though as you know, I have understanding respect for your position as secretary, lack of *eventual* compiler support is not a valid reason for not solving a problem the right way. Your argument expresses that of certain people, but I think you are misguided. By analogy, one should not make an omelette because he would have to break some eggs. As such, the argument of compiler complexity either safeguards incompetence or is used as a deflector shield for "external" contributions piggybacking on poetical correctness in order to avoid liability of irresponsible decisions.

Properly solved problems, anticipate eventual compiler "inconsistencies" later on by forcing well defined semantics. I do not believe that such "difficulty" arguments have any real merit in the realm of compile time evaluation.

It is all basically politics as I have said before.

George Makrydakis

unread,
Sep 2, 2014, 3:59:26 PM9/2/14
to std-pr...@isocpp.org, Rein Halbersma

Having a C++ language feature extension that is not in the standard is paragonable to a language fork, when it comes to compile time evaluation. If successful, it could be repeated to the point of wondering why the committee subgroup responsible was against it, placing them in an awkward position. Given that good interests are intertwined in such an old player in the industry like C++, that is engineered to not happen. Cause and effect.

Respectfully, the committee members involved in crucial decisions like this have an immense burden of proof on their shoulders. I respect them for that, but I do not have to respect unprovable non-arguments even if they come out of critically acclaimed people. Even monkeys fall from trees.

You have no valid technical reason against compile time evaluation.

Ville Voutilainen

unread,
Sep 2, 2014, 3:59:51 PM9/2/14
to std-pr...@isocpp.org
On 2 September 2014 22:38, Rein Halbersma <rhalb...@gmail.com> wrote:
> And what is holding the non-Clang compilers back exactly? IIRC, Clang moved

Ask their implementers. As far as I understand, they have other priorities
and responsibilities in addition to maximum speed of implementing new features.

> from C++11-style to C++14-style constexpr in the matter of a few months last
> year. It's sad that the slowest adaptors determine the pace of progress.

I never said the slowest adaptors determine the pace of progress. But
if we don't
pay heed to cases where the majority of vendors are all in the camp of "slowest
adaptors", madness ensues. We shouldn't determine the pace of progress
any more by the fastest adaptors as we do by the slowest ones.

> Surely the committee is not in the business of protecting half a dozen
> feet-dragging compilers from competitive pressures?

A portion of the committee most certainly is. Other people may at times agree
with that portion due to other reasons. Fast progress without
portability is, for
example, more or less useless to me and my users.

> It would be interesting to see what would happen if someone hacked Clang to
> introduce full D-style CTFE as an extension, (or TS or whatever the
> politically accepted phrase is)

It certainly would be interesting, yes.

Rein Halbersma

unread,
Sep 2, 2014, 4:03:10 PM9/2/14
to std-pr...@isocpp.org
OK, Ville, thanks for your insights! I didn't mean to imply that one should hold you responsible for the current state of affairs :-)



--

---
You received this message because you are subscribed to a topic in the Google Groups "ISO C++ Standard - Future Proposals" group.
To unsubscribe from this topic, visit https://groups.google.com/a/isocpp.org/d/topic/std-proposals/qcKUf-U7_YU/unsubscribe.
To unsubscribe from this group and all its topics, send an email to std-proposal...@isocpp.org.

George Makrydakis

unread,
Sep 2, 2014, 4:09:11 PM9/2/14
to std-pr...@isocpp.org, Ville Voutilainen

Implementers should do as fine a job as the committee does, since some of them are members of the committee. Their refuge behind unprovable complexity claims is unfounded on technical terms when it harms the language by forcing programmers to go into complex library constructs that strain the compilers more than language features would.

Again, no valid reason against compile time evaluation is given, especially since far too many people have uses for it to the point of implementing such behaviour through template meta-programming. Just like David did with std::tuple for example on this thread.

Ville Voutilainen

unread,
Sep 2, 2014, 4:09:59 PM9/2/14
to std-pr...@isocpp.org
On 2 September 2014 23:03, Rein Halbersma <rhalb...@gmail.com> wrote:
> OK, Ville, thanks for your insights! I didn't mean to imply that one should
> hold you responsible for the current state of affairs :-)

No harm done. In general, for full compile-time evaluation, we would need
1) a proposal...
1.1) ...that has convincing motivation
2) to understand that shouting on this forum how C++ "chooses to
remain in the dark ages" doesn't cover
either of the above (and I'm not specifically addressing you with
this, Rein :) )
3) to understand that repeating the statement "you have no technical
argument" doesn't help much,
either. The committee doesn't need a technical argument, like it or
not. The burden of proof for
language extensions lies first and foremost on the proposal author.
The committee's _default_
answer to _anything_ that doesn't reach consensus is "no".

George Makrydakis

unread,
Sep 2, 2014, 4:23:20 PM9/2/14
to std-pr...@isocpp.org, Ville Voutilainen

Addressing others by proxy does not validate indefensible claims. The committee is proven wrong in all your points when decisions based on social non-arguments are put forward by sponsored lobbying. Especially when it has to handle budding drafts that are not proposals yet, that haven't been put to any vote, yet delegating their essence to their sponsored members using arguments as bogus as the ones for the lack of compile time evaluation. Worse if it is misguiding people.

So don't ask people for proposals if in the end you are the only ones who want to do them. And I am not addressing you directly as Ville, but as a responsible secretary. If you think that people should have respect for such an argument, the answer is "No".

Ville Voutilainen

unread,
Sep 2, 2014, 4:37:48 PM9/2/14
to George Makrydakis, std-pr...@isocpp.org
On 2 September 2014 23:23, George Makrydakis <irreq...@gmail.com> wrote:
> Addressing others by proxy does not validate indefensible claims. The
> committee is proven wrong in all your points when decisions based on social
> non-arguments are put forward by sponsored lobbying. Especially when it has
> to handle budding drafts that are not proposals yet, that haven't been put
> to any vote, yet delegating their essence to their sponsored members using
> arguments as bogus as the ones for the lack of compile time evaluation.
> Worse if it is misguiding people.

I think what's misguiding is all these unsubstantiated claims about
"social non-arguments",
"sponsored lobbying" and "bogus arguments".

> So don't ask people for proposals if in the end you are the only ones who
> want to do them. And I am not addressing you directly as Ville, but as a

We actually have evidence of not being the "only ones who want to do proposals".
Most of the successful proposals originating outside the committee actually
followed the proposal guidelines and considered the feedback given by the
subgroups of the committee.

> responsible secretary. If you think that people should have respect for such
> an argument, the answer is "No".

You seem to have a very bad misunderstanding what the INCITS/WG21 secretary
is and is not responsible for.

Richard Smith

unread,
Sep 2, 2014, 6:33:52 PM9/2/14
to std-pr...@isocpp.org
On Tue, Sep 2, 2014 at 12:38 PM, Rein Halbersma <rhalb...@gmail.com> wrote:
On Tue, Sep 2, 2014 at 9:15 PM, Ville Voutilainen <ville.vo...@gmail.com> wrote:

(*) Feel free to count the number of implementations that support
C++14 constexpr.
The result of that count is one. The expectation is that there should
eventually be half
a dozen more.

And what is holding the non-Clang compilers back exactly? IIRC, Clang moved from C++11-style to C++14-style constexpr in the matter of a few months last year.

The implementation of C++14 constexpr in Clang took, in total, about a week. But the starting point wasn't the same as it is for most compilers. There are basically two different ways that people have historically implemented constant expression evaluation in C and C++ compilers:

1) "fold": the AST is rewritten as constant expressions are simplified. In some implementations this happens as you parse, in others it happens as a separate step. So when you build a + operation whose operands are 1 and 1, you end up with an expression "2" and no evidence that you ever had a '+'. This also means you can use essentially the same code to perform various kinds of optimization.
2) Real evaluation: the AST represents the code as written, and a separate process walks it and produces a symbolic value from it.

Most implementations seem to do (1) in some way or another. This is fairly well-suited for C++11 constexpr (where you can in-place-rewrite calls to constexpr functions to the corresponding returned expression, substitute in values for function parameters, and just keep on folding), but not well-suited to C++14 constexpr. You can, in principle, use a similar technique in C++14, but mutability and non-trivial control flow means you need to keep a lot more things symbolic, retain an evaluation environment, and perform rewrites in the appropriate sequencing order. [Even in C++11, this approach is somewhat tricky, because (for instance) you need to track whether the fold invoked undefined behavior (which would render an expression non-constant) or otherwise did something that's not allowed in a constant expression.]

Clang has always done (2). This made the C++11 constexpr implementation more complex (because Clang couldn't rely on the existing code generation codepaths to perform constant expression evaluation; all sorts of new forms of evaluation needed to be implemented) but made implementing the C++14 constexpr much more straightforward, since most of the necessary infrastructure was built as part of the C++11 constexpr implementation.

It's sad that the slowest adaptors determine the pace of progress. Surely the committee is not in the business of protecting half a dozen feet-dragging compilers from competitive pressures? 

It would be interesting to see what would happen if someone hacked Clang to introduce full D-style CTFE as an extension, (or TS or whatever the politically accepted phrase is)

I think the remaining holes in C++14 constexpr are:

 * lambdas
 * destructors
 * polymorphism (virtual functions, dynamic_cast, typeid, exceptions [for polymorphic catch])
 * dynamic memory allocation

The other restrictions in 5.19 exist to prevent "bad things" (reading the value representation of an object, undefined behavior, uninitialized stuff, ...) that would be non-portable, unimplementable (inline asm, depending on values that don't exist until runtime, ...), or -- occasionally -- politically undesirable (goto considered harmful).

I'm confident we'll have a good answer for lambdas + constexpr in C++17.

Constexpr destructors add some implementation cost (implementations would need to track whether and when to run destructors).

The committee had some fears about implementation issues with polymorphism (requiring the compiler to track the dynamic type of an object), but we already actually require such tracking, to support things like checking the validity of a base-to-derived cast. I don't think there are pressing technical issues here.

Dynamic memory allocation is a challenge, due to three factors:
 1) Values will be represented differently during translation and during execution, so it's not reasonable to expose the value representation of objects during constexpr evaluation. This is mostly a problem for placement allocation functions, since these allow a constexpr evaluation to get a view of the same storage as both a T* and as (say) an array of unsigned char.
 2) Even if restricted to just the normal allocation functions, there is still the issue that the normal allocation functions are replaceable. We have (in C++14) adopted a rule that the compiler is not actually required to call these functions if it can satisfy the allocation in some other way, so this problem is mitigated; in constexpr evaluations, we would (out of necessity) simply not call the allocation function at all.
 3) If dynamic storage were able to "leak" from compile time evaluation to runtime evaluation, we would need a mechanism to set up an initial heap state of some form. This is both a desirable feature and a very technically challenging one.

The simplest case for dynamic allocation -- support non-placement new and delete only, do not allow the end result of a constant expression evaluation to refer to dynamically-allocated memory, and do not call a replacement global allocation/deallocation function -- is completely straightforward in Clang's implementation. I can't speak for the complexity that would be required in other implementations.

George Makrydakis

unread,
Sep 2, 2014, 7:17:21 PM9/2/14
to Ville Voutilainen, std-pr...@isocpp.org


On September 2, 2014 11:37:47 PM EEST, Ville Voutilainen
<ville.vo...@gmail.com> wrote:

>
>I think what's misguiding is all these unsubstantiated claims about
>"social non-arguments",
>"sponsored lobbying" and "bogus arguments".

C++ needs volunteers, but it seems that certain people will have
problems with this. I am just informing a new potential author on what
to expect, given that you tried to motivate another person into going
through with a proposal. I will refer you to the certain events occuring
in Rapperswil, where I was motivated by you to come and discuss some
preliminary work and what happened afterwards - despite I am very happy
of having being there. It was awesome! However...

No votes, no technical arguments, given a library guideline for
something addressing EWG#30 as discussed, then the following day seeing
the gist of it being graciously delegated over to an internal party as a
language feature. I cared little about the library guideline "result" as
can be documented in another email. I don't think anymore that most of
you people can actually claim to understand what template
metaprogramming is for. "Your" problem started the next day, because of
EWG#30 and what some brilliant but hasty people did. That's when "you"
lost face, to put it bluntly.

Sponsored lobbying is occurring when people who work for your sponsors
in a big company with some vested interest in C++, vote their work
colleagues while they don't even bother to consider third party work in
order to prove or disprove it - despite they "want" to hear what it is
about. The claims are pretty much substantiated and your defensive
position against them disvalues you. Don't discuss a case that did not
involve you directly, for obvious reasons.

>
>We actually have evidence of not being the "only ones who want to do
>proposals".
>Most of the successful proposals originating outside the committee
>actually
>followed the proposal guidelines and considered the feedback given by
>the
>subgroups of the committee.

Misleading in its generalization by induction and inaccurate in its
content by ad-hominem intention. You forwarded yourself two N-less
drafts that were not part of the pre-mailing of the last meeting to the
EWG committee which were not given equal treatment, in the case I am
referring to.

Herb Sutter mentioned in public that such ways of leading people into
discussing their preliminary work are not following procedure. The
results of such non-sequiturs disprove your position as to the
infallible modus operandi of certain committee practices.

Giving a library guideline one day while delegating the gist of the same
work to a specific person as a language feature the next day, is very,
very problematic to "your" image. I still think it was an honest error,
but I see no real amendment to it by those involved. I let the readers
assume what they wish. I am entitled to continue my work based on that
preliminary draft (not proposal!) and let their overconfident brilliance
deal with the obvious aftermath.

Therefore, not only can you not cite procedure to your defense, but
neither can the EWG subgroup justify its conflicting deliberations other
than on the basis of C++ political-social necessities and non-arguments,
given that the EWG is not incompetent. I find that particularly amusing
under first order logic.

>
>> responsible secretary. If you think that people should have respect
>for such
>> an argument, the answer is "No".
>
>You seem to have a very bad misunderstanding what the INCITS/WG21
>secretary
>is and is not responsible for.

Again, motivating people to come to standard C++ meetings and work
voluntarily on material that is then "encouraged" towards misleading
guidelines (and following redelegation to internal parties by "blunder")
is awkward for you to defend. More so when you are a committee member,
national body authority on wg21 and EWG C++ committee secretary. It is
very problematic for the prestige of such meetings despite you are not
guilty as charged. If "I" cannot trust somebody who probably understands
this language better than "I" do and is aware of how these things are
done procedurally by title, who am "I" to trust? Naked kings?

Therefore, I am suggesting that you should not motivate newcomers on
this list and in this manner to follow a procedure for language features
"you guys" have apparently varying respect for. I still consider that
documentable case an unfortunate error that inconveniently burdens
"you". It is a consequence of "your" actions that people with first hand
experience of such errors simply point out the inconsistencies of your
paper trail.

Lastly, you seem to have a very big misunderstanding of what I am
actually going to prove despite your witty attempts at subverting the
argument. This is not an argument "you guys" can disprove or ignore due
to facts of "your" own creation. I am working on my own public draft,
based on my preliminary work as any other "motivated" author. "Your"
error is worth a particular kind of treatment, by technical resolution.

Have a pleasant evening and remember to keep it technical; as is my
substantiated critique of "your" procedures. You will have to discuss
this in your little inner circle, regardless of what your forced stance
on this list may be.

I still think *you* are an honest man; feel free to prove it and
continue to do great work.

George

George Makrydakis

unread,
Sep 2, 2014, 7:26:04 PM9/2/14
to std-pr...@isocpp.org

On 09/03/2014 01:33 AM, Richard Smith wrote:
On Tue, Sep 2, 2014 at 12:38 PM, Rein Halbersma <rhalb...@gmail.com> wrote:
On Tue, Sep 2, 2014 at 9:15 PM, Ville Voutilainen <ville.vo...@gmail.com> wrote:

(*) Feel free to count the number of implementations that support
C++14 constexpr.
The result of that count is one. The expectation is that there should
eventually be half
a dozen more.

And what is holding the non-Clang compilers back exactly? IIRC, Clang moved from C++11-style to C++14-style constexpr in the matter of a few months last year.

The implementation of C++14 constexpr in Clang took, in total, about a week. But the starting point wasn't the same as it is for most compilers. There are basically two different ways that people have historically implemented constant expression evaluation in C and C++ compilers:

I would argue that doing anything related to extending templates and constexpr in clang++ is probably the best way to go about studying new features. It is a better structured compiler, despite some long standing bugs of interest to niche public. This makes it easy for it to implement new features and for others, like yours truly, to implement their own.


1) "fold": the AST is rewritten as constant expressions are simplified. In some implementations this happens as you parse, in others it happens as a separate step. So when you build a + operation whose operands are 1 and 1, you end up with an expression "2" and no evidence that you ever had a '+'. This also means you can use essentially the same code to perform various kinds of optimization.
2) Real evaluation: the AST represents the code as written, and a separate process walks it and produces a symbolic value from it.

Most implementations seem to do (1) in some way or another. This is fairly well-suited for C++11 constexpr (where you can in-place-rewrite calls to constexpr functions to the corresponding returned expression, substitute in values for function parameters, and just keep on folding), but not well-suited to C++14 constexpr. You can, in principle, use a similar technique in C++14, but mutability and non-trivial control flow means you need to keep a lot more things symbolic, retain an evaluation environment, and perform rewrites in the appropriate sequencing order. [Even in C++11, this approach is somewhat tricky, because (for instance) you need to track whether the fold invoked undefined behavior (which would render an expression non-constant) or otherwise did something that's not allowed in a constant expression.]

Clang has always done (2). This made the C++11 constexpr implementation more complex (because Clang couldn't rely on the existing code generation codepaths to perform constant expression evaluation; all sorts of new forms of evaluation needed to be implemented) but made implementing the C++14 constexpr much more straightforward, since most of the necessary infrastructure was built as part of the C++11 constexpr implementation.

It's sad that the slowest adaptors determine the pace of progress. Surely the committee is not in the business of protecting half a dozen feet-dragging compilers from competitive pressures? 

It would be interesting to see what would happen if someone hacked Clang to introduce full D-style CTFE as an extension, (or TS or whatever the politically accepted phrase is)

I think the remaining holes in C++14 constexpr are:

 * lambdas
 * destructors
 * polymorphism (virtual functions, dynamic_cast, typeid, exceptions [for polymorphic catch])
 * dynamic memory allocation

The other restrictions in 5.19 exist to prevent "bad things" (reading the value representation of an object, undefined behavior, uninitialized stuff, ...) that would be non-portable, unimplementable (inline asm, depending on values that don't exist until runtime, ...), or -- occasionally -- politically undesirable (goto considered harmful).

I'm confident we'll have a good answer for lambdas + constexpr in C++17.

Constexpr destructors add some implementation cost (implementations would need to track whether and when to run destructors).

The committee had some fears about implementation issues with polymorphism (requiring the compiler to track the dynamic type of an object), but we already actually require such tracking, to support things like checking the validity of a base-to-derived cast. I don't think there are pressing technical issues here.

Dynamic memory allocation is a challenge, due to three factors:
 1) Values will be represented differently during translation and during execution, so it's not reasonable to expose the value representation of objects during constexpr evaluation. This is mostly a problem for placement allocation functions, since these allow a constexpr evaluation to get a view of the same storage as both a T* and as (say) an array of unsigned char.
 2) Even if restricted to just the normal allocation functions, there is still the issue that the normal allocation functions are replaceable. We have (in C++14) adopted a rule that the compiler is not actually required to call these functions if it can satisfy the allocation in some other way, so this problem is mitigated; in constexpr evaluations, we would (out of necessity) simply not call the allocation function at all.
 3) If dynamic storage were able to "leak" from compile time evaluation to runtime evaluation, we would need a mechanism to set up an initial heap state of some form. This is both a desirable feature and a very technically challenging one.

The simplest case for dynamic allocation -- support non-placement new and delete only, do not allow the end result of a constant expression evaluation to refer to dynamically-allocated memory, and do not call a replacement global allocation/deallocation function -- is completely straightforward in Clang's implementation. I can't speak for the complexity that would be required in other implementations.
--


Now, this is a true technical reply and the reason I keep following this list for the time being. Thanks.


David Krauss

unread,
Sep 2, 2014, 10:36:19 PM9/2/14
to std-pr...@isocpp.org
Yikes, last night I slept through my own thread. Here’s a combo reply.

On 2014–09–02, at 11:14 PM, Ville Voutilainen <ville.vo...@gmail.com> wrote:

Sounds like a decent idea. Any particular reason why we wouldn't make
the existing overloads constexpr?

I didn’t mean to specifically exclude the existing ones, but the only current overloads take an iterator range and a predicate, which is typically a lambda. Since we don’t have constexpr containers (unless you count array, and initializer_list which I covered) nor constexpr lambdas, adding constexpr there could be premature.

On the other hand, the future trend will be toward more constexpr. I wish constexpr generic algorithms were already optional to the implementation. Constexpr qualification is designed to limit compilation complexity, but if guarantees of non-qualification lead to programs that really depend on non-qualification, users will have been misled when those programs break.


On 2014–09–03, at 12:50 AM, George Makrydakis <irreq...@gmail.com> wrote:

The tuple solution works because you are using std::tuple as a boost::mpl::vector substitute and implementing catamorphisms over packs through "tuple processing" naming them differently from what they substantially are.

Packs are neither tuples (fixed size, unlike elements) nor vectors (fixed size, like elements), they are lists (variable size, like elements) or perhaps queues, since insertion and removal occur at either end.

Anyway, I don’t think taxonomy is a good guiding force.

Template-ids work perfectly well as metadata elements. Naming a class template specialization such as tuple<void> does not cause an instantiation, so there are no semantic requirements on the types in the list. Type-list operations can invariably be implemented without instantiating the type-list class.

Metacomputation often generates runtime types, and using tuple as the basis of computation saves a rebind operation. For a program which generates many types, that may halve the metadata the compiler needs to deal with.

Are you certain that making constexpr overloads of such algorithms like all_of etc is better than addressing the real issue, meaning the lack of first class support for the fundamentals allowing such constructs? Shouldn't these things be considered eventually by expanding <type_traits> to boost::mpl territory?

We already have tuple_element and tuple_size.

 Seems more like another popular substitute by hasty inspiration than a solution by design.

bool all_of( std::initializer_list< bool > ) is simple and obvious, does that make it “hasty”? I happen to do a lot of metaprogramming, but the function isn’t specific to metaprogramming at all.

Adding a new core language construct for type lists when we already have all the right machinery, and plenty of legacy code already works without, sounds like over-specific optimization for niche uses at the expense of current practice. Impeding progress on unrelated issues for the sake of increasing pressure to adopt a specific direction on one issue is dirty politics.


On 2014–09–03, at 1:42 AM, Ville Voutilainen <ville.vo...@gmail.com> wrote:

It's, however, likely that at some point there's going to be
increasing resistance
towards imposing such compile-time evaluation of everything on every
implementation.

That happened already at the beginning, right? I’d predict that the state of the art will progress in an equilibrium. It would be unwise to require full runtime emulation from everyone in 2014, but 2030 should be enough time :) . I’m all for progress, but nobody benefits from a standard until it’s implemented, and divergence due to implementations prioritizing different feature subsets is still divergence in practical terms.


On 2014–09–03, at 1:50 AM, Rein Halbersma <rhalb...@gmail.com> wrote:

Just a naive and honest question: what are the main technical obstacles for compile-time evaluation of all non-I/O expressions?

Correctness and portability. C++ is trickier than D.

(and prefereably without the constexpr keyword, implicit would do nicely) Works for D, so it seems it can be done, but maybe D's compilation model is too different from C++'s?

http://stackoverflow.com/a/19830673/153285 (and follow the link back to this list).


On 2014–09–03, at 2:29 AM, George Makrydakis <irreq...@gmail.com> wrote:

The issue is not static if alone. The committee does not have a real argument against compile time evaluation. Just politics.

In practice the approaches to metaprogramming are more divergent than to usual programming, for many reasons:

1. It feels different and unfamiliar, so beginners are tempted to throw knowledge out the window and “just hack it.”
2. Metaprogramming is more tractable without side effects, so imperative programming tends to get excluded. This was true even in macro preprocessor days. This is the real reason static_if will never happen.
3. Implementations have various quirks, leading to numerous and divergent superstitious workarounds. I still see new code using typedef char yes[2]; sizeof sfinae_test( blah ) == sizeof (yes). This is just an illustrative example, but distraction and circumlocution have hindered the evolutionary development of C++ best practices.

“Politics” are inevitable. Good politics involves not only consensus, but the modesty to limit eternal commitments to judgments of absolute certainty. High-level metaprogramming in general, not only C++, is too new to expect consensus nor certainty about what is necessary or so harmful as to be useless.


On 2014–09–03, at 3:15 AM, Ville Voutilainen <ville.vo...@gmail.com> wrote:

The result of that count is one.

Clang.

The expectation is that there should
eventually be half
a dozen more.

GCC, EDG, MSVC, XLC, Sun, make five.

Borland gave up and adopted Clang. Digital Mars and Green Hills have been awful quiet lately, it’s not clear that they have the economic motivation to finish C++11. What am I missing?

ville.vo...@gmail.com

unread,
Sep 3, 2014, 3:04:17 AM9/3/14
to std-pr...@isocpp.org


On Wed Sep 03 2014 05:36:00 GMT+0300 (EEST), David Krauss wrote:
> Yikes, last night I slept through my own thread. Here's a combo reply.

> On 2014-09-03, at 3:15 AM, Ville Voutilainen <ville.vo...@gmail.com> wrote:
>
> > The result of that count is one.
>
> Clang.
>
> > The expectation is that there should
> > eventually be half
> > a dozen more.
>
>
> GCC, EDG, MSVC, XLC, Sun, make five.
>
> Borland gave up and adopted Clang. Digital Mars and Green Hills have been awful quiet lately, it's not clear that they have the economic motivation to finish C++11. What am I missing?

Intel?

Rein Halbersma

unread,
Sep 3, 2014, 3:21:09 AM9/3/14
to std-pr...@isocpp.org
On Wed, Sep 3, 2014 at 12:33 AM, Richard Smith <ric...@metafoo.co.uk> wrote:
Dynamic memory allocation is a challenge, due to three factors:
 1) Values will be represented differently during translation and during execution, so it's not reasonable to expose the value representation of objects during constexpr evaluation. This is mostly a problem for placement allocation functions, since these allow a constexpr evaluation to get a view of the same storage as both a T* and as (say) an array of unsigned char.
 2) Even if restricted to just the normal allocation functions, there is still the issue that the normal allocation functions are replaceable. We have (in C++14) adopted a rule that the compiler is not actually required to call these functions if it can satisfy the allocation in some other way, so this problem is mitigated; in constexpr evaluations, we would (out of necessity) simply not call the allocation function at all.
 3) If dynamic storage were able to "leak" from compile time evaluation to runtime evaluation, we would need a mechanism to set up an initial heap state of some form. This is both a desirable feature and a very technically challenging one.

The simplest case for dynamic allocation -- support non-placement new and delete only, do not allow the end result of a constant expression evaluation to refer to dynamically-allocated memory, and do not call a replacement global allocation/deallocation function -- is completely straightforward in Clang's implementation. I can't speak for the complexity that would be required in other implementations.

Could you give an example of what type of code would be possible / hard? E.g. what kind of functionality would be feasible for a constexpr std::string? The full std::string interface, or only strings that fit within a small string buffer? It would be extremely nice do abandon all the variadic template/lambda hacks that do some crippled compile-time string manipulation.

George Makrydakis

unread,
Sep 3, 2014, 3:27:42 AM9/3/14
to std-pr...@isocpp.org
About Intel, remember:

1) https://isocpp.org/blog/2013/10/edg48-vc2013
2) https://www.edg.com/index.php?location=faq_q3_who
3) http://www.boost.org/doc/libs/1_56_0/boost/config/compiler/intel.hpp

It seems that the compiler implementation difficulty argument is
problematic as well since the actual compilers going after a valid
implementation are the usual knowns.



George Makrydakis

unread,
Sep 3, 2014, 4:36:34 AM9/3/14
to std-pr...@isocpp.org

On 09/03/2014 05:36 AM, David Krauss wrote:
On 2014–09–03, at 12:50 AM, George Makrydakis <irreq...@gmail.com> wrote:

The tuple solution works because you are using std::tuple as a boost::mpl::vector substitute and implementing catamorphisms over packs through "tuple processing" naming them differently from what they substantially are.

Packs are neither tuples (fixed size, unlike elements) nor vectors (fixed size, like elements), they are lists (variable size, like elements) or perhaps queues, since insertion and removal occur at either end.

Incorrect; parameter packs behave like product types and their way of processing head-to-list is what is revealing of their "tuple" nature. Are they not an n-ordered sequence of parameter types? Are tuples in their mathematical sense modelled as nested ordered pairs? See an analysis of the behavior of parameter packs at http://ecn.channel9.msdn.com/events/GoingNative12/GN12VariadicTemplatesAreFunadic.pdf. For the rest google is your friend. Take note that "tuple" without an std:: infront of it does not refer to a std::tuple, but to the notion std::tuple is actually implementing (as well as others like it).

The std::tuple tuple becomes of fixed size once you instantiate the template. But per se, you  use it in a context where it is fed a parameter pack, whose size is determined in instantiation. Its template parameter list as a class template in serious C++11 implementations, is using itself the pack which is the fundamental concept of what we are referring to type-wise. And it is not a secret that template meta-programmers (ab)use the std::tuple class template and the std::get<N> over it when they are trying do some pretty trivial stuff instead of depending upon boost::mpl::vector. Also, you should remember that boost::mpl::deque as per http://www.boost.org/doc/libs/1_56_0/boost/mpl/deque.hpp is that they are in the end based on boost::mpl::vector, which itself in any serious C++11 implementation depends upon the concept of a pack.

What you are describing as "insertion and removal occur at either end" was actually already happening with boost::mpl::vectors, that are easier to implement now that packs are available because packs are the fundamental construct here. Even boost::mpl::deque is actually using boost::mpl::vector, so yes, terminology is something severely impaired for the vast majority of people citing such sources.

Parameter packs, under the condition that their use in a template parameter list is made within a deducible context (remember multiple parameter packs?) can allow both catamorphisms and anamorphisms (left and right folds and unfolds, referring to your "ends" processing) and structure preserving transformations (for example mapping). See http://en.cppreference.com/w/cpp/language/parameter_pack#Pack_expansion currently for some creative uses of pack expansion.



Anyway, I don’t think taxonomy is a good guiding force.

In the sense that C++ has the term "functor" for something that has nothing to do with the actual way functor is used in category theory, I would agree. C++ terminology is working against itself because it has to cater for people who willfully ignore established terminology preceding C++ itself. As are other things. So, yes, taxonomy is not helping such arguments because it is vaporizing their substance.



Template-ids work perfectly well as metadata elements. Naming a class template specialization such as tuple<void> does not cause an instantiation, so there are no semantic requirements on the types in the list. Type-list operations can invariably be implemented without instantiating the type-list class.

Metacomputation often generates runtime types, and using tuple as the basis of computation saves a rebind operation. For a program which generates many types, that may halve the metadata the compiler needs to deal with.

First, you are overusing the term meta- to the demise of the argument. That people use std::tuple for some of its processing aspects for


Are you certain that making constexpr overloads of such algorithms like all_of etc is better than addressing the real issue, meaning the lack of first class support for the fundamentals allowing such constructs? Shouldn't these things be considered eventually by expanding <type_traits> to boost::mpl territory?

We already have tuple_element and tuple_size.

According to the wishes of the committee in currently active EWG#30, you cannot defend such argument. You also validated my own argument in your previous paragraph.


 Seems more like another popular substitute by hasty inspiration than a solution by design.

bool all_of( std::initializer_list< bool > ) is simple and obvious, does that make it “hasty”? I happen to do a lot of metaprogramming, but the function isn’t specific to metaprogramming at all.

Adding a new core language construct for type lists when we already have all the right machinery, and plenty of legacy code already works without, sounds like over-specific optimization for niche uses at the expense of current practice. Impeding progress on unrelated issues for the sake of increasing pressure to adopt a specific direction on one issue is dirty politics.

Nobody cares of any adoption of any feature; correctness is not about adoption but about proving "you" don't follow "your own" rules. It is about pointing out that certain practices are not based on technical grounds and that everybody plays with different cards while saying they are on the same game. That is at least hypocritical and adulating the committee over such happenstances is the worst service one can do to C++ as a language.





On 2014–09–03, at 2:29 AM, George Makrydakis <irreq...@gmail.com> wrote:

The issue is not static if alone. The committee does not have a real argument against compile time evaluation. Just politics.

In practice the approaches to metaprogramming are more divergent than to usual programming, for many reasons:

1. It feels different and unfamiliar, so beginners are tempted to throw knowledge out the window and “just hack it.”
2. Metaprogramming is more tractable without side effects, so imperative programming tends to get excluded. This was true even in macro preprocessor days. This is the real reason static_if will never happen.
3. Implementations have various quirks, leading to numerous and divergent superstitious workarounds. I still see new code using typedef char yes[2]; sizeof sfinae_test( blah ) == sizeof (yes). This is just an illustrative example, but distraction and circumlocution have hindered the evolutionary development of C++ best practices.

1) The Stroustrupian beginner argument is inexplicable and inapplicable since such uses of C++ are not for beginners, but are requested by all serious users of C++ to the point of having active EWG issues. Beginners want to "write stuff up fast" and that is why they use languages other than C++ for the most part, with wider library and language feature support. C++ is for people who are supposed to be aware of what they are doing.

2) Template metaprogramming can be functionally characterized as "a pure, non-strict, untyped functional language with pattern matching", see http://bannalia.blogspot.gr/2008/05/functional-characterization-of-c.html for example. In that sense, if you are making an imperative vs functional kind of argument, that is lost to begin with by decades of computer science research you can find in any good library.

3) I agree about the essence of this point very much. The problem is that instead of fixing template metaprogramming to make it easier to work with, we got constexpr metaprogramming (which is about playing with values and not types unless you clutter the declarative parameter list with decltype enclosures) and - arguably - concepts, which are nothing more than sfinae shorthands instead of the glorious typeclass (see haskell) character they would have expressed. We would not have a need for distraction and circumlocution anymore.

The funny thing is that the semantics of concepts with variadics depend upon pack expansion rules, so to the dismay of people who do not want this, template metaprogramming techniques can and will be applied to either fix concept lacunae or get rid of them altogether in meaningful code. Concepts don't have generative features themselves, they are just shorthands used for twisting pattern matching through sfinae hacks for "error reporting". They are the direct sequence of C++'s design adoption of latent typing and lack of proper bounded polymorphism. You can search the bibliography on how monads in C++ template metaprogramming can be used to produce more meaningful errors during compilation than concepts will ever be able to in their current design (unless in the end they are turned into endofunctors themselves...). But since concepts the poster child of the EWG and by birthright they are used as the reflex reaction to anything that shows how incomplete they are, we have to put up with them. I am not saying that they are not a good thing, I am saying that they are used as an utopic panacea by certain enthousiastic experts. They are not.

I think that I have made the point of where this is getting to.



“Politics” are inevitable. Good politics involves not only consensus, but the modesty to limit eternal commitments to judgments of absolute certainty. High-level metaprogramming in general, not only C++, is too new to expect consensus nor certainty about what is necessary or so harmful as to be useless.


"Consensus" of people not following their own procedural rules is another terminological abuse, especially when no votes are casted and the gist of specific work is redelegated to an internal party while the original one is given a misguiding library implementation. It is that conflict that says "error" if honest, worse if not. The problem is the attitude, not result. One cares little about any result from people who prove themselves ill-equiped to claim to be of the standard they are. Proofs by construction are superior to inexplicable and mockery of consensus procedures.

Inconvenient truths are politically incorrect but that is the problem of the people who are involved in them. You are implying some sort of indistinction between ethics and politics that has been proven dangerously fallacious since the times of Macchiavelli. And then there is the fairytale of the naked king.

Good discussion as always David.

George



David Krauss

unread,
Sep 3, 2014, 5:25:12 AM9/3/14
to std-pr...@isocpp.org

On 2014–09–03, at 4:37 PM, George Makrydakis <irreq...@gmail.com> wrote:

According to the wishes of the committee in currently active EWG#30, you cannot defend such argument. You also validated my own argument in your previous paragraph.

What argument? I see a lot of discussion about taxonomy. You left a paragraph incomplete.

EWG30 says:

 There are lots of very basic manipulations that are either really hard or impossible to do with argument packs unless you use something that causes a big recursive template instantiation, which is expensive at compile-time and can cause bad error messages. I want to be able to index argument packs with integral constant expressions, "take" or "drop" the first N elements of the pack, etc. 

This does not express a prejudice against using std::tuple as a type-list nor commit to a new core language facility. The immediately linked proposals don’t describe any new core facilities either.

My main argument is simply that std::tuple is a better choice for EWG30-style operations than a dedicated std::packer class, as mentioned in N4115, or something heavier like mpl::vector. This subject perhaps deserves a paper.

Also, I believe that the gain of avoiding wrapping pack expansions as tuple<T...> isn’t worth the cost of a new core language feature. However, this isn’t as concrete and I suspect such a feature will eventually happen anyway. It could be well and good, as long as there are other concomitant gains.

However, this really has no bearing on the proposal in this thread, since as mentioned it’s not really a metaprogramming facility but just a generic facility with as much application in metaprogramming as anywhere else a braced-init-list of Booleans might arise.

Tony V E

unread,
Sep 3, 2014, 11:16:09 AM9/3/14
to std-pr...@isocpp.org
On Tue, Sep 2, 2014 at 7:18 PM, George Makrydakis <irreq...@gmail.com> wrote:

 
On September 2, 2014 11:37:47 PM EEST, Ville Voutilainen <ville.vo...@gmail.com> wrote:


I think what's misguiding is all these unsubstantiated claims about
"social non-arguments",
"sponsored lobbying" and "bogus arguments".

C++ needs volunteers, but it seems that certain people will have problems with this. I am just informing a new potential author on what to expect, given that you tried to motivate another person

<snip>

George


I may be wrong, but I get the impression that your experience with the committee is dominated by one large, unfortunate, experience or "data point".  Before extrapolating that into conclusions about the committee, I would suggest many additional data points are required.

Tony


George Makrydakis

unread,
Sep 3, 2014, 1:47:18 PM9/3/14
to std-pr...@isocpp.org

On 09/03/2014 06:16 PM, Tony V E wrote:
I may be wrong, but I get the impression that your experience with the committee is dominated by one large, unfortunate, experience or "data point".  Before extrapolating that into conclusions about the committee, I would suggest many additional data points are required.

Tony

I will give you the whole story, with all respect to David since I would not like to hijack the thread into sheer offtopicness, but you asked me about it here. Here goes.

Initially, after motivation by Ville, I put together an incomplete draft for accessing parameters within a parameter pack either individually by ordinal index or over an interval of indexes / sizes describing a pack; it was also possible to have some notation accompanying that pack that would give us the possibility to detect whether the pack was made out of a pattern of types.

These requirements led to a few interesting effects boosting template constructs as to require less class template partial specializations / function template overloads (the "vertical pattern" boilerplate) and shorter parameter lists where type sequence patterns / complex constructs avoiding sfinae were possible. The important thing to note is that there are a series of active EWG issues, most of all EWG#30, that could be addressed in a unified manner if the EWG did conclude to a language feature implementation for them.

Ville forwarded two N-less drafts (one of them mine) that were not in the pre-meeting mailing list, for two things that were fundamentally different in their premise but related to parameter packs to the EWG. He did invite me without any guarantees and I thought that it would have been a nice way to know some of the brilliant minds behind what makes C++ tick. I did not go there to present a proposal. I did not go there expecting to have to win any argument. I went there with the intention of discussing some preliminary work with people who by historical right, should know this language better than I.

After arriving early morning Wednesday and while trying to understand what was going on, with the intervention of Ville (whom I thank for the invitation), I am told that at some "unknown point in time after we finish, we can discuss this!". I worked on my presentation slides the same night.

Thursday comes. Another N-less paper related to packs was discussed into a specific time slot and despite its ambiguities even the EWG missed it got "encouraged" by a specific and inter and intrarelated block of people. Instead, I had to remind the people involved in this "loose" program three times that if they indeed wanted to discuss some preliminary work, I was there just for that. In the end, I was given a slot a few minutes before evening break.

Starting the presentation, I specified that this was preliminary work, but it addressed several issues that currently merit attention. After about 9 minutes of presentation time, with a series of rows of people sleeping and munching away before I even started discussing my preliminary work, three arguments were raised: "template metaprogramming scares people away", "it would strain the compiler", "let's do it with concepts". All of these are not only non-arguments, but we are kind of really getting away from the picture by pingballing concepts in: they lack generative features and you cannot use them for accessing sub-packs or individual parameters within a pack.

In their attempt to demonstrate authority while being politically correct, they tried to get away from a difficult situation by pulling out the card of a "library reccomendation" since this is something that would interest "template metaprogrammers" only. Understanding of their embarassing tiredness because sessions started at 08:00-08:30 and unpreparedness (non pre-meeting mailing), I announce in the mailing list that I'd go forward with a library in order to play a little bit more with this. I already knew that this was undoable without heavy preprocessor and template metaprogramming but that was not the issue.

Take note that no votes are casted, this is an N-less incomplete draft and not a proposal; but it could be in one of the following meetings.

Friday. They start discussing active issues list and EWG#30 comes up. Exactly one of the issues I rapidly discussed about during the presentation. It is then decided that after 2 years almost that this was pending there, they should do something about it. And they start looking among them on what to do about it. One of the people who said "this would strain the compiler" for my preliminary work that directly addresses this, "express interest in drafting" about an issue where my work offers a fit.

Unsurprisingly, that other N-less draft that got its early slot and was obviously supported by a specific block of people despite its problems, gets "encouragement" to work on this and "merge" towards the goal of addressing EWG#30. When I talked to the people involved in this, it was visible that they were particularly strained in giving an answer as to why my very related work was ignored since they went into disproving the non-arguments they made the previous day.

Saturday. I go to the EWG and I ask for a hearing once the issues list gets active. Ville is presiding and because he has understood what happened he concedes to it. I get the same non-arguments again but with even more difficulty by them because ... EWG#30, first and foremost. One person tries to become arbiter of the situation and says "what about presenting the complete paper?" as a resolution of the issue. Yet, none of this is written in the transcriptions of the early Saturday meeting. Unsurprising.

The people who were easily and without real technical arguments dismissive of my preliminary work are the ones who are trying to implement it. Draw your own conclusions. I am simply continuing to this work on the premises I started with in https://github.com/irrequietus/atpp, should anybody care. As for Ville, well, I think that he is an honest man. Infact he will "eventually amend" EWG#30 to reflect these things, as he said in this mailing list. I will be given my N-number once I ask for it (upon completion of my work).

My disgust towards this C++ EWG practice is justified; more so if people like Herb Sutter voiced their concern that the irregularity of the procedure with which such preliminary work was discussed was problematic. The aftermath of my experience is that from now on, groups like the EWG will be more respectful of their own procedures, safeguarding newcoming authors from the impression that since they are not backed by coworkers in great, great companies, "these things happen". I still think that any serious C++ programmer should visit one of these meetings once; disillusionment about certain people is important for the community at large.

It is not that they are not aware that this is a problem they created, it is that they don't know how to deal with it other than waiting for the little kid to stop saying that the naked king is naked.

George



George Makrydakis

unread,
Sep 3, 2014, 2:02:28 PM9/3/14
to std-pr...@isocpp.org

On 09/03/2014 12:24 PM, David Krauss wrote:
What argument? I see a lot of discussion about taxonomy. You left a paragraph incomplete.


Ignoring research done in the last 4 - 5 decades of computer science claiming it is taxonomy, is the practical argument certain C++ users deploy when they are unaware of the meaning of said research. I know you are better than this David, I seriously do. Read the entire post again, will not repeat what I already said.


EWG30 says:

 There are lots of very basic manipulations that are either really hard or impossible to do with argument packs unless you use something that causes a big recursive template instantiation, which is expensive at compile-time and can cause bad error messages. I want to be able to index argument packs with integral constant expressions, "take" or "drop" the first N elements of the pack, etc. 

This does not express a prejudice against using std::tuple as a type-list nor commit to a new core language facility. The immediately linked proposals don’t describe any new core facilities either.

My main argument is simply that std::tuple is a better choice for EWG30-style operations than a dedicated std::packer class, as mentioned in N4115, or something heavier like mpl::vector. This subject perhaps deserves a paper.

That is what people are already using in order to avoid including boost::mpl::vector, up to a point. The result is an ugly hack. And any library based solution like this, would only resort to increasing the number of recursive template instantiations and function template overloads to do simple parameter access. This issue is about removing the need for using such solutions, advancing pack processing to a whole new level.



Also, I believe that the gain of avoiding wrapping pack expansions as tuple<T...> isn’t worth the cost of a new core language feature. However, this isn’t as concrete and I suspect such a feature will eventually happen anyway. It could be well and good, as long as there are other concomitant gains.

However, this really has no bearing on the proposal in this thread, since as mentioned it’s not really a metaprogramming facility but just a generic facility with as much application in metaprogramming as anywhere else a braced-init-list of Booleans might arise.


Resolution of EWG#30 as a language feature makes the template parameter list (including any packs declared in a deducible context within it) into a fundamental computational construct that has no need of the constexpr overloads, since non-type parameter packs can expand into a list of values of the same type. Because of the rules of pack expansion, implementing catamorphisms, anamorphisms and structure preserving transformations is quite easy.

At that point it is more a question of style, since type and template-type packs have the advantage of expanding to lists of type and template-template parameters and not just lists of values.

Variadics are into functional territory, stop thinking imperatively because that does not apply there.

Other than that, I think that constexpr metaprogramming is not a bad idea for certain things.



Gabriel Dos Reis

unread,
Sep 4, 2014, 6:02:18 PM9/4/14
to std-pr...@isocpp.org
Richard Smith <ric...@metafoo.co.uk> writes:

[...]

| 1) "fold": the AST is rewritten as constant expressions are
| simplified. In some implementations this happens as you parse, in
| others it happens as a separate step. So when you build a + operation
| whose operands are 1 and 1, you end up with an expression "2" and no
| evidence that you ever had a '+'. This also means you can use
| essentially the same code to perform various kinds of optimization.
| 2) Real evaluation: the AST represents the code as written, and a
| separate process walks it and produces a symbolic value from it.

you are making the implicit assumption that all C++ compilers start
doing constexpr fom ASTs :-)

Agree, there is more variations in C++ implementations than meet the eyes.

-- Gaby

Gabriel Dos Reis

unread,
Sep 4, 2014, 6:10:41 PM9/4/14
to std-pr...@isocpp.org
Rein Halbersma <rhalb...@gmail.com> writes:

[...]

| The simplest case for dynamic allocation -- support non-placement
| new and delete only, do not allow the end result of a constant
| expression evaluation to refer to dynamically-allocated memory,
| and do not call a replacement global allocation/deallocation
| function -- is completely straightforward in Clang's
| implementation. I can't speak for the complexity that would be
| required in other implementations.
|
|
| Could you give an example of what type of code would be possible /
| hard?

C++ -- following the lead of C -- has the notion of phase distinction
where the elaboration phase of the compiler happens before the linker
phase which happens before the loading phase, which happens before
runtime. You can't assume the actors from the previous phase are
present in the next phase -- in particular the elaborator is long gone
by the time the loader is doing its thing. Consequently, all the
"constants" from the elaboration phase through load-time are carefully
described symbolically -- this is even more glaring when you look at the
sort of (restricted) constant expressions you can use as template
arguments (e.g. making up an address constant out of an integer constant).

If you add dynamic allocation, you need to describe what that means
(e.g. the form of what is a value) through that chain. As far I know,
nobody has presented a thoughtful coherent analysis of the issue and a
solution.

-- Gaby

George Makrydakis

unread,
Sep 4, 2014, 8:18:12 PM9/4/14
to std-pr...@isocpp.org, Gabriel Dos Reis

On September 5, 2014 1:10:40 AM EEST, Gabriel Dos Reis <g...@axiomatics.org> wrote:
>Rein Halbersma <rhalb...@gmail.com> writes:
>
>[...]
>
>|     The simplest case for dynamic allocation -- support non-placement
>|     new and delete only, do not allow the end result of a constant
>|     expression evaluation to refer to dynamically-allocated memory,
>|     and do not call a replacement global allocation/deallocation
>|     function -- is completely straightforward in Clang's
>|     implementation. I can't speak for the complexity that would be
>|     required in other implementations.
>|
>|
>| Could you give an example of what type of code would be possible /
>| hard?
>

...

>If you add dynamic allocation, you need to describe what that means
>(e.g. the form of what is a value) through that chain.  As far I know,
>nobody has presented a thoughtful coherent analysis of the issue and a
>solution.
>

That "value" in that case, has to be immutable by the fact that as you said, the actors of the previous phases cannot be assumed to be present in the next one.
Your main problem with the constexpr runtime barrier is that people reason in terms of variables as their "values", instead of (recursive) constexpr expressions dealing with immutable values being consumed during this process.

Thinking about this declarative behavior of constexpr with imperative runtime semantics such as "dynamic allocation" is just wrong and is a reason for confusion. It is not about form, but about immutability.

Reply all
Reply to author
Forward
0 new messages