You say that definition checking would require effort through to "next decade". It would require so much work that "we'll never get any sort of concepts". Well, the only reason we would "never" get the feature is if everyone working on it decided that it wasn't worth the effort. After all, constraints is apparently very much worth a great deal of effort, considering the fact that we're on our second attempt. But if adding checking onto that requires so much more effort that even getting the rest of constraints doesn't make it worth doing, then... what's the chance that people will ever see checking alone as being worth the effort?
Matt is very concerned that, if Concepts TS goes into C++17, then in 2018, SG8 will dissolve and most of the committee will be decidedly uninterested in doing anything more with it. And some of the responses in this thread seem to bear this out. Andrew Sutton, one of the architects of Concepts TS, has already made it clear in his post that checking is optional in his mind, nothing that he'll miss. That doesn't exactly sound like he's chomping at the bit to start working on it.
I do not find that Matt's fear is unreasonable.
--
You received this message because you are subscribed to the Google Groups "SG8 - Concepts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to concepts+u...@isocpp.org.
To post to this group, send email to conc...@isocpp.org.
Visit this group at https://groups.google.com/a/isocpp.org/group/concepts/.
I'm not Andrew, but there is a path forward. Gaby has done definition
checking based on predicated in his model language Liz. It works.
There
is a snag, though, as several people have pointed out: if you do, you
can't do data collection, tracing, debugging using debug statement,
telemetry, etc. without interface statements.
It is important, but not essential and not without its own problems. It
is a huge benefit in its own right.
Matt's paper is making a lot of assumptions about the knowledge and
opinions about people wanting concepts now based on the current model.
Most seem flat wrong. He seems to have missed the last 5 years progress
in the standards committee, or maybe 10.
Also, a lot of generic code is correct despite of this concern, but will
become incorrect if you suddenly enable definition checking on it.
I find it non-obvious and debatable whether concepts should prevent
all errors during instantiation. And, remember,
1) The Concepts TS allows me to write constrained templates that are not
definition-checked, and we might get definition checking later
2) C++0x concepts don't allow me to write constrained templates that are
not definition-checked. There was never even any plan to allow that.
Look - if you expect that every constrained template that is currently
constrained
with traits, and/or later constrained with Concepts TS -concepts will
become definition-checked
by "turning a switch", you should stop expecting that. That will not
happen. It's
completely infeasible.
I do expect that definition checking must be explicitly asked for in a
constrained template.
You may consider that a drastic change.
> You can use something like late_check or a similar kind of block-level
> mechanism. I do not personally see that as a serious burden. Instead of just
> writing a debug statement, you wrap it in the equivalent of a late_check-ed
> call/block.
That means I would need to do drastic changes. Those changes are not
backwards-compatible, whereas switching from traits to not-definition-checked
constraints can be, because I don't need to touch the definitions. I can do
traits/concepts conditionally in interfaces.
A few comments:
First, please always remember the principle that it's important to prioritize callers over callees because (a) callers outnumber callees, usually by orders of magnitude, and (b) callees are much more scoped and contained, including they can much more readily be unit-tested using current techniques.
In this context, that means it is far more important to check callers (template uses) than callees (template definitions). Of course, if we can get both we should do that; but if we can only get one, or get one first, then we absolutely must prioritize callers over callees. This is a fundamental principle.
Wait, no: C++0x concepts failed to solve any problem, because C++0x concepts Did Not Work in the real world.
In particular, C++0x concepts were never shown to be implementable with sufficient performance to actually be usable in a commercial compiler and user project. The prototype ConceptGCC implementation never overcame a multi-factor (IIRC, 6x) slowdown in recompiling existing code against a concept-ized STL, which is way beyond the "unadoptable" line; that would need to come closer to 20% to even consider again just on performance grounds.
-- And there were other serious problems that were never addressed, such as that C++0x concepts bifurcated the template system between constrained and unconstrained templates, which was a major contributor to the fact that after several years we still could never figure out how to actually use them correctly.
Note that the part of C++0x concepts I thin Matt most wants is separate checking, which relied on concept maps, and that is exactly the part of the design that caused the excessive overheads and was never shown to be realistically implementable.
On 20 February 2016 at 04:13, Matt Calabrese
<metaprogram...@gmail.com> wrote:
> On Friday, February 19, 2016 at 5:01:11 PM UTC-8, Ville Voutilainen wrote:
>>
>> Also, a lot of generic code is correct despite of this concern, but will
>> become incorrect if you suddenly enable definition checking on it.
>
>
> We seem to have very different definitions of correct. If you enable
> definition checking and it then fails, then the code was certainly *not*
> correct. Either the developer specified incorrect constraints, or the
Such code exists today, it compiles and runs and works as required.
I'm willing to let you call that code incorrect by your definition, but I have
no trouble disagreeing with that view.
Or the developer very deliberately used more functionality than the
"properly-specified
constraints", for debatable definition of "properly".
>> Look - if you expect that every constrained template that is currently
>> constrained
>> with traits, and/or later constrained with Concepts TS -concepts will
>> become definition-checked
>> by "turning a switch", you should stop expecting that. That will not
>> happen. It's
>> completely infeasible.
> I agree that it is not feasible for SFINAE-constrained templates, which is
> one of the reasons why we need language-level concepts. I disagree that it
That's not the point. The point is about whether the definition of a constrained
template needs to be changed when the constraint mechanism is updated
from legacy mechanisms to concepts.
> is the case that we should not expect a language-level constraint facility
> to be able to properly check the definition of a constrained template,
Oh, you can certainly expect it to be able to "properly" check the definition,
and you can also expect it to either turn well-formed correct code into
ill-formed code, or require changing the definition.
>> I do expect that definition checking must be explicitly asked for in a
>> constrained template.
>> You may consider that a drastic change.
> Even if you explicitly ask for it, it is unlikely that the current
> specification of constraints would be sufficient to provide proper checking.
Meaning what? The current specification of constraints, or the current
specification of a constrained function, which is lacking any lookup changes?
> I'm not sure I follow -- how would writing "late_check" be a drastic change?
> You're already re-specifying the constraints -- is a single additional
> keyword really a problem? That seems like an incredible stretch. Or am I
> misunderstanding your point here?
It requires changing the definition. That *IS* a drastic change. Re-specifying
the constraints with the Concepts TS concepts doesn't require touching
the definition.
The concern here is that the current design may very well prevent us from making such incremental progress with language-level concepts. I'd rather not do that, and given that we don't even have a conceptified standard library, it's just one more reason on top of the others for why we should hold off.
There is nothing wrong with Concepts remaining a TS for now and not being a part of C++17. It's not as though it would be dead if it doesn't immediately make it into the next standard. Please, let's just not risk seriously messing this up because we want it directly in the language a couple of years early.
On 20 February 2016 at 05:30, Matt Calabrese
<metaprogram...@gmail.com> wrote:
>> >> Also, a lot of generic code is correct despite of this concern, but
>> >> will
>> >> become incorrect if you suddenly enable definition checking on it.
>> > We seem to have very different definitions of correct. If you enable
>> > definition checking and it then fails, then the code was certainly *not*
>> > correct. Either the developer specified incorrect constraints, or the
>> Such code exists today, it compiles and runs and works as required.
>> I'm willing to let you call that code incorrect by your definition, but I
>> have
>> no trouble disagreeing with that view.
> So if the standard specified constraints that didn't match what was actually
> necessarily for implementation, you don't think this would be considered a
> defect in the standard? Or similarly, if a standard library implementation
I wasn't talking about standard library code, for what it's worth.
> depended on functionality that wasn't a part of the specified constraints,
> you wouldn't consider this an invalid implementation? I understand that many
That very much depends on whether the code can produce a hard
error(*). In addition
to that, there are cases where the standard allows implementation freedom for
whether an implementation diagnoses certain requirements violations. So
no, I wouldn't necessarily consider that an invalid implementation.
(*) A violation of "proper constraints" doesn't necessarily produce a
hard error.
> users may not encounter the issues and that the issues can be very subtle,
> but it is precisely that subtlety that makes the checking so important, in
> my opinion. The subtlest issues are the ones that require the most aid at
> identifying.
On the other hand, bugs that never manifest do not need to be fixed.
Except if definition checking is in effect, in which case they have to be.
Which is why definition checking cannot be mandatory.
>> Or the developer very deliberately used more functionality than the
>> "properly-specified
>> constraints", for debatable definition of "properly".
> ... I don't understand how you can even state that this is debatable. If the
> developer depends on functionality that is not a part of the constraints,
> then those constraints are under-specified. If you are referring to things
I can easily find cases where that's not true. A call chain from a constrained
template to an unconstrained template to a constrained template is such
a case, if the unconstrained template doesn't violate the constraints on
the end points, and those constraints are compatible.
It's not a "choice to acknowledge". I can prove that the code I have
is correct as written
but incorrect under definition checking.
>> Meaning what? The current specification of constraints, or the current
>> specification of a constrained function, which is lacking any lookup
>> changes?
> Both would likely need to be changed. On the constraint side, you'd probably
"Likely"? You seem to use that word a lot.
> need the constraints to effectively be isomorphic with pseudo-signatures for
> them to be useful, including the semantics of how things like return value
I do not know what that means. You continue to use very vague descriptions.
Sorry. In a simplified example, consider a pseudo-signature that "returns" T, keeping in mind that the function it matches with doesn't necessarily return exactly T, but rather, just something convertible to T. Note that this is a common kind of constraint in practice. Now, in your constrained template, you make a call to some function "foo" that takes a T (maybe a constrained function or not, depending on if T is dependent -- it doesn't really matter). In the current world, one can naively, though not obviously so, invoke "foo" directly with the result of the call to the associated function... only this is not strictly correct. This can either call a different function or be ambiguous, or cause other kinds of issues. As a byproduct of definition checking (or more precisely, in the C++0x world at least, by going through a map), this code definitively calls the foo that takes T and the writer of the generic code does not have to be a language lawyer to understand the subtleties to do it properly. It works in the way a reasonable programmer would expect it to.
W dniu sobota, 20 lutego 2016 07:34:50 UTC+1 użytkownik Matt Calabrese napisał:Sorry. In a simplified example, consider a pseudo-signature that "returns" T, keeping in mind that the function it matches with doesn't necessarily return exactly T, but rather, just something convertible to T. Note that this is a common kind of constraint in practice. Now, in your constrained template, you make a call to some function "foo" that takes a T (maybe a constrained function or not, depending on if T is dependent -- it doesn't really matter). In the current world, one can naively, though not obviously so, invoke "foo" directly with the result of the call to the associated function... only this is not strictly correct. This can either call a different function or be ambiguous, or cause other kinds of issues. As a byproduct of definition checking (or more precisely, in the C++0x world at least, by going through a map), this code definitively calls the foo that takes T and the writer of the generic code does not have to be a language lawyer to understand the subtleties to do it properly. It works in the way a reasonable programmer would expect it to.
From the other side, allowing the code to take other overload of foo() that takes exactly was was returned from your function and not T precisely, is the think that allows certain desirable optimizations, like use of expression templates in numeric libraryies.
Other Matt here... My interpretation of what he says isn't quite that we should delay concepts until we have definition-side checking, but that we should delay concepts until we are confident that there is a workable upgrade path to definition side checking.
--
You received this message because you are subscribed to the Google Groups "SG8 - Concepts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to concepts+u...@isocpp.org.
To post to this group, send email to conc...@isocpp.org.
Visit this group at https://groups.google.com/a/isocpp.org/group/concepts/.
Other Matt here... My interpretation of what he says isn't quite that we should delay concepts until we have definition-side checking, but that we should delay concepts until we are confident that there is a workable upgrade path to definition side checking.
If you want the behavior as in your example then the concept you are using wouldn't be a concept operating on T with operators specified to return T. It would be a concept operating on T that has operators that return associated types, where those associated types are convertible to T. Those types model a less-refined version of the concept (assuming expression templates, where the associated types likely wouldn't provide most of the mutating operations that T provides). Note that a concept such as this would also match a type where the operators directly return T such as if T were a double or int.
What happens in the above code where T is something like "int"? decltype(d) would be int. bar is then called with that int. Everything is fine. Now, similarly, what happens if T is a type that yields expression templates from operators? Since the Numeric concept is the more simple one, this code still works, too, even if the result of something like (a+b)+c might normally produce an object that holds on to a temporary operand by reference. This is because the concept specifies that the associated operator + yields T. Sure it's not optimal because the concept used was not a close match, however because decltype(d) ends up being T, the code is safe and works. This is a good thing, since if this did not happen and d ended up as something like an expression-template that held on to operands by reference, we'd now have dangling references and a subtle run-time bug that may not even be caught at compile time.
In practice, if you write generic code that you expect to work both with expression-templates and with basic types, that generic code has to be aware of this, otherwise the code just might not work or may work purely out of coincidence or may appear to work but has bugs that won't show up until run-time if they are even diagnosed at all. This is not an uncommon kind of problem when working with expression template libraries today -- people pass instances of their types and expressions along to templates that were written without expression templates in mind and the mistake is not always obvious. If, in a world with checking, the developer of the generic code used the simple kind of Numeric concept here, the code would be correct.
Matt's paper is making a lot of assumptions about the knowledge and
opinions about people wanting concepts now based on the current model.
Most seem flat wrong. He seems to have missed the last 5 years progress
in the standards committee, or maybe 10.
Hi Matt,
As I recall there were other problems with the old proposal.
A. ADL based customization points
B. Type checking generic lambda bodies
It seems to me that if we need to return to c++0x concepts and complete that design,
then we could have to a decade before anything happens, while risking we can't complete the design satisfactory or implement it efficiently enough.
Can we afford that?
It was quite an eye opener when I compared the specification of the old concepts to the new. The old specification was huge. That level of complexity is just scaring.
There is another route to getting definition checking though. Instead of wiring definition checking into the compiler, it could be a separate tool, perhaps done by using clang. That way the definition checking could be done once in a while, or during nightly builds, without affecting compilation or debug performance for the developer.
Of course, it still requires that definition checking is possible in some form (we can probably live with some obscure cases not being perfectly diagnosed).
But how do we know these are really obscure cases without more experience and perhaps a conceptified standard library?
On 22 February 2016 at 14:48, Paul Fultz II <pful...@gmail.com> wrote:But how do we know these are really obscure cases without more experience and perhaps a conceptified standard library?I must have missed that paper in the current mailing. Where can I find this conceptified standard library proposal?
On Monday, February 22, 2016 at 3:01:55 PM UTC-6, Nevin Liber wrote:On 22 February 2016 at 14:48, Paul Fultz II <pful...@gmail.com> wrote:But how do we know these are really obscure cases without more experience and perhaps a conceptified standard library?I must have missed that paper in the current mailing. Where can I find this conceptified standard library proposal?
Well here:
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2015/n4382.pdf
But I don't know if thats the latest.
On Monday, February 22, 2016 at 1:37:27 PM UTC-6, Thorsten Ottosen wrote:Hi Matt,
As I recall there were other problems with the old proposal.
A. ADL based customization points
B. Type checking generic lambda bodies
I believe Larisse Voufo has done research with "weak"-binding that helps solve those issues.
It seems to me that if we need to return to c++0x concepts and complete that design,
The desire to have semantic-based concepts doesn't mean we have to return to C++0x concepts, nor does a design for semantic-based needs to be complete for concepts to be accepted, just that there is a possible path forward(ideally without deprecating `concept bool`).
then we could have to a decade before anything happens, while risking we can't complete the design satisfactory or implement it efficiently enough.
Can we afford that?
It was quite an eye opener when I compared the specification of the old concepts to the new. The old specification was huge. That level of complexity is just scaring.
However, the scope of C++0x was much larger than the current Concept TS. Furthermore, Doug Gregor had proposed a simplified form of semantic-based concepts for C++ that simplified archetype instantiations and removed concept maps, refinements, axioms, and explicit concepts. It was actually much simpler than the current Concept TS:
https://docs.google.com/viewer?a=v&pid=forums&srcid=MDIyMDc3NjUwMTczOTM0Mjk3NjABMDY0MzIwNzAxMTY2ODcxNjg5ODABTXN4MjR4YjZ5Q01KATAuMQFpc29jcHAub3JnAXYy
There is another route to getting definition checking though. Instead of wiring definition checking into the compiler, it could be a separate tool, perhaps done by using clang. That way the definition checking could be done once in a while, or during nightly builds, without affecting compilation or debug performance for the developer.
Of course, it still requires that definition checking is possible in some form (we can probably live with some obscure cases not being perfectly diagnosed).
But how do we know these are really obscure cases without more experience and perhaps a conceptified standard library? Given a little more time with a conceptified STL would help iron-out these issues, and show that definition checking is good enough.
Just my two cents
Kind regards
Thorsten
There is another route to getting definition checking though. Instead of wiring definition checking into the compiler, it could be a separate tool, perhaps done by using clang. That way the definition checking could be done once in a while, or during nightly builds, without affecting compilation or debug performance for the developer.
Of course, it still requires that definition checking is possible in some form (we can probably live with some obscure cases not being perfectly diagnosed).
But how do we know these are really obscure cases without more experience and perhaps a conceptified standard library? Given a little more time with a conceptified STL would help iron-out these issues, and show that definition checking is good enough.I agree it would be good with such an stl if we don't have that already.I remember a meeting more than 10 years ago. The debate was fierce, and circled around whether the two proposals at that time had certain theoretical properties, specifically about definition checking. I wouldn't say it ended in a fist fight, but it was close ;) I believe there is a difference, but I can't image that the difference can be very large. That just seem implausible ... If the the difference is large it must be easy to show code examples of common code that can't be definition checked.
Please note that - as we have said repeatedly - we know how to do definition checking based on predicates. Gaby Dos Reis did it in his C++-style model language Liz several years ago: Gabriel Dos Reis. A System for Axiomatic Programming. In proceedings of the 2012 Conferences on Intelligent Computer Mathemathics. Bremen, German; July 2012; Springer. Liz is still in academic/research use. This again harks back to the ideas documented in our POPL paper: Gabriel Dos Reis, Bjarne Stroustrup. Specifying C++ Concepts. In International Symposium on Principles of Programming Languages (POPL 2006). Charleston (South Carolina), USA; January 2006. It's not all that hard or expensive in compile time.
For some people the question is not whether we can check definitions against concepts specified in the declaration, it's whether we want to.
There is a lot that happens in the transition from `g(x)` to `f`. There is copying, moving, passing by reference, implicit and explicit conversions. And how deeply should the compiler check through conversion and assignments to declare the definition correct?
Then if we want to virtualize the concepts, it becomes even more complicated, because a function table needs to be generated for all
For some people the question is not whether we can check definitions against concepts specified in the declaration, it's whether we want to.
If we want user-friendly error messages than we want to check definitions. It weakens the type safety by not checking the usage.
From: Vicente J. Botet Escriba Sent: Wednesday, February 24, 2016 6:50 PM Reply To: conc...@isocpp.org Subject: Re: [concepts] Is there really no path forward for definition checking of templates? |
Just my 2cts
Vicente
On 2/24/2016 6:50 PM, Vicente J. Botet Escriba wrote:
Le 24/02/2016 23:02, Thorsten Ottosen a écrit :
For some people the question is not whether we can check definitions against concepts specified in the declaration, it's whether we want to.
If we want user-friendly error messages than we want to check definitions. It weakens the type safety by not checking the usage.
As Bjarne said, the cost is high for that. We can't insert debug statements. The world of templates will be divided into two separate worlds.
That's why I suggested that definition checking should be a tool, not wired into the compiler.
If we consider that check definition is not good for some cases, maybe the language proposal should consider that, and let the user add their traces and disable the check definition in some way, by block, statement, expression ....
Nevertheless, I believe that we would want check definition most of the time, but I could change my point of view, if checking them will slow a lot the compilation time.
Don't change your point of view quite yet. We know we can do checking. I actually think that it wouldn't be that expensive, but we need more experience. Some people think in terms of opt-in, some in terms of opt-out, and some in terms of defining specific loop-holes. It's next on some to-do lists. The most important point is that we provide concepts as a general, flexible, precise, easy-to-use specification tool. Please note that what I say is nothing new.
Le 25/02/2016 01:28, Bjarne Stroustrup a écrit :
My point was that I want definition-checking defined by the language, not by an external tool as Thorsten was suggesting.
On 2/24/2016 6:50 PM, Vicente J. Botet Escriba wrote:
Le 24/02/2016 23:02, Thorsten Ottosen a écrit :
For some people the question is not whether we can check definitions against concepts specified in the declaration, it's whether we want to.
If we want user-friendly error messages than we want to check definitions. It weakens the type safety by not checking the usage.
As Bjarne said, the cost is high for that. We can't insert debug statements. The world of templates will be divided into two separate worlds.
That's why I suggested that definition checking should be a tool, not wired into the compiler.
If we consider that check definition is not good for some cases, maybe the language proposal should consider that, and let the user add their traces and disable the check definition in some way, by block, statement, expression ....
Nevertheless, I believe that we would want check definition most of the time, but I could change my point of view, if checking them will slow a lot the compilation time.
Don't change your point of view quite yet. We know we can do checking. I actually think that it wouldn't be that expensive, but we need more experience. Some people think in terms of opt-in, some in terms of opt-out, and some in terms of defining specific loop-holes. It's next on some to-do lists. The most important point is that we provide concepts as a general, flexible, precise, easy-to-use specification tool. Please note that what I say is nothing new.
I said *most of the time*, as I could say, in general we want definition checking. If definition-checking is expensive we will want it disabled by default instead of enable by default. If it is not expensive we will need to opt-out in some rare and specific cases.
On Feb 27, 2016, at 2:30 PM, Bjarne Stroustrup <bja...@stroustrup.com> wrote:
On 2/27/2016 1:46 PM, Vicente J. Botet Escriba wrote:
Le 25/02/2016 01:28, Bjarne Stroustrup a écrit :
My point was that I want definition-checking defined by the language, not by an external tool as Thorsten was suggesting.
On 2/24/2016 6:50 PM, Vicente J. Botet Escriba wrote:
Le 24/02/2016 23:02, Thorsten Ottosen a écrit :
For some people the question is not whether we can check definitions against concepts specified in the declaration, it's whether we want to.
If we want user-friendly error messages than we want to check definitions. It weakens the type safety by not checking the usage.
As Bjarne said, the cost is high for that. We can't insert debug statements. The world of templates will be divided into two separate worlds.
That's why I suggested that definition checking should be a tool, not wired into the compiler.
If we consider that check definition is not good for some cases, maybe the language proposal should consider that, and let the user add their traces and disable the check definition in some way, by block, statement, expression ....
Nevertheless, I believe that we would want check definition most of the time, but I could change my point of view, if checking them will slow a lot the compilation time.
Don't change your point of view quite yet. We know we can do checking. I actually think that it wouldn't be that expensive, but we need more experience. Some people think in terms of opt-in, some in terms of opt-out, and some in terms of defining specific loop-holes. It's next on some to-do lists. The most important point is that we provide concepts as a general, flexible, precise, easy-to-use specification tool. Please note that what I say is nothing new.
I agree. *If* I want definition checking, I want to compiler to do it.
I said *most of the time*, as I could say, in general we want definition checking. If definition-checking is expensive we will want it disabled by default instead of enable by default. If it is not expensive we will need to opt-out in some rare and specific cases.
I don't think that we yet know exactly what "we" want. That's a discussion for next year or so, and a topic for experimentation.