Explicit and implicit constraints

2,747 views
Skip to first unread message

Nikolay Ivchenkov

unread,
Mar 11, 2013, 2:20:14 PM3/11/13
to SG8 - Concepts
What is the difference between "constrained template" and
"uncontrained
template"?

The complete set of actual constraints for a particular function
template declaration implicitly includes the requirement that the
template parameter list and the function type shall be well-formed.

// #1
template <class Range>
auto begin(Range &range) ->
decltype(range.begin())
{ return range.begin(); }

Is this function template "constrained"? It doesn't have any
requires-clause, but it has the implicit constraint (imposed by
decltype(range.begin())): range.begin() shall be well-formed (when it
is ill-formed, the template specialization will be excluded from the
set of candidate functions).

namespace impl
{
void begin();

template <class Range>
auto adl_begin(Range &range) ->
decltype(begin(range))
{ return begin(range); }

template <class T>
constexpr bool has_adl_begin()
{
// whether expression (adl_begin)(std::declval<T &>())
// is well-formed
return __is_valid_expr((adl_begin)(std::declval<T &>()));
}
}

// #2
template <class Range>
auto begin(Range &range) ->
decltype(impl::adl_begin(range))
requires(impl::has_adl_begin<Range>())
{ return impl::adl_begin(range); }

Here the underlying implicit constraint for #2 is "begin(range) shall
be well-formed, where 'begin' is looked up with ADL" (it is imposed by
decltype(impl::adl_begin(range))). In addition, this template has
explicit contraint "has_adl_begin<Range>() shall be true" which
essentially represents the same requirement as the implicit
constraint.

If we compare complete sets of constraints for #1 and #2, neither of
these templates can be considered as more constrained than other,
because neither of the sets is a subset of other.

Alternatively, the meaning of "constrained template" can be "template
with a requires-clause". Then #1 is an unconstrained template, while
#2
is a constrained template; according to the mental model

"An unconstrained template is obviously less constrained than a
constrained template and is only selected when no other candidates
are viable" (see §2.1.2),

#1 is less constrained than #2 and, therefore, #2 will be preferred
(over #1) whenever it is viable. IMO, such outcome doesn't seem to be
reasonable.

Andrew Sutton

unread,
Mar 11, 2013, 3:42:15 PM3/11/13
to conc...@isocpp.org
> The complete set of actual constraints for a particular function
> template declaration implicitly includes the requirement that the
> template parameter list and the function type shall be well-formed.
>
> // #1
> template <class Range>
> auto begin(Range &range) ->
> decltype(range.begin())
> { return range.begin(); }
>
> Is this function template "constrained"? It doesn't have any
> requires-clause, but it has the implicit constraint (imposed by
> decltype(range.begin())): range.begin() shall be well-formed (when it
> is ill-formed, the template specialization will be excluded from the
> set of candidate functions).

I do not consider this template to be constrained. A constraint is a
predicate on template arguments that determines whether a declaration
can be instantiated. This is just a declaration with a dependent
result type.

I actually considered going down this road... using dependent types in
the declaration as implicit constraints. I decided against it because
it leads to the idea that constraints should be automatically deduced,
and I think that leads to greater comfort with ad hoc and
underspecified templates. I see that as a major barrier to concepts as
we move forward.

Constraints are part of the interface of the declaration. They should
be written explicitly and concisely, and not inferred from the syntax
of an implementation.

Andrew

Bjarne Stroustrup

unread,
Mar 11, 2013, 3:54:59 PM3/11/13
to conc...@isocpp.org
On 3/11/2013 2:42 PM, Andrew Sutton wrote:

Stronger: Some of us consider it a very bad idea to deduce the interface
of a set of functions from their implementation. Doing so means that the
interface changes when the implementation does. That seriously
constrains implementers and leaves users with irregular,
implementation-oriented "concepts"

Nikolay Ivchenkov

unread,
Mar 12, 2013, 11:54:50 AM3/12/13
to conc...@isocpp.org
On Monday, March 11, 2013 11:42:15 PM UTC+4, Andrew Sutton wrote:
> I do not consider this template to be constrained. A constraint is a
> predicate on template arguments that determines whether a
> declaration can be instantiated. This is just a declaration with a
> dependent result type.

In other words, a contraint is a formal entity that in general is not
supposed to reflect some actual set of requirements.


> I actually considered going down this road... using dependent types
> in the declaration as implicit constraints. I decided against it
> because it leads to the idea that constraints should be
> automatically deduced, and I think that leads to greater comfort
> with ad hoc and underspecified templates.

I saw similar doubts about return types and noexcept-specifications.
What is the outcome? Currently if I want to write a simple adaptor, I
have to either mention the same expression 3 times:

template <class F>
    class function_adaptor
{
public:
    function_adaptor(F const &f) : m_f(f) {}

    template <class... Params>
        auto operator ()(Params &&... params) const
            noexcept(noexcept(
                std::declval<F &>().call(FORWARD(params)...)))
            -> decltype(std::declval<F &>().call(FORWARD(params)...))
            { return m_f.call(FORWARD(params)...); }
private:
    F m_f;
};

or actively use the preprocessor in order to reduce redundancy. With
concepts the same expression has to be mentioned... 4 times?
(return statement + return type + noexcept-specification + concept
definition). And in the absence of a portable is_well_formed operator
(which is not proposed) we will have to use SFINAE tricks in order to
define a concept, such as "expression

    std::declval<F &>().call(std::declval<Params>()...)

shall be well-formed"?


> Constraints are part of the interface of the declaration. They
> should be written explicitly and concisely, and not inferred from
> the syntax of an implementation.

There are function templates, whose semantics is defined in the form
"the effect is equivalent to calling ...", and for which there is only
one reasonable implementation. There is no point to consider such
implemetation as something that must be hidden.

Andrew Sutton

unread,
Mar 13, 2013, 11:31:10 AM3/13/13
to conc...@isocpp.org
>> I do not consider this template to be constrained. A constraint is a
>> predicate on template arguments that determines whether a
>> declaration can be instantiated. This is just a declaration with a
>> dependent result type.
>
> In other words, a contraint is a formal entity that in general is not
> supposed to reflect some actual set of requirements.

I'm not quite sure what you mean by this.

A constraint specifies an actual set of requirements that are actually
checked prior to instantiation. If you write a declaration or
definition that uses syntax outside of that specification, then you
invite the possibilities for late-caught type errors.

SFINAE tricks may work in some cases, but relying on them as a general
mechanism has proven difficult.


> I saw similar doubts about return types and noexcept-specifications.
> What is the outcome? Currently if I want to write a simple adaptor, I
> have to either mention the same expression 3 times:
>
> template <class F>
> class function_adaptor
> {
> public:
> function_adaptor(F const &f) : m_f(f) {}
>
> template <class... Params>
> auto operator ()(Params &&... params) const
> noexcept(noexcept(
> std::declval<F &>().call(FORWARD(params)...)))
> -> decltype(std::declval<F &>().call(FORWARD(params)...))
> { return m_f.call(FORWARD(params)...); }
> private:
> F m_f;
> };
>
> or actively use the preprocessor in order to reduce redundancy. With
> concepts the same expression has to be mentioned... 4 times?
> (return statement + return type + noexcept-specification + concept
> definition). And in the absence of a portable is_well_formed operator
> (which is not proposed) we will have to use SFINAE tricks in order to
> define a concept, such as "expression
>
> std::declval<F &>().call(std::declval<Params>()...)
>
> shall be well-formed"?

I believe that C++14 will include a fix for result type deduction, and
I agree that noexcept can lead to inelegant interfaces. I suspect that
a more reasonable definition of constraints on this operator might be:

template<typename... Params>
requires Noexcept_function<F, Params...>()
auto operator()Params&&... params) const
{ return m_f.call(forward<Params>(params)...); }

Or something similar.


> There are function templates, whose semantics is defined in the form
> "the effect is equivalent to calling ...", and for which there is only
> one reasonable implementation. There is no point to consider such
> implemetation as something that must be hidden.

This is not a convincing argument for conflating the difference
between an interface and an implementation.

Andrew

Nikolay Ivchenkov

unread,
Mar 13, 2013, 5:44:15 PM3/13/13
to conc...@isocpp.org
On Wednesday, March 13, 2013 7:31:10 PM UTC+4, Andrew Sutton wrote:
>> In other words, a contraint is a formal entity that in general is
>> not supposed to reflect some actual set of requirements.
>
> I'm not quite sure what you mean by this.
>
> A constraint specifies an actual set of requirements that are actually
> checked prior to instantiation.

In general, it's a subset of an actual (complete) set of requirements
that can be checked during overload resolution (and contributes to
determining the outcome of an overload resolution process). The
presence of formal constraints doesn't imply that usual SFINAE won't
work; thus, sometimes we will have to deal with implicit (non-formal)
constraints that should be taken into account.

When people read the following thoughts


    "An unconstrained template is obviously less constrained than a
    constrained template"

they probably first imagine something like this:

    // unconstrained template
    template <class T>
        void f(T);

    // constrained template
    template <class T>
        requires SomePredicate<T>()
            void f(T);

where there is no noticeable difference between formal and full
visible sets of constraints, and the suggested model gives reasonable
and intuitively expected results. The unattractive truth is that the
given model may imply unreasonable and surprising results:

    // formally unconstrained template
    template <class T>
        typename std::enable_if<std::is_integral<T>{}>::type f(T);

    // formally constrained template
    template <class T>
        requires std::is_arithmetic<T>{}
            void f(T);

Here the second template will always have better match than the
former, although the set of integral types is a subset of arithmetic
types. A possible impression that formally constrained and formally
unconstrained templates should always mix well is just an illusion.


> If you write a declaration or definition that uses syntax outside of
> that specification, then you invite the possibilities for
> late-caught type errors.

I don't worry about late-caught errors (which are hard errors that
instantly expose existing bugs), I'm concerned about silent calls to
functions that possibly are not intended to be chosen.


> I suspect that a more reasonable definition of constraints on this
> operator might be:
>
> template<typename... Params>
>  requires Noexcept_function<F, Params...>()
> auto operator()(Params&&... params) const

> { return m_f.call(forward<Params>(params)...); }
>
> Or something similar.

How exactly would you define the concept? That's the most interesting
part. Also it would be interesting to see a concept-based alternative
to the following definition of min:

    #define FORWARD(x) static_cast<decltype(x) &&>(x)

    #define RETURN(...) \
        noexcept(noexcept(__VA_ARGS__)) \
        -> decltype((__VA_ARGS__)) \
        { return __VA_ARGS__; }

    template <class T1, class T2>
        auto min(T1 &&t1, T2 &&t2)
            RETURN(t1 < t2 ? FORWARD(t1) : FORWARD(t2))

BTW, within the model described in N2914 (the last C++0x working draft
with concepts) it's extremely hard (or impossible?) to specify correct
constraints for such min.


>> There are function templates, whose semantics is defined in the
>> form "the effect is equivalent to calling ...", and for which there
>> is only one reasonable implementation. There is no point to
>> consider such implemetation as something that must be hidden.
>
> This is not a convincing argument for conflating the difference
> between an interface and an implementation.

What is the difference between interface and implementation of a call
wrapper that simply forwards arguments to and results from some
underlying function? What is the point to define a special concept for
such wrapper?

Andrew Sutton

unread,
Mar 13, 2013, 6:36:47 PM3/13/13
to conc...@isocpp.org
> In general, it's a subset of an actual (complete) set of requirements
> that can be checked during overload resolution (and contributes to
> determining the outcome of an overload resolution process).

In general, that should *not* be. Remember that concepts-lite is an
incremental step towards a full concepts, it's not simply a
replacement for enable_if.

Concepts will definitely include requirements for separately checking
template declarations separately from their instantiation, which means
that names and expressions that would result in substitution failures
now may result in compiler errors in the future.

SFINAE hacks have enabled programmers to think about only what's
necessary to achieve a particular overloading outcome, but not the
complete set of requirements for a template. I'm as guilty of that as
anybody, but I also understand that what can do using SFINAE is not
the equivalent of a proper specification.


> A possible impression that formally constrained and formally
> unconstrained templates should always mix well is just an illusion.

Are you suggesting that future libraries should contain a mix of
SFINAE-enabled overloads mixed with constrained overloads? Can you
provide some rationale as to why that would be the case?



> How exactly would you define the concept? That's the most interesting
> part. Also it would be interesting to see a concept-based alternative
> to the following definition of min:
>
> #define FORWARD(x) static_cast<decltype(x) &&>(x)
>
> #define RETURN(...) \
> noexcept(noexcept(__VA_ARGS__)) \
> -> decltype((__VA_ARGS__)) \
> { return __VA_ARGS__; }
>
> template <class T1, class T2>
> auto min(T1 &&t1, T2 &&t2)
> RETURN(t1 < t2 ? FORWARD(t1) : FORWARD(t2))
>
> BTW, within the model described in N2914 (the last C++0x working draft
> with concepts) it's extremely hard (or impossible?) to specify correct
> constraints for such min.

Did you read the entire proposal? The implementation section described
the mechanism I used to implement syntactic requirements, and if you
dig through the implementation of the traits header, you'll see how
the noexcept type traits are implemented using constraints. It's in a
file named include/bits/cons_type_traits.h (in libstdc++v3).


template<typename T, typename U>
requires Totally_ordered<T, U>()
Common_type<T, U>
min(T&& t, T&& u) { return t < u ? t : u; }

The implementation of Concepts Lite does not yet include Common_type
--- I left it out on accident, but there's a good definition in the
Palo Alto TR (N3351). It's the result type of the ?: operator.

Add forwarding where appropriate, but I don't think it should be necessary.


>> This is not a convincing argument for conflating the difference
>> between an interface and an implementation.
>
> What is the difference between interface and implementation of a call
> wrapper that simply forwards arguments to and results from some
> underlying function? What is the point to define a special concept for
> such wrapper?

If you find the need to define a special concept for such a wrapper,
then I suspect that particular function does not need to be
constrained, or could be adequately defined using some SFINAE
mechanism.

Andrew

Bjarne Stroustrup

unread,
Mar 14, 2013, 11:24:23 AM3/14/13
to conc...@isocpp.org
On 3/13/2013 5:36 PM, Andrew Sutton wrote:

Let me take this opportunity to remind people that

"being able to do something is not sufficient reason for doing it" and
"being able to do every trick is not a feature but a bug"

For the latter, remember Dijkstra's famous "Goto considered harmful"
paper. The point was not that the "new features" (loop constructs) could
do every goto trick better/simpler, but that some of those tricks should
be avoided to simplify good programming.

Concepts and concepts lite are meant to make good generic programming
simpler. They are not meant to be a drop-in substitute for every
metaprogramming and macroprogramming trick. If you are an expert, and if
in your expert opinion you and your users really need those tricks, you
can still use them, but we need to make many (most) uses of templates
easier to get right, so that they can become more mainstream. That where
concepts and concept lite fits in.

Some of you may find this hard to believe, but "back then" there was
quite serious opposition to function declarations because "they
restricted the way functions could be used and the way separate
compilation could be used" and also serious opposition to virtual
functions "because pointers to functions are so much more flexible." I
see concepts lite (and concepts) in the same light as goto/for,
unchecked-function-arguments/function-declarations,
pointers-to-functions/abstract-classes.

Tony V E

unread,
Mar 15, 2013, 11:19:43 PM3/15/13
to conc...@isocpp.org
On Wed, Mar 13, 2013 at 6:36 PM, Andrew Sutton
<andrew....@gmail.com> wrote:
>> In general, it's a subset of an actual (complete) set of requirements
>> that can be checked during overload resolution (and contributes to
>> determining the outcome of an overload resolution process).
>
> In general, that should *not* be. Remember that concepts-lite is an
> incremental step towards a full concepts, it's not simply a
> replacement for enable_if.
>
> Concepts will definitely include requirements for separately checking
> template declarations separately from their instantiation, which means
> that names and expressions that would result in substitution failures
> now may result in compiler errors in the future.
>
> SFINAE hacks have enabled programmers to think about only what's
> necessary to achieve a particular overloading outcome, but not the
> complete set of requirements for a template. I'm as guilty of that as
> anybody, but I also understand that what can do using SFINAE is not
> the equivalent of a proper specification.
>
>
>> A possible impression that formally constrained and formally
>> unconstrained templates should always mix well is just an illusion.
>
> Are you suggesting that future libraries should contain a mix of
> SFINAE-enabled overloads mixed with constrained overloads? Can you
> provide some rationale as to why that would be the case?
>
>

I don't know about libraries, but I suspect mixes of libraries and
mixes of old and new code, will leave someone with SFINAE mixed with
constraints.

Tony

Andrew Sutton

unread,
Mar 16, 2013, 9:58:19 AM3/16/13
to conc...@isocpp.org
> I don't know about libraries, but I suspect mixes of libraries and
> mixes of old and new code, will leave someone with SFINAE mixed with
> constraints.

I'm not especially worried. From what I've seen, SFINAE-constrained
overloads of the same function aren't frequently spread across a lot
of modules. They tend to be localized to a single library, so the
likelihood of mixed code should be quite small.

Are there any good examples of systems where this is not the case?

Tony V E

unread,
Mar 16, 2013, 12:55:53 PM3/16/13
to conc...@isocpp.org
I suspect it will only happen via accidental naming collisions. Like when your function is just called "update". It is just too common of a name. 

So it will be rare. But will bite someone.  But that problem already exists in various forms.

Sent from my portable Analytical Engine


From: "Andrew Sutton" <andrew....@gmail.com>
To: "conc...@isocpp.org" <conc...@isocpp.org>
Sent: 16 March, 2013 9:58 AM
Subject: Re: [concepts] Explicit and implicit constraints

> I don't know about libraries, but I suspect mixes of libraries and
> mixes of old and new code, will leave someone with SFINAE mixed with
> constraints.

I'm not especially worried. From what I've seen, SFINAE-constrained
overloads of the same function aren't frequently spread across a lot
of modules. They tend to be localized to a single library, so the
likelihood of mixed code should be quite small.

Are there any good examples of systems where this is not the case?

-- 
You received this message because you are subscribed to the Google Groups "SG8 - Concepts" group.
To unsubscribe from this group and stop receiving emails from it, send an email to concepts+u...@isocpp.org.
To post to this group, send email to conc...@isocpp.org.
Visit this group at http://groups.google.com/a/isocpp.org/group/concepts/?hl=en-US.


Nikolay Ivchenkov

unread,
Mar 29, 2013, 5:12:32 AM3/29/13
to conc...@isocpp.org
On Thursday, March 14, 2013 2:36:47 AM UTC+4, Andrew Sutton wrote:
>>> A constraint specifies an actual set of requirements that are
>>> actually checked prior to instantiation.
>>
>> In general, it's a subset of an actual (complete) set of
>> requirements that can be checked during overload resolution (and
>> contributes to determining the outcome of an overload resolution
>> process).
>
> In general, that should *not* be. Remember that concepts-lite is an
> incremental step towards a full concepts, it's not simply a
> replacement for enable_if.

An actual set of requirements may differ from formal set for:

1) formally unconstrained templates (as shown previously);
2) formally constrained templates with accidentally omitted
requirements (see below);
3) formally constrained templates with intentionally omitted
requirements.


> SFINAE hacks have enabled programmers to think about only what's
> necessary to achieve a particular overloading outcome, but not the
> complete set of requirements for a template.

In the absense of a convenient way to specify desirable priority of
a function template among an overload set, lightweight concepts
probably will be used as enable_if v2.0 for more subtle hacks.


>> A possible impression that formally constrained and formally
>> unconstrained templates should always mix well is just an illusion.
>
> Are you suggesting that future libraries should contain a mix of
> SFINAE-enabled overloads mixed with constrained overloads?

I thought that you are going to encourage people to use such mixes.
From my point of view, existing templates should be treated as
"indeterminately constrained" (for which all other templates are
neither more nor less constrained) rather than "unconstrained", and
there should be a way to distinguish templates with unknown
requirements and templates with no requirements (of course if we
assume that concepts should ever be introduced at all).


>  template<typename T, typename U>
>    requires Totally_ordered<T, U>()
>      Common_type<T, U>
>      min(T&& t, U&& u) { return t < u ? t : u; }

Such definition appears to be underconstrained, because not all
Totally_ordered pairs have a common type.

    struct B { int n; };

    bool operator ==(B const &, B const &);
    bool operator !=(B const &, B const &);
    bool operator <(B const &, B const &);
    bool operator >(B const &, B const &);
    bool operator <=(B const &, B const &);
    bool operator >=(B const &, B const &);

    struct D1 : B {};
    struct D2 : B {};

    D1 d1;
    D2 d2;
    using common_type = decltype(d1 < d2 ? d1 : d2);

Here the pair <D1, D2> is totally ordered, but it doesn't have a
common type, so the application of ?: will be ill-formed. Therefore,
we need something like this:

    template <class T, class U>
        requires
            Less_than_comparable<T, U>() &&
            Has_common_type<T, U>()
        Common_type<T, U> min(T &&t, U &&u)
            { return t < u ? FORWARD(t) : FORWARD(u); }

On the other hand, according to N3485 - 14.8.2 p. 6, 7:

    At certain points in the template argument deduction process it
    is necessary to take a function type that makes use of template
    parameters and replace those template parameters with the
    corresponding template arguments. This is done at the beginning of
    template argument deduction when any explicitly specified template
    arguments are substituted into the function type, and again at the
    end of template argument deduction when any template arguments
    that were deduced or obtained from default arguments are
    substituted.

    The substitution occurs in all types and expressions that are used
    in the function type and in template parameter declarations.

Thus, substitution of template arguments in Common_type<T, U> can be
done as a part of the template argument deduction process, which (as
far as I can see in N3580) is supposed to be performed before
substitution in the requires-clause; then a substitution failure in
Common_type<T, U> may occur before checking Has_common_type<T, U>().

I've tested gcc-clite-0.2 (2013-02-22).
The program from http://liveworkspace.org/code/CTns4$0 prints 0. If we
replace the call

   min(std::declval<T>(), std::declval<U>())

with

   min<T, U>(std::declval<T>(), std::declval<U>())

then the compiler issues the following diagnostics:

----------------------------------------------------------------------
Compiling: source.cpp
In file included from /include/c++/4.8.0/type_traits:42:0,
                 from /include/c++/4.8.0/bits/move.h:57,
                 from /include/c++/4.8.0/bits/stl_pair.h:61,
                 from /include/c++/4.8.0/bits/stl_algobase.h:65,
                 from /include/c++/4.8.0/bits/char_traits.h:41,
                 from /include/c++/4.8.0/ios:41,
                 from /include/c++/4.8.0/ostream:40,
                 from /include/c++/4.8.0/iostream:40,
                 from source.cpp:1:
/include/c++/4.8.0/bits/cons_type_traits.h: In instantiation of
                    ‘struct std::common_type<D1&, D2&>’:
source.cpp:28:13:   required by substitution of ‘template<class T,
                        class U> typename std::common_type<T, U>::type
                        min(T&&, U&&)
                        [with T = <missing>; U = <missing>]’
source.cpp:33:80:   required from ‘constexpr bool has_min()
                        [with T = D1&; U = D2&]’
source.cpp:37:38:   required from here
/include/c++/4.8.0/bits/cons_type_traits.h:1462:29: error: no match
                        for ternary ‘operator?:’ (operand types are
                        ‘bool’, ‘D1’, and ‘D2’)
     { typedef decltype(true ? declval<_Tp>() : declval<_Up>()) type; };
                             ^
Process terminated with status 1 (0 minutes, 1 seconds)
4 errors, 0 warnings
----------------------------------------------------------------------

Nikolay Ivchenkov

unread,
Mar 29, 2013, 5:43:43 AM3/29/13
to conc...@isocpp.org, b...@cs.tamu.edu
On Thursday, March 14, 2013 7:24:23 PM UTC+4, Bjarne Stroustrup wrote:
> Concepts and concepts lite are meant to make good generic
> programming simpler.

I doubt that such expectations about simplicity are reasonable.
Specifying correct constraints is more difficult than it may seem at
first glance. While templates with _properly_ specified constraints
may become easier to use, it's likely that only language lawyers will
be able to write templates correctly in accordance with the new
fashion and we will have lots and lots of poorly constrained templates
(that are even worse than formally unconstrained templates). I can
suggest simple test: ask people how they would specify constraints for
the following templates:

1)

    template <class T, class U>
        typename std::common_type<T, U>::type
            min(T &&t, U &&u)
                { return t < u ? FORWARD(t) : FORWARD(u); }
2)
    template <class T, class... Params>
        typename std::enable_if
        <
            !std::is_array<T>{},
            std::unique_ptr<T>
        >::type make_unique(Params &&... params)
            { return std::unique_ptr<T>(new T(FORWARD(params)...)); }

3)
    template <class F, class... Params>
        typename std::result_of<F &&(Params &&...)>::type
            invoke(F &&f, Params &&... params)
                { return FORWARD(f)(FORWARD(params)...); }

where FORWARD is defined as


    #define FORWARD(x) static_cast<decltype(x) &&>(x)

I think, the vast majority of _advanced_ C++ programmers will be
unable to provide proper formal constraints even for such simple
cases.

How about something more difficult, e.g., variadic min?
http://liveworkspace.org/code/2GxNZd$0

It's not obvious how to reasonably define and use such primitive
constraints as Less_than_comparable, Weakly_ordered, and
Totally_ordered. The header cons_type_traits.h provides the following
definitions:

  template<typename _Tp>
    constexpr bool Weakly_ordered()
    {
      __declval _Tp __a, __b;
      return __is_valid_expr(bool = {__a < __b})
          && __is_valid_expr(bool = {__a > __b})
          && __is_valid_expr(bool = {__a <= __b})
          && __is_valid_expr(bool = {__a >= __b});
    }

  template<typename _Tp, typename _Up>
    constexpr bool Weakly_ordered()
    {
      __declval _Tp __t;
      __declval _Up __u;
      return Weakly_ordered<_Tp>() && Weakly_ordered<_Up>()
          && __is_valid_expr(bool = {__t < __u})
          && __is_valid_expr(bool = {__u < __t})
          && __is_valid_expr(bool = {__t > __u})
          && __is_valid_expr(bool = {__u > __t})
          && __is_valid_expr(bool = {__t <= __u})
          && __is_valid_expr(bool = {__u <= __t})
          && __is_valid_expr(bool = {__t >= __u})
          && __is_valid_expr(bool = {__u >= __t});   
    }

  template<typename _Tp, typename _Up>
    constexpr bool Totally_ordered()
    {
      return Equality_comparable<_Tp, _Up>() &&
        Weakly_ordered<_Tp, _Up>();
    }

If I constrain my min with such Totally_ordered as was suggested by
Andrew, I won't be able to write something like this

    class type_index
    {
    public:
        type_index(std::type_info const &x) noexcept :
            m_typeid(&x) {}
        size_t hash_code() const noexcept
            { return m_typeid->hash_code(); }
        const char *name() const noexcept
            { return m_typeid->name(); }
        std::type_info const &get() const noexcept
            { return *m_typeid; }
    private:
        std::type_info const *m_typeid;
    };

    inline bool operator ==(
        type_index const &x, type_index const &y) noexcept
            { return x.get() == y.get(); }
    inline bool operator !=(
        type_index const &x, type_index const &y) noexcept
            { return !(x == y); }
    inline bool operator <(
        type_index const &x, type_index const &y) noexcept
            { return x.get().before(y.get()); }
    inline bool operator >(
        type_index const &x, type_index const &y) noexcept
            { return y < x; }
    inline bool operator <=(
        type_index const &x, type_index const &y) noexcept
            { return !(y < x); }
    inline bool operator >=(
        type_index const &x, type_index const &y) noexcept
            { return !(x < y); }

    type_index type_idx = get_type_index();
    type_index min_type_idx = min(type_idx, typeid(T)); // ill-formed

because 'std::type_info const &' isn't Weakly_ordered (there is no
suitable operator < for operands of such type), in spite of the fact
that there are no any problems with heterogeneous comparison between
'type_index' and 'std::type_info const &'. The presence of

   Weakly_ordered<_Tp>() && Weakly_ordered<_Up>()

doesn't look reasonable. What can we say about the symmetry "T is
LessThanComparable/Weakly_ordered/Totally_ordered with U iff U is
LessThanComparable/Weakly_ordered/Totally_ordered with T"?

Sometimes classes define operator < as a member function. Then
implicit conversions can be applied only to the right operand. I
consider such classes as poorly defined, but people frequently do such
things. In particular, std::type_index is defined so. We can compare
an expression of type std::type_index (taken as the left operand) with
an expression of type const std::type_info (taken as the right
operand), but the symmetric comparison between const std::type_info
and std::type_index is not valid. Do we really need the symmetry in
such templates as heterogeneous min? I don't think so.

The need in presence of ==, !=, >, <=, >= for min is also
controversial.

On the other hand, Weakly_ordered doesn't care about constness and
value category of operands. For example,


    template <class T, class U>
        requires
            std::Totally_ordered<T, U>() &&
            Has_common_type<T, U>()

        typename std::common_type<T, U>::type
            min(T &&t, U &&u)
    {
        return t < u ? FORWARD(t) : FORWARD(u);
    }

can be changed to


    template <class T, class U>
        requires
            std::Totally_ordered<T, U>() &&
            Has_common_type<T, U>()

        typename std::common_type<T, U>::type
            min(T &&t, U &&u)
    {
        // '<' is replaced with '>'
        return t > u ? FORWARD(u) : FORWARD(t);
    }

without introducing any problems, but if we change this to


    template <class T, class U>
        requires
            std::Totally_ordered<T, U>() &&
            Has_common_type<T, U>()

        typename std::common_type<T, U>::type
            min(T &&t, U &&u)
    {
        auto const &x = t;
        auto const &y = u;
        return x < y ? FORWARD(t) : FORWARD(u);
    }

the constraint check std::Totally_ordered<T, U>() becomes
insufficient: the ability to compare non-const lvalues doesn't imply
the ability to compare const lvalues. C++11 - Table 18 describes
LessThanComparable requirement in terms of expressions with arbitrary
value category and optionally added constness. A similar comprehensive
heterogeneous version of LessThanComparable could be formulated as
follows:

    A type T is LessThanComparable with a type U iff for every
    well-formed lvalue or rvalue t of possibly const type
    std::remove_reference<T>::type and for every well-formed lvalue or
    rvalue u of possibly const type std::remove_reference<U>::type the
    following declarations are well-formed:

        bool implicit_conversion_result = t < u;
        bool explicit_conversion_result(t < u);

[Note that theoretically a type may be implicitly convertible to bool,
but not contextually convertible to bool. For example, the code


    template <class T, class U>
        requires
            std::Totally_ordered<T, U>() &&
            Has_common_type<T, U>()

        typename std::common_type<T, U>::type
            min(T &&t, U &&u)
    {
        return t < u ? FORWARD(t) : FORWARD(u);
    }

    struct B
    {
        operator int() const;
        explicit operator bool() const = delete;
    };

    struct X
    {
        friend B operator <(X const &, X const &);
        friend B operator >(X const &, X const &);
        friend B operator <=(X const &, X const &);
        friend B operator >=(X const &, X const &);
        friend B operator ==(X const &, X const &);
        friend B operator !=(X const &, X const &);
    };

    auto &&res = min(X(), X());

is ill-formed, because for the given application of min the result of
t < u is not contextually convertible to bool, but for the deduced T
and U the constraint check std::Totally_ordered<T, U>() will be
successful, because t < u is implicitly convertible to bool.]

A comprehensive concept check for the aforementioned definition of
heterogeneous LessThanComparable has to include tests for 32
declarations (left const/non-const lvalue/rvalue operand *
right const/non-const lvalue/rvalue operand * implicit/explicit
conversion to bool):

    #define CONVERTIBLE(source_expr, destination_type) \
        destination_type = { source_expr }; \
        destination_type(source_expr)


    template <class T, class U>
        constexpr bool Less_than_comparable_impl_lvalueness() noexcept
    {
        return requires
            {
                CONVERTIBLE(declval<T &>() < declval<U &>(), bool);
                CONVERTIBLE(declval<T>() < declval<U &>(), bool);
                CONVERTIBLE(declval<T &>() < declval<U>(), bool);
                CONVERTIBLE(declval<T>() < declval<U>(), bool);
            };

    }
    template <class T, class U>
        constexpr bool Less_than_comparable_impl_constness() noexcept
    {
        return
            Less_than_comparable_impl_lvalueness<T, U>() &&
            Less_than_comparable_impl_lvalueness<T const, U>() &&
            Less_than_comparable_impl_lvalueness<T, U const>() &&
            Less_than_comparable_impl_lvalueness<T const, U const>();

    }
    template <class T, class U>
        constexpr bool Less_than_comparable() noexcept
    {
        using ET = typename std::remove_reference<T>::type;
        using EU = typename std::remove_reference<U>::type;
        return Less_than_comparable_impl_constness<ET, EU>();
    }   

If we add checks for >, <=, =>, we will have 128 checks. If we also
add the symmetry check (x @ y <-> y @ x), we will have 256 checks.
Such theoretically perfect concept check leads to complicated code
and potentially slow compilation. You can say: "OK, let's ignore rare
errors and handle only the most common ones. Then we can significantly
reduce the number of checks, although sometimes we can get long error
messages as before." But where to draw a line between "rare" and
"common"? Is the following definition good enough?


    template <class T, class U>
        constexpr bool Less_than_comparable() noexcept
    {
        using AT = typename std::remove_reference<T>::type;
        using AU = typename std::remove_reference<U>::type;
        return requires
        {
            bool = { declval<AT const &>() < declval<AU const &>() };
            bool = { declval<AT>() < declval<AU>() };
        };
    }

How high skill and how much time do we need in order to find an
optimal solution?

Consider another example:

  // Value type
  template<typename _Tp>
    struct __value_type
    { using type = __subst_fail; };

  template<typename _Tp>
    requires __is_valid_type(typename _Tp::value_type)
      struct __value_type<_Tp>
      { using type = typename _Tp::value_type; };

  // Any type claiming that their value type is void lies.
  template<typename _Tp>
    requires __is_valid_type(typename _Tp::value_type)
          && __is_same(typename _Tp::value_type, void)
    struct __value_type<_Tp>
    { using type = __subst_fail; };

  template<typename _Tp>
    struct __value_type<_Tp*>
    { using type = _Tp; };

  template<typename _Tp>
    struct __value_type<const _Tp*>
    { using type = _Tp; };

  // Value_type
  template<typename _Tp>
    using Value_type = typename __value_type<_Tp>::type;

  template<typename _Iter, typename _Tp>
    constexpr bool Writable()
    {
      __declval _Iter __i;
      __declval _Tp __x;
      return __is_valid_expr(*__i = std::forward<_Tp>(__x));
    }

  template<typename _Iter, typename _Tp>
    constexpr bool Output_iterator()
    {
      return Writable<_Iter, _Tp>() && Advanceable<_Iter>();
    }

  template<typename _In, typename _Out>
    constexpr bool Indirectly_movable()
    {
      return Input_iterator<_In>()
          && Output_iterator<_Out, Value_type<_In>&&>();
    }

  // True when values can be copied from In to Out.
  template<typename _In, typename _Out>
    constexpr bool Indirectly_copyable()
    {
      return Indirectly_movable<_In, _Out>()
          && Output_iterator<_Out, Value_type<_In>>();
    }

  template<Input_iterator _In, Advanceable _Out>
    requires Indirectly_copyable<_In, _Out>()
      inline _Out
      copy(_In __first, _In __last, _Out __result)
      {
        __glibcxx_requires_valid_range(__first, __last);

        return (std::__copy_move_a2<__is_move_iterator<_In>::__value>
          (std::__miter_base(__first), std::__miter_base(__last),
           __result));
      }

Do you see any problems in Indirectly_movable and Indirectly_copyable
(which are part of the presented GCC implementation)? If not, I'll
give you a hint: try to figure out what will happen if _In is
'const std::unique_ptr<int> *' and _Out is 'std::unique_ptr<int> *'.
In spite of the fact that std::unique_ptr<int> is not assignable from
an lvalue or rvalue of type const std::unique_ptr<int>,

    Indirectly_movable
    <
        const std::unique_ptr<int> *,
        std::unique_ptr<int> *
    >()

and

    Indirectly_copyable
    <
        const std::unique_ptr<int> *,
        std::unique_ptr<int> *
    >()

will return true, because in

    __is_valid_expr(*__i = std::forward<_Tp>(__x))

the expression std::forward<_Tp>(__x) becomes an rvalue of type
std::unique_ptr<int> in all cases.

If you try to compile

    std::unique_ptr<int> const p1;
    std::unique_ptr<int> p2;
    std::copy(&p1, &p1 + 1, &p2);

you will see that the constraint check doesn't work properly: the
error is diagnosed inside the definition of specialization for
auxiliary class template std::__copy_move.

On the other hand, the above std::copy is overconstrained (again, the
root of the problem is inside the definition of Indirectly_copyable).
Under the C++11 rules we can write:

    std::vector<int> v1 = { 11, 22, 33 };
    std::vector<std::reference_wrapper<int>> v2;
    std::copy(v1.begin(), v1.end(), std::back_inserter(v2));
   
For the given example the "modernized" version of std::copy eventually
requires (without any visible reason) that an lvalue of type
std::back_insert_iterator<std::vector<std::reference_wrapper<int>>>
shall be assignable from an rvalue of type int, but this condition
is not satisfied, because an rvalue of type int cannot be implicitly
converted to std::reference_wrapper<int>, so the call

    std::copy(v1.begin(), v1.end(), std::back_inserter(v2))

will be ill-formed. From my point of view, such application of
"concepts lite" is worse than useless.

When we write a constraint definition, we should keep in mind all
potentially useful applications of templates, for which the constraint
is designed, and not focus solely on purely abstract considerations.
In particular, the relation "CopyAssignable subsumes MoveAssignable"
is reasonable, but the relation "Indirectly_copyable subsumes
Indirectly_movable" is not reasonable (we should not rely on vague and
contrived analogies here).

The similarity between std::iterator_traits<Iter>::value_type and
decltype(*std::declval<Iter>()) is delusive: the former and the latter
aren't interchangeable (and the former is useless in most cases).

Let's go further.
N3580 "Concepts Lite: Constraining Templates with Predicates" - 5.1
contains the following text:

    The _is_convertible_to trait determines whether a user-defined
    conversion sequence can be found from one type T to another U.
    Currently, this is implemented as a library feature with a complex
    implementation.
   
    These traits, together with the __is_base_of intrinsic, have
    special logical rules in our implementation of the subsumes
    algorithm. In particular, the following statements are always
    valid:
   
        __is_same(T, U) |- __is_convertible_to(T, U)
        __is_same(T, U) |- __is_base_of(U, T)
        __is_base_of(U, T) |- __is_convertible_to(T, U)

    That is, whenever the left side of |- is true, the right side is
    also true. This has the nice property of ordering these
    relationships by their “strength”. These properties are used in
    some STL implementations to select optimized algorithms when
    comparing values of the same type vs. those that are simply
    convertible.

If __is_same, __is_base_of, and __is_convertible_to are supposed to
have the same semantics as std::is_same, std::is_base_of, and
std::is_convertible respectively, then neither of these 3 assumptions
is correct in general. For example, if we have

    class A { A(A const &); };

then the following conditions shall hold:
   
    std::is_same<A, A>::value == true
    std::is_base_of<A, A>::value == true
    std::is_convertible<A, A>::value == false

    std::is_same<int, int>::value == true
    std::is_base_of<int, int>::value = false

I can imagine that there may be some useful applications of such
"subsumes" relations (at least because they actually hold for some
subsets of type pairs), but when we begin to operate with the
categories "sometimes works" or "almost works", it looks like we are
dealing with hacks.

Conclusion:
When we don't consider details, concepts may seem to be simple, but
their actual complexity can be well understood only when we consider
them in depth.


> Some of you may find this hard to believe, but "back then" there was
> quite serious opposition to function declarations because "they
> restricted the way functions could be used and the way separate
> compilation could be used" and also serious opposition to virtual
> functions "because pointers to functions are so much more flexible."
> I see concepts lite (and concepts) in the same light as goto/for,
> unchecked-function-arguments/function-declarations,
> pointers-to-functions/abstract-classes.

Before we start to speculate on vague analogies, could you remind
why we don't have compile-time checked exception-specifications?

Bjarne Stroustrup

unread,
Mar 29, 2013, 10:27:31 AM3/29/13
to Nikolay Ivchenkov, conc...@isocpp.org
On 3/29/2013 4:43 AM, Nikolay Ivchenkov wrote:
> On Thursday, March 14, 2013 7:24:23 PM UTC+4, Bjarne Stroustrup wrote:
> > Concepts and concepts lite are meant to make good generic
> > programming simpler.
>
> I doubt that such expectations about simplicity are reasonable.

Do you often call people unreasonable?

> Specifying correct constraints is more difficult than it may seem at
> first glance.

Really? I wonder why I haven't noticed that over the last many years.

> While templates with _properly_ specified constraints
> may become easier to use, it's likely that only language lawyers will
> be able to write templates correctly in accordance with the new
> fashion and we will have lots and lots of poorly constrained templates
> (that are even worse than formally unconstrained templates).

I think you completely misunderstand what concepts are meant to do. They
are *not* intended to mirror the absolutely minimal requirements of
every function that a fertile programmer's brain can come up with and
every language technical problem anyone can think of. They are meant to
express fundamental concepts (hence the name) of an application domain.
I recommend:

http://www.stepanovpapers.com/#eop
http://www.open-std.org/jtc1/sc22/wg21/docs/papers/2012/n3351.pdf
http://www.stroustrup.com/sle2011-concepts.pdf

The "concepts" we came up with for C++0x can serve are a warning. If
what we come up with looks a lot like those, we have failed.

Nikolay Ivchenkov

unread,
Mar 29, 2013, 3:58:01 PM3/29/13
to conc...@isocpp.org, Nikolay Ivchenkov, b...@cs.tamu.edu
On Friday, March 29, 2013 6:27:31 PM UTC+4, Bjarne Stroustrup wrote:
>>> Concepts and concepts lite are meant to make good generic
>>> programming simpler.
>>
>> I doubt that such expectations about simplicity are reasonable.
>
> Do you often call people unreasonable?

I prefer to discuss ideas, technical details, and objective issues,
not personalities.


> I think you completely misunderstand what concepts are meant to do.
> They are *not* intended to mirror the absolutely minimal
> requirements of every function [...]

I agree that specifying _absolutely_ minimal requirements may be
non-optimal, but all additional requirements (over absolutely
minimal) should have sufficient justification. Otherwise, we will get
overconstrained templates, which are not applicable in (potentially)
many cases that people would like to see supported, and some people
will effectively be forced to reimplement such templates in order to
have similar functionality without annoying limitations, or they will
consider migration to C++14/C++17 unacceptable if the committee decide
to widely break compatibility with C++11.


> They are meant to express fundamental concepts (hence the name) of
> an application domain.

A concept includes syntactic requirements as an essential part. I
described some technical issues regarding to syntactic requirements as
argumentation for my point of view about "simplicity" of generic
programming with concepts. Do you have any substantive remarks or
objections related to the technical part?I've read N3351 and sle2011-concepts from cover to cover. Some
definitions in N3351 and in the GCC implementation have similar
technical issues (from my point of view), which I can describe either
privately or on this reflector (but in other thread).
Reply all
Reply to author
Forward
0 new messages