macro times(unsigned n)
{
inject "for(unsigned i = 0; i != %, ++i)", n;
}
$times(10)
std::cout << "Hello, World!\n";
template <typename T>
macro defineEq()
{
inject "bool operator == (%0 lhs, %0 rhs) { return true", std::meta::full_name_of<T>();
for(auto mem : std::meta::members_of<T>())
inject " && lhs.%0 == rhs.%0", mem.name;
inject "; }";
}
Is this a feature people would like to have?
On 10/29/13 5:12 PM, stackm...@hotmail.com wrote:
> This feature becomes especially powerful with C++14's mutable compiletime
> state and upcoming comiletime reflection features. One could easily imagine
> iterating over the members of a class to generate comparison operators or
> serialization routines using this feature:
> The standard library could provide some predefined macros for this.
>
> Is this a feature people would like to have?
>
Thanks for the idea� got me thinking about modules�
transform(first, last, first, [](x) x*x );
// or
times(10) * [](x) cout << x; // using operator*
struct Hasher
{
template<typename T>
size_t operator()(size_t hash, T x)
{
boost::hash_combine(hash, x);
return hash;
}
};
template<typename Struct>
size_t hash_value(const Struct &x)
{
return boost::fusion::fold(x, size_t(0), Hasher{});
}
Several thoughts:
1. D language has Mixins, which has very similar use-cases and minimalistic design: http://dlang.org/mixin.html - it takes compile time string as a parameter and emits code in-place.
I don't think anybody has proposed Mixin-like functionality for C++, but figured I'd do my best to nip it before it goes anywhere.
You do not appear to be talking about what I'm talking about here. :)
I am specifically referring to the D-like mixin system, not a
high-level macro system. Clearly the latter is a good thing to have
compared to what we have today. D's approach is simply not the right
way to do it. Rust is a better role model here at a semantics level
(not necessarily the syntax level):
https://github.com/mozilla/rust/blob/master/doc/tutorial-macros.md
mixin( constexpr_function_returning_string(arg1, arg2, ag3) ) // injects code in-place
//or
mixin( some_struct<arg1, arg2, arg3>::code_string ) // injects code in-place
You do not appear to be talking about what I'm talking about here. :)
I am talking about macro system embedded into language, not external one like current preprocessor.
I am specifically referring to the D-like mixin system, not a
high-level macro system. Clearly the latter is a good thing to have
compared to what we have today. D's approach is simply not the right
way to do it. Rust is a better role model here at a semantics level
(not necessarily the syntax level):
https://github.com/mozilla/rust/blob/master/doc/tutorial-macros.md
I think D's mixin is bare minimum which will be feature-enabler.
I am talking about something like:Of course higher level macro system will be better, but I guess it will take much more time to standardize.
mixin( constexpr_function_returning_string(arg1, arg2, ag3) ) // injects code in-place
//or
mixin( some_struct<arg1, arg2, arg3>::code_string ) // injects code in-place
template <typename T>
macro generateEq()
{
std::meta::expression ret = true;
for(auto mem : std::meta::members_of<T>())
ret = ret && fb[0].mem == fb[1].mem; // overloaded operators to build expressions
std::meta::function_builder<bool(T, T)> fb("operator ==");
fb.append(std::meta::return_statement(ret));
inject fb.str();
}
#include
<delegate>
class C {
public:
int i;
void f();
vector<int> g(double);
private:
void h(int);
};
template<class
F,
class B>
class delegate_before_call<C, F, B> : public
B
{
using this_type = delegate_before<C, F, B>;
friend F;
public:
delegate_before_call(C &c, F f) :
wrapped(c), fct(f) {}
void f() {
fct( ); // (1.a)
wrapped.f();
}
vector<int> g(double d) {
fct( ); // [2.a]
return wrapped.g(d);
}
};
template<class
T>
frame
delegate_members_before_call[
meta::member_function
...Member = meta::public_function_members_of<T>
>
]
-> DeclarationList
[[[
[[[
Member::signature
{
fct( );
return wrapped.Member::forward_call()
;
}
]]]...
]]]
template<class
T, class F, class B>
class delegate_before_call : public B
{
using this_type = delegate_before_call<T, F, B>;
friend
F;
public:
delegate_before(C &c, F f) : wrapped(c),
fct(f) {}
[[[
delegate_members_before_call<T>[] ]]]
};
Please clarify whether "frame" and your proposed "operator [[[ ]]]" work as cues for expanding code constructs that are respecting type safety / language grammar and when should such operations take place.
It seems that you are you just adroitly attempting to move function - like macro definition and invocation in a form of expansion within the definition scope of class / function templates, while giving some constant expression evaluation characteristics in said operation.
Should that hold, you would just be forcing tightly coupled evaluations to occur prior to template instantiation and even definition. Have you considered where such a way would clash with templates / constexpr ?
I am just saying that by wanting to remove the need for preprocessor metaprogramming with the current facilities, in the end the route is to augment constexpr / template metaprogramming, not introduce another preprocessor.
I would thus like to see where this unambiguously extends templates and constexpr and how it deals with the simple degenerate case of plain text substitution and concatenation, even when the strings in question are totally random (just like what happens with the preprocessor now).
As for people who rely on external tools for code generation, I think they are simply trying to avoid current preprocessor metaprogramming techniques for some subjective disadvantage. Examples like boost's own use demonstrates otherwise.
Please clarify whether "frame" and your proposed "operator [[[ ]]]" work as cues for expanding code constructs that are respecting type safety / language grammar and
when should such operations take place.
It seems that you are you just adroitly attempting to move function - like macro definition and invocation in a form of expansion within the definition scope of class / function templates, while giving some constant expression evaluation characteristics in said operation.
Should that hold, you would just be forcing tightly coupled evaluations to occur prior to template instantiation and even definition.
Have you considered where such a way would clash with templates / constexpr ?
I am just saying that by wanting to remove the need for preprocessor metaprogramming with the current facilities, in the end the route is to augment constexpr / template metaprogramming, not introduce another preprocessor.
I would thus like to see where this unambiguously extends templates and constexpr and how it deals with the simple degenerate case of plain text substitution and concatenation, even when the strings in question are totally random (just like what happens with the preprocessor now).
As for people who rely on external tools for code generation, I think they are simply trying to avoid current preprocessor metaprogramming techniques for some subjective disadvantage. Examples like boost's own use demonstrates otherwise.
Le 03/11/13 01:45, George Makrydakis a écrit :
Yes, a frame has nodes of the language grammar as parameters and return a specific grammar node.Please clarify whether "frame" and your proposed "operator [[[ ]]]" work as cues for expanding code constructs that are respecting type safety / language grammar and
Good question. I have not a response now and I have no responses to some of your good questions. I have not worked the approach enough to make a concrete proposal.when should such operations take place.
Right.It seems that you are you just adroitly attempting to move function - like macro definition and invocation in a form of expansion within the definition scope of class / function templates, while giving some constant expression evaluation characteristics in said operation.
Why do you think this?Should that hold, you would just be forcing tightly coupled evaluations to occur prior to template instantiation and even definition.
No, not yet. Which one do you see?.
Have you considered where such a way would clash with templates / constexpr ?
template <class... Ts> void fun( Ts... vs ) {
gun( A<Ts...>::hun(vs)...);
gun( A<Ts...>::hun(vs...));
gun( A<Ts>::hun(vs)...);
}
This is not the most complicated thing you will see in parameter pack expansion but it kind of foreshadows how one can deploy extensions to unpacking patterns to cover for a few more scenarios where more complicated constructs are required (no, you can't do it with unpacking as it is with C++11, C++14). I reserve the right to explain this further in a future post all for just that (hopefully). It is not an easy sight and it would require some additional syntax to make things unambiguous because (...) is not enough.I don't see a frame as a preprocessor but as a term rewriting system. I want to be able to
I am just saying that by wanting to remove the need for preprocessor metaprogramming with the current facilities, in the end the route is to augment constexpr / template metaprogramming, not introduce another preprocessor.
* build grammar node instances by reflection
* transform grammar nodes instances on grammar nodes instances, and
* expand the code represented by a grammar node instance by reification.
The operator [[[]]] taking as parameter some code results in a grammar node.
Frames transform grammar nodes on grammar nodes.
The operator [[[]]] taking as parameter a grammar node results in the associated code.
Which syntax is more appropriated to support these requirements needs refinement.
A grammar node instance is represented as a type on the std::meta namespace.
The compiler should provide some intrinsic meta-functions to obtain grammar node instances from C++ types.
My approach doesn't takes the augment constexpr / template metaprogramming route, and maybe the route I'm taking is a dead-end.
As I see frames, they are orthogonal to templates and constexpr. The examples I gave are template frames, but we could have frames that are not templates. A frame would be implicitly a constexpr as if the parameters are know at compile time, the result is obtained at compile time. If wewere able to build grammar nodes at run-time, the frames could be evaluated at run-time.
While at a first glance it could seem that frames are nothing more than macros, their parameters are language grammar nodes. This makes a big difference and IMHO it opens to more elaborated transformation techniques.
I have nothing against external tools, but I think this is off topic on this ML as far as the generation is external.
As for people who rely on external tools for code generation, I think they are simply trying to avoid current preprocessor metaprogramming techniques for some subjective disadvantage. Examples like boost's own use demonstrates otherwise.
I have used the preprocessor meta-programming techniques in TBoost.Enums and TBoost.Opaque and while this allows to experiment I would like to see something more appropriated introducing reflection in the language.
Ville has suggested in [1] an approach that is more in line with the current language. operator member_name when member_name is a compile time string would define a new function. meta::invoke_member allows to call to a function identified by a compile time string.
On Wed, Nov 6, 2013 at 1:56 PM, Vicente J. Botet Escriba <vicent...@wanadoo.fr> wrote:
Le 03/11/13 01:45, George Makrydakis a écrit :
Yes, a frame has nodes of the language grammar as parameters and return a specific grammar node.Please clarify whether "frame" and your proposed "operator [[[ ]]]" work as cues for expanding code constructs that are respecting type safety / language grammar and
Good question. I have not a response now and I have no responses to some of your good questions. I have not worked the approach enough to make a concrete proposal.when should such operations take place.
First of all, you will have to excuse my delayed reply, it was due to my own limited availability. Make a note of this reply of yours here because I will be using it later on.
Right.It seems that you are you just adroitly attempting to move function - like macro definition and invocation in a form of expansion within the definition scope of class / function templates, while giving some constant expression evaluation characteristics in said operation.
Why do you think this?Should that hold, you would just be forcing tightly coupled evaluations to occur prior to template instantiation and even definition.
Imagine wanting to repetitively create a series of repetitive constructs within a given function / class template definition, where each item is derived from a series of numbers within the [0,N] range where N is any positive integer. Notice that in order for such a template definition to occur, all the syntactically valid constructs within definition scope would have to be present. Since we are talking about systematically generating these at the level of a "preprocessor substitute" (as in your concept), we would have to have a way to control such expansion, some sort of flow control. Such control means having expression evaluation implemented in some form. A prime example of hijacking concatenation into creating such "expression evaluation" mechanics are the internals of boost libraries, where boost.preprocessor is used. Take note that this is very well defined as to where and when everything takes place: this is "text" generation in the end, prior to any action on behalf of the compiler.
I think that the need for such a solution could be done away with if the templates / constexpr system were extended properly. I do not have myself a complete solution on this, but as everyone on the grounds of this mailing list I just have a few ideas about the "dont's" and some few pretty concrete directions as to the "do's". It would take me quite a while to explain the reasons about them and eventually I will. One aspect is actually modifying parameter expansion pack semantics. More on that later on.
No, not yet. Which one do you see?.
Have you considered where such a way would clash with templates / constexpr ?
The one I described above. It is introducing another system for what templates / constexpr should be allowed to do (with a few expansions to their semantics). In a way, through parameter unpacking, templates have already acquired a way to expand type - dependent (and safe) constructs for any given number of type / non type / template template type parameters used. This is the consequence of introducing variadicity in template parameters.
In the C++11 standard, 14.5.3 has several examples for your enjoyment. You can also watch Andrei Alexandrescu's excellent talk about variadic templates being "funadic" at http://channel9.msdn.com/Events/GoingNative/GoingNative-2012/Variadic-Templates-are-Funadic. There are several ways to do different things with parameter pack expansion, even more than several when it comes to multiple parameter packs. I am just citing one of Alexandrescu's examples:
template <class... Ts> void fun( Ts... vs ) { gun( A<Ts...>::hun(vs)...); gun( A<Ts...>::hun(vs...)); gun( A<Ts>::hun(vs)...); }
This is not the most complicated thing you will see in parameter pack expansion but it kind of foreshadows how one can deploy extensions to unpacking patterns to cover for a few more scenarios where more complicated constructs are required (no, you can't do it with unpacking as it is with C++11, C++14). I reserve the right to explain this further in a future post all for just that (hopefully). It is not an easy sight and it would require some additional syntax to make things unambiguous because (...) is not enough.
I don't see a frame as a preprocessor but as a term rewriting system. I want to be able to
I am just saying that by wanting to remove the need for preprocessor metaprogramming with the current facilities, in the end the route is to augment constexpr / template metaprogramming, not introduce another preprocessor.
* build grammar node instances by reflection
* transform grammar nodes instances on grammar nodes instances, and
* expand the code represented by a grammar node instance by reification.
The operator [[[]]] taking as parameter some code results in a grammar node.
Frames transform grammar nodes on grammar nodes.
The operator [[[]]] taking as parameter a grammar node results in the associated code.
Which syntax is more appropriated to support these requirements needs refinement.
A grammar node instance is represented as a type on the std::meta namespace.
The compiler should provide some intrinsic meta-functions to obtain grammar node instances from C++ types.
My approach doesn't takes the augment constexpr / template metaprogramming route, and maybe the route I'm taking is a dead-end.As I see frames, they are orthogonal to templates and constexpr. The examples I gave are template frames, but we could have frames that are not templates. A frame would be implicitly a constexpr as if the parameters are know at compile time, the result is obtained at compile time. If wewere able to build grammar nodes at run-time, the frames could be evaluated at run-time.
While at a first glance it could seem that frames are nothing more than macros, their parameters are language grammar nodes. This makes a big difference and IMHO it opens to more elaborated transformation techniques.
I would not say that it is a dead - end, but I think it needs more work when it comes to clarifying semantics and where operations should take place in relation i.e. to template definition, declaration and instantiation.
But you are not at the same problem domain as the preprocessor right now, so you can't substitute it as you wish. Frames (as you are proposing them at least) are not exactly analogous to AST term rewriting macros (see Nimrod's http://nimrod-code.org/) either.
Because of the (for the time being!) ambiguous semantics you propose, they clash with established systems as templates and constexpr. Think again about why I asked as to when and where frame operations should take place. This is more important than it seems.
I have nothing against external tools, but I think this is off topic on this ML as far as the generation is external.
As for people who rely on external tools for code generation, I think they are simply trying to avoid current preprocessor metaprogramming techniques for some subjective disadvantage. Examples like boost's own use demonstrates otherwise.
I did not understand your reply here.
As I said, the preprocessor is fairly adequate if used properly; however it does require some inner knowledge of the techniques involved when doing metaprogramming with it and awareness of the necessary compromises in designing solutions with it. People not particularly eager to gain that knowledge, turn to external tools. Would I like to have something better? Of course I would, but if the preprocessor is all I am given within C++, then I must be ready to make the best use of it. And I have no problems with that either.
I have used the preprocessor meta-programming techniques in TBoost.Enums and TBoost.Opaque and while this allows to experiment I would like to see something more appropriated introducing reflection in the language.
Ville has suggested in [1] an approach that is more in line with the current language. operator member_name when member_name is a compile time string would define a new function. meta::invoke_member allows to call to a function identified by a compile time string.
I am aware of Ville Voutilainen's approaches to problems like this and I am quite a fan of his solutions when it comes to them. But we are going to reflection now, which would derail the entire thread into something different. Systematic generation of valid C++ constructs for reducing hand - written constructs is not something closely related to reflection but as a "workaround" for the lack of said feature at the language level. It is obviously that in the end, you are actually trying to solve a compile / run - time barrier problem, hence your interest in reflection. Try thinking of the "frames" you are proposing as something that should not however just exist to help reflection purposes.