Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Preprocessor evolution

95 views
Skip to first unread message

Balog Pal

unread,
Apr 19, 2003, 5:48:43 AM4/19/03
to
"Francis Glassborow" <francis.g...@ntlworld.com> wrote in message news:NJjmmaB4...@robinton.demon.co.uk...

> Not heretical these days. We spent time earlier this week in the
> evolution group of WG21 & J16 considering ways to take control of the
> pre-processor to make it work well in C++. Our current thinking is to
> add something like:
>
> #< // no preprocessor macros can permeate this barrier
> // unless explicitly asked for
>
> #import XYZ // allow that macro through
> #define ABC 123
> #define FALSE 0
> ...
> #export FALSE
> #> // only explicitly exported macros escape
>
> IOWs create a mechanism to control the scope of pre-processor macros.

Hm, that's something intertesting.

#<
#include "windowsx.h"
#export SendMessage
#> // get rid of all the rest of hurting macros defined there.

I like that one.

Those markers do nest, do they? And export propagate to the outside?

Or I misunderstand the meaning of #< ? Intuitively I'd expect to work it like the environment does on unix. #< would launch a new shell, inherting the current state, which is exited on #>
and anything not exported vanish. But your words can be interpreted as if #< would erase everything. That sounds bad, but probably yet another directive like #tabularasa ;-)

If you start adding features to preprocessor -- what I think is a good thing, Don't be shy. I'm sure people could think up at least a dozen easy to implement and usable directives.
Or tweak even the existing ones, like add vararg support to #define.

Paul
---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Francis Glassborow

unread,
Apr 20, 2003, 2:49:24 AM4/20/03
to
In message <3e9f...@andromeda.datanet.hu>, Balog Pal <pa...@lib.hu>
writes

>> #import XYZ // allow that macro through
>> #define ABC 123
>> #define FALSE 0
>> ...
>> #export FALSE
>> #> // only explicitly exported macros escape
>>
>> IOWs create a mechanism to control the scope of pre-processor macros.
>
>Hm, that's something intertesting.
>
>#<
>#include "windowsx.h"
>#export SendMessage
>#> // get rid of all the rest of hurting macros defined there.
>
>I like that one.

Yes, and we will probably support that (I cannot think of a reason not
to)


>
>Those markers do nest, do they? And export propagate to the outside?

I cannot remember if we explicitly discussed that, but we should do, and
my instinctive response would be yes.

>
>Or I misunderstand the meaning of #< ? Intuitively I'd expect to work it like the environment does on unix. #< would launch a new shell, inherting
>the current state, which is exited on #>
>and anything not exported vanish.

I think that is the way we mean to go.

>But your words can be interpreted as if #< would erase everything. That sounds bad, but probably yet another directive like #tabularasa ;-)

No, I do not think that would be our intention.

>
>If you start adding features to preprocessor -- what I think is a good thing, Don't be shy. I'm sure people could think up at least a dozen easy to
>implement and usable directives.

Hm... I suspect that we do not want to do very much to the preprocessor,
sort of minimalist action to control the scope of directives.

>Or tweak even the existing ones, like add vararg support to #define.

Possible if we decide to incorporate it from C99. However we are much
more likely to do serious work on providing good tools for
meta-programming.


--
ACCU Spring Conference 2003 April 2-5
The Conference you should not have missed
ACCU Spring Conference 2004 Late April
Francis Glassborow ACCU


[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

Niklas Matthies

unread,
Apr 20, 2003, 4:06:42 PM4/20/03
to
On 2003-04-20 06:49, Francis Glassborow <francis.g...@ntlworld.com> wrote:
> In message <3e9f...@andromeda.datanet.hu>, Balog Pal <pa...@lib.hu>
> writes
>>> #import XYZ // allow that macro through
>>> #define ABC 123
>>> #define FALSE 0
>>> ...
>>> #export FALSE
>>> #> // only explicitly exported macros escape
>>>
>>> IOWs create a mechanism to control the scope of pre-processor macros.
>>
>>Hm, that's something intertesting.
>>
>>#<
>>#include "windowsx.h"
>>#export SendMessage
>>#> // get rid of all the rest of hurting macros defined there.
>>
>>I like that one.
>
> Yes, and we will probably support that (I cannot think of a reason not
> to)

Hmm... Does that mean that for every exported macro, the preprocessor
remembers its "closure", and expands the macro within that closure?
Like in:

#<
#define X 5
#define FIVE X
#export FIVE
#>

#<
#define X 6
#define SIX X
#export SIX
#>

#define X 7

int five = FIVE; // initialized to 5
int six = SIX; // initialized to 6

If it doesn't, then, in Balog Pal's example, one would need to know
how the SendMessage macro is implemented, and possibly export further
macros for it to be useable, which works against encapsulation.

It's my impression that some namespace concept would be more generally
useful, and having each macro be expanded within the namespace it was
defined in. One could then write:

#begin WIN32
# include "windows.h" // everything defined here now lives
# // in preprocessor namespace WIN32
#end WIN32

#define SendMessage WIN32(SendMessage) // use function macro syntax
// for accessing namespace

Or, to adapt the other example above:

#define X 7

#begin A
# define X 5
#end A
#define FIVE A(X)

#begin , and makes for easy nesting of namespaces.
# define X 6
#end B
#define SIX B(X)

int five = FIVE; // initialized to 5
int six = SIX; // initialized to 6

The function macro syntax is used to avoid introducing a new scope
resolution operator, which would either break old code or require
extending the basic character set. Note that it is consistent with the
current function macro expansion mechanism that the outer definition
of X as 7 doesn't affect the expansion of FIVE and SIX.

Actually, the above is close to equivalent to:

#define A_unique_prefix_X 5
#define A(token) A_unique_prefix_##token
#define FIVE A(X)

#define B_unique_prefix_X 6
#define B(token) B_unique_prefix_##token
#define SIX B(X)

The only difference is that A_unique_prefix_X and B_unique_prefix_X
are not accessible other than through A(X) and B(X), and that the
definitions of X need not be aware that they live in a namespace
(i.e. you can retroactively place them in a namespace just by
inclusion), and makes for easy nesting of namespaces.

-- Niklas Matthies

---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Balog Pal

unread,
Apr 20, 2003, 10:10:42 PM4/20/03
to
"Francis Glassborow" <francis.g...@ntlworld.com> wrote in message
news:RHqB1kAq...@robinton.demon.co.uk...

> >But your words can be interpreted as if #< would erase everything.
> >That sounds bad, but probably yet another directive like #tabularasa
> >;-)
>
> No, I do not think that would be our intention.

A directive doing that can be a great help too.


#<
#undefall
// most of the .cpp code is here -- that is guaranteed to be macro-free
// and behave exactly as it reads #import NULL // let this in #>

It can be especially helpful during development, when all you see is
strange behavior -- just placing that macro-killer can save a plenty of
work.

Also your original example had #import, I can hardly think its use
without that. Btw, please find another word for import, that one is
currently used in MSVC to include COM type libraries.

> Possible if we decide to incorporate it from C99. However we are much
> more likely to do serious work on providing good tools for
> meta-programming.

I welcome that. I believe most stuff Alexandrescu could (almost)
solve even using the current language shall gain direct compiler
support. int2type, alternative branches in templates depending on type
traits, built-in type traits, static constraint che
cks on template arguments.

But I don't think developnemt on different areas should be exclusive.
At least when we're talking about 'cheap' stuff.

Paul

David Abrahams

unread,
Apr 21, 2003, 7:48:37 AM4/21/03
to
comp.std.c+...@nmhq.net (Niklas Matthies) writes:

> Hmm... Does that mean that for every exported macro, the preprocessor
> remembers its "closure", and expands the macro within that closure?
> Like in:
>
> #<
> #define X 5
> #define FIVE X
> #export FIVE
> #>
>
> #<
> #define X 6
> #define SIX X
> #export SIX
> #>
>
> #define X 7
>
> int five = FIVE; // initialized to 5
> int six = SIX; // initialized to 6

Yes, that was the conclusion that the evolution working group came to
in Oxford.

--
Dave Abrahams
Boost Consulting
www.boost-consulting.com

Daveed Vandevoorde

unread,
Apr 21, 2003, 1:31:23 PM4/21/03
to
comp.std.c+...@nmhq.net (Niklas Matthies) wrote:
[...]

> Hmm... Does that mean that for every exported macro, the preprocessor
> remembers its "closure", and expands the macro within that closure?

No. We're looking for a mechanism that will not
severely affect currently conforming preprocessor
implementations.

> Like in:
>
> #<
> #define X 5
> #define FIVE X
> #export FIVE
> #>
>
> #<
> #define X 6
> #define SIX X
> #export SIX
> #>
>
> #define X 7
>
> int five = FIVE; // initialized to 5
> int six = SIX; // initialized to 6

Both would be initialized to 7.
Without the "#define X 7" both would be
initialized to 6.

I should also note that the syntax discussed here
is unlikely to be the final one. In particular,
"#import" is already in use as an extension on some
platforms. The current working directives letting
definitions in and out, are "#in" and "#out."
The syntactical choice for block delimiters is broader
still.

Daveed

Daveed Vandevoorde

unread,
Apr 21, 2003, 1:31:32 PM4/21/03
to
da...@boost-consulting.com (David Abrahams) wrote:
> comp.std.c+...@nmhq.net (Niklas Matthies) writes:
>
> > Hmm... Does that mean that for every exported macro, the preprocessor
> > remembers its "closure", and expands the macro within that closure?
> > Like in:
> >
> > #<
> > #define X 5
> > #define FIVE X
> > #export FIVE
> > #>
> >
> > #<
> > #define X 6
> > #define SIX X
> > #export SIX
> > #>
> >
> > #define X 7
> >
> > int five = FIVE; // initialized to 5
> > int six = SIX; // initialized to 6
>
> Yes, that was the conclusion that the evolution working group came to
> in Oxford.

I'm afraid not. The only similar conclusion was
that while expanding FIVE (or SIX), the #in filter
would not play. However, there is no change in
the macro lookup rules.

Daveed

Gabriel Dos Reis

unread,
Apr 21, 2003, 2:25:34 PM4/21/03
to
da...@boost-consulting.com (David Abrahams) writes:

| comp.std.c+...@nmhq.net (Niklas Matthies) writes:
|
| > Hmm... Does that mean that for every exported macro, the preprocessor
| > remembers its "closure", and expands the macro within that closure?
| > Like in:
| >
| > #<
| > #define X 5
| > #define FIVE X
| > #export FIVE
| > #>
| >
| > #<
| > #define X 6
| > #define SIX X
| > #export SIX
| > #>
| >
| > #define X 7
| >
| > int five = FIVE; // initialized to 5
| > int six = SIX; // initialized to 6
|
| Yes, that was the conclusion that the evolution working group came to
| in Oxford.

Really? I do not remember we came to that conclusion, at least that
is missing from my notes.


--
Gabriel Dos Reis, g...@integrable-solutions.net

Allan W

unread,
Apr 21, 2003, 2:39:35 PM4/21/03
to
comp.std.c+...@nmhq.net (Niklas Matthies) wrote

> Hmm... Does that mean that for every exported macro, the preprocessor
> remembers its "closure", and expands the macro within that closure?

> If it doesn't, then, in Balog Pal's example, one would need to know


> how the SendMessage macro is implemented, and possibly export further
> macros for it to be useable, which works against encapsulation.

I like (well, don't hate) the ideas you posted. However, let's bear
in mind that the current "problem" with macros is exactly that --
encapsulation. The problems you pointed out would not make that
any worse -- not by a long shot.

> It's my impression that some namespace concept would be more generally
> useful, and having each macro be expanded within the namespace it was
> defined in. One could then write:
>
> #begin WIN32
> # include "windows.h" // everything defined here now lives
> # // in preprocessor namespace WIN32
> #end WIN32
>
> #define SendMessage WIN32(SendMessage) // use function macro syntax
> // for accessing namespace
>
> Or, to adapt the other example above:
>
> #define X 7
>
> #begin A
> # define X 5
> #end A
> #define FIVE A(X)

This next line seems garbled in transmission -- you meant "#begin B"?


> #begin , and makes for easy nesting of namespaces.
> # define X 6
> #end B
> #define SIX B(X)
>
> int five = FIVE; // initialized to 5
> int six = SIX; // initialized to 6
>
> The function macro syntax is used to avoid introducing a new scope
> resolution operator, which would either break old code or require
> extending the basic character set. Note that it is consistent with the
> current function macro expansion mechanism that the outer definition
> of X as 7 doesn't affect the expansion of FIVE and SIX.
>
> Actually, the above is close to equivalent to:
>
> #define A_unique_prefix_X 5
> #define A(token) A_unique_prefix_##token
> #define FIVE A(X)
>
> #define B_unique_prefix_X 6
> #define B(token) B_unique_prefix_##token
> #define SIX B(X)
>
> The only difference is that A_unique_prefix_X and B_unique_prefix_X
> are not accessible other than through A(X) and B(X), and that the
> definitions of X need not be aware that they live in a namespace
> (i.e. you can retroactively place them in a namespace just by
> inclusion), and makes for easy nesting of namespaces.

I see the point of this -- you can enable "WIN32" macros where you
need it, then switch to "WINSOCK" macros, and then turn them both off.
But maybe this is more complicated than we really need?

I'd like some way to separate macros into three (or maybe four) groups:

* Macros that are part of the C++ standard -- let's call these STD
* Additional macros defined by the implementation -- let's call these VENDOR
* Possibly a group specifically for third-party libraries? -- call them LIB
* All others (third-party libraries and/or users) -- let's call these USR

(These would be namespaces in the language sense, but let's please
avoid calling them namespaces because of confusion with the C++
concept of a namespace. "Macro group" would work better, IMHO.)

And, I'd like to be able to turn expansion on or off for each of these
groups independantly, and later restore this setting to what it
previously was. This can be done with a "save" directive which saves
the current settings and optionally sets the others, and a "restore"
directive.

#save STD=ON, VENDOR=ON, USR=OFF // Turns on "standard" macros only

// Notice that this class uses "plain" member names. It
// doesn't have to "worry" that name, ssn, public, and so
// on are actually user-defined macros!
class employee {
char name[80+1];
long ssn;
public:
// and so on
};

// #ifdef continues to work normally -- only expansion is supressed
#ifndef NEWEMPLOYEE

// Note that "user" macros can be defined,
// even if they're not currently being expanded
#define NEWEMPLOYEE new employee;
#endif

#restore // Returns to previous levels (all three settings restored)

Exactly how macros are marked as "standard" or "vendor" is of course
up to the individual vendors. I would NOT expect this to be documented,
because we wouldn't want users or third-party libraries to use this in
their own headers.

(If we have a "LIB" group, we would of course have to specify how LIB
macros are marked. Better I think to discourage LIB macros in the
first place.)

Francis Glassborow

unread,
Apr 23, 2003, 5:37:05 AM4/23/03
to
In article <3ea2...@andromeda.datanet.hu>, Balog Pal <pa...@lib.hu>
writes

>But I don't think developnemt on different areas should be exclusive.
>At least when we're talking about 'cheap' stuff.

There is almost nothing that is cheap. In the very least it takes dozens
of hours of committee time which is a precious resource.

>

--
ACCU Spring Conference 2003 April 2-5
The Conference you should not have missed
ACCU Spring Conference 2004 Late April
Francis Glassborow ACCU

Niklas Matthies

unread,
Apr 23, 2003, 9:52:39 AM4/23/03
to
On 2003-04-21 18:39, Allan W <all...@my-dejanews.com> wrote:
> comp.std.c+...@nmhq.net (Niklas Matthies) wrote
[...]
>> It's my impression that some namespace concept would be more generally
>> useful, and having each macro be expanded within the namespace it was
>> defined in. One could then write:
>>
>> #begin WIN32
>> # include "windows.h" // everything defined here now lives
>> # // in preprocessor namespace WIN32
>> #end WIN32
>>
>> #define SendMessage WIN32(SendMessage) // use function macro syntax
>> // for accessing namespace
>>
>> Or, to adapt the other example above:
>>
>> #define X 7
>>
>> #begin A
>> # define X 5
>> #end A
>> #define FIVE A(X)
>
> This next line seems garbled in transmission -- you meant "#begin B"?

Right. I don't know what happened.

I think it is the user of the macros who should decide which macros
are placed into his namespace, and not the macro provider. Which means
there should be an import/using mechanism and not an export mechanism.
Also, teh expansions of imported macros need to be able to access
non-imported macros from the namespace they were imported from.

-- Niklas Matthies

Francis Glassborow

unread,
Apr 23, 2003, 12:38:17 PM4/23/03
to
In article <slrnbabv5o.r9g.comp...@nmhq.net>, Niklas
Matthies <comp.std.c+...@nmhq.net> writes

>I think it is the user of the macros who should decide which macros
>are placed into his namespace, and not the macro provider. Which means
>there should be an import/using mechanism and not an export mechanism.
>Also, teh expansions of imported macros need to be able to access
>non-imported macros from the namespace they were imported from.

Not really. As long as we have a way to provide fences suppliers and
users can co-operate.

--
ACCU Spring Conference 2003 April 2-5
The Conference you should not have missed
ACCU Spring Conference 2004 Late April
Francis Glassborow ACCU

---

LLeweLLyn

unread,
Apr 23, 2003, 3:12:55 PM4/23/03
to
Francis Glassborow <francis.g...@ntlworld.com> writes:

> In message <3e9f...@andromeda.datanet.hu>, Balog Pal <pa...@lib.hu>
> writes

> >If you start adding features to preprocessor -- what I think is a good thing, Don't be shy. I'm sure people could think up at least a dozen easy to
> >implement and usable directives.
>
> Hm... I suspect that we do not want to do very much to the preprocessor,
> sort of minimalist action to control the scope of directives.
>
> >Or tweak even the existing ones, like add vararg support to
> >#define.

I would like to see this for the sake of improved C99 => C++
compatibility.

Also, since the preprocessor is not going away, I would like to see an
alternative to function like macros that do not require such
things as the do{}while(0) idiom and parentheses around all
arguments. I can't think of a good way to express this (yet) but
that's what I want.

[snip]


> Possible if we decide to incorporate it from C99. However we are much
> more likely to do serious work on providing good tools for
> meta-programming.

[snip]

Despite what I said above, if preprocessor improvements fall by the
wayside in order to give more time to good tools for
meta-programming, I'll be pleased overall. Of course cpp is a
valid tool for some meta-programming, but I feel the C++ community
deserves better.

---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Ken Hagan

unread,
Apr 28, 2003, 3:06:55 AM4/28/03
to
LLeweLLyn wrote:
>
> Also, since the preprocessor is not going away, I would like to see an
> alternative to function like macros that do not require such
> things as the do{}while(0) idiom and parentheses around all
> arguments. I can't think of a good way to express this (yet) but
> that's what I want.

MSVC has an intrinsic called _noop() which doesn't evaluate its
arguments. If you want something to go away in release builds,
you just write #define TRACE _noop. (Disclaimer, I've only just
noticed it.)

It also has a pre-defined macro called __COUNTER__ that expands to
a unique (per-compilation unit) integer token. You can use this to
create unique names for things whose actual name you don't much
care about.

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Allan W

unread,
Apr 28, 2003, 3:07:04 AM4/28/03
to
LLeweLLyn <llewe...@xmission.dot.com> wrote

> Also, since the preprocessor is not going away, I would like to see an
> alternative to function like macros that do not require such
> things as the do{}while(0) idiom and parentheses around all
> arguments. I can't think of a good way to express this (yet) but
> that's what I want.

How about making them functions -- but still expand inline?
We could call these "inline functions."

:-)

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Daveed Vandevoorde

unread,
May 2, 2003, 9:21:04 AM5/2/03
to
all...@my-dejanews.com (Allan W) wrote in message news:<7f2735a5.03042...@posting.google.com>...

> LLeweLLyn <llewe...@xmission.dot.com> wrote
> > Also, since the preprocessor is not going away, I would like to see an
> > alternative to function like macros that do not require such
> > things as the do{}while(0) idiom and parentheses around all
> > arguments. I can't think of a good way to express this (yet) but
> > that's what I want.
>
> How about making them functions -- but still expand inline?
> We could call these "inline functions."

I presented a set of language extensions I'm working
on at the recent ACCU conference (which, BTW, turned
out to be a very cool event). Somewhat unexpectedly,
it was also presented before a part of WG21/J16.

I put up some notes at:
http://vandevoorde.com/Daveed/News/Archives/000015.html

Daveed
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Mike Conley

unread,
May 2, 2003, 11:50:55 AM5/2/03
to
LLeweLLyn <llewe...@xmission.dot.com> wrote in
news:m1u1cr4...@localhost.localdomain:

> In message <3e9f...@andromeda.datanet.hu>, Balog Pal <pa...@lib.hu>
> writes
> >If you start adding features to preprocessor -- what I think is a
> >good thing, Don't be shy. I'm sure people could think up at least a
> >dozen easy to implement and usable directives.
>

> >Or tweak even the existing ones, like add vararg support to
> >#define.
>
> I would like to see this for the sake of improved C99 => C++
> compatibility.
>

It would also be nice if we actually made them a useful construct (I do
not consider supplying arguments to printf useful :) We could do that by
adding support for recursive macros to the preprocessor. Eg,

#defrec SUM(ARG1,...) ARG1 + SUM(__VA_ARGS__)

A call to SUM would expand to the sum of its arguments.

The current preprocessor's inability (by design) to recursively expand
function-like macros is, I think, the primary obstacle to useful
preprocessor metaprogramming (not that this should necessarily apply only
to function-like macros...).


--
Mike Conley


[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Pavel Vozenilek

unread,
May 3, 2003, 10:16:50 AM5/3/03
to
goo...@vandevoorde.com (Daveed Vandevoorde) wrote in message news:<52f2f9cd.0305...@posting.google.com>...
[snip]

> I presented a set of language extensions I'm working
> on at the recent ACCU conference (which, BTW, turned
> out to be a very cool event). Somewhat unexpectedly,
> it was also presented before a part of WG21/J16.
>
> I put up some notes at:
> http://vandevoorde.com/Daveed/News/Archives/000015.html
>
Will it be possible to read configuratin files from metacode?

/Pavel

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Allan W

unread,
May 3, 2003, 10:16:40 AM5/3/03
to
goo...@vandevoorde.com (Daveed Vandevoorde) wrote

> I presented a set of language extensions I'm working
> on at the recent ACCU conference (which, BTW, turned
> out to be a very cool event). Somewhat unexpectedly,
> it was also presented before a part of WG21/J16.
>
> I put up some notes at:
> http://vandevoorde.com/Daveed/News/Archives/000015.html

I found this hard to read -- not your writing style, but the HTML
page. Your style sheet explicitly forces the body of the text to
show up at 11-point font, overriding my browser's setting. On my
screen, 11-point font is very difficult to read. I had to copy it
into a word processor just to read it.

The metacode concept itself seems interesting. I'm not sure
that I would consider it a replacement for the preprocessor.
We'll still need #include, #ifdef, and so on.

Also, I wonder how many compilers are going to be able to
implement all of this. It's quite extensive -- more so than
some high-level languages. Would it be possible to implement
metacode as a separate pre-compile step? If it has to run at
the same time as the main translator, there could be
difficulties (or at least performance issues) on platforms
where the compiler is already considered a big program.

I had some trouble understanding the second example in "Code
Injection Primitives" -- the one where you inject field names.
Is that supposed to generate
float fieldfloat;
int fieldint;
char fieldchar;
and so on for all known data types? Would that include pointers,
pointers to pointers, pointers to pointers to pointers, ad
nauseum? (If so, how would you end the infinite recursion?)
Would it include user-defined classes? (If so, and you used it
within the definition of a class -- would it include the class
still being defined?) Will the code require knowledge of the
internals of the current compiler -- how the symbol table is
laid out, and so on?

Will there be such a thing as "undefined behavior -- no diagnostic
required" for metacode functions? Since they operate at compile
time, it could easily cause the compiler itself to abort -- this
would make debugging them somewhat difficult.

I must say, I like this whole idea a lot better than some of the
contortions being used to do the same things with templates.
Especially things like is_lvalue and is_constant. I'm guessing
that typeof() would be a predefined metacode function?

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

LLeweLLyn

unread,
May 3, 2003, 8:44:09 PM5/3/03
to
all...@my-dejanews.com (Allan W) writes:

> LLeweLLyn <llewe...@xmission.dot.com> wrote
> > Also, since the preprocessor is not going away, I would like to see an
> > alternative to function like macros that do not require such
> > things as the do{}while(0) idiom and parentheses around all
> > arguments. I can't think of a good way to express this (yet) but
> > that's what I want.
>
> How about making them functions -- but still expand inline?
> We could call these "inline functions."
>
> :-)

Rewrite this macro as an inline function:

#define TRACE() do{\
if(debug_this_module) {\
logfile << __FILE__ << ":" << __LINE__ << ":" << endl; logfile\
<< dump_stack_trace();\
}\
}while(0)

:-)
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Paul Mensonides

unread,
May 3, 2003, 8:44:39 PM5/3/03
to
Mike Conley wrote:
> LLeweLLyn <llewe...@xmission.dot.com> wrote in
> news:m1u1cr4...@localhost.localdomain:
>
> > In message <3e9f...@andromeda.datanet.hu>, Balog Pal <pa...@lib.hu>
> > writes
> > >If you start adding features to preprocessor -- what I think is a
> > >good thing, Don't be shy. I'm sure people could think up at least
> a > >dozen easy to implement and usable directives.
> >
> > >Or tweak even the existing ones, like add vararg support to
> > >#define.
> >
> > I would like to see this for the sake of improved C99 => C++
> > compatibility.
> >
>
> It would also be nice if we actually made them a useful construct (I
> do not consider supplying arguments to printf useful :) We could do
> that by adding support for recursive macros to the preprocessor. Eg,
>
> #defrec SUM(ARG1,...) ARG1 + SUM(__VA_ARGS__)
>
> A call to SUM would expand to the sum of its arguments.
>
> The current preprocessor's inability (by design) to recursively expand
> function-like macros is, I think, the primary obstacle to useful
> preprocessor metaprogramming (not that this should necessarily apply
> only to function-like macros...).

Useful preprocessor metaprogramming already exists. However, the primary
obstacle is not the lack of recursion--that can be abstracted--rather the
primary obstacle is preprocessor conformance. Dealing with the variadic
parameters is actually quite easy, including doing what is intended above. The
only difficult part of dealing with variadics is detecting when there are no
parameters left--which cannot be done in a "general" sense because token-pasting
of unrelated tokens is undefined behavior.

Regards,
Paul Mensonides
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Mike Conley

unread,
May 3, 2003, 8:45:03 PM5/3/03
to
Mike Conley <conle...@osu.edu> wrote in
news:Xns9367DD6DE352...@206.127.4.10:

> #defrec SUM(ARG1,...) ARG1 + SUM(__VA_ARGS__)

I probably should have said something about terminating the recursion....
Here are a few options. I'm sure people who know more about the
preprocessor than I have better ideas.


1) Allow macros to be overloaded on the number of arguments, eg,
#define SUM() 0
#define SUM(arg1) arg1
#defrec SUM(arg1,...) arg1 + SUM(__VA_ARGS__)
....

would create 3 SUM macros, with the last being recursively expanded.
This could be taken a step further to allow a function-like and an
object-like macro to share a name, though I doubt that's a good idea.


2) Add an operator (called "empty" below) to test whether or not
__VA_ARGS__ contains additional arguments. It would expand to 1 whenever
its argument would expand to whitespace only. For example (let's assume
that #defrec introduces two-stage expansion of macros -- directives are
evaluated within the expanded macro body for each expansion):

#defrec SUM(ARG1,...) \
#if empty(__VA_ARGS__) ARG1 \
#else ARG1 + SUM(__VA_ARGS__) \
#endif

This doesn't allow a call to SUM with no arguments to expand to 0,
because SUM requires at least one argument. Not a big loss here, but
certainly less general than (1).

3) Add some general comma-separated list manipulation operators. For
example, if we had a preprocessor operator split defined st

split(index, list) returns the first index elements of list and
split(-index, list) returns the last L - index elements, where L is the
length of the list

then we could define SUM:

#defrec SUM(...) \
#if empty(__VA_ARGS__) 0 \
#else split(1, __VA_ARGS__) + SUM(split(-1, __VA_ARGS__)) \
#endif


--
Mike Conley
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Daveed Vandevoorde

unread,
May 4, 2003, 6:09:52 AM5/4/03
to
pavel_v...@yahoo.co.uk (Pavel Vozenilek) wrote:

> goo...@vandevoorde.com (Daveed Vandevoorde) wrote:
> [snip]
> > I presented a set of language extensions I'm working
> > on at the recent ACCU conference (which, BTW, turned
> > out to be a very cool event). Somewhat unexpectedly,
> > it was also presented before a part of WG21/J16.
> >
> > I put up some notes at:
> > http://vandevoorde.com/Daveed/News/Archives/000015.html
> >
> Will it be possible to read configuratin files from metacode?

Good question. Some sort of I/O would be nice,
wouldn't it (I'm thinking of a very simple model)?
We'll see. It's on my list of options, but not at
the top right now.

Daveed
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Daveed Vandevoorde

unread,
May 4, 2003, 6:10:53 AM5/4/03
to
all...@my-dejanews.com (Allan W) wrote:
> goo...@vandevoorde.com (Daveed Vandevoorde) wrote
> > I presented a set of language extensions I'm working
> > on at the recent ACCU conference (which, BTW, turned
> > out to be a very cool event). Somewhat unexpectedly,
> > it was also presented before a part of WG21/J16.
> >
> > I put up some notes at:
> > http://vandevoorde.com/Daveed/News/Archives/000015.html
>
> I found this hard to read -- not your writing style, but the HTML
> page. Your style sheet explicitly forces the body of the text to
> show up at 11-point font, overriding my browser's setting. On my
> screen, 11-point font is very difficult to read. I had to copy it
> into a word processor just to read it.

I'll have to look into that. Someone e-mailed me
a similar comment, but the reply address bounced.

> The metacode concept itself seems interesting. I'm not sure
> that I would consider it a replacement for the preprocessor.
> We'll still need #include, #ifdef, and so on.

It's not meant to replace the preprocessor per se,
but to make metaprogramming a "first class" paradigm
in C++.

> Also, I wonder how many compilers are going to be able to
> implement all of this.

I'm careful to keep in implementable. The part I've
implemented so far turned out to be remarkably smooth:
C++ compilers already have to do much of the subtasks
for other reason (e.g., constant-expression evaluation).

> It's quite extensive -- more so than
> some high-level languages. Would it be possible to implement
> metacode as a separate pre-compile step?

I think so (though it has its disadvantages from a
user-interface perspective).

> If it has to run at
> the same time as the main translator, there could be
> difficulties (or at least performance issues) on platforms
> where the compiler is already considered a big program.

So far it hasn't been a problem.

> I had some trouble understanding the second example in "Code
> Injection Primitives" -- the one where you inject field names.
> Is that supposed to generate
> float fieldfloat;
> int fieldint;
> char fieldchar;
> and so on for all known data types?

You're talking about the "define_fields" example, right?
The input to this metacode routine is an array of types.
If the array of types is { int, int, X, char(*)[7] }, it
essentially injects:

int field1;
int field2;
X field3;
char (*field4)[7];


[...]


> Will there be such a thing as "undefined behavior -- no diagnostic
> required" for metacode functions? Since they operate at compile
> time, it could easily cause the compiler itself to abort -- this
> would make debugging them somewhat difficult.

In my implementation, metacode routines run in a "protected
environment": They are interpreted. So the compiler normally
doesn't abort: It gives you a meaningful error message.

There is also a limit (that can be modified) on how many
branches (including function calls and returns) metacode can
execute (to deal with infinite loops).

> I must say, I like this whole idea a lot better than some of the
> contortions being used to do the same things with templates.

Yes, that's really the motivation. Template metaprogramming
has proven (to me at least) that metaprogramming is something
C++ programmers want to do (perhaps especially library writers).
Now let's work on providing a good tool for that purpose.

> Especially things like is_lvalue and is_constant. I'm guessing
> that typeof() would be a predefined metacode function?

Typeof is already part of my implementation, and I'm about to
add the new "decltype" operator that came out of Oxford.

Daveed
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Paul Mensonides

unread,
May 4, 2003, 11:58:40 AM5/4/03
to
Mike Conley wrote:
> Mike Conley <conle...@osu.edu> wrote in
> news:Xns9367DD6DE352...@206.127.4.10:
>
>> #defrec SUM(ARG1,...) ARG1 + SUM(__VA_ARGS__)
>
> I probably should have said something about terminating the
> recursion.... Here are a few options. I'm sure people who know more
> about the preprocessor than I have better ideas.
>
>
> 1) Allow macros to be overloaded on the number of arguments, eg,
> #define SUM() 0
> #define SUM(arg1) arg1
> #defrec SUM(arg1,...) arg1 + SUM(__VA_ARGS__)
> ....

Given variadic macros, it is already possible to count the arguments. (The only
border case is that you can't, in a completely general fashion, tell the
difference between zero arguments or one argument.) Given that this is
possible, it is also possible to make a generalized overloader for macros based
on the number of arguments.

> 2) Add an operator (called "empty" below) to test whether or not
> __VA_ARGS__ contains additional arguments. It would expand to 1
> whenever its argument would expand to whitespace only. For example
> (let's assume that #defrec introduces two-stage expansion of macros
> -- directives are evaluated within the expanded macro body for each
> expansion):
>
> #defrec SUM(ARG1,...) \
> #if empty(__VA_ARGS__) ARG1 \
> #else ARG1 + SUM(__VA_ARGS__) \
> #endif

This one is unnecessary also *if* token-pasting was well-defined for situations
like this (this is the primary reason that you cannot detect nullary vs. unary
invocations):

+ ## == // undefined behavior

If it was defined, as either retokenization or no-op, then you can detect
emptiness like this:

#define CAT(a, b) PRIMITIVE_CAT(a, b)
#define PRIMITIVE_CAT(a, b) a ## b

#define SPLIT(i, ...) \
PRIMITIVE_CAT(SPLIT_, i)(__VA_ARGS__) \
/**/
#define SPLIT_0(a, ...) a
#define SPLIT_1(a, ...) __VA_ARGS__

#define IS_NULLARY(...) \
SPLIT(0, CAT(IS_NULLARY_R_, IS_NULLARY_C expr)) \
/**/
#define IS_NULLARY_C() 1
#define IS_NULLARY_R_1 1,
#define IS_NULLARY_R_IS_NULLARY_C 0,

#define IS_EMPTY(...) IS_EMPTY_I(__VA_ARGS__)
#define IS_EMPTY_I(...) \
IS_NULLARY( \
IS_EMPTY_ ## __VA_ARGS__ ## IS_EMPTY \
) \
/**/
#define IS_EMPTY_IS_EMPTY ()

IS_EMPTY( ) // 1
IS_EMPTY(a, b) // 0

However, this breaks down in situations like this:

IS_EMPTY( ++ ) // undefined behavior

There are other methods of detecting "emptiness," but they all have some kind of
input that is invalid--which makes them non-general.

> This doesn't allow a call to SUM with no arguments to expand to 0,
> because SUM requires at least one argument. Not a big loss here, but
> certainly less general than (1).

If token-pasting had well-defined semantics for unrelated tokens--rather than
undefined behavior--it would be possible to count the number of arguments and
fake overloading on number of parameters.

> 3) Add some general comma-separated list manipulation operators. For
> example, if we had a preprocessor operator split defined st
>
> split(index, list) returns the first index elements of list and
> split(-index, list) returns the last L - index elements, where L is
> the length of the list
>
> then we could define SUM:
>
> #defrec SUM(...) \
> #if empty(__VA_ARGS__) 0 \
> #else split(1, __VA_ARGS__) + SUM(split(-1, __VA_ARGS__)) \
> #endif

We can already do all of this stuff with variadic macros as is--except general
purpose discrimination between empty/non-empty. All that we need for that is
well-defined token-pasting semantics with any two tokens as operands.

Regards,
Paul Mensonides


[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Mike Conley

unread,
May 5, 2003, 3:39:26 PM5/5/03
to
"Paul Mensonides" <leav...@attbi.com> wrote in
news:8Lzsa.180856$Si4.1...@rwcrnsc51.ops.asp.att.net:

> Useful preprocessor metaprogramming already exists.

Yes, but the implmentation seems to consist in large part of the type of
repetition that recursive macros would make unnecessary. And a good
portion of the library is dedicated to making up for the preprocessor's
lack thereof. True recursive macros would be more general and (probably)
less complex than anything that can be implemented in the current
preprocessor.

--
Mike Conley

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Mike Conley

unread,
May 5, 2003, 4:58:03 PM5/5/03
to
"Paul Mensonides" <leav...@attbi.com> wrote in
news:ZP_sa.724059$F1.93803@sccrnsc04:

> Given variadic macros, it is already possible to count the arguments.

But you can only count up to some predetermined number of arguments.
Practically speaking, the limit would be larger if this were implemented in
the preprocessor itself.

> Given that this is possible, it is also possible to make a generalized
> overloader for macros based on the number of arguments.

That would be neat. The syntax would be a bit unpleasant, though, compared
to native preprocessor support for overloading. Native overloaded macros
would also provide a workaround for the token pasting problem you've
mentioned:

#define COUNT_ARGS() 0
#defrec COUNT_ARGS(A, ...) 1 + COUNT_ARGS(__VA_ARGS__)

Argument counting, defined this way, doesn't need to rely on token pasting.
And you get a safe emptiness detector and a list splitter for free :)

--
Mike Conley
---

Pavel Vozenilek

unread,
May 8, 2003, 1:58:16 PM5/8/03
to
goo...@vandevoorde.com (Daveed Vandevoorde) wrote in message news:<52f2f9cd.03050...@posting.google.com>...
..

> > > I presented a set of language extensions I'm working
> > > on at the recent ACCU conference (which, BTW, turned
> > > out to be a very cool event). Somewhat unexpectedly,
> > > it was also presented before a part of WG21/J16.
> > >
> > > I put up some notes at:
> > > http://vandevoorde.com/Daveed/News/Archives/000015.html
> > >
> > Will it be possible to read configuratin files from metacode?
>
> Good question. Some sort of I/O would be nice, wouldn't it
..

Yes, it would. Contrived example: metacode which produces programer
documentation extracted from source code.

/Pavel

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Paul Mensonides

unread,
May 8, 2003, 1:58:09 PM5/8/03
to
Mike Conley wrote:
> "Paul Mensonides" <leav...@attbi.com> wrote in
> news:ZP_sa.724059$F1.93803@sccrnsc04:
>
>> Given variadic macros, it is already possible to count the arguments.
>
> But you can only count up to some predetermined number of arguments.
> Practically speaking, the limit would be larger if this were
> implemented in the preprocessor itself.

I can count up to 2^512 arguments with my current "strict" implementation of the
Boost pp-lib. Granted, it isn't the limit that matters. Rather, it that you
have to count them at all. :)

>> Given that this is possible, it is also possible to make a
>> generalized overloader for macros based on the number of arguments.
>
> That would be neat. The syntax would be a bit unpleasant, though,
> compared to native preprocessor support for overloading. Native
> overloaded macros would also provide a workaround for the token
> pasting problem you've mentioned:
>
> #define COUNT_ARGS() 0
> #defrec COUNT_ARGS(A, ...) 1 + COUNT_ARGS(__VA_ARGS__)

No the wouldn't. The token pasting problem that I mentioned only exists because
there is no difference between ( placemarker ) and ( argument ). Therefore,
this would be ambiguous:

#define MACRO()
#define MACRO(a)

MACRO() // argument == placemarker?
// or, no argument at all?

Furthermore, the problems compound if you separate the "recursive" style macros
from the overloading idea. Specifically, if an identifier token that is found
during the rescan of a macro's replacement list that refers to a macro that is
currently "disabled," the identifier token itself is permanently disabled--this
is even without an attempted invocation, such as:

#define MACRO(x) x

MACRO( MACRO )( 1 ) // MACRO( 1 )

No invocation is needed in order to disable specific identifiers. So, either
the entire overload set must be disabled, or the overloading can only work when
macros are recursive.

#define A(a, b) ...
#define A(x) x

A(A)(1, 2) // ???

> Argument counting, defined this way, doesn't need to rely on token
> pasting. And you get a safe emptiness detector and a list splitter
> for free :)

I like the overloading on number of arguments idea. I don't like the variadic
splitter because that is completely unnecessary:

#define first(...) first_i(__VA_ARGS__,)
#define first_i(x, ...) x

#define rest(...) rest_i(__VA_ARGS__)
#define rest_i(x, ...) __VA_ARGS__

Regards,
Paul Mensonides


[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Paul Mensonides

unread,
May 9, 2003, 5:52:11 AM5/9/03
to
Mike Conley wrote:
> "Paul Mensonides" <leav...@attbi.com> wrote in
> news:8Lzsa.180856$Si4.1...@rwcrnsc51.ops.asp.att.net:
>
> > Useful preprocessor metaprogramming already exists.
>
> Yes, but the implmentation seems to consist in large part of the type
> of repetition that recursive macros would make unnecessary. And a
> good portion of the library is dedicated to making up for the
> preprocessor's lack thereof. True recursive macros would be more
> general and (probably) less complex than anything that can be
> implemented in the current preprocessor.

That repetition is only necessary because most preprocessors are not very
compliant. Given a strict preprocessor, I can abstract recursion to a single
set of macros. With that set, recursion can be exponential, meaning that with
100 macros, you can get 2^100 recursions--e.g. more than you'd ever want and the
same set can be used for everything--including some "user-defined" recursive
macro such as "SUM."

BTW, I'm not disagreeing with your ideas, I'm only saying that they can already
be accomplished--though slightly less "cleanly".

Regards,
Paul Mensonides
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Allan W

unread,
May 9, 2003, 10:29:26 AM5/9/03
to
> all...@my-dejanews.com (Allan W) wrote:
> > Will there be such a thing as "undefined behavior -- no diagnostic
> > required" for metacode functions? Since they operate at compile
> > time, it could easily cause the compiler itself to abort -- this
> > would make debugging them somewhat difficult.

goo...@vandevoorde.com (Daveed Vandevoorde) wrote


> In my implementation, metacode routines run in a "protected
> environment": They are interpreted. So the compiler normally
> doesn't abort: It gives you a meaningful error message.
>
> There is also a limit (that can be modified) on how many
> branches (including function calls and returns) metacode can
> execute (to deal with infinite loops).

If this was part of the standard, it would have to run efficiently
on platforms that can't have protected environments -- such as
processors that don't provide such a function.

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Allan W

unread,
May 9, 2003, 10:30:04 AM5/9/03
to
> > LLeweLLyn <llewe...@xmission.dot.com> wrote
> > > Also, since the preprocessor is not going away, I would like to see an
> > > alternative to function like macros that do not require such
> > > things as the do{}while(0) idiom and parentheses around all
> > > arguments. I can't think of a good way to express this (yet) but
> > > that's what I want.

> all...@my-dejanews.com (Allan W) writes:
> > How about making them functions -- but still expand inline?
> > We could call these "inline functions."
> >
> > :-)

LLeweLLyn <llewe...@xmission.dot.com> wrote


> Rewrite this macro as an inline function:
>
> #define TRACE() do{\
> if(debug_this_module) {\
> logfile << __FILE__ << ":" << __LINE__ << ":" << endl; logfile\
> << dump_stack_trace();\
> }\
> }while(0)
>
> :-)

// USAGE: TRACE(__LINE__);
void TRACE(int line=0) {
if (debug_this_module)
logfile << __FILE__ << ": " << line << ":\n" << dump_stack_trace();
}

If dump_stack_trace can be modified to accept a parameter int ignore=0,
then TRACE() could be changed to pass in 1, meaning "ignore the very
first stack frame".

:-)

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Allan W

unread,
May 9, 2003, 10:30:18 AM5/9/03
to
Mike Conley <conle...@osu.edu> wrote

> if we had a preprocessor operator split defined st
>
> split(index, list) returns the first index elements of list and
> split(-index, list) returns the last L - index elements, where L is the
> length of the list
>
> then we could define SUM:
>
> #defrec SUM(...) \
> #if empty(__VA_ARGS__) 0 \
> #else split(1, __VA_ARGS__) + SUM(split(-1, __VA_ARGS__)) \
> #endif

How about
#include <lisp>

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Mike Conley

unread,
May 9, 2003, 11:52:44 AM5/9/03
to
Paul Mensonides <leav...@attbi.com> wrote in
news:WuAta.745871$L1.211469@sccrnsc02:

> I can count up to 2^512 arguments with my current "strict"
> implementation of the Boost pp-lib. Granted, it isn't the limit that
> matters. Rather, it that you have to count them at all. :)

I thought the limit was much smaller. Guess I'll have to take a closer
look at it. But, as you say, it's the need to count, not the limit, that's
bothersome.



> > Native
> > overloaded macros would also provide a workaround for the token
> > pasting problem you've mentioned:

> No the wouldn't. The token pasting problem that I mentioned only
> exists because there is no difference between ( placemarker ) and (
> argument ). Therefore, this would be ambiguous:
>
> #define MACRO()
> #define MACRO(a)
>
> MACRO() // argument == placemarker?
> // or, no argument at all?

Ahh... I hadn't thought of that (obviously :) This should probably be a
call to the nullary MACRO. Users can explicitly request the unary MACRO
with an empty argument easily enough:

#define nothing
MACRO(nothing)


> Furthermore, the problems compound if you separate the "recursive"
> style macros from the overloading idea. Specifically, if an
> identifier token that is found during the rescan of a macro's
> replacement list that refers to a macro that is currently "disabled,"
> the identifier token itself is permanently disabled--this is even
> without an attempted invocation, such as:
>
> #define MACRO(x) x
>
> MACRO( MACRO )( 1 ) // MACRO( 1 )

Right. If you want this to expand to 1, you'd need to use a recursive
macro. Recursive macros would never be disabled.


> No invocation is needed in order to disable specific identifiers. So,
> either the entire overload set must be disabled, or the overloading
> can only work when macros are recursive.

Probably the thing to do is disable only the nonrecursive overloads.
For example:

#defrec A(arg1) arg1
#define A(arg1, arg2) arg1(arg2)

A(1) // 1
A(A,A)(1) // A(A)(1) -> A(1) -> 1 (because unary A is recursive)
A(A,A)(1,1) // A(A)(1,1) -> A(1,1) (stops here -- binary A is disabled)

More specifically, let's suppose we're talking about a macro M. M_N
represents a definition of M taking N arguments (let's not worry about
variadics -- similar rules would apply to them).

Within the body of M_N, M_N is disabled iff M_N is not recursive. M_K is
available (whether M_N is recursive or not) for all K != N.

If the name M is used in a context where an ordinary macro would be
disabled without an invocation, then, for all N, M_N is disabled iff M_N is
not recursive.

> I like the overloading on number of arguments idea. I don't like the
> variadic splitter because that is completely unnecessary:

The point was that, with recursive macros and overloading, you could define
a fully general list splitter (and other useful list manipulation macros,
too :).

--
Mike Conley
---

Daveed Vandevoorde

unread,
May 9, 2003, 4:13:19 PM5/9/03
to
(I removed comp.lang.c++.moderated from the list of
target groups.)

all...@my-dejanews.com (Allan W) wrote:
> > all...@my-dejanews.com (Allan W) wrote:
> > > Will there be such a thing as "undefined behavior -- no diagnostic
> > > required" for metacode functions? Since they operate at compile
> > > time, it could easily cause the compiler itself to abort -- this
> > > would make debugging them somewhat difficult.
>
> goo...@vandevoorde.com (Daveed Vandevoorde) wrote
> > In my implementation, metacode routines run in a "protected
> > environment": They are interpreted. So the compiler normally
> > doesn't abort: It gives you a meaningful error message.
> >
> > There is also a limit (that can be modified) on how many
> > branches (including function calls and returns) metacode can
> > execute (to deal with infinite loops).
>
> If this was part of the standard, it would have to run efficiently
> on platforms that can't have protected environments -- such as
> processors that don't provide such a function.

I'm sorry: I didn't mean to imply that the "protected
environment" is a platform feature. I just wrote an
interpreter for the internal representation of the
compiler, and that interpreter protects (or _should_
protect ;-) the compiler against dereferencing null
pointers and the like. (I.e., by "protected environment"
I meant "embedded interpreter".)

I don't think my interpreter is particularly efficient
at this point, but it beats template metaprogramming
hands down. (Not that I think the standard can mandate
absolute performance numbers.)

Daveed

---

Mike Conley

unread,
May 10, 2003, 2:38:42 PM5/10/03
to
"Paul Mensonides" <leav...@attbi.com> wrote in
news:SjAta.745818$L1.210663@sccrnsc02:

> Mike Conley wrote:

> That repetition is only necessary because most preprocessors are not
> very compliant. Given a strict preprocessor, I can abstract recursion
> to a single set of macros. With that set, recursion can be
> exponential, meaning that with 100 macros, you can get 2^100
> recursions--e.g. more than you'd ever want and the same set can be
> used for everything--including some "user-defined" recursive macro
> such as "SUM."

Cool. I'm going to have to take a closer look -- I didn't realize it
was so capable :)

> BTW, I'm not disagreeing with your ideas, I'm only saying that they
> can already be accomplished--though slightly less "cleanly".

And I'm not trying to be critical of your library (honest!). Rather, I
wanted to point to it as an example -- the fact that it exists
demonstrates that users find preprocessor metaprogramming useful (and
recursion/repetition in particular).

--
Mike Conley

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Paul Mensonides

unread,
May 10, 2003, 2:39:03 PM5/10/03
to
Mike Conley wrote:
> Paul Mensonides <leav...@attbi.com> wrote in
> news:WuAta.745871$L1.211469@sccrnsc02:
>
>> I can count up to 2^512 arguments with my current "strict"
>> implementation of the Boost pp-lib. Granted, it isn't the limit that
>> matters. Rather, it that you have to count them at all. :)
>
> I thought the limit was much smaller. Guess I'll have to take a
> closer look at it. But, as you say, it's the need to count, not the
> limit, that's bothersome.

The limit is much small in the Boost pp-lib. That is because I cannot get away
with a great many things because of buggy preprocessors. The "strict" version
of the library is an entirely separate implementation that I haven't released
yet (though it should be soon). This version of the library abstracts recursion
into a single set of macros. This makes macro algorithmic, so writing the
entire BOOST_PP_REPEAT or BOOST_PP_WHILE construct only takes about 3-4 macros
each. Likewise, it allows you to create exponential loops, etc., that have a
massively higher theoretical limit--such as 2^512. I should ammend the above.
I can't "count" to 2^512, but I can't produce a numeric token in that range
(currently). The strict sources include high precision arithmetic which will
only go up to 9,999,999,999--which is *way* less than 2^512. However--it is
high enough (obviously).

>>> Native
>>> overloaded macros would also provide a workaround for the token
>>> pasting problem you've mentioned:
>
>> No the wouldn't. The token pasting problem that I mentioned only
>> exists because there is no difference between ( placemarker ) and (
>> argument ). Therefore, this would be ambiguous:
>>
>> #define MACRO()
>> #define MACRO(a)
>>
>> MACRO() // argument == placemarker?
>> // or, no argument at all?
>
> Ahh... I hadn't thought of that (obviously :) This should probably
> be a call to the nullary MACRO. Users can explicitly request the
> unary MACRO with an empty argument easily enough:
>
> #define nothing
> MACRO(nothing)

I agree that in the scheme above, if there are no arguments, and a nullary macro
exists, choose the nullary macro. The logic is pretty simple. ;) However, the
above won't always work:

#define NIL

#define A() B()
#define A(x) B(x)

#define B() 1
#define B(x) 2

A(NIL) // ?

The problem here is that the argument is thoroughly expanded and rescanned prior
to insertion into the replacement list of A. Rescanning therefore sees this:

B( )

...and would call the nullary B macro.

Granted, this is not a big problem. I'm just pointing out why simulation of
MACRO( <placemarker> ) as opposed to MACRO() is not generally possible. (Though
I don't think it really matters!) :)

>> Furthermore, the problems compound if you separate the "recursive"
>> style macros from the overloading idea. Specifically, if an
>> identifier token that is found during the rescan of a macro's
>> replacement list that refers to a macro that is currently "disabled,"
>> the identifier token itself is permanently disabled--this is even
>> without an attempted invocation, such as:
>>
>> #define MACRO(x) x
>>
>> MACRO( MACRO )( 1 ) // MACRO( 1 )
>
> Right. If you want this to expand to 1, you'd need to use a recursive
> macro. Recursive macros would never be disabled.
>
>> No invocation is needed in order to disable specific identifiers.
>> So, either the entire overload set must be disabled, or the
>> overloading
>> can only work when macros are recursive.
>
> Probably the thing to do is disable only the nonrecursive overloads.

The problem is that there is no way to tell what overload is what when all you
have is the identifier itself. The preprocessor disables the specific
identifier token itself when it encounters it if the corresponding macro is
disabled. No invocation (or attempt at invocation) is necessary--which means
any specific overload could not be chosen. Ultimately, what this means is 1)
the entire overload set must be disabled, or 2) a specific identifier token
would have to be only "tentatively" disabled--pending selection from an overload
set. (The easy solution would be to only allow overloading of macros that are
recursive.)

> The point was that, with recursive macros and overloading, you could
> define a fully general list splitter (and other useful list
> manipulation macros, too :).

Yes, I know, and I would definitely like such a mechanism. However, much of
this I can do already given a conformant preprocessor. List manipulation
algorithms in my strict sources only require about 3/4 macros each. However,
the mechanism has to jump through some hoops and use some advanced trickery to
achieve this.

Regards,
Paul Mensonides


[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

LLeweLLyn

unread,
May 11, 2003, 5:59:20 AM5/11/03
to
all...@my-dejanews.com (Allan W) writes:

> > all...@my-dejanews.com (Allan W) wrote:
> > > Will there be such a thing as "undefined behavior -- no diagnostic
> > > required" for metacode functions? Since they operate at compile
> > > time, it could easily cause the compiler itself to abort -- this
> > > would make debugging them somewhat difficult.
>
> goo...@vandevoorde.com (Daveed Vandevoorde) wrote
> > In my implementation, metacode routines run in a "protected
> > environment": They are interpreted. So the compiler normally
> > doesn't abort: It gives you a meaningful error message.
> >
> > There is also a limit (that can be modified) on how many
> > branches (including function calls and returns) metacode can
> > execute (to deal with infinite loops).
>
> If this was part of the standard, it would have to run efficiently
> on platforms that can't have protected environments -- such as
> processors that don't provide such a function.

'don't provide such a function'? What function are you thinking of?
Note, an interpreter which needs to supply a protected
environment need not rely on e.g., virtual memory.

Independent of that, I think efficiency at compile time is less
important than runtime efficiency; I think most development for
platforms of limited resources is done with a cross-compiler,
which runs on a fast pc or workstation, but the generated code
runs on a different kind of machine.


---

LLeweLLyn

unread,
May 11, 2003, 6:00:00 AM5/11/03
to
all...@my-dejanews.com (Allan W) writes:

> > > LLeweLLyn <llewe...@xmission.dot.com> wrote
> > > > Also, since the preprocessor is not going away, I would like to see an
> > > > alternative to function like macros that do not require such
> > > > things as the do{}while(0) idiom and parentheses around all
> > > > arguments. I can't think of a good way to express this (yet) but
> > > > that's what I want.
>
> > all...@my-dejanews.com (Allan W) writes:
> > > How about making them functions -- but still expand inline?
> > > We could call these "inline functions."
> > >
> > > :-)
>
> LLeweLLyn <llewe...@xmission.dot.com> wrote
> > Rewrite this macro as an inline function:
> >
> > #define TRACE() do{\
> > if(debug_this_module) {\
> > logfile << __FILE__ << ":" << __LINE__ << ":" << endl; logfile\
> > << dump_stack_trace();\
> > }\
> > }while(0)
> >
> > :-)
>
> // USAGE: TRACE(__LINE__);
> void TRACE(int line=0) {
> if (debug_this_module)
> logfile << __FILE__ << ": " << line << ":\n" << dump_stack_trace();
> }

I think you miss half the point. Requiring the user to specify
__LINE__ and __FILE__ is only an opportunity for error. (In
pracitice, I always use __func__ or __PRETTY_FUNCTION__ where
available, as well as __FILE__ and __LINE__.) Further, __FILE__
expands to a string literal containing the file name. If TRACE is
used in a file different from where it is defined, your definition
will print the wrong file name.

> If dump_stack_trace can be modified to accept a parameter int ignore=0,
> then TRACE() could be changed to pass in 1, meaning "ignore the very
> first stack frame".
>
> :-)

As a student, I consistently used functions for this sort of thing -
but after I started working professionly, I switched to macros -
so much easier, despite their problems.
---

Mike Conley

unread,
May 12, 2003, 9:51:14 AM5/12/03
to
"Paul Mensonides" <leav...@attbi.com> wrote in
news:vaVua.534065$OV.502037@rwcrnsc54:

> The
> "strict" version of the library is an entirely separate implementation
> that I haven't released yet (though it should be soon). This version
> of the library abstracts recursion into a single set of macros.

Sounds cool.


> Granted, this is not a big problem. I'm just pointing out why
> simulation of MACRO( <placemarker> ) as opposed to MACRO() is not
> generally possible.

Another possiblity would use undef, suitably extended to undefine
specific overloads, eg, #undef MACRO(A,A) -- the name of the parameter
would be irrelevent. In unambiguous cases names could be omitted.
Omitting a parameter list entirely would undefine all overloads of the
macro.

#undef MACRO() // undef nullary version

MACRO() // pp now only sees unary version

This would at least give users a workaround. And it seems like a natural
extension to support overloaded macros.


> (Though I don't think it really matters!) :)

You're probably right :)


>> Probably the thing to do is disable only the nonrecursive overloads.
>
> The problem is that there is no way to tell what overload is what when
> all you have is the identifier itself.

I have a rather simple scheme in mind. The preprocessor sees MACRO in a
context in which it needs to disable MACRO. It searches its internal
data structures for non-recursive versions of MACRO and disables them.
Which overload in particular ends up being called (assuming MACRO is ever
invoked) is irrelevent to this process. The preprocessor would limit
overload resolution to macros that aren't disabled.

The only case in which not all nonrecursive overloads of MACRO would be
disabled would be in the definition of an overload of MACRO. Within that
body, all other previously defined overloads of MACRO would be available.
This requires only that the preprocessor delay disabling MACRO until it
has seen the argument list (so it knows which overload is being
#defined).

In other words, each overload is treated as a separate macro by the
preprocessor when defining another overload. But, when the preprocessor
can't tell which overload is being referred to (because the macro hasn't
been invoked), it disables all the nonrecursive overloads.

This seems like a consistent approach to me, but there are plenty of
options.

> However,
> much of this I can do already given a conformant preprocessor.

Granted. 2^512 iterations really says it all :)

--
Mike Conley
---

Luis Pedro Coelho

unread,
May 12, 2003, 10:06:55 AM5/12/03
to
Le Vendredi 9 Mai 2003 16:30, Allan W a ecrit:
> How about
> #include <lisp>

ROTL :)

I think one can say that,

in programming languages those who do not understand LISP are condemned to
reinvent it, poorly.

Seriously, how much of the magic going on in boost with templates and
preprocessor would be unnecessary with a bit more of lisp in c++?

Regards,
luis pedro
---

LLeweLLyn

unread,
May 13, 2003, 2:18:46 AM5/13/03
to
Luis Pedro Coelho <luis_...@netcabo.pt> writes:

> Le Vendredi 9 Mai 2003 16:30, Allan W a ecrit:
> > How about
> > #include <lisp>
>
> ROTL :)
>
> I think one can say that,
>
> in programming languages those who do not understand LISP are condemned to
> reinvent it, poorly.

Ironicly, much of the C++ code which tries to reinvent lisp is written
by people who know lisp very well. :-)

> Seriously, how much of the magic going on in boost with templates and
> preprocessor would be unnecessary with a bit more of lisp in c++?

[snip]

Potentally a lot. However I think adding lisp-like features to C++
will be difficult. Good things usually are.

I think there are many threads on this in archives.

[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Paul Mensonides

unread,
May 14, 2003, 4:23:47 AM5/14/03
to
Mike Conley wrote:
>> The problem is that there is no way to tell what overload is what
>> when all you have is the identifier itself.
>
> I have a rather simple scheme in mind. The preprocessor sees MACRO
> in a context in which it needs to disable MACRO. It searches its
> internal data structures for non-recursive versions of MACRO and
> disables them. Which overload in particular ends up being called
> (assuming MACRO is ever invoked) is irrelevent to this process. The
> preprocessor would limit overload resolution to macros that aren't
> disabled.

A specific identifier token would have to retain a "potentially" disabled
status--meaning that it is either enabled/disabled pending an invocation attempt
and the result of "overload resolution."

The problem is that there are two forms of "disabling" which we've been
discussing so far as only one. The first type is what you mention above, when a
macro's replacement list is rescanning the macro that generated it is disabled.
I.e. the macro itself--not any specific identifier tokens. However, the macro
is only disabled on the initial, prerequisite rescan of any given macro
expansion (this rescanning may, in turn, invoke other macros but the original
macro remains disabled). This "disabled on first rescan only" requires the
preprocessor to mark any specific identifier token that refers to a "disabled"
macro as disabled itself--permanently. This is the second form of disabling--it
is not disabling of a macro. Rather, it is the disabling of a specific
identifier token that was found *when* a macro itself was disabled. The reason
that this must be done is that use as a macro parameter and appearance in a
replacement list, such as:

#define ID(x) x

ID( MACRO )

...effectively causes "MACRO" to be rescanned twice--once during the expansion
of the parameter and once during the rescanning of the replacement list of "ID."
Therefore something like this is valid:

#define SCAN(x) x

#define EMPTY()
#define DEFER(macro) macro EMPTY()

#define A() DEFER(B)()
#define B() DEFER(A)()

A() // B ()
SCAN( A() ) // A ()
SCAN(SCAN( A() )) // B ()
SCAN(SCAN(SCAN( A() ))) // A ()

...and so on and so forth--indefinitely. What this means, is when an identifier
is found that refers to an overload, it would have to be marked as "potentially
disabled" pending invocation attempt and resolution in a possible later context
outside of the initial rescan.

> The only case in which not all nonrecursive overloads of MACRO would
> be disabled would be in the definition of an overload of MACRO.
> Within that body, all other previously defined overloads of MACRO
> would be available. This requires only that the preprocessor delay
> disabling MACRO until it has seen the argument list (so it knows
> which overload is being #defined).

It can't do that directly because a macro itself is only disabled during the
initial rescan of the replacement list. If an identifier is found that refers
to a disabled macro--that specific identifier *must* be marked somehow at that
point, so it doesn't become enabled again in contexts other than the initial
rescan. The best you can do here is a tentative disabling that "remembers" what
overload initially produced the specific token.

I think this is a recipe for trouble. :( Name disabling is the single most
complicated task the preprocessor has to perform (because of the interplay
between the rescan of a macro's replacement list and the (re)scan of the tokens
that follow the macro invocation). It would be better to leave that alone and
allow overloading only of recursive macros--simple and straightforward since no
name disabling is involved (and in isolation, much easier to implement than the
normal macro expansion mechanics).

Regards,
Paul Mensonides


[ Send an empty e-mail to c++-...@netlab.cs.rpi.edu for info ]
[ about comp.lang.c++.moderated. First time posters: do this! ]

[ comp.std.c++ is moderated. To submit articles, try just posting with ]

Luis Pedro Coelho

unread,
May 14, 2003, 5:06:25 PM5/14/03
to
Le Mardi 13 Mai 2003 08:18, LLeweLLyn a ecrit:

>> Seriously, how much of the magic going on in boost with templates and
>> preprocessor would be unnecessary with a bit more of lisp in c++?
> [snip]
>
> Potentally a lot. However I think adding lisp-like features to C++
> will be difficult. Good things usually are.

I have often hoped that in time, things like boost::lambda will serve as
"as-if" models for core-language changes to c++. Of course, these core
changes would provide more, but in a "as if you had written all this
boost::lambda code by hand" fashion as it provides a useful model to start
with.

Regards,
luis
---


[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]

[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Mike Conley

unread,
May 19, 2003, 3:46:08 PM5/19/03
to
"Paul Mensonides" <leav...@attbi.com> wrote in
news:Hm0wa.112523$pa5.1...@rwcrnsc52.ops.asp.att.net:

> It would be
> better to leave that alone and allow overloading only of recursive
> macros--simple and straightforward since no name disabling is involved
> (and in isolation, much easier to implement than the normal macro
> expansion mechanics).


That seems reasonable enough. It also has the advantage of making all of
this an extension to the preprocessor, rather than a modification of
existing behavior.

--
Mike Conley
---

0 new messages