Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Unpopular C++ ideas

148 views
Skip to first unread message

John Nagle

unread,
Mar 21, 2004, 5:07:52 PM3/21/04
to
Here are a few ideas for the next major revision of C++
that people will hate, but which contain some sense.

1. Move unsafe library functions to new header files.

The headers for the classically unsafe string
functions ("sprintf", "strcat", etc.) would be moved
to "<unsafe-string.h>", or something similar.

This breaks existing programs, but they're probably
broken anyway. After 25 years of buffer overflows,
it's time to dump these badly designed functions.

2. Make "assert" part of the core language.

Compilers should know more about "assert", so they
can optimize and pull asserts out of loops. Early
assertion failure detection, where programs can report an
assertion failure as soon as it becomes inevitable,
should be encouraged.

Example:

const int tabsize = 100
int tab[tabsize];
for (int i=0; i<=tabsize ; i++)
{ assert(i<tabsize); tab[i] = 0; }

can be optimized into

assert(false);


3. "swap" and "move" as primitives.

STL collections should all support "swap" and "move",
and collections should be able to handle objects for which
"swap" and "move" are defined, but "operator=" is not.
This moves towards collections of auto_ptr.

4. "let".

"let" declares and initializes a variable with the type
of the right hand side.

Example:

let x = 1.0; // x is a double

And, of course

for (let p = tab.begin(); p != tab.end(); p++) { ... }

which is shorter and more generic than the current form

for (vector<sometype>::iterator p = tab.begin(); p != tab.end(); p++) { ... }

5. Automatic class locking

Like "synchronized" classes in Java. Only one thread can be
"inside" an object at a time. Lock upon entry to a public
function, unlock at exit.

This requires some clear thinking about what it means for
control to be "inside" an object. I'd suggest that if this is done,
it should be possible to temporarily "leave" the object by
writing, in a class member function:

void classname::fn()
{ ... // locked
public {
... // unlocked
}; // relocking
}

If an object needs to block a thread and let other threads
in during a block, that syntax allows it in a straightforward
manner.

John Nagle
Animats

---
[ comp.std.c++ is moderated. To submit articles, try just posting with ]
[ your news-reader. If that fails, use mailto:std...@ncar.ucar.edu ]
[ --- Please see the FAQ before posting. --- ]
[ FAQ: http://www.jamesd.demon.co.uk/csc/faq.html ]

Edward Diener

unread,
Mar 21, 2004, 10:01:40 PM3/21/04
to
John Nagle wrote:
> Here are a few ideas for the next major revision of C++
> that people will hate, but which contain some sense.
>
> 1. Move unsafe library functions to new header files.
>
> The headers for the classically unsafe string
> functions ("sprintf", "strcat", etc.) would be moved
> to "<unsafe-string.h>", or something similar.
>
> This breaks existing programs, but they're probably
> broken anyway. After 25 years of buffer overflows,
> it's time to dump these badly designed functions.

What purpose does this server other than to make a judgment call on what is
"safe" or "un-safe" ? Since these functions are in their own C header file,
I see no purpose in moving them somewhere else.

>
> 2. Make "assert" part of the core language.
>
> Compilers should know more about "assert", so they
> can optimize and pull asserts out of loops. Early
> assertion failure detection, where programs can report an
> assertion failure as soon as it becomes inevitable,
> should be encouraged.
>
> Example:
>
> const int tabsize = 100
> int tab[tabsize];
> for (int i=0; i<=tabsize ; i++)
> { assert(i<tabsize); tab[i] = 0; }
>
> can be optimized into
>
> assert(false);

What happens when an assert is false if you make it part of the language
itself ? Currently the result is up to the macro and implementation, Making
it part of the language, which I do not necessarily deem a bad idea, must
define exactly what it does when an assert is false. Whatever is decided may
not be everyone's choice.

>
>
> 3. "swap" and "move" as primitives.
>
> STL collections should all support "swap" and "move",
> and collections should be able to handle objects for which
> "swap" and "move" are defined, but "operator=" is not.
> This moves towards collections of auto_ptr.

One can have collections of boost::shared_ptr so not having collections of
std::auto_ptr is not a big deal as I see it. Furthermore boost::shared_ptr
has been accepted in TR1.

By swap and move, I believe you are arguing for move semantics not invoking
a copy constructor or assignment operator. I will let others argue that one
out as I have never seen the practical advantage of it.

>
> 4. "let".
>
> "let" declares and initializes a variable with the type
> of the right hand side.
>
> Example:
>
> let x = 1.0; // x is a double
>
> And, of course
>
> for (let p = tab.begin(); p != tab.end(); p++) { ... }
>
> which is shorter and more generic than the current form
>
> for (vector<sometype>::iterator p = tab.begin(); p != tab.end(); p++)
> { ... }

This I do like, especially for the purposes of template programming, where
the programmer needs to create a variable of a type which will hold the type
of the rvalue expression. The compiler does know the type of the rvalue
expression, so having it automatically create a variable of the same type is
a no-brainer. I believe there are a number of proposals which move the next
version of C++ in the direction of supporting the "type_of" idea, and would
be surprised if the next version of C++ diod not support some form of
"type_of".

>
> 5. Automatic class locking
>
> Like "synchronized" classes in Java. Only one thread can be
> "inside" an object at a time. Lock upon entry to a public
> function, unlock at exit.

It needs to be better than Java's "synchronized" in order to allow a single
member function to be locked and not an entire object as Java does, which is
crude. Again I believe that the next version of C++ is considering the
threading issue, but members of the C++ standard committee will know about
that, and hopefully will comment.

>
> This requires some clear thinking about what it means for
> control to be "inside" an object. I'd suggest that if this is done,
> it should be possible to temporarily "leave" the object by
> writing, in a class member function:
>
> void classname::fn()
> { ... // locked
> public {
> ... // unlocked
> }; // relocking
> }
>
> If an object needs to block a thread and let other threads
> in during a block, that syntax allows it in a straightforward
> manner.

Except for the first idea, which is a prejudice, the rest are all
interesting.

Andrei Alexandrescu

unread,
Mar 22, 2004, 10:19:27 PM3/22/04
to
"John Nagle" <na...@animats.com> wrote in message
news:c627c.40806$qC6....@newssvr25.news.prodigy.com...

> 2. Make "assert" part of the core language.
>
> Compilers should know more about "assert", so they
> can optimize and pull asserts out of loops.

They already know pretty much all that's to be known. The general-purpose
optimizer can easily detect the condition and figure out whether it's always
true or always false.

Andrei

John Nagle

unread,
Mar 23, 2004, 5:31:31 PM3/23/04
to
You can go much further than that. Compilers should be able
to hoist and strength-reduce asserts, not just evaluate
them at compile time. But the compiler needs to know that
it's OK to fail an assert "early". That is,

int tab[100];
int n;
...
for (int i=0; i<n; i++)
{ assert(i<100);
tab[i] = 0;
}

should be compiled as

int tab[100];
int n;
...
assert(n < 100); // hoisted assert
for (int i=0; i<n; i++)
{ // assert(i<100); // implied by assert above
tab[i] = 0;
}

even though the assertion will fail before the loop is even entered.
The compiler needs to know that early assertion failure is permitted,
so it can hoist asserts through loop entries as shown above.

With this, it becomes feasible to put asserts in collection classes
for subscript checking. Some British work on Pascal in the 1980s showed
that well over 95% of subscript checks can be optimized out using
techniques like this.

Niklas Matthies

unread,
Mar 23, 2004, 5:31:34 PM3/23/04
to
On 2004-03-23 03:19, "Andrei Alexandrescu" wrote:
> "John Nagle" <na...@animats.com> wrote in message
> news:c627c.40806$qC6....@newssvr25.news.prodigy.com...
>> 2. Make "assert" part of the core language.
>>
>> Compilers should know more about "assert", so they
>> can optimize and pull asserts out of loops.
>
> They already know pretty much all that's to be known. The
> general-purpose optimizer can easily detect the condition and figure
> out whether it's always true or always false.

The standard could allow implementations to assume that the condition
of an assert is always true when NDEBUG is defined, and perform
optimizations based on this assumption. In other words, let any assert
with a condition that would evaluate to false be formally undefined
behavior when NDEBUG is defined.

You can already get the effect with something like

(condition || <expression that always invokes undefined behavior>)

if the compiler is smart enough, but it would be nice to have this
more explicitly via assert (and hence more likely to be actually
exploited by compilers).

-- Niklas Matthies

Helium

unread,
Mar 24, 2004, 4:15:02 AM3/24/04
to
> > 4. "let".
> >
> > "let" declares and initializes a variable with the type
> > of the right hand side.
> >
> > Example:
> >
> > let x = 1.0; // x is a double
> >
> > And, of course
> >
> > for (let p = tab.begin(); p != tab.end(); p++) { ... }
> >
> > which is shorter and more generic than the current form
> >
> > for (vector<sometype>::iterator p = tab.begin(); p != tab.end(); p++)
> > { ... }
>
> This I do like, especially for the purposes of template programming, where
> the programmer needs to create a variable of a type which will hold the type
> of the rvalue expression. The compiler does know the type of the rvalue
> expression, so having it automatically create a variable of the same type is
> a no-brainer. I believe there are a number of proposals which move the next
> version of C++ in the direction of supporting the "type_of" idea, and would
> be surprised if the next version of C++ diod not support some form of
> "type_of".
>
Isn't let very similar to the proposed auto:
http://anubis.dkuug.dk/jtc1/sc22/wg21/docs/papers/2004/n1607.pdf

Hans Aberg

unread,
Mar 24, 2004, 4:16:21 AM3/24/04
to
In article <c627c.40806$qC6....@newssvr25.news.prodigy.com>,
na...@animats.com (John Nagle) wrote:

>1. Move unsafe library functions to new header files.
>
> The headers for the classically unsafe string
>functions ("sprintf", "strcat", etc.) would be moved
>to "<unsafe-string.h>", or something similar.

One should probably instead have a keyword like "pure" or something, that
indicates a function is safe (re-entrant). A function is pure if made up
by pure components, and a function indicated pure cannot in its
implementation call non-pure components unless explicitly overridden say
with the same keyword "pure".

> This breaks existing programs, but they're probably
>broken anyway. After 25 years of buffer overflows,
>it's time to dump these badly designed functions.

Then one does not need to break existing programs.

Hans Aberg

Walter

unread,
Mar 24, 2004, 6:42:42 PM3/24/04
to

"John Nagle" <na...@animats.com> wrote in message
news:c627c.40806$qC6....@newssvr25.news.prodigy.com...
> Here are a few ideas for the next major revision of C++
> that people will hate, but which contain some sense.
>
> 1. Move unsafe library functions to new header files.
>
> The headers for the classically unsafe string
> functions ("sprintf", "strcat", etc.) would be moved
> to "<unsafe-string.h>", or something similar.
>
> This breaks existing programs, but they're probably
> broken anyway. After 25 years of buffer overflows,
> it's time to dump these badly designed functions.

D has an unusual approach to this kind of problem. There's the keyword
'deprecated' which can be applied to declarations that have been superceded,
but are still necessary to support legacy code. Then, a compiler switch is
used to allow or disallow use of deprecated declarations.

This makes it easy for the maintenance programmer to find and purge any
dependencies on obsolete declarations, and easy for the library vendor to
provide a clear upgrade path.

One could do this in C++ now using an #ifdef and the appropriate convention.
The problem with conventions, of course, is getting them consistently
adopted. One could also simply comment out the declarations for sprintf in
stdio.h, but my experience with such methods are:

1) programmers are very, very reluctant to modify system or vender .h files.
2) when they do, it breaks some unrelated project
3) updating the compiler means that all one's tweaks get undone

-Walter
www.digitalmars.com free C/C++/D compilers

Bob Bell

unread,
Mar 25, 2004, 4:55:41 PM3/25/04
to
na...@animats.com (John Nagle) wrote in message news:<IQ%7c.13797$MY4....@newssvr27.news.prodigy.com>...

> You can go much further than that. Compilers should be able
> to hoist and strength-reduce asserts, not just evaluate
> them at compile time. But the compiler needs to know that
> it's OK to fail an assert "early". That is,
>
> int tab[100];
> int n;
> ...
> for (int i=0; i<n; i++)
> { assert(i<100);
> tab[i] = 0;
> }
>
> should be compiled as
>
> int tab[100];
> int n;
> ...
> assert(n < 100); // hoisted assert
> for (int i=0; i<n; i++)
> { // assert(i<100); // implied by assert above
> tab[i] = 0;
> }
>
> even though the assertion will fail before the loop is even entered.
> The compiler needs to know that early assertion failure is permitted,
> so it can hoist asserts through loop entries as shown above.

Except that this changes the meaning of the code. Before hoisting,
"tab" is filled before the assert fires; after hoisting, "tab" is not
filled at all. This doesn't seem like such a good change. Where an
assertion is placed by the programmer has everything to do with the
state of the program if/when the assertion fails. If the code is
written like the first version above but is compiled like the second
version, then when I examine the resulting core dump (or whatever)
I'll be plenty surprised to find that "tab" still has garbage, not 100
zeroes.

Bob

Steven T. Hatton

unread,
Mar 25, 2004, 4:56:41 PM3/25/04
to
Walter wrote:

> D has an unusual approach to this kind of problem. There's the keyword
> 'deprecated' which can be applied to declarations that have been
> superceded, but are still necessary to support legacy code. Then, a
> compiler switch is used to allow or disallow use of deprecated
> declarations.

Have you considered using the same approach to modify the way 'deprecated'
elements are compiled? For example, if a new version of the C++ standard
were created, and it 'broke' backward compatability, there might be a way
to persuade the compiler to process libraries from the older version
differently from code conforming to the latest standard.

Yes, the thought makes me nervous. It might turn out to be far to
complicated to actually implement, nonetheless, it seems worth considering.

--
STH

John Nagle

unread,
Mar 26, 2004, 10:44:52 PM3/26/04
to
Hans Aberg wrote:

> In article <c627c.40806$qC6....@newssvr25.news.prodigy.com>,
> na...@animats.com (John Nagle) wrote:
>
>
>>1. Move unsafe library functions to new header files.
>>
>> The headers for the classically unsafe string
>>functions ("sprintf", "strcat", etc.) would be moved
>>to "<unsafe-string.h>", or something similar.
>
>
> One should probably instead have a keyword like "pure" or something, that
> indicates a function is safe (re-entrant). A function is pure if made up
> by pure components, and a function indicated pure cannot in its
> implementation call non-pure components unless explicitly overridden say
> with the same keyword "pure".

That's a different issue.

By "unsafe", I meant "do not check buffer size before storing".
It's generally recognized that some of the original C standard
library functions were very badly designed, and I'm arguing that
the price we pay for their flaws is higher than the price of
fixing the code that uses them.

John Nagle
Animats

Andrei Alexandrescu

unread,
Mar 26, 2004, 10:51:07 PM3/26/04
to
"John Nagle" <na...@animats.com> wrote in message
news:IQ%7c.13797$MY4....@newssvr27.news.prodigy.com...
[about putting assert inside the language]

> You can go much further than that. Compilers should be able
> to hoist and strength-reduce asserts, not just evaluate
> them at compile time. But the compiler needs to know that
> it's OK to fail an assert "early". That is,
>
[snip example]

>
> even though the assertion will fail before the loop is even entered.
> The compiler needs to know that early assertion failure is permitted,
> so it can hoist asserts through loop entries as shown above.
>
> With this, it becomes feasible to put asserts in collection classes
> for subscript checking. Some British work on Pascal in the 1980s showed
> that well over 95% of subscript checks can be optimized out using
> techniques like this.

But this all can be done if we inline assert as "if (!cond) throw
AssertFailed;" or something similar.

What I am saying is that it is not building assert into the language that is
the issue here. It is building flow analysis into the language. Then,
whether you have assert, an if follwoed by abort() or by throwing an
exception, or whatever, the flow analysis will take care of it
appropriately.

There's nothing specal about assert. It is just that in certain builds it is
an if statement that terminates the program execution on one branch.
Standard flow analysis can take care of that whether it is in the form of an
assert or a hand-coded if.

Andrei Alexandrescu

unread,
Mar 26, 2004, 10:51:32 PM3/26/04
to
"Niklas Matthies" <usenet...@nmhq.net> wrote in message
news:slrnc614dq.2r7...@nmhq.net...

> On 2004-03-23 03:19, "Andrei Alexandrescu" wrote:
> > "John Nagle" <na...@animats.com> wrote in message
> > news:c627c.40806$qC6....@newssvr25.news.prodigy.com...
> >> 2. Make "assert" part of the core language.
> >>
> >> Compilers should know more about "assert", so they
> >> can optimize and pull asserts out of loops.
> >
> > They already know pretty much all that's to be known. The
> > general-purpose optimizer can easily detect the condition and figure
> > out whether it's always true or always false.
>
> The standard could allow implementations to assume that the condition
> of an assert is always true when NDEBUG is defined, and perform
> optimizations based on this assumption. In other words, let any assert
> with a condition that would evaluate to false be formally undefined
> behavior when NDEBUG is defined.
>
> You can already get the effect with something like
>
> (condition || <expression that always invokes undefined behavior>)
>
> if the compiler is smart enough, but it would be nice to have this
> more explicitly via assert (and hence more likely to be actually
> exploited by compilers).

What's wrong with abort() instead the "expression that always invokes
undefined behavior"? Again, there is nothing special about assert. If you
handcode something similar to an assertion, you should be able to achieve
similar effects.

Buikding assert in the language would be the wrong, limited, horizonless way
to go about it.


Andrei

Walter

unread,
Mar 29, 2004, 1:02:28 AM3/29/04
to

""Steven T. Hatton"" <hat...@globalsymmetry.com> wrote in message
news:66qdnaOhp-d...@speakeasy.net...

> Walter wrote:
>
> > D has an unusual approach to this kind of problem. There's the keyword
> > 'deprecated' which can be applied to declarations that have been
> > superceded, but are still necessary to support legacy code. Then, a
> > compiler switch is used to allow or disallow use of deprecated
> > declarations.
>
> Have you considered using the same approach to modify the way 'deprecated'
> elements are compiled? For example, if a new version of the C++ standard
> were created, and it 'broke' backward compatability, there might be a way
> to persuade the compiler to process libraries from the older version
> differently from code conforming to the latest standard.
>
> Yes, the thought makes me nervous. It might turn out to be far to
> complicated to actually implement, nonetheless, it seems worth
considering.

Most C++ compilers (and C compilers) have command line switches to change
the semantics of the language to be compiled. After all, there are 3 primary
versions of C, and there's a lot of C++ code out there written to older C++
semantics that still need to be supported.

The downside of all that is, let's say, the compiler has switches to alter n
different language semantics. Then the testing has to test n! (n factorial)
permutations of the compiler. This rapidly approaches a mathematical
impracticality.

John Nagle

unread,
Mar 29, 2004, 1:03:50 AM3/29/04
to
It's very similar to the "auto" proposal, but there
are some issues with using the "auto" keyword.

Although it's rarely done, you can declare variables
"auto". You can write

auto int x = 1;

just as you can write

static int x = 1;

So there's syntatic trouble with "auto" in that placement.
The compiler would have to distinguish

auto x = 1;

from all of the above.

Incidentally,

auto const int x = 1;
and
const auto int x = 1;
and even
const auto volatile int x = 1;
are all legal. So it's possible to keep the parse ambiguous for
quite a while.

It may be possible to get a parser to buy another overload
of "auto", but it's not a comfortable syntax.

Admittedly "let" adds a keyword, but everybody will understand
what it means. Overloading "auto" is rather obscure. Bear in
mind that this is a feature for general C++ programmers, not
l33t template gods.

John Nagle
Animats

John Nagle

unread,
Mar 29, 2004, 1:05:32 AM3/29/04
to
The committee is considering a special
built-in compile time assertion form to make the template
people happy. Maybe that could be extended to cover
the general case.

The compile-time form causes the compile to fail,
even if the code involved is never executed. So that's
an "early failure", much as I'm discussing here.

The point I'm making about "assert" is that if the
compiler knows more about it, it can optimize it
much more aggresively. See the (snipped) example.

What I'm trying to get to is low-cost optimized subscript
checking, such as a few advanced Pascal compilers
had twenty years ago. See

http://doi.acm.org/10.1145/201059.201063

John Nagle
Animats

Andrei Alexandrescu wrote:

---

Niklas Matthies

unread,
Mar 29, 2004, 1:05:47 AM3/29/04
to
On 2004-03-27 03:51, "Andrei Alexandrescu" wrote:
> "Niklas Matthies" <usenet...@nmhq.net> wrote in message:
:

>> You can already get the effect with something like
>>
>> (condition || <expression that always invokes undefined behavior>)
>>
>> if the compiler is smart enough, but it would be nice to have this
>> more explicitly via assert (and hence more likely to be actually
>> exploited by compilers).
>
> What's wrong with abort() instead the "expression that always
> invokes undefined behavior"?

The fact that abort() doesn't invoke undefined behavior. :)

Consider:

// A
try { f(); }
catch (some_exception const &) { /* ignore */ }
catch (...) { abort(); }

vs.

// B
try { f(); }
catch (some_exception const &) { /* ignore */ }
catch (...) { assert(false); }

In B, the assert would tell the compiler that it may assume that the
only exceptions that the invocation of f() can throw are of type
some_exception, so the generated exception-catching code could omit
checking whether the exception coming from f() actually is a
some_exception or not. It could effectively compile to

try { f(); }
catch (...) { /* ignore */ }

in non-debug mode.

The compiler is not allowed to this in A, because the behavior would
be different when f throws a non-some_exception.

Furthermore, when the compiler is able to statically detect that the
invocation of f() actually _does_ throw a non-some_exception, it would
be allowed to refuse compilation of B. With A, even a warning would be
considered inappropriate by many users. Or do you want each and every
occurrence of abort() in non-dead code to trigger a warning?

:


> Buikding assert in the language would be the wrong, limited,
> horizonless way to go about it.

Actually, what I'd like see to be built into the language is
something like __assume_true(cond) (which would be defined to be a
no-op when cond would evaluate to true if it were evaluated at that
point, and to be undefined behavior when it would evaluate to false),
which then could be used in the definition of the assert() macro,
along the lines of:

#ifdef NDEBUG
#define assert(cond) __assume_true(cond)
#else
#define assert(cond) ((void) ((cond) || __assertion_failure(#cond)))
#endif

extern "C" { void __noreturn__ __assertion_failure(char const *); }

-- Niklas Matthies

Hans Aberg

unread,
Mar 29, 2004, 1:05:59 AM3/29/04
to
In article <AOR8c.42971$%05....@newssvr25.news.prodigy.com>,
na...@animats.com (John Nagle) wrote:

> By "unsafe", I meant "do not check buffer size before storing".
>It's generally recognized that some of the original C standard
>library functions were very badly designed, and I'm arguing that
>the price we pay for their flaws is higher than the price of
>fixing the code that uses them.

I believe that the C++ suggested usage is to not use those old C-functions
at all, but the corresponding C++ string functions. If those C functions
should be changed, that is probably a C-language issue, which C++ then
would follow.

On another level, one can think of admitting language construct that
admits the compiler to do static checking for accuracy instead of dynamic
checks, for safe implementation efficiency. If somebody figures out how to
do that, that technique could as well be applied to the old C functions.

Hans Aberg

Colin Hirsch

unread,
Mar 29, 2004, 1:08:14 PM3/29/04
to
Niklas Matthies wrote:
> [...]

> The standard could allow implementations to assume that the condition
> of an assert is always true when NDEBUG is defined, and perform
> optimizations based on this assumption. In other words, let any assert
> with a condition that would evaluate to false be formally undefined
> behavior when NDEBUG is defined.

Hi,

I use assert(), or rather a more powerful handcrafted version ASSERT(),
to assert internal invariants of my program. In any non-trivial, more
than 1000 lines of code, I do not believe that I can possibly have
enough black-box or unit tests to cover _all_ possible combinations of
flow through the code with all kinds of inputs. Hence the _last_ thing
that I want is to disable asserts in production and have the software
continue to run with inconsistent data, instead of a controlled crash
with a stack trace and core dump. In other words, for my paranoia,
NDEBUG is an absolute no-no for 99% of all code/projects...

Apart from that I would like to follow Andrei's reasoning that enhancing
flow analysis in the compiler is better than optimising for one special
macro (which is probably not used in its "raw" form all that often
anyhow)(and disregarding the question of how much compilers can
eliminate with current optimisations when NDEBUG is defined).

Regards, Colin

t...@cs.ucr.edu

unread,
Mar 29, 2004, 1:08:35 PM3/29/04
to
Niklas Matthies <usenet...@nmhq.net> wrote:
[...]
+ The standard could allow implementations to assume that the condition
+ of an assert is always true when NDEBUG is defined, and perform
+ optimizations based on this assumption. In other words, let any assert
+ with a condition that would evaluate to false be formally undefined
+ behavior when NDEBUG is defined.
+
+ You can already get the effect with something like
+
+ (condition || <expression that always invokes undefined behavior>)
+
+ if the compiler is smart enough, but it would be nice to have this
+ more explicitly via assert (and hence more likely to be actually
+ exploited by compilers).

Wow. I like your proposal *very* much.

Suppose that a given C++ implementation were modified so that:
(1) When NDEBUG is defined, "assert(<exp>)" expands to say
"(<exp>||*0)"
(2) Since all's fair when "<exp>" is false, the expression
"(<exp>||*0)" generates no code, and
(3) the implementation simply assumes that following the
evaluation of "(<exp>||*0)" the expression "<exp>" is true.

It would seem that:
- By the as-if rule, the modified implementation would continue to
conform as much as the original did.
- No existing unbroken code would get broken.
- No existing code would slow down.
- Some existing code would actually speed up if the implemenation
took advantage of (3).

Am I missing something?

Tom Payne

Thorsten Ottosen

unread,
Mar 29, 2004, 1:09:35 PM3/29/04
to
"John Nagle" <na...@animats.com> wrote in message
news:9g99c.43288$Ku5....@newssvr25.news.prodigy.com...

> The committee is considering a special
> built-in compile time assertion form to make the template
> people happy. Maybe that could be extended to cover
> the general case.

That would be nice. FYI, I'm now the person responsible for
writing that proposal. My first
paper will be available on the Sydney post-mailing.

> The point I'm making about "assert" is that if the
> compiler knows more about it, it can optimize it
> much more aggresively. See the (snipped) example.
>
> What I'm trying to get to is low-cost optimized subscript
> checking, such as a few advanced Pascal compilers
> had twenty years ago.

Although I don't discuss it much in my paper, the second version of my
proposal will.
Let us assume that vector::operator[] was declared (not defined) like this:

vector::operator[]( size_type n )
precondition { n < size() : throw range_error(); };

That might make it easy for compilers to detect when range checking
is not necessary in loops.

best regards

Thorsten

Pete Forman

unread,
Mar 29, 2004, 1:09:44 PM3/29/04
to
na...@animats.com (John Nagle) writes:

> 1. Move unsafe library functions to new header files.

It is often far from clear which header files are being called owing
to indirect inclusion. It might be better to add a deprecated tag to
individual functions. (I am not offering a syntax for this.)
--
Pete Forman -./\.- Disclaimer: This post is originated
WesternGeco -./\.- by myself and does not represent
pete....@westerngeco.com -./\.- opinion of Schlumberger, Baker
http://petef.port5.com -./\.- Hughes or their divisions.

Andrei Alexandrescu

unread,
Mar 29, 2004, 8:53:17 PM3/29/04
to
"John Nagle" <na...@animats.com> wrote in message
news:9g99c.43288$Ku5....@newssvr25.news.prodigy.com...

> The committee is considering a special
> built-in compile time assertion form to make the template
> people happy. Maybe that could be extended to cover
> the general case.

Kewl.

> The point I'm making about "assert" is that if the
> compiler knows more about it, it can optimize it
> much more aggresively. See the (snipped) example.
>
> What I'm trying to get to is low-cost optimized subscript
> checking, such as a few advanced Pascal compilers
> had twenty years ago. See
>
> http://doi.acm.org/10.1145/201059.201063

I understand your point; mine seems to not be as well understood. Again, my
point is, it's not assert that is special. The flow analysis of a test (if
statement) that terminates the current function (or the whole program)
abruptly on one of its branches - that's what's important.

There is no code sample you can show me that could benefit of a built-in
assert, than of a compiler performing standard flow analysis. If we manage
to get that point across, then the next step is to convince that flow
analysis is more general and has more applicabilities than a good assert.

Oh, and by the way, forgive this little joke:

A: Top posting.
Q: What's the worst thing on the Usenet?

:o)

Andrei

Andrei Alexandrescu

unread,
Mar 29, 2004, 8:53:34 PM3/29/04
to
"Niklas Matthies" <usenet...@nmhq.net> wrote in message
news:slrnc6avc2.372...@nmhq.net...

> On 2004-03-27 03:51, "Andrei Alexandrescu" wrote:
> > "Niklas Matthies" <usenet...@nmhq.net> wrote in message:
> :
> >> You can already get the effect with something like
> >>
> >> (condition || <expression that always invokes undefined behavior>)
> >>
> >> if the compiler is smart enough, but it would be nice to have this
> >> more explicitly via assert (and hence more likely to be actually
> >> exploited by compilers).
> >
> > What's wrong with abort() instead the "expression that always
> > invokes undefined behavior"?
>
> The fact that abort() doesn't invoke undefined behavior. :)
>
> Consider:
>
> // A
> try { f(); }
> catch (some_exception const &) { /* ignore */ }
> catch (...) { abort(); }
>
> vs.
>
> // B
> try { f(); }
> catch (some_exception const &) { /* ignore */ }
> catch (...) { assert(false); }

The two examples should be equivalent.

Andrei

t...@cs.ucr.edu

unread,
Mar 30, 2004, 11:09:50 AM3/30/04
to
t...@cs.ucr.edu wrote:
+ Niklas Matthies <usenet...@nmhq.net> wrote:
+ [...]
+ + The standard could allow implementations to assume that the condition
+ + of an assert is always true when NDEBUG is defined, and perform
+ + optimizations based on this assumption. In other words, let any assert
+ + with a condition that would evaluate to false be formally undefined
+ + behavior when NDEBUG is defined.

+ +
+ + You can already get the effect with something like
+ +
+ + (condition || <expression that always invokes undefined behavior>)
+ +
+ + if the compiler is smart enough, but it would be nice to have this
+ + more explicitly via assert (and hence more likely to be actually
+ + exploited by compilers).
[...]
+ Suppose that a given C++ implementation were modified so that:
+ (1) When NDEBUG is defined, "assert(<exp>)" expands to say
+ "(<exp>||*0)"
+ (2) Since all's fair when "<exp>" is false, the expression
+ "(<exp>||*0)" generates no code, and
+ (3) the implementation simply assumes that following the
+ evaluation of "(<exp>||*0)" the expression "<exp>" is true.
+
+ It would seem that:
+ - By the as-if rule, the modified implementation would continue to
+ conform as much as the original did.
+ - No existing unbroken code would get broken.
+ - No existing code would slow down.
+ - Some existing code would actually speed up if the implemenation
+ took advantage of (3).
+
+ Am I missing something?

Oops! Of course, a program that violates its assertions can conform,
even when NDEBUG is defined.

But, if the standards were simply modified so that the result of
evaluating "assert(<exp>)" is undefined whenever (1) "<exp>" is false
and (2) NDEBUG is defined, then:

* Conforming implementations would continue to conform, i.e., an
occurrence of "assert(<exp>)" would not need to generate
any behavior when NDEBUG is defined.

* Programs that don't violate any assertions would retain their
former behavior and performance, even when NDEBUG is defined.

* Following an occurrence of "assert(<exp>)" a conforming
implementation could behave as though "<exp>" were true,
even when NDEBUG is defined.

* Therefore, under aggressively optimized implementations, some
programs that don't violate their assertions would actually run
faster than before.

Presumably, aggressively optimized implementations already attach no
behaviour to occurrences of "(<exp>||*0)" and simply assume that
"<exp>" is true afterward. Such implementations could simply expand
"assert(<exp>)" as "(<exp>||*0)" whenever NDEBUG is defined, and
thereby take advantage of optimizations already in place.

Risto Lankinen

unread,
Mar 30, 2004, 11:30:21 AM3/30/04
to

""Andrei Alexandrescu"" <SeeWebsit...@moderncppdesign.com> wrote in
message news:c4aeq2$2h1ia4$1...@ID-14036.news.uni-berlin.de...

>
> I understand your point; mine seems to not be as well understood. Again,
my
> point is, it's not assert that is special. The flow analysis of a test (if
> statement) that terminates the current function (or the whole program)
> abruptly on one of its branches - that's what's important.

Could you please elaborate. This didn't help me understand your
point any better.

FWIW, I can't see how flow analysis could [easily] improve on
assert-aware compiler. Since the flow cannot always take all
invariants into consideration, generic functions must handle all
possible cases. If actual use is limited to a subset [e.g. adds
invariants] of the function's state space, how does the compiler
find this out using flow analysis, if one of the functions resides
in, say, a library?

- Risto -

Niklas Matthies

unread,
Mar 30, 2004, 1:28:08 PM3/30/04
to
On 2004-03-30 01:53, "Andrei Alexandrescu" wrote:
> "Niklas Matthies" <usenet...@nmhq.net> wrote in message
> news:slrnc6avc2.372...@nmhq.net...
>> On 2004-03-27 03:51, "Andrei Alexandrescu" wrote:
:

>> > What's wrong with abort() instead the "expression that always
>> > invokes undefined behavior"?
>>
>> The fact that abort() doesn't invoke undefined behavior. :)
>>
>> Consider:
>>
>> // A
>> try { f(); }
>> catch (some_exception const &) { /* ignore */ }
>> catch (...) { abort(); }
>>
>> vs.
>>
>> // B
>> try { f(); }
>> catch (some_exception const &) { /* ignore */ }
>> catch (...) { assert(false); }
>
> The two examples should be equivalent.

I don't know what you mean by "they should", but they certainly
aren't equivalent (with the proposed semantics of assert() in NDEBUG
mode), as I thought I demonstrated in the previous posting.

-- Niklas Matthies

Niklas Matthies

unread,
Mar 30, 2004, 1:28:34 PM3/30/04
to
On 2004-03-29 18:08, t...@cs.ucr.edu wrote:
> Niklas Matthies <usenet...@nmhq.net> wrote:
> [...]
>+ The standard could allow implementations to assume that the condition
>+ of an assert is always true when NDEBUG is defined, and perform
>+ optimizations based on this assumption. In other words, let any assert
>+ with a condition that would evaluate to false be formally undefined
>+ behavior when NDEBUG is defined.
>+
>+ You can already get the effect with something like
>+
>+ (condition || <expression that always invokes undefined behavior>)
>+
>+ if the compiler is smart enough, but it would be nice to have this
>+ more explicitly via assert (and hence more likely to be actually
>+ exploited by compilers).
>
> Wow. I like your proposal *very* much.
>
> Suppose that a given C++ implementation were modified so that:
> (1) When NDEBUG is defined, "assert(<exp>)" expands to say
> "(<exp>||*0)"
> (2) Since all's fair when "<exp>" is false, the expression
> "(<exp>||*0)" generates no code, and

It does generate code when <exp> produces side effects. I would prefer
<exp> to be never evaluated at runtime in NDEBUG mode, just as with
the current assert(). And this requires language support.

> (3) the implementation simply assumes that following the
> evaluation of "(<exp>||*0)" the expression "<exp>" is true.
>
> It would seem that:
> - By the as-if rule, the modified implementation would continue to
> conform as much as the original did.
> - No existing unbroken code would get broken.
> - No existing code would slow down.
> - Some existing code would actually speed up if the implemenation
> took advantage of (3).
>
> Am I missing something?

Apart from the above, I don't think so.

As I wrote in another post, I think it would be better to have a
new primitive that takes an expression that is convertible to bool
and tells the compiler "you may assume that if this expression would
be evaluated, the result converted to bool would be true".
While in many situations '<exp> || <undefined behavior>' is formally
equivalent, it qualifies as a rather obscure hack in my opinion.

-- Niklas Matthies

Risto Lankinen

unread,
Mar 31, 2004, 1:15:27 AM3/31/04
to

<t...@cs.ucr.edu> wrote in message news:c4b52f$1cd$1...@glue.ucr.edu...

> t...@cs.ucr.edu wrote:
> Presumably, aggressively optimized implementations already attach no
> behaviour to occurrences of "(<exp>||*0)" and simply assume that
> "<exp>" is true afterward.

What entitles the compiler to assume that <exp> is always true
in (<exp>||*0)?

- Risto -

Risto Lankinen

unread,
Mar 31, 2004, 1:15:43 AM3/31/04
to

"Niklas Matthies" <usenet...@nmhq.net> wrote in message
news:slrnc6j5re.31c...@nmhq.net...

>
> It does generate code when <exp> produces side effects. I would prefer
> <exp> to be never evaluated at runtime in NDEBUG mode, just as with
> the current assert(). And this requires language support.

This is good for backward compatibility with the current state,
but in my experience it has been a source of subtle bugs [that
the expression in assert() is not always evaluated]. If assert()
became a compiler feature, I would like the expression to be
evaluated (just to retain the side effects) even if NDEBUG is
defined.

- Risto -

James Kuyper

unread,
Mar 31, 2004, 6:35:35 PM3/31/04
to
rlan...@hotmail.com ("Risto Lankinen") wrote in message news:<vPsac.12785$k4.2...@news1.nokia.com>...

> <t...@cs.ucr.edu> wrote in message news:c4b52f$1cd$1...@glue.ucr.edu...
> > t...@cs.ucr.edu wrote:
> > Presumably, aggressively optimized implementations already attach no
> > behaviour to occurrences of "(<exp>||*0)" and simply assume that
> > "<exp>" is true afterward.
>
> What entitles the compiler to assume that <exp> is always true
> in (<exp>||*0)?

If exp is false, then *0 is evaluated, which means that the program
has undefined behavior. Therefore, anything is allowed, including code
optimizations based upon an incorrect assumption that exp is true.

Let's take a simple case:

int func(int i)
{
i>0 || *0;

if(i>0)
return 1;
return 0;
}


This can be optimized into the equivalent of the following:

int func(int i) { return 1; }

Risto Lankinen

unread,
Apr 1, 2004, 1:15:10 PM4/1/04
to

"James Kuyper" <kuy...@wizard.net> wrote in message
news:8b42afac.0403...@posting.google.com...

> rlan...@hotmail.com ("Risto Lankinen") wrote in message
news:<vPsac.12785$k4.2...@news1.nokia.com>...
> >
> > What entitles the compiler to assume that <exp> is always true
> > in (<exp>||*0)?
>
> If exp is false, then *0 is evaluated, which means that the program
> has undefined behavior. Therefore, anything is allowed, including code
> optimizations based upon an incorrect assumption that exp is true.

Wow! This is an insanely smart utilization of the "reductio
ad absurdum"-type proof of the theorem (<exp>==true).
What makes it interesting is that a thinking *machine* can
be made to understand it!

A lot of weird stuff has been done with templates, but of all
C++ idioms that I 've seen that are both cool and wacky at
the same time, this one gets the prom queen.

Thanks a lot for explaining!

- Risto -

Andrei Alexandrescu (See Website for Email)

unread,
Apr 1, 2004, 1:16:41 PM4/1/04
to
"Niklas Matthies" <usenet...@nmhq.net> wrote in message
> As I wrote in another post, I think it would be better to have a
> new primitive that takes an expression that is convertible to bool
> and tells the compiler "you may assume that if this expression would
> be evaluated, the result converted to bool would be true".

Let me try to explain that there is no need for a new primitive. Consider
this function:

bool foo(int n) {
if (!(n >= 0)) abort();
return n >= 0;
}

I claim that the compiler only needs to know that abort() never returns in
order to optimize this function just as well as:

bool bar(int n) {
super_duper_assert(n >= 0);
return n >= 0;
}

This is because, if the compiler performs standard flow analysis on foo, it
will collect on the second line of the function the flow analysis fact that
n >= 0.

> While in many situations '<exp> || <undefined behavior>' is formally
> equivalent, it qualifies as a rather obscure hack in my opinion.

I definitely agree with that!

To answer to all posts that replied to mine, I guess all I'm saying is that
instead of thinking of building assert as a language mechanism, we should
think of what's needed for programmers to define their own assert-like
facilities. The answer is guarantees on standard flow analysis. They are
already in place for most compilers (most compilers warn, for example, when
a function never returns from a branch).


Andrei

t...@cs.ucr.edu

unread,
Apr 1, 2004, 1:52:11 PM4/1/04
to
Niklas Matthies <usenet...@nmhq.net> wrote:
+ On 2004-03-29 18:08, t...@cs.ucr.edu wrote:
[...]
+> Suppose that a given C++ implementation were modified so that:
+> (1) When NDEBUG is defined, "assert(<exp>)" expands to say
+> "(<exp>||*0)"
+> (2) Since all's fair when "<exp>" is false, the expression
+> "(<exp>||*0)" generates no code, and
+
+ It does generate code when <exp> produces side effects. I would prefer
+ <exp> to be never evaluated at runtime in NDEBUG mode, just as with
+ the current assert(). And this requires language support.

IIRC, it's always considered bad programming for the argument to
assert to have side effect.

+ As I wrote in another post, I think it would be better to have a
+ new primitive that takes an expression that is convertible to bool
+ and tells the compiler "you may assume that if this expression would
+ be evaluated, the result converted to bool would be true".

It's my impression that the Standards committee is very reluctant to
introduce new keywords. Consider, for example, how many times the
keyword "static" has been reused.

+ While in many situations '<exp> || <undefined behavior>' is formally
+ equivalent, it qualifies as a rather obscure hack in my opinion.

Agreed. But it's an "obscure hack" that isn't exposed to public view.
In fact, the assert macro already contains a hack to make it a real
expression equivalent (in all but side effects) to "0", when "NDEBUG"
is undefined.

Tom Payne

Niklas Matthies

unread,
Apr 1, 2004, 8:07:36 PM4/1/04
to
On 2004-04-01 18:52, t...@cs.ucr.edu wrote:
> Niklas Matthies <usenet...@nmhq.net> wrote:
>+ On 2004-03-29 18:08, t...@cs.ucr.edu wrote:
> [...]
>+> Suppose that a given C++ implementation were modified so that:
>+> (1) When NDEBUG is defined, "assert(<exp>)" expands to say
>+> "(<exp>||*0)"
>+> (2) Since all's fair when "<exp>" is false, the expression
>+> "(<exp>||*0)" generates no code, and
>+
>+ It does generate code when <exp> produces side effects. I would prefer
>+ <exp> to be never evaluated at runtime in NDEBUG mode, just as with
>+ the current assert(). And this requires language support.
>
> IIRC, it's always considered bad programming for the argument to
> assert to have side effect.

Yes. But consider (in NDEBUG mode):

int n = ...
assert(is_prime(n));
... // n not modified here
bool b = is_prime(n);

If the compiler can see the implementation of is_prime() and can
deduce that it is a pure function (no side effects, and the result
only depends on the argument), then it can optimize away both calls
and translate the last line as:

bool b = true;

But if it can't deduce that, then *both* calls have to be performed.
In particular, the first one has to be performed *even though* the
information provided by the assert() has become useless in that case.

So, the question is not whether the expression actually has side
effects or not, but whether the compiler can see that it has no side
effects. Therefore I would rather let the compiler not have the
expression unnecessarily evaluated just-in-case. Otherwise it becomes
a pessimization rather than an optimization.

>+ As I wrote in another post, I think it would be better to have a
>+ new primitive that takes an expression that is convertible to bool
>+ and tells the compiler "you may assume that if this expression would
>+ be evaluated, the result converted to bool would be true".
>
> It's my impression that the Standards committee is very reluctant to
> introduce new keywords. Consider, for example, how many times the
> keyword "static" has been reused.

It doesn't need to be a keyword. Like with assert(), a macro would be
fine.

>+ While in many situations '<exp> || <undefined behavior>' is formally
>+ equivalent, it qualifies as a rather obscure hack in my opinion.
>
> Agreed. But it's an "obscure hack" that isn't exposed to public view.

IMHO the facility _should_ be exposed seperately, i.e. not just as the
behavior of assert() in NDEBUG-mode. Assert() doesn't allow very
fine-grained control, so many will want the behavior available for
building their own assert-like facilities. Currently it's quite
compiler-specific which UB expressions, if any at all, have the
desired effect on compilation. There would likely be great
improvements with a dedicated facility.

And, of course, the UB hack doesn't work when we don't want the
expression to be evaluated.

-- Niklas Matthies

John Nagle

unread,
Apr 1, 2004, 8:07:50 PM4/1/04
to
Here's the problem:

Example:

char tab[100];

void printstars(int fd, int n)
{ for (int i=0; i<n; i++)
{ assert(i < 100); // in practice, the assert is in some collection class
tab[i] = '*';
}
write(fd,tab,n);
}

Now clearly this will fail at the assert if n>100,
just before it would subscript out of range. But
the assert is executed on every iteration of the loop,
which is safe but inefficient.

One would like this optimized to

char tab[100];

void printstars(int fd, int n)
{ assert(n<=100); // hoisted by compiler
for (int i=0; i<n; i++)
{
tab[i] = '*';
}
write(fd,tab,n);
}

which is far more efficient. But that code will fail "early",
before any of the loop iterations are executed. The final
value in "tab" will be different than if the program were
allowed to run up to the failure point. So the
compiler can't perform that optimization unless it knows
more about "assert".

If we get this right, it will be possible to put many more
asserts in collection classes without impacting inner loop
performance.

John Nagle

Niklas Matthies

unread,
Apr 1, 2004, 11:17:02 PM4/1/04
to
On 2004-04-02 01:07, John Nagle wrote:
> Here's the problem:
<snip>

Just to avoid confusion: What John Nagle is proposing is different
from what I am proposing. Both suggestions are quite orthogonal to
each other--John Nagle's applies to non-NDEBUG mode, mine to NDEBUG
mode.

-- Niklas Matthies

Ian McCulloch

unread,
Apr 2, 2004, 10:38:38 AM4/2/04
to
John Nagle wrote:

But it is possible for the compiler to hoist the comparison out of the loop
anyway, even if it knows nothing about assert() :

void printstars(int fd, int n)
{

int LoopEnd = std::min(n, 100);
for (int i=0; i<LoopEnd; i++)
{
assert(true);
tab[i] = '*';
}
for (int i=LoopEnd; i<n; i++)
{
assert(false);


tab[i] = '*';
}
write(fd,tab,n);
}

To my non-compiler-implementer's eye, this doesn't look any harder to
recognize than the case where the compiler knows that assert() is special;
as a bonus it doesn't (potentially) change the observable behaviour of the
program, or require any changes in the standard.

Cheers,
Ian McCulloch

t...@cs.ucr.edu

unread,
Apr 2, 2004, 10:39:44 AM4/2/04
to
Niklas Matthies <usenet...@nmhq.net> wrote:
+ On 2004-04-01 18:52, t...@cs.ucr.edu wrote:
[...]
+> IIRC, it's always considered bad programming for the argument to
+> assert to have side effect.
+
+ Yes. But consider (in NDEBUG mode):
+
+ int n = ...
+ assert(is_prime(n));
+ ... // n not modified here
+ bool b = is_prime(n);
+
+ If the compiler can see the implementation of is_prime() and can
+ deduce that it is a pure function (no side effects, and the result
+ only depends on the argument), then it can optimize away both calls
+ and translate the last line as:
+
+ bool b = true;
+
+ But if it can't deduce that, then *both* calls have to be performed.
+ In particular, the first one has to be performed *even though* the
+ information provided by the assert() has become useless in that case.
+
+ So, the question is not whether the expression actually has side
+ effects or not, but whether the compiler can see that it has no side
+ effects. Therefore I would rather let the compiler not have the
+ expression unnecessarily evaluated just-in-case. Otherwise it becomes
+ a pessimization rather than an optimization.
+
[...]
+ IMHO the facility _should_ be exposed seperately, i.e. not just as the
+ behavior of assert() in NDEBUG-mode. Assert() doesn't allow very
+ fine-grained control, so many will want the behavior available for
+ building their own assert-like facilities. Currently it's quite
+ compiler-specific which UB expressions, if any at all, have the
+ desired effect on compilation. There would likely be great
+ improvements with a dedicated facility.
+
+ And, of course, the UB hack doesn't work when we don't want the
+ expression to be evaluated.

How about the idiom of defining a simple macro, assume(exp), that
expands to "(((exp)||*0),0)" when NDEBUG and SMARTCOMPILER are defined
and to "assert(exp)" otherwise.

As you point out, in NDEBUG-and-SMARTCOMPILER mode the evaluation exp
will be suppressed only if the compiler can determine that exp has no
side effects. So, for expression that call external functions, the
programmer can use assert.

Tom Payne

Niklas Matthies

unread,
Apr 2, 2004, 4:24:14 PM4/2/04
to
On 2004-04-01 18:16, "Andrei Alexandrescu (See Website for Email)" wrote:
> "Niklas Matthies" <usenet...@nmhq.net> wrote in message
>> As I wrote in another post, I think it would be better to have a
>> new primitive that takes an expression that is convertible to bool
>> and tells the compiler "you may assume that if this expression would
>> be evaluated, the result converted to bool would be true".
>
> Let me try to explain that there is no need for a new primitive.
> Consider this function:
>
> bool foo(int n) {
> if (!(n >= 0)) abort();
> return n >= 0;
> }
>
> I claim that the compiler only needs to know that abort() never
> returns in order to optimize this function just as well as:
>
> bool bar(int n) {
> super_duper_assert(n >= 0);
> return n >= 0;
> }
>
> This is because, if the compiler performs standard flow analysis on
> foo, it will collect on the second line of the function the flow
> analysis fact that n >= 0.

The difference is that the compiler can't optimize away the test and
the abort() call in the first version (unless it happens to know from
global flow analysis that n >= 0 always holds for any actual call to
foo()). It _can_ optimize foo() to:

bool foo(int n) {
if (!(n >= 0)) abort();

return true;
}

No debate about this. But the "super_duper_assert" version is intended
to let the compiler *always* optimize the function to

bool bar(int n) {
return true;
}

in NDEBUG mode, because the programmer explicitly tells the compiler
"look, I promise that n >= 0 will always hold for calls to bar()".
See the difference?


More generally: For any given boolean expression (at a specific
position in source code), flow analysis results into one of the
following:

T: the expression will always be true

F: the expression will always be false

TX: there are control paths for which the expression will be
true, but for the other control paths nothing can be said

FX: there are control paths for which the expression will be
false, but for the other control paths nothing can be said

TFX: there are control paths for which the expression will be
true, and some others for which the expression will be false,
but for the remaining control paths nothing can be said

X: nothing can be said for any of the control paths

Let's assume the programmer believes that the expression will always
be true. In general, one chooses between the following two policies:

1) Trust the programmer, unless the compiler can prove at
translation time that the programmer is wrong (i.e. when
F or FX or TFX).

2) Don't trust the programmer, so if it can't be proven at
translation time that the programmer is correct, then perform
a check at runtime and complain loudly if it fails (i.e. when
not T).

Assert() in non-NDEBUG mode caters to policy 2. Assert() in NDEBUG
mode sort-of caters to policy 1, only that the compiler is not
actually allowed to trust the programmer (i.e. to assume that the
assert condition is true) nor to complain (= fail compilation) if it
can prove that the programmer is wrong.

Consider the simple (if bogus) program:

#define NDEBUG
#include <cassert>
#include <cstdlib>

int main(int argc, char * argv[])
{
assert(argc == 1);
return argc == 1 ? EXIT_SUCCESS : EXIT_FAILURE;
}

The compiler is _not_ allowed to optimize main() to

int main(int argc, char * argv[])
{
return EXIT_SUCCESS;
}

Neither is it allowed to do so with your abort() solution:

int main(int argc, char * argv[])
{
if (argc != 1) abort()
return argc == 1 ? EXIT_SUCCESS : EXIT_FAILURE;
}

It can only optimize it to

int main(int argc, char * argv[])
{
if (argc != 1) abort()
return EXIT_SUCCESS;
}

And now consider the UB solution:

int main(int argc, char * argv[])
{
(argc == 1) || (* (int *) 0 = 5);
return argc == 1 ? EXIT_SUCCESS : EXIT_FAILURE;
}

This - finally - lets the compiler optimize main to:

int main(int argc, char * argv[])
{
return EXIT_SUCCESS;
}

And I believe that "hello compiler, you're allowed to trust me that
<expr> is true" is a common enough thing for a programmer to say to
warrant a generic, articulate language construct for this purpose:

int main(int argc, char * argv[])
{
__assume(argc == 1);
return argc == 1 ? EXIT_SUCCESS : EXIT_FAILURE;
}

-- Niklas Matthies

Andrei Alexandrescu (See Website for Email)

unread,
Apr 2, 2004, 4:24:20 PM4/2/04
to
"John Nagle" <na...@animats.com> wrote in message
news:%Z%ac.17227$bL3....@newssvr27.news.prodigy.com...
> Here's the problem:
[snip]

> So the
> compiler can't perform that optimization unless it knows
> more about "assert".

Sorry, this is not the case.

All the compiler needs to know in your examples is that when passing more
than 100 as an argument, the program will terminate abruptly. That's _all_
it needs to know. Knowledge about abort() terminating the program and a
simple if statement is everything that's needed.

void printstars(int fd, int n)
{ for (int i=0; i<n; i++)

{ if (!(i < 100)) abort(1);


tab[i] = '*';
}
write(fd,tab,n);
}

In the example above, if the compiler knows abort() actually aborts the
program (reasonable since it's a standard library function), it is able to
hoist the test and perform it only once.

Niklas Matthies

unread,
Apr 2, 2004, 6:16:26 PM4/2/04
to
On 2004-04-02 15:39, t...@cs.ucr.edu wrote:
> Niklas Matthies <usenet...@nmhq.net> wrote:
>+ On 2004-04-01 18:52, t...@cs.ucr.edu wrote:
:

> How about the idiom of defining a simple macro, assume(exp), that
> expands to "(((exp)||*0),0)" when NDEBUG and SMARTCOMPILER are defined
> and to "assert(exp)" otherwise.
>
> As you point out, in NDEBUG-and-SMARTCOMPILER mode the evaluation exp
> will be suppressed only if the compiler can determine that exp has no
> side effects. So, for expression that call external functions, the
> programmer can use assert.

When writing assume()/assert(), I don't want to think about whether
the expression involves such calls or not, especially since whether a
function is inline or not can change over the code's lifetime, and for
template code can also depend on the compilation unit where it's
instantiated. I simply want to say "hello compiler, you're free to
assume that the following condition holds (and free to abort
compilation if you can prove that it doesn't hold)".

I'd also like the facility to not depend on any compilation mode or
macro being defined. For example, one might want to select the mode
depending on a template parameter (think policy traits).

-- Niklas Matthies

John Nagle

unread,
Apr 2, 2004, 6:17:28 PM4/2/04
to
Is NDEBUG in the standard?

John Nagle
Animats

John Nagle

unread,
Apr 2, 2004, 7:10:45 PM4/2/04
to
The question is what happens BEFORE the program aborts.
How much work can the program omit before aborting, once it's
clear that the program is doomed to abort.
Presumably you couldn't hoist an assert through a function
call that did output.

It's not clear why making "abort" special is better
than making "assert" special. Right now, compilers
don't know much about either.

John Nagle

Andrei Alexandrescu (See Website for Email) wrote:

Dave Harris

unread,
Apr 4, 2004, 1:25:03 PM4/4/04
to
na...@animats.com (John Nagle) wrote (abridged):

> 1. Move unsafe library functions to new header files.

Rather than break existing code, perhaps you could add a new header
which contains declarations which make the unsafe functions
illegal to use. Eg:

// DisableUnsafeFunctions.h
enum unsafe_ { unsafe };
char *strcat( char *a, const char *b, unsafe_ = unsafe );

// Calls like strcat( dest, src ) are now ambiguous.

For the next standard, it is probably better to have "depreciated"
The advantage of DisableUnsafeFunctions.h is that it does not
require language changes so you can start using it straight away.

-- Dave Harris, Nottingham, UK

Nicola Musatti

unread,
Apr 5, 2004, 11:30:50 AM4/5/04
to
SeeWebsit...@moderncppdesign.com ("Andrei Alexandrescu (See Website for Email)") wrote in message news:<c4hhq8$2h92ci$1...@ID-14036.news.uni-berlin.de>...
[...]

> To answer to all posts that replied to mine, I guess all I'm saying is that
> instead of thinking of building assert as a language mechanism, we should
> think of what's needed for programmers to define their own assert-like
> facilities. The answer is guarantees on standard flow analysis. They are
> already in place for most compilers (most compilers warn, for example, when
> a function never returns from a branch).

I agree with you. In fact I believe this is an area where the C
heritage has probably damaged C++. In the traditional C way of
thinking the programmer always does flow analysis better than the
compiler. Thus we have libraries that always leave checks to the
programmer, instead of the IMO superior approach of always checking in
the library and providing ways to disable checks locally (did anybody
mention twenty year old Pascal compilers)?

With a correct combination of heuristics (the compiler "knows" the
standard library), inlining and flow analysis all a programmer would
be left to do is to explicitly disable checks that are too expensive
to perform and that couldn't be removed/hoisted by the compiler.

Any assert facility would benefit from the same approach too.

Cheers,
Nicola Musatti

t...@cs.ucr.edu

unread,
Apr 5, 2004, 2:31:21 PM4/5/04
to
"Andrei Alexandrescu (See Website for Email)" <SeeWebsit...@moderncppdesign.com> wrote:
+ "Niklas Matthies" <usenet...@nmhq.net> wrote in message
+> As I wrote in another post, I think it would be better to have a
+> new primitive that takes an expression that is convertible to bool
+> and tells the compiler "you may assume that if this expression would
+> be evaluated, the result converted to bool would be true".
+
+ Let me try to explain that there is no need for a new primitive. Consider
+ this function:
+
+ bool foo(int n) {
+ if (!(n >= 0)) abort();
+ return n >= 0;
+ }
+
+ I claim that the compiler only needs to know that abort() never returns in
+ order to optimize this function just as well as:
+
+ bool bar(int n) {
+ super_duper_assert(n >= 0);
+ return n >= 0;
+ }
+
+ This is because, if the compiler performs standard flow analysis on foo, it
+ will collect on the second line of the function the flow analysis fact that
+ n >= 0.

I'm obviously missing something here. IIUC, the point is to suppress
the run-time evaluation of the condition, in this case "n>=0".
Knowledge that abort never returns is insuffient to surpress that
evaluation in the first case, since abort() is a real function with
real and distinct side effects (e.g., a special return to the OS that
causes a possible core dump). So, there needs to be a run-time
decision whether or not to call it.

+> While in many situations '<exp> || <undefined behavior>' is formally
+> equivalent, it qualifies as a rather obscure hack in my opinion.
+
+ I definitely agree with that!
+
+ To answer to all posts that replied to mine, I guess all I'm saying is that
+ instead of thinking of building assert as a language mechanism, we should
+ think of what's needed for programmers to define their own assert-like
+ facilities. The answer is guarantees on standard flow analysis. They are
+ already in place for most compilers (most compilers warn, for example, when
+ a function never returns from a branch).

Niklaus made a very important point in his followup to my response.
To suppress the evaluation of the condition via a facility that is not
part of the core language, the compiler must be able to determine that
the condition has no observable side effects. And it's impossible for
a compiler to rule out side effects for independently compiled
functions.

Tom Payne

Niklas Matthies

unread,
Apr 5, 2004, 2:32:03 PM4/5/04
to
On 2004-04-02 23:17, John Nagle wrote:
> Is NDEBUG in the standard?

Yes, via reference to the C standard.
It's also mentioned exlicitly in 17.4.2.1p2.

-- Niklas Matthies
--
A: Top posters.
Q: What is the most annoying thing on Usenet?

John Nagle

unread,
Apr 5, 2004, 5:16:00 PM4/5/04
to
Nicola Musatti wrote:
> I agree with you. In fact I believe this is an area where the C
> heritage has probably damaged C++. In the traditional C way of
> thinking the programmer always does flow analysis better than the
> compiler. Thus we have libraries that always leave checks to the
> programmer, instead of the IMO superior approach of always checking in
> the library and providing ways to disable checks locally (did anybody
> mention twenty year old Pascal compilers)?

True. Today, any compiler that doesn't do flow analysis
for the entire compilation unit is a toy. Compilers that do
flow analysis for the entire program exist. We can expect
C++ compilers to take a global look at, as a minimum, the
entire compilation unit.

John Nagle
Animats

t...@cs.ucr.edu

unread,
Apr 6, 2004, 12:01:47 AM4/6/04
to
"Andrei Alexandrescu" <SeeWebsit...@moderncppdesign.com> wrote:
+ "John Nagle" <na...@animats.com> wrote in message
+ news:9g99c.43288$Ku5....@newssvr25.news.prodigy.com...
+> The committee is considering a special
+> built-in compile time assertion form to make the template
+> people happy. Maybe that could be extended to cover
+> the general case.
+
+ Kewl.
+
+> The point I'm making about "assert" is that if the
+> compiler knows more about it, it can optimize it
+> much more aggresively. See the (snipped) example.
+>
+> What I'm trying to get to is low-cost optimized subscript
+> checking, such as a few advanced Pascal compilers
+> had twenty years ago. See
+>
+> http://doi.acm.org/10.1145/201059.201063
+
+ I understand your point; mine seems to not be as well understood. Again, my
+ point is, it's not assert that is special. The flow analysis of a test (if
+ statement) that terminates the current function (or the whole program)
+ abruptly on one of its branches - that's what's important.
+
+ There is no code sample you can show me that could benefit of a built-in
+ assert, than of a compiler performing standard flow analysis.


Suppose that exp visibly has no side effects.


Case #1:

int main() {
if ( ! (exp) ) abort(); // "exp" must be evaluated.
... // Now exp can be assumed true.
}


Case #2:

int main() {
if ( ! (exp) ) *(int*)0; // "exp" needn't be evaluated.
... // Now exp can be assumed true.
}


Case #3:

int main() {
assert(exp); // Under Niklaus' built-in assert, in NDEBUG mode,
// "exp" needn't be evaluated.
... // Now exp can be assumed true.
}


Tom Payne

t...@cs.ucr.edu

unread,
Apr 6, 2004, 1:10:43 PM4/6/04
to
"Andrei Alexandrescu" <SeeWebsit...@moderncppdesign.com> wrote:
+ "Niklas Matthies" <usenet...@nmhq.net> wrote in message
+ news:slrnc6avc2.372...@nmhq.net...

+> On 2004-03-27 03:51, "Andrei Alexandrescu" wrote:
+> > "Niklas Matthies" <usenet...@nmhq.net> wrote in message:
+> :

+> >> You can already get the effect with something like
+> >>
+> >> (condition || <expression that always invokes undefined behavior>)
+> >>
+> >> if the compiler is smart enough, but it would be nice to have this
+> >> more explicitly via assert (and hence more likely to be actually
+> >> exploited by compilers).
+> >
+> > What's wrong with abort() instead the "expression that always
+> > invokes undefined behavior"?
+>
+> The fact that abort() doesn't invoke undefined behavior. :)
+>
+> Consider:
+>
+> // A
+> try { f(); }
+> catch (some_exception const &) { /* ignore */ }
+> catch (...) { abort(); }
+>
+> vs.
+>
+> // B
+> try { f(); }
+> catch (some_exception const &) { /* ignore */ }
+> catch (...) { assert(false); }
+
+ The two examples should be equivalent.

Under Niklaus Matthies' proposal, the evaluation of "assert(false)"
invokes undefined behavior, which under a conforming implementation
could include the invocation of nasal demons, i.e., non-conforming
behavior for "abort()".

Michiel Salters

unread,
Apr 7, 2004, 8:10:05 PM4/7/04
to
kuy...@wizard.net (James Kuyper) wrote in message news:<8b42afac.0403...@posting.google.com>...

> rlan...@hotmail.com ("Risto Lankinen") wrote in message news:<vPsac.12785$k4.2...@news1.nokia.com>...
> > <t...@cs.ucr.edu> wrote in message news:c4b52f$1cd$1...@glue.ucr.edu...
> > > t...@cs.ucr.edu wrote:
> > > Presumably, aggressively optimized implementations already attach no
> > > behaviour to occurrences of "(<exp>||*0)" and simply assume that
> > > "<exp>" is true afterward.
> >
> > What entitles the compiler to assume that <exp> is always true
> > in (<exp>||*0)?
>
> If exp is false, then *0 is evaluated, which means that the program
> has undefined behavior. Therefore, anything is allowed, including code
> optimizations based upon an incorrect assumption that exp is true.

The only exception is of course an exception; evaluating <exp> could
throw. In that case, the compiler must still evaluate <exp>.

Eg.

bool foo();
try {
assert( foo() );
if ( foo() )
bar();
else
baz();
}
catch (...) { std::cerr << "neither bar nor baz"; }

foo() must be evaluated, but with assert in the core language the
call to baz could be eliminated ( given suitable wording )

Regards,
Michiel Salters
under some

t...@cs.ucr.edu

unread,
Apr 8, 2004, 1:16:50 AM4/8/04
to
Michiel Salters <Michiel...@cmg.nl> wrote:
+ kuy...@wizard.net (James Kuyper) wrote in message news:<8b42afac.0403...@posting.google.com>...
+> rlan...@hotmail.com ("Risto Lankinen") wrote in message news:<vPsac.12785$k4.2...@news1.nokia.com>...
+> > <t...@cs.ucr.edu> wrote in message news:c4b52f$1cd$1...@glue.ucr.edu...
+> > > t...@cs.ucr.edu wrote:
+> > > Presumably, aggressively optimized implementations already attach no
+> > > behaviour to occurrences of "(<exp>||*0)" and simply assume that
+> > > "<exp>" is true afterward.
+> >
+> > What entitles the compiler to assume that <exp> is always true
+> > in (<exp>||*0)?
+>
+> If exp is false, then *0 is evaluated, which means that the program
+> has undefined behavior. Therefore, anything is allowed, including code
+> optimizations based upon an incorrect assumption that exp is true.
+
+ The only exception is of course an exception; evaluating <exp> could
+ throw. In that case, the compiler must still evaluate <exp>.

So, if I understand correctly, the proper statement would be:

If the evaluation of an occurrence of a boolean expression <exp>:

(1) has no side effects when evaluating <exp> yields "true"

(2) inevitably leads to undefined behavior, when evaluating <exp>
does not yield the value "true" (e.g., when it throws an
exception or yields "false")

then that occurrence of <exp> need not be evaluated and may simply
be assumed to yield "true".

Right?

Tom Payne

James Kuyper

unread,
Apr 8, 2004, 2:54:22 PM4/8/04
to
t...@cs.ucr.edu wrote in message news:<c52mot$c60$1...@glue.ucr.edu>...

Correct. That assumption may be used to optimize any subsequent (or
previous) code, so long as there are no intervening statements that
are capable of changing the value that 'exp' would have. Which
statements have that capability depends upon the exact nature of
'exp'.

Niklas Matthies

unread,
Apr 9, 2004, 10:45:03 AM4/9/04
to
On 2004-04-08 05:16, t...@cs.ucr.edu wrote:
> Michiel Salters <Michiel...@cmg.nl> wrote:
:
>+ The only exception is of course an exception; evaluating <exp> could
>+ throw. In that case, the compiler must still evaluate <exp>.
>
> So, if I understand correctly, the proper statement would be:
>
> If the evaluation of an occurrence of a boolean expression <exp>:
>
> (1) has no side effects when evaluating <exp> yields "true"
>
> (2) inevitably leads to undefined behavior, when evaluating <exp>
> does not yield the value "true" (e.g., when it throws an
> exception or yields "false")
>
> then that occurrence of <exp> need not be evaluated and may simply
> be assumed to yield "true".

I think (2) is slightly wrong. As a reminder, we are talking about:

<exp> || <UB>

If <exp> throws, then, besides not evaluating to true, <UB> is _not_
reached. In other words, throwing has the same effect as "true" as far
as the undefined behavior is concerned.

All this only reinforces my impression that <exp> || <UB> is just
a troublesome hack. When a programmer writes assert(<exp>), the
expectation is that (a) <exp> has conceptually no side effects
(it might have actual ones such like caching through mutable members),
(b) does not throw and (c) yields true. The implementation should be
allowed to assume all three, and consequently never be required to
evaluate <exp>.

-- Niklas Matthies

Michiel Salters

unread,
Apr 13, 2004, 3:33:24 PM4/13/04
to
Michiel...@cmg.nl (Michiel Salters) wrote in message news:<cefd6cde.0404...@posting.google.com>...

> kuy...@wizard.net (James Kuyper) wrote in message news:<8b42afac.0403...@posting.google.com>...
> > rlan...@hotmail.com ("Risto Lankinen") wrote in message news:<vPsac.12785$k4.2...@news1.nokia.com>...
> > > <t...@cs.ucr.edu> wrote in message news:c4b52f$1cd$1...@glue.ucr.edu...
> > > > t...@cs.ucr.edu wrote:
> > > > Presumably, aggressively optimized implementations already attach no
> > > > behaviour to occurrences of "(<exp>||*0)" and simply assume that
> > > > "<exp>" is true afterward.
> > >
> > > What entitles the compiler to assume that <exp> is always true
> > > in (<exp>||*0)?
> >
> > If exp is false, then *0 is evaluated, which means that the program
> > has undefined behavior. Therefore, anything is allowed, including code
> > optimizations based upon an incorrect assumption that exp is true.
>
> The only exception is of course an exception; evaluating <exp> could
> throw. In that case, the compiler must still evaluate <exp>.

And as an extension, in the sequence of statements { stmt(); *0; },
the compiler may assume that stmt() doesn't return normally; it must
either terminate or throw.

Regards,
Michiel Salters

t...@cs.ucr.edu

unread,
Apr 13, 2004, 11:13:16 PM4/13/04
to
Niklas Matthies <usenet...@nmhq.net> wrote:
: On 2004-04-08 05:16, t...@cs.ucr.edu wrote:
[...]
:> So, if I understand correctly, the proper statement would be:

:>
:> If the evaluation of an occurrence of a boolean expression <exp>:
:>
:> (1) has no side effects when evaluating <exp> yields "true"
:>
:> (2) inevitably leads to undefined behavior, when evaluating <exp>
:> does not yield the value "true" (e.g., when it throws an
:> exception or yields "false")
:>
:> then that occurrence of <exp> need not be evaluated and may simply
:> be assumed to yield "true".
:
: I think (2) is slightly wrong. As a reminder, we are talking about:
:
: <exp> || <UB>
:
: If <exp> throws, then, besides not evaluating to true, <UB> is _not_
: reached. In other words, throwing has the same effect as "true" as far
: as the undefined behavior is concerned.

I disagree. Part (2) reads:
:> (2) inevitably leads to undefined behavior, when evaluating <exp>


:> does not yield the value "true" (e.g., when it throws an
:> exception or yields "false")

and the key word is "inevitably", which includes cases where <exp> throws.

: All this only reinforces my impression that <exp> || <UB> is just
: a troublesome hack.

In principle, that "hack" can be used to give useful information to
the compiler without change to the Standard.

: When a programmer writes assert(<exp>), the


: expectation is that (a) <exp> has conceptually no side effects
: (it might have actual ones such like caching through mutable members),
: (b) does not throw and (c) yields true. The implementation should be
: allowed to assume all three, and consequently never be required to
: evaluate <exp>.

I agree, but if the programmer makes (a) and (b) visibly so, then the
"hack" allows the implementation to assume (c).

Tom Payne

Niklas Matthies

unread,
Apr 14, 2004, 12:49:33 PM4/14/04
to
On 2004-04-14 03:13, t...@cs.ucr.edu wrote:
> Niklas Matthies <usenet...@nmhq.net> wrote:
>: On 2004-04-08 05:16, t...@cs.ucr.edu wrote:
:

>: I think (2) is slightly wrong. As a reminder, we are talking about:
>:
>: <exp> || <UB>
>:
>: If <exp> throws, then, besides not evaluating to true, <UB> is _not_
>: reached. In other words, throwing has the same effect as "true" as far
>: as the undefined behavior is concerned.
>
> I disagree. Part (2) reads:
>:> (2) inevitably leads to undefined behavior, when evaluating <exp>
>:> does not yield the value "true" (e.g., when it throws an
>:> exception or yields "false")
> and the key word is "inevitably", which includes cases where <exp>
> throws.

How does, in current C++, throwing inevitably lead to undefined
behavior? It doesn't.

>: When a programmer writes assert(<exp>), the
>: expectation is that (a) <exp> has conceptually no side effects
>: (it might have actual ones such like caching through mutable members),
>: (b) does not throw and (c) yields true. The implementation should be
>: allowed to assume all three, and consequently never be required to
>: evaluate <exp>.
>
> I agree, but if the programmer makes (a) and (b) visibly so, then the
> "hack" allows the implementation to assume (c).

Yes; my point is (1) that the construct becomes less useful if the
programmer has to make (a) and (b) "visible" to make it work, and
(2) that if the construct is not taken advantage of because either
the compiler has not enough information (this can depend, among other
things, on implementation details of the expression, which might
change later on) or just isn't smart enough, then the expression is
needlessly executed, which constitutes a pessimization instead of the
expected optimization, thereby thwarting the purpose of NDEBUG mode.

It may be better than nothing, but it's only a poor substitute for
what it could be.

-- Niklas Matthies

James Kuyper

unread,
Apr 14, 2004, 8:39:00 PM4/14/04
to
t...@cs.ucr.edu wrote in message news:<c5i5oh$ssm$2...@glue.ucr.edu>...
> Niklas Matthies <usenet...@nmhq.net> wrote:
..

> : All this only reinforces my impression that <exp> || <UB> is just
> : a troublesome hack.
>
> In principle, that "hack" can be used to give useful information to
> the compiler without change to the Standard.

Agreed. However, I doubt that there's any real compiler that attempts
to recognize that fact, at least not in the most general case. And
would you really contend that this method of providing the information
is the method that should be standardized?

t...@cs.ucr.edu

unread,
Apr 15, 2004, 1:47:24 PM4/15/04
to
James Kuyper <kuy...@wizard.net> wrote:
: t...@cs.ucr.edu wrote in message news:<c5i5oh$ssm$2...@glue.ucr.edu>...

:> Niklas Matthies <usenet...@nmhq.net> wrote:
: ..
:> : All this only reinforces my impression that <exp> || <UB> is just
:> : a troublesome hack.
:>
:> In principle, that "hack" can be used to give useful information to
:> the compiler without change to the Standard.
:
: Agreed. However, I doubt that there's any real compiler that attempts
: to recognize that fact, at least not in the most general case.

For my purposes, all I need is that it would be easy for a compiler to
do so.

: And would you really contend that this method of providing the

: information is the method that should be standardized?

The point is that this "hack" need not be standardized -- it already
exists. Standardization should specify that an occurrence of
"assume(<exp>)" generates no code and implies that <exp> has nonzero
value, until some parameter of <exp> might have changed value.

Tom Payne

t...@cs.ucr.edu

unread,
Apr 15, 2004, 1:47:33 PM4/15/04
to
Niklas Matthies <usenet...@nmhq.net> wrote:

: On 2004-04-14 03:13, t...@cs.ucr.edu wrote:
:> Niklas Matthies <usenet...@nmhq.net> wrote:
:>: On 2004-04-08 05:16, t...@cs.ucr.edu wrote:
: :
:>: I think (2) is slightly wrong. As a reminder, we are talking about:
:>:
:>: <exp> || <UB>
:>:
:>: If <exp> throws, then, besides not evaluating to true, <UB> is _not_
:>: reached. In other words, throwing has the same effect as "true" as far
:>: as the undefined behavior is concerned.
:>
:> I disagree. Part (2) reads:
:>:> (2) inevitably leads to undefined behavior, when evaluating <exp>
:>:> does not yield the value "true" (e.g., when it throws an
:>:> exception or yields "false")
:> and the key word is "inevitably", which includes cases where <exp>
:> throws.
:
: How does, in current C++, throwing inevitably lead to undefined
: behavior? It doesn't.

Plase note that part (2) is a hypothesis, not an assertion:

If the evaluation of an occurrence of a boolean expression <exp>:

(1) has no side effects when evaluating <exp> yields "true"

(2) inevitably leads to undefined behavior, when evaluating <exp>


does not yield the value "true" (e.g., when it throws an
exception or yields "false")

then that occurrence of <exp> need not be evaluated and may simply


be assumed to yield "true".

:>: When a programmer writes assert(<exp>), the


:>: expectation is that (a) <exp> has conceptually no side effects
:>: (it might have actual ones such like caching through mutable members),
:>: (b) does not throw and (c) yields true. The implementation should be
:>: allowed to assume all three, and consequently never be required to
:>: evaluate <exp>.
:>
:> I agree, but if the programmer makes (a) and (b) visibly so, then the
:> "hack" allows the implementation to assume (c).
:
: Yes; my point is (1) that the construct becomes less useful if the
: programmer has to make (a) and (b) "visible" to make it work, and
: (2) that if the construct is not taken advantage of because either
: the compiler has not enough information (this can depend, among other
: things, on implementation details of the expression, which might
: change later on) or just isn't smart enough, then the expression is
: needlessly executed, which constitutes a pessimization instead of the
: expected optimization, thereby thwarting the purpose of NDEBUG mode.

NDEBUG has no bearing on this construct.

: It may be better than nothing, but it's only a poor substitute for
: what it could be.

Right. It's better than nothing, and it's available here and now.

And, once the committee gets around to approving "assume(<exp>)", C++
will no longer need a built-in function that permits conforming
implementations to assume without checking that <exp> has a non-zero
value -- that's still better.

Tom Payne

Niklas Matthies

unread,
Apr 15, 2004, 6:08:44 PM4/15/04
to
On 2004-04-15 17:47, t...@cs.ucr.edu wrote:
> Niklas Matthies <usenet...@nmhq.net> wrote:
:
>: How does, in current C++, throwing inevitably lead to undefined
>: behavior? It doesn't.
>
> Plase note that part (2) is a hypothesis, not an assertion:
>
> If the evaluation of an occurrence of a boolean expression <exp>:
>
> (1) has no side effects when evaluating <exp> yields "true"
>
> (2) inevitably leads to undefined behavior, when evaluating <exp>
> does not yield the value "true" (e.g., when it throws an
> exception or yields "false")
>
> then that occurrence of <exp> need not be evaluated and may simply
> be assumed to yield "true".

Ok, I finally get it. I thought that the whole statement was meant to
apply (only) to the '<exp> || <UB>' construct, and that "inevitably
leads to undefined behavior" was meant to refer to the '<UB>' in
'<exp> || <UB>', and nothing else.

But what you meant was that in any code equivalent to

try
{
...
bool const b = <exp>;
... // non-forking, finite-time code path
if (b)
{
... // non-forking, finite-time code path
<UB>
...
}
...
}
catch (<any exception <exp> might throw>)
{
... // non-forking, finite-time code path
<UB>
...
}

<exp> need not be evaluated and may simply be assumed to yield "true".

So, in other words, you propose to define

#define assume(exp) \
do { try { ((exp) || <UB>); } catch (...) { <UB>; } } while (0)

instead of

#define assume(exp) ((exp) || <UB>)

?

>:>: When a programmer writes assert(<exp>), the :>: expectation is
>that (a) <exp> has conceptually no side effects :>: (it might have
>actual ones such like caching through mutable members), :>: (b) does
>not throw and (c) yields true. The implementation should be :>:
>allowed to assume all three, and consequently never be required to
>:>: evaluate <exp>.
>:>
>:> I agree, but if the programmer makes (a) and (b) visibly so, then the
>:> "hack" allows the implementation to assume (c).
>:
>: Yes; my point is (1) that the construct becomes less useful if the
>: programmer has to make (a) and (b) "visible" to make it work, and
>: (2) that if the construct is not taken advantage of because either
>: the compiler has not enough information (this can depend, among other
>: things, on implementation details of the expression, which might
>: change later on) or just isn't smart enough, then the expression is
>: needlessly executed, which constitutes a pessimization instead of the
>: expected optimization, thereby thwarting the purpose of NDEBUG mode.
>
> NDEBUG has no bearing on this construct.

Well, not technically, but conceptually. As I outlined in a previous
posting in this thread, assert() in non-NDEBUG mode means (ignoring
observable side-effects for the moment, which we both agree should not
occur in assert() expressions)

don't trust the programmer, therefore evaluate the expression at
runtime unless it can be statically proven that it won't evaluate
to false (and won't have side-effects)

whereas assert() in NDEBUG mode means

trust the programmer, hence no need to evaluate the expression

The whole purpose of the "assume" proposal I brought up was to extend
the latter with

and since it is trusted that the expression, if it would be
evaluated (which it isn't), would yield true, it is okay to use
this assumption in code generation (typically for optimization)
or to refuse translation if it can be statically proven that it
will not evaluate to true

But if we take your proposal of how assume() could be defined in
current C++, then, while getting some of the desired extension, at the
same time we cut back on some of the "trust the programmer" that is
currently there with assert()-in-NDEBUG-mode, since "no need to
evaluate the expression" becomes restricted to specific circumstances
(the ones you specified above) and is being revoked for the general
case.

So while we get a step forward in one direction, we simultaneously
take a step back in a different direction. So it's rather a shift than
an improvement.

>: It may be better than nothing, but it's only a poor substitute for
>: what it could be.
>
> Right. It's better than nothing, and it's available here and now.

It's available (for any practical purposes) once you get enough
compiler writers to have their compilers actually recognize and take
advantage of such constructs. Somehow I don't see this as being a
sufficiently realistic prospect.

-- Niklas Matthies

t...@cs.ucr.edu

unread,
Apr 16, 2004, 12:17:46 AM4/16/04
to
Niklas Matthies <usenet...@nmhq.net> wrote:

: On 2004-04-15 17:47, t...@cs.ucr.edu wrote:
:> Niklas Matthies <usenet...@nmhq.net> wrote:
: :
:>: How does, in current C++, throwing inevitably lead to undefined
:>: behavior? It doesn't.
:>
:> Plase note that part (2) is a hypothesis, not an assertion:
:>
:> If the evaluation of an occurrence of a boolean expression <exp>:
:>
:> (1) has no side effects when evaluating <exp> yields "true"
:>
:> (2) inevitably leads to undefined behavior, when evaluating <exp>
:> does not yield the value "true" (e.g., when it throws an
:> exception or yields "false")
:>
:> then that occurrence of <exp> need not be evaluated and may simply
:> be assumed to yield "true".
:
: Ok, I finally get it. I thought that the whole statement was meant to
: apply (only) to the '<exp> || <UB>' construct, and that "inevitably
: leads to undefined behavior" was meant to refer to the '<UB>' in
: '<exp> || <UB>', and nothing else.

I apologize for the misleading presentation.

: But what you meant was that in any code equivalent to


:
: try
: {
: ...
: bool const b = <exp>;
: ... // non-forking, finite-time code path
: if (b)
: {
: ... // non-forking, finite-time code path
: <UB>
: ...
: }
: ...
: }
: catch (<any exception <exp> might throw>)
: {
: ... // non-forking, finite-time code path
: <UB>
: ...
: }
:
: <exp> need not be evaluated and may simply be assumed to yield "true".

Yup.

: So, in other words, you propose to define


:
: #define assume(exp) \
: do { try { ((exp) || <UB>); } catch (...) { <UB>; } } while (0)
:
: instead of
:
: #define assume(exp) ((exp) || <UB>)
:
: ?

Right! In fact, that's much better than what I had mind.

[...]
:> NDEBUG has no bearing on this construct.


:
: Well, not technically, but conceptually. As I outlined in a previous
: posting in this thread, assert() in non-NDEBUG mode means (ignoring
: observable side-effects for the moment, which we both agree should not
: occur in assert() expressions)
:
: don't trust the programmer, therefore evaluate the expression at
: runtime unless it can be statically proven that it won't evaluate
: to false (and won't have side-effects)
:
: whereas assert() in NDEBUG mode means
:
: trust the programmer, hence no need to evaluate the expression
:
: The whole purpose of the "assume" proposal I brought up was to extend
: the latter with
:
: and since it is trusted that the expression, if it would be
: evaluated (which it isn't), would yield true, it is okay to use
: this assumption in code generation (typically for optimization)
: or to refuse translation if it can be statically proven that it
: will not evaluate to true

Agreed!

: But if we take your proposal of how assume() could be defined in


: current C++, then, while getting some of the desired extension, at the
: same time we cut back on some of the "trust the programmer" that is
: currently there with assert()-in-NDEBUG-mode, since "no need to
: evaluate the expression" becomes restricted to specific circumstances
: (the ones you specified above) and is being revoked for the general
: case.

Yup. (Unfortunately.)

: So while we get a step forward in one direction, we simultaneously


: take a step back in a different direction. So it's rather a shift than
: an improvement.

It's a recognition of what we already have. One for which I'm
grateful. Thanks, Niklas. ;-)

:>: It may be better than nothing, but it's only a poor substitute for


:>: what it could be.
:>
:> Right. It's better than nothing, and it's available here and now.
:
: It's available (for any practical purposes) once you get enough
: compiler writers to have their compilers actually recognize and take
: advantage of such constructs. Somehow I don't see this as being a
: sufficiently realistic prospect.

Moore's law and its corollaries reduce most compiler optimization to
irrelevance. Rather, I'm interested in removing the rationale for
such nonsense as C89 6.3 and its reminants in C99 and the C++
standard.

Regards, Tom

Niklas Matthies

unread,
Apr 16, 2004, 11:33:10 AM4/16/04
to
On 2004-04-15 22:08, Niklas Matthies wrote:
:

> But what you meant was that in any code equivalent to
>
> try
> {
> ...
> bool const b = <exp>;
> ... // non-forking, finite-time code path
> if (b)

This should read

if (!b)

of course.

> {
> ... // non-forking, finite-time code path
> <UB>
> ...
> }
> ...
> }
> catch (<any exception <exp> might throw>)
> {
> ... // non-forking, finite-time code path
> <UB>
> ...
> }
>
><exp> need not be evaluated and may simply be assumed to yield "true".

-- Niklas Matthies

Nicolas Pavlidis

unread,
Sep 16, 2004, 6:19:31 PM9/16/04
to
na...@animats.com (John Nagle) writes:

> The committee is considering a special

> built-in compile time assertion form to make the template

> people happy. Maybe that could be extended to cover

> the general case.

I'm not that expert, and maybe i haven't read the thread exactly enough,
but what is about boost::STATIC_ASSERT (I'm not sure that it is the real
name). I think that is all anyone needs to make asssertions at
compiletime. For some cases it is quite helpful.

Kind regards,
Nicolas
--
| Nicolas Pavlidis | Elvis Presly: |\ |__ |
| Student of SE & KM | "Into the goto" | \|__| |
| pav...@sbox.tugraz.at | ICQ #320057056 | |
|-------------------University of Technology, Graz----------------|

0 new messages