Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Ecma International Moves to Standardize C++ Binding for CLI

1 view
Skip to first unread message

Javier Estrada

unread,
Oct 8, 2003, 4:05:44 PM10/8/03
to
Can anyone clarify the impact of this press release in the future of
C++?

http://www.ecma-international.org/news/ecma-TG5-PR.htm

>From the press release and the figures involved it (Herb, EDG,
Dinkumware) would seem that good things may come from it, but what I'm
interested in the general--and objective--opinion of the C++
community. In particular, what are the proposed extensions and how
seamlessly would my code today integrate with the extensions tomorrow?


Regards,

Javier

[ See http://www.gotw.ca/resources/clcm.htm for info about ]
[ comp.lang.c++.moderated. First time posters: Do this! ]

"Martin v. Löwis"

unread,
Oct 11, 2003, 11:18:51 AM10/11/03
to
Javier Estrada wrote:

>>From the press release and the figures involved it (Herb, EDG,
> Dinkumware) would seem that good things may come from it, but what I'm
> interested in the general--and objective--opinion of the C++
> community. In particular, what are the proposed extensions and how
> seamlessly would my code today integrate with the extensions tomorrow?

I would expect the outcome of this effort to look very much like managed
C++ today. I.e. you get __gc, __struct, and __pin keywords, attribute
syntax, etc.

I personally dislike these extensions - if .NET is not capable of
executing C++ as-is without extensions, it may well be that .NET is best
used from C# only. If so, extending C++ for it is pointless - any
development for .NET will use C#, anyway.

Regards,
Martin

Javier Estrada

unread,
Oct 14, 2003, 4:40:16 AM10/14/03
to
> I would expect the outcome of this effort to look very much like managed
> C++ today. I.e. you get __gc, __struct, and __pin keywords, attribute
> syntax, etc.
>
> I personally dislike these extensions - if .NET is not capable of
> executing C++ as-is without extensions, it may well be that .NET is best
> used from C# only. If so, extending C++ for it is pointless - any
> development for .NET will use C#, anyway.

Someone described before that there is a difference between managed
types and managed code. .NET is capable of executing C++ as is--it
compiles it to MSIL--and runs it as managed_code.

The MC++ keywords are already there, so I wouldn't expect more
keywords. On the contrary, I'd expect less if they're going to the
effort of produding a "binding". I'd rather expect more integration
and compliance from the CLI to the C++language then
viceversa--obvious, but I can only guess a few things--e.g.,
deterministic destruction of objects, instead of GC. I dislike the
current idioms for "destructors" in the .NET languages when you get it
for free in C++. Other things I would expect would be a seamless
mapping of the generics version that .NET will start to offer with
Whidbey with C++ templates, although this is a tough nut to crack,
given the differences that have been described in some postings.

Could possibly Herb or Dave clarify? C'mon, guys, you're moderators
;-) I think that this is of interest to several of us, not for the
.NET side of it, but from the C++ aspect.

Regards,

Javier

Herb Sutter

unread,
Oct 14, 2003, 5:36:01 PM10/14/03
to
I can offer a few comments on this.

As an overall comment, here's my take on the meaning of this work, I think
the most important thing I can say is that I believe this work
demonstrates the continued relevance of C++ on modern platforms.
Specifically:

- Myth #1: Critics of C++ frequently claim that it is a "dying tongue."
The reality is that C++ is still dominant in the industry and its use is
actually still growing moderately (see for example Plum's article in the
October issue of DDJ), and I think this work is further evidence that C++
is alive and well in the industry in general.

- Myth #2: People almost as frequently claim that C++ is no longer
important to Microsoft. The reality is that C++ is the subject of heavy
and continued investment (and internal use) at Microsoft. Other Microsoft
products, including C#, get attention because they're new, which is
understandable. C++ has been a core product for over 10 years and
continues to be widely used inside Microsoft to build its products and
platforms, including the CLR. The C++/CLI work demonstrates Microsoft's
commitment that C++ be a key and first-class language on .NET moving
forward, without compromising ISO conformance (which we just spent most of
a release cycle on with VC++ 2003, aka 7.1, and will keep improving).

- Elegance and stability are both important. The first version of "Managed
C++" had problems that need fixing and so a revision is necessary, but
stability is also important because people need to know that this is real
and won't just be changed again in another two years. Collaborating with
experts outside Microsoft on the design, and committing to an open
standards process for the extensions, hopefully demonstrates that
Microsoft is committing to this as a long-term design for C++ on .NET that
is suitable for code investment and that will have multiple interoperable
implementations (i.e., not just from Microsoft). After all, the CLI exists
on non-Microsoft platforms today too.


On 11 Oct 2003 11:18:51 -0400, "Martin v. Löwis" <mar...@v.loewis.de>
wrote:


> > I would expect the outcome of this effort to look very much like managed
> > C++ today. I.e. you get __gc, __struct, and __pin keywords, attribute
> > syntax, etc.

Not quite. The existing "Managed C++" design had several problems,
including:

- The syntax and features were uneven, and sometimes just weird. For
example, T* meant three different and incompatible things depending on the
type of T, so you had to remember the differences all the time, and there
was little hope of successfully writing agnostic templates that acted on
T*'s.

- The extensions looked ugly (e.g., __keywords). Although the reason those
names were chosen was that it's the easiest way to respect ISO
conformance, the overwhelming response from users was: "Yuck!" So in our
revised design we are still doing pure conforming extensions (avoiding
taking reserved words), but using different approaches that require more
compiler work but are more usable while preserving conformance (e.g.,
contextual keywords instead of "__" ones).

- The extensions lacked first-class support for some fundamental CLR
features (e.g., __property decorations on individual member functions that
the compiler would cobble together for you, instead of a property
abstraction).

- MC++ followed what one might call a "two worlds" (or perhaps
"schizoid"?) hypothesis -- it did expose the important CLR features, but
only for CLR types, not native types; and it did not expose native C++
features also for CLR types. The two worlds were grafted together, not
integrated. It turns out that "some things work over here, other things
work over there, and it's your job to keep score at home" is just not a
good answer. This has been a constant source of surprises to programmers
who always have to remember the list of things that don't work, or work
differently, when using CLR types.

We ought to do better. We are fixing these problems. For the past year or
so I've been the lead architect for the language design team, and together
with bright people both inside and outside of Microsoft we have come up
with a design for the .NET extensions that meets all of the following
goals:

- provides an elegant (and uniform!) syntax and semantics that give
a natural feel for C++ programmers

- provides first-class support for all current and upcoming .NET
features (e.g., properties, events, garbage collection, generics)
for all types including existing C++ classes

- provides first-class support for C++ features (e.g.,
deterministic destruction, templates) for all types
including .NET classes

- preserves ISO C++ standards conformance (by adding pure
extensions, but also in a way that we don't need to use "__"
keywords)


> > I personally dislike these extensions - if .NET is not capable of
> > executing C++ as-is without extensions, it may well be that .NET is best
> > used from C# only. If so, extending C++ for it is pointless - any
> > development for .NET will use C#, anyway.

I hear you, but let me put it this way: A key driver here is that
developers are increasingly writing programs targeted to modern
environments characterized by a virtual machine with garbage collection.
Leading examples include the JVM and the .NET CLR (ISO CLI, or Common
Language Infrastructure, is the standardized subset of the .NET runtime
environment and frameworks class library). True, Standard C++ has no
notion of now-fundamental concepts like garbage collection. Does that mean
C++ is simply obsolescent? No; it can be cleanly extended to directly
support these modern environments, and there's nothing wrong with that.

What's more, C++ actually offers compelling advantages that can make it
the language of choice for your application, even on the "home turf" of
newer languages specifically designed for these new environments. That's
the kind of thing I get excited about. Really excited about. For a
specific example, see the deterministic destruction notes below.


On 14 Oct 2003 04:40:16 -0400, ljes...@hotmail.com (Javier Estrada)
wrote:


>Someone described before that there is a difference between managed
>types and managed code. .NET is capable of executing C++ as is--it
>compiles it to MSIL--and runs it as managed_code.
>
>The MC++ keywords are already there, so I wouldn't expect more
>keywords. On the contrary, I'd expect less if they're going to the
>effort of produding a "binding".

Right. This doesn't preclude keyword cleanup. :-)

>I'd rather expect more integration
>and compliance from the CLI to the C++language then
>viceversa--obvious, but I can only guess a few things--e.g.,
>deterministic destruction of objects, instead of GC.

In C++/CLI, you get "both" deterministic destruction and GC -- both are
important and valued, and I've heard as many C# programmers complain about
the lack of deterministic destruction in C# as I've heard C++ programmers
complain about the lack of GC in C++. C++/CLI delivers both, on a
per-object basis for objects of any type. For example, in C++/CLI you can
take objects from the .NET Frameworks and C# libraries and naturally apply
deterministic destruction to them (by calling delete, or by putting them
on the stack where the dtors/Dispose gets run automatically as per C++
usual).

>I dislike the
>current idioms for "destructors" in the .NET languages when you get it
>for free in C++.

Amen. Consider that the normal mode of progress in computing is that we
discover idioms and then in later languages/versions add direct language
support for them. Frankly, I consider it a regression to take an important
feature that already has language support (deterministic destruction) and
regress to making it a coding idiom, which is to say fragile and
error-prone.

>Other things I would expect would be a seamless
>mapping of the generics version that .NET will start to offer with
>Whidbey with C++ templates, although this is a tough nut to crack,
>given the differences that have been described in some postings.

Done. Generics and templates are overlapping but distinct features, and
each does things well that the other can't touch. Both will be supported
and can be used together. For example, generics will be able to match
template template parameters (but PLEASE don't tell Andrei, you know what
he did with templates -- he's just bound to make this do brain-wrenching
things and discover new and completely unforeseen uses for them).

For those who are interested in seeing the shape of the project, we will
be making the proposed standard's base document (the "starting point"
document) publicly available on November 15. Then, starting in December
2003, the new ECMA committee will take over the base document and craft it
into a consensus standard, which Microsoft will track (i.e., we will
incorporate changes the committee makes into our product). The current
thinking is that the ECMA standard will take about one year to complete
and approve, so we're targeting the end of 2004.

Cheers,

Herb

P.S.: Okay, so now you know why the C++ Coding Standards books is late.
But Andrei and I really are working on it now.

---
Herb Sutter (www.gotw.ca)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

Attila Feher

unread,
Oct 15, 2003, 8:47:04 AM10/15/03
to
Martin v. Löwis wrote:
> Javier Estrada wrote:
>
> >>From the press release and the figures involved it (Herb, EDG,
> > Dinkumware) would seem that good things may come from it, but what
> I'm > interested in the general--and objective--opinion of the C++
> > community. In particular, what are the proposed extensions and how
> > seamlessly would my code today integrate with the extensions
> tomorrow?
>
> I would expect the outcome of this effort to look very much like
> managed C++ today. I.e. you get __gc, __struct, and __pin keywords,
> attribute syntax, etc.
>
> I personally dislike these extensions - if .NET is not capable of
> executing C++ as-is without extensions, it may well be that .NET is
> best used from C# only. If so, extending C++ for it is pointless - any
> development for .NET will use C#, anyway.

IMO the extensions are there to provide reuse of C++ code. Basically you
are able to compile you (original) C++ for the .NET and provide interfaces
to the C# (or whatever else is used, Python ;-) ) native .NET code. Sort of
"porting C++ libraries to .NET". I am not exactly convinced that Microsoft
wants .NET to be programmed in managed C++. IMO the "ugliness" is one
indicator for it. C# being created is another. However if Microsoft wants
applications to be ported to .NET (and they do) they have to provide a
relatively easy path for porting - hence managed C++ is created.

--
Attila aka WW

Nemanja Trifunovic

unread,
Oct 15, 2003, 3:49:21 PM10/15/03
to
> In C++/CLI, you get "both" deterministic destruction and GC -- both are
> important and valued, and I've heard as many C# programmers complain about
> the lack of deterministic destruction in C# as I've heard C++ programmers
> complain about the lack of GC in C++. C++/CLI delivers both, on a
> per-object basis for objects of any type. For example, in C++/CLI you can
> take objects from the .NET Frameworks and C# libraries and naturally apply
> deterministic destruction to them (by calling delete, or by putting them
> on the stack where the dtors/Dispose gets run automatically as per C++
> usual).

Wow!!! We really need that. See what kind perversions I needed to come
up with to overcome the lack of deterministic finalization with
managed types:

http://www.codeproject.com/managedcpp/managedraii.asp

WW

unread,
Oct 16, 2003, 4:57:36 PM10/16/03
to
Attila Feher wrote:
[SNIP]

> IMO the extensions are there to provide reuse of C++ code. Basically
> you are able to compile you (original) C++ for the .NET and provide
> interfaces to the C# (or whatever else is used, Python ;-) ) native
> .NET code. Sort of "porting C++ libraries to .NET". I am not
> exactly convinced that Microsoft wants .NET to be programmed in
> managed C++. IMO the "ugliness" is one indicator for it. C# being
> created is another. However if Microsoft wants applications to be
> ported to .NET (and they do) they have to provide a relatively easy
> path for porting - hence managed C++ is created.

Since writing this I have got the mail from Herb's mailing list... and I am
forced to change my position. C++ is there and it is to stay there on top
of CLI - as I understand it now. And as for current managed C++: it was a
first and somehwat failed attempt. The evolution is going on, as I see with
full power.

--
WW aka Attila

Richard Howard

unread,
Oct 28, 2003, 10:54:28 AM10/28/03
to

smitty_one_each

unread,
Oct 29, 2003, 11:43:42 AM10/29/03
to
Two questions, Herb:

> For those who are interested in seeing the shape of the project, we will
> be making the proposed standard's base document (the "starting point"
> document) publicly available on November 15. Then, starting in December
> 2003, the new ECMA committee will take over the base document and craft it
> into a consensus standard, which Microsoft will track (i.e., we will
> incorporate changes the committee makes into our product). The current
> thinking is that the ECMA standard will take about one year to complete
> and approve, so we're targeting the end of 2004.

1. Could you elaborate on the relationship between this work and the
C++0x work. (Pronouncing C++0x as 'cooks' hints at some cliche about
broth...)

>
> Cheers,
>
> Herb
>
> P.S.: Okay, so now you know why the C++ Coding Standards books is late.
> But Andrei and I really are working on it now.

2. Can you hint when this bit of reading might be out? Sounds
Spring-ish, at best. Alluding to the 'C++ as a dying tongue' thread,
my eyeball survey has C++ losing the battle of the bookshelf pretty
handily. To answer the 'where is _your_ book?' question in advance, I
just figured out the environment variable subtleties so that I could
compile boost under VC7.1 with bjam.

Domo,
Chris

Chris Wundram

unread,
Oct 30, 2003, 9:33:12 AM10/30/03
to
> - MC++ followed what one might call a "two worlds" (or perhaps
> "schizoid"?) hypothesis -- it did expose the important CLR features, but
> only for CLR types, not native types; and it did not expose native C++
> features also for CLR types. The two worlds were grafted together, not
> integrated. It turns out that "some things work over here, other things
> work over there, and it's your job to keep score at home" is just not a
> good answer. This has been a constant source of surprises to programmers
> who always have to remember the list of things that don't work, or work
> differently, when using CLR types.

I found this to be one of the most usefull features of the current
managed c++ implementation. When I want to develop for the CLR, I
would use c# which has a good mapping of it's features to the CLR's
object model. However when I am writing libraries for the CLR, (that
bridge the CLR to older legacy libraries written for c++), I use the
current managed c++ because it allows me to use both object models in
the same code. So I can, for example, write code for CLR objects
which live in the managed heap, and are garbage collected, that use
older C++ objects which live in the win32 heap, and are not garbage
collected.

Sure the current syntax of managed c++ is not pretty, but given that
it's primary purpose (in my opinion), is to bridge the old win32 world
with the new managed world, I think the syntax that maintains a
seperation of the two object models is good.

Herb Sutter

unread,
Nov 10, 2003, 2:59:41 PM11/10/03
to
On 29 Oct 2003 11:43:42 -0500, smitty_...@hotmail.com

(smitty_one_each) wrote:
> 1. Could you elaborate on the relationship between this work and the
>C++0x work.

The two are separate. C++0x is the next version of the ISO C++ standard,
currently under development -- the evolution of the C++ language.

C++/CLI is a binding between that ISO C++ standard and the ISO CLI
standard, as set of pure extensions to ISO C++ with near-zero impact on
existing ISO C++ programs. The goal is to make it easy and natural to use
ISO C++ when writing programs for CLI platforms (e.g., .NET, Rotor, Mono)
-- as easy, and in some cases easier, than C#.

> 2. Can you hint when this bit of reading might be out? Sounds
>Spring-ish, at best.

That's our current guesstimate too.

Herb

---
Herb Sutter (www.gotw.ca)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Francis Glassborow

unread,
Nov 11, 2003, 10:46:12 AM11/11/03
to
In article <aejvqv84b6rh9gg3u...@4ax.com>, Herb Sutter
<hsu...@gotw.ca> writes

>On 29 Oct 2003 11:43:42 -0500, smitty_...@hotmail.com
>(smitty_one_each) wrote:
>> 1. Could you elaborate on the relationship between this work and the
>>C++0x work.
>
>The two are separate. C++0x is the next version of the ISO C++ standard,
>currently under development -- the evolution of the C++ language.
>
>C++/CLI is a binding between that ISO C++ standard and the ISO CLI
>standard, as set of pure extensions to ISO C++ with near-zero impact on
>existing ISO C++ programs. The goal is to make it easy and natural to use
>ISO C++ when writing programs for CLI platforms (e.g., .NET, Rotor, Mono)
>-- as easy, and in some cases easier, than C#.

However what concerns a number of us is that these 'pure extensions'
will have a negative impact on the future development of C++. I, for
one, do not think that introducing new keywords and/or semantics is the
task of a bindings standard.

Note that I ma not claiming that this will happen but I am saying that
should it, I will need very strong persuasion before encouraging my
National Body to endorse the resulting ECMA Standard as an ISO one.

I know that I am not alone in considering this attempted use of ECMA as
an 'end run' and think the Microsoft should be very careful before it
continues along this line.

I also think that individuals involved in both processes (ECMA and ISO)
should carefully consider potential conflicts of interest.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Glen Low

unread,
Nov 11, 2003, 12:00:10 PM11/11/03
to
> - The extensions lacked first-class support for some fundamental CLR
> features (e.g., __property decorations on individual member functions that
> the compiler would cobble together for you, instead of a property
> abstraction).

I wonder whether you're considering (or have considered) mapping some
C#/.NET features into their more natural (in C++) syntax:

Properties -> proxy classes with operator= and conversion operator
overloads
Indexers -> operator[] overloads.

Essentially this would mean whenever a developer creates a proxy class
with just operator= and a conversion operator (possibly marking it
somehow in the source?), the operator= and conversion are associated
with the equivalent .NET property, so that non-C++ .NET clients could
use them transparently as properties.

I'd imagine what would need to happen under the hood is the get_xxx
and set_xxx methods on the declaring object call through to the
equivalent methods on the proxy directly. Since the proxy would have
to be a public data member (!) to get the same syntax in C# and C++,
it should have the same lifetime as the declaring object.

Indexers could work through temporary proxies.

I don't know whether it's worth the trouble reconciling these two
similar but not identical systems in C++ and .NET. But it would be
cool...

Cheers,
Glen Low, Pixelglow Software
www.pixelglow.com

Troll_King

unread,
Nov 11, 2003, 12:02:50 PM11/11/03
to
ljes...@hotmail.com (Javier Estrada) wrote in message news:<d522680b.03101...@posting.google.com>...

> > I would expect the outcome of this effort to look very much like managed
> > C++ today. I.e. you get __gc, __struct, and __pin keywords, attribute
> > syntax, etc.
> >
> > I personally dislike these extensions - if .NET is not capable of
> > executing C++ as-is without extensions, it may well be that .NET is best
> > used from C# only. If so, extending C++ for it is pointless - any
> > development for .NET will use C#, anyway.
>
> Could possibly Herb or Dave clarify? C'mon, guys, you're moderators
> ;-) I think that this is of interest to several of us, not for the
> .NET side of it, but from the C++ aspect.
>
> Regards,
>
> Javier

I think that C# is a better language for .Net than Standard C++ and
here are my reasons.

The .Net framework is a product and it's target customer is business
and undergraduates (eventually mostly for Europeans and people in the
developing countries). The solutions developers are specializing the
vendor classes that are already implemented, and a simple language
like C# is more than sophisticated enough for solution implementation.
Anything that Microsoft deals in is turned into a product, they
purposely hold back some features of the product so that they can roll
them out later on and charge for upgrades. Microsoft could have used
the Standard C++ language definition as the language of .Net in the
very begining, however they did not do this on purpose, if anything,
Microsoft will offer a Standard C++ mapping of the intermediate
language in order to associate Standard C++ with solution
implementation as opposed to system implementation (where the focus is
on generalization). They will remove the ability to implement systems
at the operating sytem layer and in it's place they will let you
specialize with Standard C++ and accomplish the same tasks as C#
currently handles, so that you will migrate to the .Net framework and
their new product line and use a software layer decoupled from the
operating system. Microsoft wants above all a more flexible and mobile
architecture that they can make into a product and have more control
over the factors of production (R&D). Now they will do anything to
attract people to the new product line, any langauge goes, they will
add 'Seal Language' if it will attract enough followers. I think that
Standard C++ for .Net is a bad thing but I don't know what it all
means, whether it makes a difference because I really don't understand
why people invested time into Standard C++ on the Microsoft product in
the first place. And it was all done terribly, your marketing failed
because all of the good examples were closed and unavailable to the
masses. The truth is that Herbert Schlidt invented C++, that's a fact,
especially MSVC++. That is not the reality but it is the fact, just
look around anywhere outside this circle. If you can't use Standard
C++ and learn, if you are unwilling to learn, to think about the
context, than you will have Standard C++ on the .Net framework and be
able to migrate to the new product line, but you never even had
Standard C++ to begin with, you would closer to the truth if you used
Seal Language instead, it would be more fitting..

WW

unread,
Nov 11, 2003, 5:51:16 PM11/11/03
to
Francis Glassborow wrote:
[SNIP]

> However what concerns a number of us is that these 'pure extensions'
> will have a negative impact on the future development of C++. I, for
> one, do not think that introducing new keywords and/or semantics is
> the task of a bindings standard.
>
> Note that I ma not claiming that this will happen but I am saying that
> should it, I will need very strong persuasion before encouraging my
> National Body to endorse the resulting ECMA Standard as an ISO one.
[SNIP]

IMHO it is already showing if you follow the signs carefully. I mean as I
understood the votes of the evolution group, people want a new, more class
like, safe enum. And not to touch the existing one. Now it would be a very
logical step to use the "enum class" series of keywords to defines such a
thing. But wait! That is already used by the C++/CLI binding... And I
think it is not too far fecthed that people will start talking about
Microsoft hijacking the C++ language via ECMA. I do *not* say it happens
(the hijacking) but considering the paranoia around MS and the fact that the
proposed syntax does have an effect on the possible routes the C++ evolution
can take I believe that people will say this.

IMO MS would do better introducing some (one) keyword to put before those
constructs. I think about this as a courtesy to ISO and WG21. Otherwise
C++/CLI might define constructs in a binding which otherwise would have been
taken by the C++ language itself. It would not be a problem in itself, but
it will most probably do happen with incompatible semantics.

I want to stress again that I am not in any way against the C++/CLI binding,
on the contrary. I only think that this binding (ehem, the community making
it) should take every possible precaution not to use constructs which the
ISO C++ might/could use in its evolution.

With some it is quite simple, because "enum class" could actually be called
"cli enum", "interface class" could be called "cli interface" and so forth.
Possibly with another "trigger" word. it might even be possible with that
trigger word remaining free to use as an identifier - if max munch allows.
Although I am really a nobody and I do not imagine that I can even hope to
have any effect in this quest I humbly but honestly and strongly suggest for
the ECMA committee to consider such a construct (triggerword enum,
triggerword interface etc.).

The biggest problem is with operator symbols. There is an ongoing
discussion about creating "Java style references" in C++ (comp.std.c++). I
do not really follow it and I have no idea how much real support it has, but
such a thing would/could evidently use the hat (^) symbol for its type
declaration. But as of today CLI has taken that symbol away. The big
question is: will the semantics of the C++/CLI ECMA standard be compatible
with the semantics decided for the evolving C++?

There was real concern voiced by some at the WG21 meeting about the fact
that WG21 (ISO) has no control over the ECMA standardization process. It
was worded rather as cooperation/coordination, but basically it trims down
to control: since the ECMA standardization will be finished before WG21/J16
members (working on C++ evolution but having no money to spend on ECMA)
could have a word on its work. My biggest fear is that the unfortunate
happens and C++/CLI (ECMA) will manage to use up C++ constructs what what
WG21 later wants to use for the CLI indepedent C++. And those things go
into the ECMA standard. And that goes into the ISO fast track process. And
then some countries will just not accept it. And when those features will
be proposed to ISO WG21 Microsoft will be forced to vote OMDB, since it
would kill their efforts. But anyhow, if such a thing happens - even
without national bodies rejection and OMDB from MS -, it will be percieved
as a hijacking of C++ by MS. And that can have a very very badly effect the
future of the language, since it will polarize the ANSI/ISO commitee. I
really hope that the honorable effort of making the C++/CLI binding will not
result in something like that. But avoiding this disaster scennario won't
happen by accident, only by deliberate actions taken by the ECMA team.

IMO the people participating in the ECMA work hold a much bigger
responsibility than it first meets the eye. They do not only create a
binding, but their work will have various effects on the language itself. I
do not remember this being explicitly mentioned on the WG21 meeting, but I
want to stress that I believe that the ECMA group has a task *beyond* making
sure that the CLI related extensions will not break existing code. They
have to make sure that whatever extensions they add to the language they
will not have any limiting effect on the future evolution of the ISO C++
language.

I have already suggested a way for avoiding "overloading" of existing C++
keywords. For places where "collision" is unavoidable the task is even more
delicate/difficult. The ECMA team has to make sure that they come up with
semantics, which can be generalized up to the semantics the main language
may wish to use the construct for. Not an easy task.

We have to be honest. Microsoft is big enough to make sure it will have a
C++/CLI binding. Its interests call for a fast and effective (for surfacing
the CLI features) standardization process. Thanks to Herb it is done very
politely. But make no mistake ( (C) G.W.B), it is done in a very determined
way. Microsoft's main interest is in making the CLI available and very easy
to use in C++. But for the larger C++ community (which is already partially
represented in the ECMA committee) it is also important to make sure that
the repercussions (I might use the wrong word here) to the C++ language and
its evolution are minimal, or even none if possible. It is no easy task.

As a sidenote let me express my thoughts about what I see as unusual about
the CLI/C++ binding. Language bindings I have seen were either pure library
ones, or ones which extended the language with definitely unambiguous
syntactical constructs. Such ones, which are 99.999% unlikely to collide
with the language itself. As an example I can remember the ESQL/C I have
used 12 years ago. Each SQL command started with $SQL, making it sure that
C is not effected. Let me recap. A language binding (usually) should:

- add pure library functionality (pure, I mean no language changes, like
leaving out headers and still getting magic library functions)

- adding new syntax, which "sticks" out of the language, without the
slightest chance of collision

- further restrictions in the original definitions, but not changing them
(for example saying: int is always 32 bits)

The "problem" with the C++/CLI binding is that it is more than a traditional
language binding. And it has to be, it is just a fact. In fact I feel that
Microsoft is actually pushing the C++ language a little bit, not being
satisfied by the pace it embraced new ideas (such as virtual machines, or
portable language environments and GC and so forth). I do not mean pushing
as a bad word here.

So the C++/CLI is actually *extending* the language. It does not "only"
bind. It will be the first standard introduction of GC into the language...
and many more things. The problem is not with these facts, the problem is
that C++ itself has a (possibly last) language evolution step left and that
full scale coordination is nearly impossible because of the very different
pace of the ISO and the ECMA work.

As I have said earlier I think it would be much "nicer" from ECMA/MS to
introduce a cli (or some other) keyword/magic word (depeding what does the
syntax allow) into the language and name its added constructs using that. I
kinda like seeing "cli enum" much more than seeing "enum class" as a CLI
dependent construct. I can see I use a CLI dependent feature there and it
will surely not collide with C++ itself. And IMO it won't break code
either - I believe it has less chance of breaking code than enum class has.
And I also believe that it does not take a bit away from the "naturalness"
of the cli features in the code.

But about the hat (^) symbol I am worried. It seems to be a demanding (if
not impossible) task to define it on a way that its semantics will allow the
use of the same symbol for a possible new language feature of C++. I think
it is indeed a good candidate for some sort of new reference like type, with
possible automagically generated factory functions... but that leads far.

I honestly would be very happy and willing to participate in the work of
this ECMA committee. But do not worry. My employer will not sponsor this
and I will never have that kinda money so you are all safe. :-) But I do
think there will be revolutionary work done there and I do think that the
committee will have an enormous repsonsibility in regards of the future of
the C++ language. A nice and challenging task.

So let me ask you all involved in ECMA to please please consider what I
wrote and do not only concenrate on not breaking existing code, but also on
not to break/coerce/hijack the future direction/evolution the ISO C++
langauge could take.

--
WW aka Attila

WW

unread,
Nov 12, 2003, 4:04:57 AM11/12/03
to
Troll_King wrote:
> I think that C# is a better language for .Net than Standard C++ and
> here are my reasons.
>
> The .Net framework is a product and it's target customer is business
> and undergraduates (eventually mostly for Europeans and people in the
> developing countries). The solutions developers are specializing the
> vendor classes that are already implemented, and a simple language
[SNIP]

> look around anywhere outside this circle. If you can't use Standard
> C++ and learn, if you are unwilling to learn, to think about the
> context, than you will have Standard C++ on the .Net framework and be
> able to migrate to the new product line, but you never even had
> Standard C++ to begin with, you would closer to the truth if you used
> Seal Language instead, it would be more fitting..

I am sorry, my English is bad. Could you please rephrase your reasons? I
have read it several times but I could not find any reasons. I have only
seen FUD. It must be my bad English, but to me the text looked rather like
a heated outburst. And since I trust our moderators not to let through
heated outbursts and flame-bait-accusations against any entity (being it
indivual or a corporation) I must conclude that I should have failed to
understand your point. Would you care to rephrase it? If I may ask, please
do it in short sentences, each sentence in a seperate sentence and each
separate topic in its own paragraph? That would help me a lot to be able to
get your point. Thanks in advance, Mr. Troll King.

{Moderation rules might be summarised as:
reject flames against individuals, if in doubt reject
allow criticism of corporations
otherwise when in doubt accept.
It is not up to moderators to decide whether something that might be
incomprehensible nonsense actually is. If it were we would be spending
far too many hours adjudicating on marginal nonsense. -mod}


--
WW aka Attila

Dave Boyle

unread,
Nov 12, 2003, 3:31:42 PM11/12/03
to
<snip>

> I want to stress again that I am not in any way against the C++/CLI binding,
> on the contrary. I only think that this binding (ehem, the community making
> it) should take every possible precaution not to use constructs which the
> ISO C++ might/could use in its evolution.

<snip>

This is an interesting issue, I think. Is there a precedent for this?

For example, has there ever been a situation when those responsible
for standardising C requested the C++ group not use keywords etc.
that might be useful to C at some later stage of its evolution?

Cheers,

Dave

smitty_one_each

unread,
Nov 12, 2003, 3:36:01 PM11/12/03
to
[snip]

The "problem" with the C++/CLI binding is that it is more than a
traditional
language binding. And it has to be, it is just a fact. In fact I
feel that
Microsoft is actually pushing the C++ language a little bit, not being
satisfied by the pace it embraced new ideas (such as virtual machines,
or
portable language environments and GC and so forth). I do not mean
pushing
as a bad word here.
[snip]

Here is a paranoid question:
Is there a possible future step, where compiling C++ on a Microsoft
plaftform becomes impossible _without_ using the CLI binding?
C++ is beautiful because of its austerity. I appreciate that many
applications benefit from all the 'goodies', but many do not.
R,
Chris

Francis Glassborow

unread,
Nov 13, 2003, 6:52:34 AM11/13/03
to
In article <30e6e0a8.03111...@posting.google.com>, Dave Boyle
<david...@ed.tadpole.com> writes

><snip>
>
>> I want to stress again that I am not in any way against the C++/CLI binding,
>> on the contrary. I only think that this binding (ehem, the community making
>> it) should take every possible precaution not to use constructs which the
>> ISO C++ might/could use in its evolution.
>
><snip>
>
>This is an interesting issue, I think. Is there a precedent for this?
>
>For example, has there ever been a situation when those responsible
>for standardising C requested the C++ group not use keywords etc.
>that might be useful to C at some later stage of its evolution?

Unlikely as C had a policy of 'no new keywords' Of course it broke that
with restrict but...

--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

WW

unread,
Nov 13, 2003, 6:58:56 AM11/13/03
to
Dave Boyle wrote:
> <snip>
>
>> I want to stress again that I am not in any way against the C++/CLI
>> binding, on the contrary. I only think that this binding (ehem, the
>> community making it) should take every possible precaution not to
>> use constructs which the ISO C++ might/could use in its evolution.
>
> <snip>
>
> This is an interesting issue, I think. Is there a precedent for this?
>
> For example, has there ever been a situation when those responsible
> for standardising C requested the C++ group not use keywords etc.
> that might be useful to C at some later stage of its evolution?

AFAIK there has been a close working relationship between C and C++ people
at the beginning. Then, as I understand it, the two committees had some
sort of "breakup", meaning that the cooperation is not as high as one would
expect from so closely related languages. So IMO there were no such
requests, because the communication is not really there. And IIRC there was
no question from the C++ committee about their C99 introduced keywords
either (if they will break C++ code or not). And both is sad.

But we have to be careful not to fish here (red herring). C++ is *not* a
language binding for C! They are two distinct languages. Closely related,
but distinct. OTOH C++/CLI is *not* a new language.

To be honest I was hoping for a little bit more constructive comments - if
any. I have made my comments not to discredit the CLI binding effort (and
IMO who has read my article fully saw that), but to point out the possible
hardships and even trying to give solutions for it. If I did seem to
criticize (which I did not mean to), it was *constructive* criticism.

--
WW aka Attila

WW

unread,
Nov 13, 2003, 4:02:44 PM11/13/03
to
smitty_one_each wrote:
> [snip]
> The "problem" with the C++/CLI binding is that it is more than a
> traditional
> language binding. And it has to be, it is just a fact. In fact I
> feel that
> Microsoft is actually pushing the C++ language a little bit, not being
> satisfied by the pace it embraced new ideas (such as virtual machines,
> or
> portable language environments and GC and so forth). I do not mean
> pushing
> as a bad word here.
> [snip]
>
> Here is a paranoid question:
> Is there a possible future step, where compiling C++ on a Microsoft
> plaftform becomes impossible _without_ using the CLI binding?
> C++ is beautiful because of its austerity. I appreciate that many
> applications benefit from all the 'goodies', but many do not.

Possibly Herb could answer this better... but here is my try. I cannot tell
the future, but IMO it will not happen. If it will happen for certain kind
of code - such as the GUI stuff - this will be anyway the part which is
already pretty much system dependent. It is also worth to mention that - as
of now - several groups/companies are making ISO CLI environments, so I
believe that some subset of the CLI will not be MS operating system specific
anyways.

So all in all I am not afraid of that part. If MS manages to make the
reincarnated managed C++ to be such that people will be forced to use too
much CLI featured in their code... well people will detect that. And as I
understood what Herb said on the WG21 meeting they want pure C++ code to run
on the CLI. And as I imagine one of the reasons why MS wants C++ on the CLI
is code reuse. And so do the people who plan to use that binding, and they
will not settle with having to keep two code bases of general code - and in
this case C++ on CLI would not succeed.

--
WW aka Attila

Nicola Musatti

unread,
Nov 13, 2003, 5:02:04 PM11/13/03
to
Let me first say that I consider your worries more than reasonable.

"WW" <wo...@freemail.hu> wrote in message news:<bor8u2$gu4$1...@phys-news1.kolumbus.fi>...
[...]


> With some it is quite simple, because "enum class" could actually be called
> "cli enum", "interface class" could be called "cli interface" and so forth.
> Possibly with another "trigger" word. it might even be possible with that
> trigger word remaining free to use as an identifier - if max munch allows.
> Although I am really a nobody and I do not imagine that I can even hope to
> have any effect in this quest I humbly but honestly and strongly suggest for
> the ECMA committee to consider such a construct (triggerword enum,
> triggerword interface etc.).

The standard already provides a way to avoid conflicts when
introducing new keywords: prepend a double underscore.

> The biggest problem is with operator symbols. There is an ongoing
> discussion about creating "Java style references" in C++ (comp.std.c++). I
> do not really follow it and I have no idea how much real support it has, but
> such a thing would/could evidently use the hat (^) symbol for its type
> declaration. But as of today CLI has taken that symbol away. The big
> question is: will the semantics of the C++/CLI ECMA standard be compatible
> with the semantics decided for the evolving C++?

Introducing new operators is not an option if the spirit of the C++
standard is to be respected; when all's said and done the ECMA
standardization committee have to decide whether they want to create
an "orthodox" extension or C+++ .

Quite frankly most possible behaviours can be expressed by means of
classes, functions and overloading of exisiting operators; whether
these elements exist in libraries written in standard C++ or are
handled in some magical way is just an implementation detail.

In this respect I think that Borland did a reasonable job with its
VCL.

[...]


> As a sidenote let me express my thoughts about what I see as unusual about
> the CLI/C++ binding. Language bindings I have seen were either pure library
> ones, or ones which extended the language with definitely unambiguous
> syntactical constructs. Such ones, which are 99.999% unlikely to collide
> with the language itself. As an example I can remember the ESQL/C I have
> used 12 years ago. Each SQL command started with $SQL, making it sure that
> C is not effected.

Note that Embedded SQL is a very different beast in that it is usually
handled by a preprocessor which translates SQL statements into valid
statements of the host language. I don't think this approach would be
viable for CLI/C++.

[...]


> The "problem" with the C++/CLI binding is that it is more than a traditional
> language binding. And it has to be, it is just a fact. In fact I feel that
> Microsoft is actually pushing the C++ language a little bit, not being
> satisfied by the pace it embraced new ideas (such as virtual machines, or
> portable language environments and GC and so forth). I do not mean pushing
> as a bad word here.

If Microsoft (or anybody else) wanted C++ to "support" virtual
machines, they just had to design their VM's to be capable to support
all the features of C++ and not just a subset.

As for GC, pure implementations exist.

Cheers,
Nicola Musatti

Attila Feher

unread,
Nov 14, 2003, 8:13:34 AM11/14/03
to
Nicola Musatti wrote:
> The standard already provides a way to avoid conflicts when
> introducing new keywords: prepend a double underscore.

ANd would you use an enviroment, which makes code metrics liek this:

1003240230 line of code
357394 empty
95864569 comment
3498759084960879076904265 underscoes.

> Introducing new operators is not an option if the spirit of the C++
> standard is to be respected; when all's said and done the ECMA
> standardization committee have to decide whether they want to create
> an "orthodox" extension or C+++ .
>
> Quite frankly most possible behaviours can be expressed by means of
> classes, functions and overloading of exisiting operators; whether
> these elements exist in libraries written in standard C++ or are
> handled in some magical way is just an implementation detail.

Please read up on the hat symbol and what does it mean. Then you will
understand that there seems to be no reasonable way to bring this up to the
level of the C++ language and make it possible to use it conveniently -
unless an operator is introduced.

> In this respect I think that Borland did a reasonable job with its
> VCL.

IIRC that is not a binding to a VM.

> If Microsoft (or anybody else) wanted C++ to "support" virtual
> machines, they just had to design their VM's to be capable to support
> all the features of C++ and not just a subset.

IIUC you can use all the features of C++ running on the CLI. At least with
the new binding. You cannot use all the C++ features with all the CLI
native things. Like you cannot have C++ like multiple inheritance on native
CLI objects. IIRC.

> As for GC, pure implementations exist.

CLI (if I understood correctly) is much more than a GC.

Microsoft did not design the CLI based on (or for) C++. OTOH Herb and his
group at ECMA seems to make every effort possible to make C++ and CLI very
useful together. I want to stress again that my worries are about possible
conflict because of the different pace of the two committees. I do not have
worries about the intention, neither do I want to discourage the C++/CLI
binding work. I had to stress that again, because I have felt that your
reply implied that it is not a good thing. And I felt that you thought I do
think that. But - on the contrary - I think that the C++/CLI binding *is* a
good thing, *if* done well. And all I wanted to point out is some -
possibly - hidden dangers.

--
Attila aka WW

Andrew Browne

unread,
Nov 15, 2003, 5:17:04 AM11/15/03
to
"Francis Glassborow" <fra...@robinton.demon.co.uk> wrote in message
news:3LPM05VrxCs$Ew...@robinton.demon.co.uk...

>
> However what concerns a number of us is that these 'pure extensions'
> will have a negative impact on the future development of C++. I, for
> one, do not think that introducing new keywords and/or semantics is
the
> task of a bindings standard.

The goals of the C++/CLI proposal are good ones, I think, but I wonder
if it would be possible to achieve them without (most of) the new
keywords and semantics?

For example instead of:

ref class R {/*...*/}; // CLR reference type
value class V {/*...*/}; // CLR value type
interface class I {/*...*/}; // CLR interface type
generic <typename T>
ref class G {/*...*/}; // CLR generic
// etc etc

couldn't we have

class R : public System::Object {/*...*/}; // CLR reference type
class V : public System::ValueType {/*...*/}; // CLR value type
class I : public System::Object
{/* pure virtuals only here*/ }; // CLR interface type
template <typename T>
class G : public System::Object {/*...*/}; // CLR generic
// etc etc?

I'm not clear why we need gcnew and the new handle and tracking
references either. Would it be possible to just use normal pointer and
reference syntax and for the compiler to do the right thing in terms
of pinning etc when needed?

Andrew Browne

Herb Sutter

unread,
Nov 17, 2003, 3:44:36 PM11/17/03
to
On 12 Nov 2003 15:36:01 -0500, smitty_...@hotmail.com

(smitty_one_each) wrote:
>Is there a possible future step, where compiling C++ on a Microsoft
>plaftform becomes impossible _without_ using the CLI binding?

No. I've just started a blog at http://blogs.gotdotnet.com/hsutter, so
I'll answer this question there.

For convenience, though, here's a copy of that answer:


No. Doing that would mean throwing away all the ISO conformance work that
Visual C++ just spent nearly the whole last release cycle adding to the
product. VC++ is now 98%-ish conformant to C++03 (the 1998 ISO C++
standard + its first technical corrigendum) and VC++ will continue to work
on the remaining 2%, plus track the coming C++0x additions as they are
created by the ISO and ANSI committees.

Of course, the CLI extensions will be needed where programs specifically
take advantage of CLI (i.e., .NET) data types and features, such the types
in the .NET Frameworks libraries, and garbage collection and reflection.
But programs that don't need those can ignore the extensions and compile
just fine to either native binaries or to .NET IL. Note that last bit,
because it seems to be not widely known: C++ code can still be compiled to
IL and run in the .NET virtual machine (Common Language Runtime, or CLR)
without using any extensions; the extensions are needed only for
additionally using CLI data types and features like garbage collection.

So there are three major scenarios:

- Pure native: Compile existing programs to native binaries just like
we've all been doing for years. No CLI features, no CLI extensions.

- Normal C++ programs that happen to be compiled to IL instead of to x86:
The code runs on the VM and is JITted and everything, but the program is
still using all native data and not using any CLI data types, so no CLI
extensions are needed here either.

- C++ programs that explicitly start using some CLI data types or
features: At those points in the code where those data types or features
are used, and only at those points, the extensions will apply, and most of
the time the only new syntax will be to write gcnew and ^ (instead of new
and *).

Unless you're actually authoring your own new CLI types, you're unlikely
to directly use much more than gcnew and ^, plus maybe an occasional
sprinkling of nullptr or %.

Best wishes,

Herb

---
Herb Sutter (www.gotw.ca)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 18, 2003, 6:26:13 AM11/18/03
to

Nicola, these are great points and I'll get to all of them. For now, I've
answered the following in my blog (http://blogs.gotdotnet.com/hsutter).
Here's a text version:


On 13 Nov 2003 17:02:04 -0500, Nicola....@ObjectWay.it (Nicola


Musatti) wrote:
>"WW" <wo...@freemail.hu> wrote in message news:<bor8u2$gu4$1...@phys-news1.kolumbus.fi>...

>> The "problem" with the C++/CLI binding is that it is more than a traditional
>> language binding. And it has to be, it is just a fact.

[...]


>As for GC, pure implementations exist.

[that add no new extensions to ISO C++]

Not for a pure definition of "pure," they don't. :-)

To explain why C++ pointers are insufficient (unless their semantics were
to be changed at least a little, which would mean breaking existing code),
consider two counterexamples:

1. Not for a compacting GC. Certainly a bald pointer can't point directly
to an object that moves around in memory, because C++ pointers are
required to be stable, to always have the same value while pointing to the
same object. Changing the semantics of a pointer to make it track will
break lots of code, starting with set<T*>, because such tracking pointers
cannot be ordered (their values will after all be changed arbitrarily at
unpredictable times by the GC). There are also other restrictions, but
that's one of the most noticeable. [Aside: Such a tracking pointerlike
abstraction is needed, and is provided in C++/CLI. It just can't be
spelled * without fundamentally scuttling ISO C++ conformance, is all.]

2. Not for a non-compacting GC, either. This case can be got a lot closer,
but even Great Circle / Boehm style collectors impose restrictions that
break some conforming C++ programs. In particular, they restrict, if only
slightly, the operations that Standard C++ allows on pointers. Consider
the following well-formed ISO C++ program with well-defined semantics:

int* pi = new int(42); // line 1
pi ^= 0xaaaaaaaa;

// ... do other work ...

pi ^= 0xaaaaaaaa;
cout << *pi; // perfectly ok, prints "42", won't crash
delete pi; // ok

Add-on GCs can't see such disguised pointers, and are liable to reclaim
the memory allocated in line 1 before its later use, resulting in an
attempt to access freed memory. Boom.

This isn't perverse or theoretical, by the way. Consider "two-way
pointers" as one example of a well-known implementation technique where
two pointers are XOR'd together like this for a perfectly reasonable and
legal use. In particular, a motivation behind two-way pointers is that you
can have a more space-efficient doubly linked list if you store only one
(not two) pointer's worth of storage in each node. But how can the list
still be traversable in both directions? The idea is that each node
stores, not a pointer to one other node, but a pointer to the previous
node XOR'd with a pointer to the next node. To traverse the list in either
direction, at each node you get a pointer to the next node by simply
XORing the current node's two-way pointer value with the address of the
last node you visited, which yields the address of the next node you want
to visit. For more details, see:

"Running Circles Round You, Logically"
by Steve Dewhurst
C/C++ Users Journal (20, 6), June 2002

I don't think the article is available online, alas, but you can find some
related source code demonstrating the technique at:

http://www.semantics.org/tyr/tyr0_5/list.h

This perfectly standards-conforming and useful technique won't work
correctly with any GC implementation I know of that does not extend the
language so that pointers can retain their full standard meaning.

Steve's technique works perfectly fine and unbroken, however, under
C++/CLI. It works because C++/CLI preserves exactly the full semantics of
* pointers without any limitations. To do so, C++/CLI needed to add a new
abstraction for GC semantics instead of pretending that raw pointers are
by themselves a complete solution for safe use in a GC environment (they
aren't, only because they were never designed to be).

For more about the design motivations behind the ^ declarator (aka a
"handle"), see also Brandon Bray's excellent blog entry posted earlier
today at:

http://blogs.gotdotnet.com/branbray/permalink.aspx/c57f8683-5973-4ecc-837c-95e37102e86d

Best wishes,

Herb

---
Herb Sutter (www.gotw.ca) (http://blogs.gotdotnet.com/hsutter)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 18, 2003, 6:40:15 AM11/18/03
to
On 15 Nov 2003 05:17:04 -0500, "Andrew Browne"

<clcppm...@this.is.invalid> wrote:
>I'm not clear why we need gcnew and the new handle and tracking
>references either. Would it be possible to just use normal pointer and
>reference syntax and for the compiler to do the right thing in terms
>of pinning etc when needed?

I'll answer your other points soon, but as for this part please see my
blog at http://blogs.gotdotnet.com/hsutter (and response to a similar
question elsewhere in this thread). In particular, the entry is:

FAQ: Why aren't C++ pointers enough to "handle" GC?

http://blogs.gotdotnet.com/hsutter/PermaLink.aspx/ec4f6e29-b512-4071-9dd7-340246bb5c51

and it includes a link to Brandon Bray's blog which is a more detailed
answer to the latter part of your question ("why not just get the compiler
to do the right thing"):

Behind the Design: Handles
by Brandon Bray

http://blogs.gotdotnet.com/branbray/permalink.aspx/c57f8683-5973-4ecc-837c-95e37102e86d

More to come...

Herb

---
Herb Sutter (www.gotw.ca) (http://blogs.gotdotnet.com/hsutter)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 18, 2003, 6:55:04 AM11/18/03
to
On 13 Nov 2003 17:02:04 -0500, Nicola....@ObjectWay.it (Nicola
Musatti) wrote:
>The standard already provides a way to avoid conflicts when
>introducing new keywords: prepend a double underscore.

Right, and that's what Managed C++ used, for just that reason: to respect
compatibility. Unfortunately, there was a lot of resistance and it is
considered a failure.

For one thing, programmers have complained loudly that all the underscores
are not only ugly, but a real pain because they're much more common
throughout the code than other extensions such as __declspec have been. In
particular, __gc gets littered throughout the programmer's code.

At least as importantly, the __keywords littered throughout the code can
make the language feel second-class, particularly when people look at
equivalent C++ and C# or VB source code side-by-side. This comparative
ugliness has been a contributing, if not essential, factor why some
programmers have left C++ for other languages.

Consider:

//-------------------------------------------------------
// C# code
//
class R {
private int len;
public property int Length {
get() { return len; }
set() { len = value; }
}
};

R r = new R;
r.Length = 42;

//-------------------------------------------------------
// Managed C++ equivalent
//
__gc class R {
int len;
public:
__property int get_Length() { return len; }
__property void set_Length( int i ) { len = i; }
};

R __gc * r = new R;
r.set_Length( 42 );

Oddly, numerous programmers find the former more attractive. Particularly
after the 2,000th time they typed __gc.

But now we can do better:

//-------------------------------------------------------
// C++/CLI equivalent
//
ref class R {
int len;
public:
property int Length {
int get() { return len; }
void set( int i ) { len = i; }
}
};

R^ r = gcnew R;
r->Length = 42;

I should note there's actually also a shorter form for this common case,
to have the compiler automatically generate the property's getter, setter,
and backing store. While I'm at it, I'll also put the R instance on the
stack which is also a new feature of the revised syntax:

//-------------------------------------------------------
// C++/CLI alternatives
//
ref class R {
public:
property int Length;
};

R r;
r.Length = 42;

C# is adding something similar as a property shorthand. But C# doesn't
have stack-based semantics for reference types and is unlikely to ever
have them, though "using" is a partial automation of the stack-based
lifetime control that C++ programmers take for granted. I'll have more to
say about "using" another time.

I've added this discussion to my blog, too.

Herb

---
Herb Sutter (http://blogs.gotdotnet.com/hsutter) (www.gotw.ca)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Nicola Musatti

unread,
Nov 18, 2003, 2:25:18 PM11/18/03
to
"Attila Feher" <attila...@lmf.ericsson.se> wrote in message news:<bp2ivd$r5$1...@newstree.wise.edt.ericsson.se>...

> Nicola Musatti wrote:
> > The standard already provides a way to avoid conflicts when
> > introducing new keywords: prepend a double underscore.
>
> ANd would you use an enviroment, which makes code metrics liek this:
>
> 1003240230 line of code
> 357394 empty
> 95864569 comment
> 3498759084960879076904265 underscoes.

If these were realistic figures they would indicate that the ECMA
effort was useless. On the other hand I use daily Borland's
C++Builder, whose binding to the Object Pascal written VCL library
makes judicious use of extension keywords.

[...]


> > Quite frankly most possible behaviours can be expressed by means of
> > classes, functions and overloading of exisiting operators; whether
> > these elements exist in libraries written in standard C++ or are
> > handled in some magical way is just an implementation detail.
>
> Please read up on the hat symbol and what does it mean. Then you will
> understand that there seems to be no reasonable way to bring this up to the
> level of the C++ language and make it possible to use it conveniently -
> unless an operator is introduced.

I tried googl'ing for it but didn't find much. From the little I did
find I can't see a need for a different operator from plain operator
dereference.

> > In this respect I think that Borland did a reasonable job with its
> > VCL.
>
> IIRC that is not a binding to a VM.

That's true, but it is still a binding to a library written in a
different language which uses a different object model.



> > If Microsoft (or anybody else) wanted C++ to "support" virtual
> > machines, they just had to design their VM's to be capable to support
> > all the features of C++ and not just a subset.
>
> IIUC you can use all the features of C++ running on the CLI. At least with
> the new binding. You cannot use all the C++ features with all the CLI
> native things. Like you cannot have C++ like multiple inheritance on native
> CLI objects. IIRC.
>
> > As for GC, pure implementations exist.
>
> CLI (if I understood correctly) is much more than a GC.
>
> Microsoft did not design the CLI based on (or for) C++.

I'm not sure I'll ever forgive them for that ;-)

> OTOH Herb and his
> group at ECMA seems to make every effort possible to make C++ and CLI very
> useful together. I want to stress again that my worries are about possible
> conflict because of the different pace of the two committees. I do not have
> worries about the intention, neither do I want to discourage the C++/CLI
> binding work. I had to stress that again, because I have felt that your
> reply implied that it is not a good thing. And I felt that you thought I do
> think that. But - on the contrary - I think that the C++/CLI binding *is* a
> good thing, *if* done well. And all I wanted to point out is some -
> possibly - hidden dangers.

The effort that Microsoft is investing in making .NET accessible to
C++ programmers in a natural way is certainly a positive thing and the
team involved (not only Herb, but also Dinkumware and EDG) is
impressive and certainly worthy of our trust. I'm curious to see what
they'll come out with.

Cheers,
Nicola Musatti

Francis Glassborow

unread,
Nov 18, 2003, 2:47:52 PM11/18/03
to
In article <5qbirv89n0terf7f8...@4ax.com>, Herb Sutter
<hsu...@gotw.ca> writes

>On 13 Nov 2003 17:02:04 -0500, Nicola....@ObjectWay.it (Nicola
>Musatti) wrote:
> >The standard already provides a way to avoid conflicts when
> >introducing new keywords: prepend a double underscore.
>
>Right, and that's what Managed C++ used, for just that reason: to respect
>compatibility. Unfortunately, there was a lot of resistance and it is
>considered a failure.
>
>For one thing, programmers have complained loudly that all the underscores
>are not only ugly, but a real pain because they're much more common
>throughout the code than other extensions such as __declspec have been. In
>particular, __gc gets littered throughout the programmer's code.
>
>At least as importantly, the __keywords littered throughout the code can
>make the language feel second-class, particularly when people look at
>equivalent C++ and C# or VB source code side-by-side. This comparative
>ugliness has been a contributing, if not essential, factor why some
>programmers have left C++ for other languages.

However what is needed is something like a greatly improved 'Managed
C++'.

Just because the 'real' extended keywords are provided by prepending a
double underscore does not mean that this 'implementation' detail need
to be visible to the average working programmer. MS is, among other
things, a tool vendor and so should be more than capable of providing a
tool that quietly converts (translates) from user written terms to
'correct' extended C++ ones and visa-versa.

--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Peter Dimov

unread,
Nov 18, 2003, 2:57:59 PM11/18/03
to
Herb Sutter <hsu...@gotw.ca> wrote in message news:<5qbirv89n0terf7f8...@4ax.com>...

> On 13 Nov 2003 17:02:04 -0500, Nicola....@ObjectWay.it (Nicola
> Musatti) wrote:
> >The standard already provides a way to avoid conflicts when
> >introducing new keywords: prepend a double underscore.
>
> Right, and that's what Managed C++ used, for just that reason: to respect
> compatibility. Unfortunately, there was a lot of resistance and it is
> considered a failure.
>
> For one thing, programmers have complained loudly that all the underscores
> are not only ugly, but a real pain because they're much more common
> throughout the code than other extensions such as __declspec have been. In
> particular, __gc gets littered throughout the programmer's code.
>
> At least as importantly, the __keywords littered throughout the code can
> make the language feel second-class, particularly when people look at
> equivalent C++ and C# or VB source code side-by-side. This comparative
> ugliness has been a contributing, if not essential, factor why some
> programmers have left C++ for other languages.

This is all understandable, but the fact that choosing 'ref' as one of
the keywords breaks three TR1 libraries (and who knows how much other
code) might deserve mentioning.

Nicola Musatti

unread,
Nov 18, 2003, 2:58:55 PM11/18/03
to
Herb Sutter <hsu...@gotw.ca> wrote in message news:<5qbirv89n0terf7f8...@4ax.com>...
[...]

> I've added this discussion to my blog, too.

...where I answered:

I still don't like the use of plain keywords and new operators in a
C++ extension; I believe that these should be reserved for the
evolution of the language itself.

I share the point of view that the original Managed Extensions lead to
very ugly code, but the example you posted looks like what I call
C+++. Is it reasonable/useful to have a new, transitional language
between C++ and C#?

A better alternative would be the combination of different forms of
syntax: some standard compliant declarations, a few extended keywords,
one or to additions to the C++ standard. Here are a few examples:
Properties are a topic of general interest; a better effort should be
made to have them included in the next version of the standard? What
happened to the Borland proposal?

The hat symbol and gcnew could be replaced with a template like
syntax, e.g.

cli::handle<R> r = cli::gcnew<R>();
R->Length = 42;

When you're done:

cli::gcdelete(r);

__gc classes could be declared by making their constructors private
and making gcnew their friend, or by deriving them from a conventional
ancestor; Otherwise the __gc keyword could be retained.

System:: classes' semantics would be defined by the fact that they
belong to a conventional namespace and would behave sort of like a
typedef'ed smart pointer: you could declare their instances without a
pointer like syntax (eg.:

System::String s;

but access their members with the arrow operator.

I'm aware that these are very rough ideas and that there are very many
aspects I didn't consider. I don't have the knowledge nor the time to
turn these into a serious proposal. Yet I believe that the general
ideas behind them are better respectful of the spirit of the C++
standard than the glimpses I cought of the current proposal and they
might even be better acceptable to standard C++ programmers.

Cheers,
Nicola Musatti

Nicola Musatti

unread,
Nov 18, 2003, 3:01:17 PM11/18/03
to
Herb Sutter <hsu...@gotw.ca> wrote in message news:<nt1irv8m26arklo21...@4ax.com>...

> Nicola, these are great points and I'll get to all of them. For now, I've
> answered the following in my blog (http://blogs.gotdotnet.com/hsutter).

To which I answered, also in your blog:

To point 1 I have no comments to make. Handles a la Macintosh are
something very different from pointers and their introduction into the
language would require a new syntax.

As to point 2 I'm aware that gc's could benefit from modifications to
the standard, but for add on libraries such as the Boehm collector I
believe there's nothing wrong in imposing conditions on the use of the
data it handles.
I don't find it any different than making sure you don't use iterators
after performing invalidating operations.

Cheers,
Nicola Musatti

Richard Smith

unread,
Nov 18, 2003, 3:02:40 PM11/18/03
to
Herb Sutter wrote:

> For those who are interested in seeing the shape of the project, we will
> be making the proposed standard's base document (the "starting point"
> document) publicly available on November 15.

Is this document now available on the web anywhere? I've
tried googling for it and looking on the Microsoft and ECMA
web sites but have been unable to find it.

--
Richard Smith

__DILIP__

unread,
Nov 18, 2003, 3:19:55 PM11/18/03
to

"Nicola Musatti" <Nicola....@ObjectWay.it> wrote in message
news:a327cf48.03111...@posting.google.com...

> I tried googl'ing for it but didn't find much. From the little I did
> find I can't see a need for a different operator from plain operator
> dereference.

There is an exhaustive write-up about this at Brandon Bray's weblog at:
http://blogs.gotdotnet.com/branbray/PermaLink.aspx/c57f8683-5973-4ecc-837c-9
5e37102e86d

--Dilip

Note to moderator: I was under the impression I can bring any advancements
brought about in a given C++ compiler irrespective of who its vendor (MSFT
in this case) might be. Is my understanding wrong? In any case, this is
another post to a .NET link because the OP indicated he couldn't find any
relevant information on googling.

Herb Sutter

unread,
Nov 18, 2003, 6:13:42 PM11/18/03
to
(I've also included this answer in my blog.)

On 15 Nov 2003 05:17:04 -0500, "Andrew Browne"
<clcppm...@this.is.invalid> wrote:

>The goals of the C++/CLI proposal are good ones, I think, but I wonder
>if it would be possible to achieve them without (most of) the new
>keywords and semantics?
>
>For example instead of:
>
>ref class R {/*...*/}; // CLR reference type
>value class V {/*...*/}; // CLR value type
>interface class I {/*...*/}; // CLR interface type
>generic <typename T>
>ref class G {/*...*/}; // CLR generic
>// etc etc
>
>couldn't we have
>
>class R : public System::Object {/*...*/}; // CLR reference type
>class V : public System::ValueType {/*...*/}; // CLR value type
>class I : public System::Object
>{/* pure virtuals only here*/ }; // CLR interface type
>template <typename T>
>class G : public System::Object {/*...*/}; // CLR generic
>// etc etc?

That's one of the alternatives I attempted, and I wasn't the first. I
think almost everyone starts here, and I held on for a while before I
became convinced I had to let go because it wasn't leading to the right
places. Let me share some of the problems and objections that crop up when
you work your way down this path:


1. (Minor) Verbose

The above alternative is a lot of typing compared to any of the
alternatives (Managed C++ syntax, proposed C++/CLI syntax, and other CLI
languages).

There's a pretty easy solution for this one, using keyword shortcuts:

class R : ref {/*...*/}; // CLR reference type
class V : value {/*...*/}; // CLR value type
class I : interface


{/* pure virtuals only here*/ }; // CLR interface type

An inconvenience with this is that there could already be a class named
ref, and so the syntax would have to be embroided somehow to account for
disambiguating this; this is unfortunate but surmountable. But, more
importantly, this shorthand still doesn't address the other drawbacks,
below, of this general approach.


2. Forward declarations

Consider:

class X;

Is this a ref class, value class, interface class, or native class? There
are a few cases where this needs to be known from the forward declaration.


3. Indirect: The header hunt

Consider:

class X : public Y { };

Is this a ref class, value class, interface class, or native class? Under
the alternative, the only way to know would be to inspect Y and all base
classes until you can determine whether any of them directly or indirectly
inherit from Object or ValueType (or not). There are shortcuts (e.g., it's
simpler for value types because they're always sealed and so the
inheritance has to be direct), but the hunt remains.

That may not seem like a huge issue, except that the types really are
behaviorally different in small but important ways; for example, in one
case a virtual call in a ctor or dtor will be deep, in the other it will
be shallow. What metadata will eventually be emitted, if any?


4. Closes doors

Speaking specifically to the last part of the example:

>template <typename T>
>class G : public System::Object {/*...*/}; // CLR generic

Unfortunately, this conflates the ideas of the type category
(ref/value/native) with the form of genericity (generic/template). It says
that CLI types can only be genericized, and native types can only be
templated, leaving no way to express the other two useful concepts:

- a templated CLI type (C++/CLI syntax: template<class T> ref class R {};)
- a generic native type (C++/CLI syntax: generic<class T> class N {};)

Templated CLI types in particular are very useful and are supported in
C++/CLI, which lets the template/generic choice and the class category
choice vary independently.


5. Other closed doors: Distinguishing mixed types (Future)

In the future, C++/CLI is intended to eventually allow for full mixing and
cross-inheritance of arbitrary types. Using the alternative
inheritance-based syntax alone does not allow the programmer to
distinguish between the following two distinct things that the proposed
C++/CLI design lets the programmer express as follows:

ref class Ref : public ANative { int x; };

class Native : public ARef { int x; };

This distinction can't be expressed using the proposed alternative above.
Both types have System::Object as a base class, but one is a reference
class that other CLI languages could use directly and where virtual calls
during construction are deep, and one is a native class that other CLI
languages can only use via a handle or reference to the ARef base class
and where virtual calls during construction are shallow.


Cheers,

Herb

---
Herb Sutter (http://blogs.gotdotnet.com/hsutter) (www.gotw.ca)

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Andrew Browne

unread,
Nov 18, 2003, 6:15:07 PM11/18/03
to
"Herb Sutter" <hsu...@gotw.ca> wrote in message
news:gp5irvsf0s0eakq4v...@4ax.com...
<snip><snip>

As I understand it, the crucial argument here is that the compacting
GC can move objects around and that standard C++ pointers do not allow
for this. Ordering is mentioned, specifically with regard to set<T*>.
However I believe standard C++ only specifies pointer comparison
between arbitrary pointers for void pointers (and I think this was
actually a late language addition, specifically to support such things
as set<T*>.) Would it be possible for a conversion to void pointer to
perform an implicit pin so that the ordering requirement would be
satisfied?

If that were possible, then presumably it would also be possible to
perform an implicit pin when a pointer is cast to an integer type, so
that it could be cast back to yield the same (still valid) pointer.

Of course in these scenarios the pinned memory would not be garbage
collected, and would need something like an explicit call to delete to
release the pin and signal to the collector that the memory is now
available for collection.

Andrew Browne

Herb Sutter

unread,
Nov 19, 2003, 9:33:19 AM11/19/03
to
On 18 Nov 2003 14:57:59 -0500, pdi...@mmltd.net (Peter Dimov) wrote:
>This is all understandable, but the fact that choosing 'ref' as one of
>the keywords breaks three TR1 libraries (and who knows how much other
>code) might deserve mentioning.

Fortunately, it doesn't. :-) C++/CLI does not take ref as a reserved word,
only as a contextual keyword. Thus no uses of ref as an identifier are
affected. (This is by design.)

I should probably at some point write another followup to this outlining
the approach we've taken to use contextual keywords and in some cases a
lex hack... maybe later this week.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 19, 2003, 9:36:04 AM11/19/03
to
On 18 Nov 2003 15:02:40 -0500, Richard Smith <ric...@ex-parrot.com>
wrote:

>Herb Sutter wrote:
>
>> For those who are interested in seeing the shape of the project, we will
>> be making the proposed standard's base document (the "starting point"
>> document) publicly available on November 15.
>
>Is this document now available on the web anywhere? I've
>tried googling for it and looking on the Microsoft and ECMA
>web sites but have been unable to find it.

We're, um, a few days late. It'll be there by the end of the week. I'll
post the URL here and to my mailing list and to www.gotw.ca/microsoft (my
memory being what it is, I ought to add the qualification: if I don't
forget about it until I suddenly wake up in a panic in the middle of the
night two weeks later and realize I didn't do it yet).

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Andrew Browne

unread,
Nov 19, 2003, 9:40:06 AM11/19/03
to
"Herb Sutter" <hsu...@gotw.ca> wrote in message
news:m22lrvomeelnf0mm9...@4ax.com...

> (I've also included this answer in my blog.)
>
> That's one of the alternatives I attempted, and I wasn't the first.
I
> think almost everyone starts here, and I held on for a while before
I
> became convinced I had to let go because it wasn't leading to the
right
> places. Let me share some of the problems and objections that crop
up when
> you work your way down this path:

A partial answer to these objections - specifically to discriminating
between CLI, native and mixed types might be a linkage specification.

extern "CLI" {
// anything declared in here is a pure CLI type
}

Any pure CLI type would have CLI "linkage" and any native or mixed
type wouldn't.

Andrew Browne

Attila Feher

unread,
Nov 19, 2003, 1:39:04 PM11/19/03
to
WW wrote:
> Francis Glassborow wrote:
> [SNIP]

>> However what concerns a number of us is that these 'pure extensions'
>> will have a negative impact on the future development of C++. I, for
>> one, do not think that introducing new keywords and/or semantics is
>> the task of a bindings standard.
>>
>> Note that I ma not claiming that this will happen but I am saying
>> that should it, I will need very strong persuasion before
>> encouraging my National Body to endorse the resulting ECMA Standard
>> as an ISO one. [SNIP]
>
> IMHO it is already showing if you follow the signs carefully. I mean
> as I understood the votes of the evolution group, people want a new,
> more class like, safe enum....
[SNIP]

Does the complete silence (no answers) to my post mean that the ECMA group
does refuse to take my comments into account? I have to ask that because
everyone else in this subthread has got answers but me. And I would like to
see some reaction, because the things I have listed (possible bad side
effects on the mainstream C++ language) are important enough not to be
ignored.

In general, is Microsoft (and the ECMA group created by it) ready to look
past Microsofts urgent needs of today (being able to compile C++ code
better) and take into account the *future* of standard C++? Right now, from
the answers given to others and the no-answer given to me worries me,
because it says: nope.

I am pretty sure that all of the design decisions taken by the group can be
discussed here or in a blog one by one, but the questiosn remains: on what
bases will/should C++/CLI grab the enum class, when ISO C++ has just decided
to pursue further the notion of a more class like enum. And that is only
one of the many questions.

And I believe that it would really be good to address the general matter
those questions bring up: how to make the C++/CLI binding the least
intrusive to the language. IMHO it is not as good as it could be. Many
construct are too C++-ish to be taken by a language binding just like this.
So we can concentrate on proving that each feature represented by those
unfortunate syntactical constructs should be there *or* we can think about
how to change them into something unlikely to collide (like cli enum, clinew
instead of gcnew etc.) or how to generalize them enough (for example
operators or the ^ type and even gcnew if possible) so that they can (to our
best knowledge) fit into the future C++ language.

If this does not happen there might be just way too much turbulence
afterwards. Eg: the ECMA standard failing to become an ISO one. This
delays what MS wants to do, not good for anyone. Even if this does not
happen (and even if it does) MS will be forced to vote down *any* proposals
in C++ evolution (or library), which would break their new CLI binding. For
example it might be the ISO C++ committee, who has to come up with a weird
name for the new enum, because the enum class is taken, and class enum is
not distinct enough (or vice versa). If this ever happen it will be a Very
Bad Thing. IMO it is better to talk about them now and try to solve them
now, then face the tension a year from now.

--
Attila aka WW

Peter Dimov

unread,
Nov 19, 2003, 4:01:39 PM11/19/03
to
Nicola....@ObjectWay.it (Nicola Musatti) wrote in message news:<a327cf48.03111...@posting.google.com>...

[...]

> The hat symbol and gcnew could be replaced with a template like
> syntax, e.g.
>
> cli::handle<R> r = cli::gcnew<R>();
> R->Length = 42;
>
> When you're done:
>
> cli::gcdelete(r);

I find the ^ innovation acceptable, but the 'gcnew' thing... Why
should all "modern" reference-based languages copy the C++ new
operator? What is wrong with

R^ r; // R __gc * r = __gcnew R();
R^ r(1, 2); // R __gc * r = __gcnew R(1, 2);

R^ r = nullptr; // nil reference, explicit
R^ r2 = r; // usual reference copy

Given that the ^ things are halfway between pointers and values, as
evidenced by the desire to overload operator+ with "value" semantics
on T^, the above seems reasonable.

On the other hand, I'm probably missing something, and I need to add
that I'm fascinated by watching a new language evolve. :-)

Francis Glassborow

unread,
Nov 19, 2003, 6:36:59 PM11/19/03
to
In article <bpevvv$g8t$1...@newstree.wise.edt.ericsson.se>, Attila Feher
<attila...@lmf.ericsson.se> writes

>If this does not happen there might be just way too much turbulence
>afterwards. Eg: the ECMA standard failing to become an ISO one. This
>delays what MS wants to do, not good for anyone. Even if this does not
>happen (and even if it does) MS will be forced to vote down *any* proposals
>in C++ evolution (or library), which would break their new CLI binding.

First I think this is not the best newsgroup for the issues you are
raising which might explain lack of response.

However MS has precisely one vote at J16 level and no vote at WG21 level
so there is no way that MS can 'vote down' a proposal. OTOH regardless
as to whether something is or is not part of another standard WG21 would
always take careful note of 'existing practice' and would not normally
act in a way that gratuitously breaks any existing practice.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Attila Feher

unread,
Nov 20, 2003, 6:39:50 AM11/20/03
to
Francis Glassborow wrote:
> In article <bpevvv$g8t$1...@newstree.wise.edt.ericsson.se>, Attila Feher
> <attila...@lmf.ericsson.se> writes
>> If this does not happen there might be just way too much turbulence
>> afterwards. Eg: the ECMA standard failing to become an ISO one.
>> This delays what MS wants to do, not good for anyone. Even if this
>> does not happen (and even if it does) MS will be forced to vote down
>> *any* proposals in C++ evolution (or library), which would break
>> their new CLI binding.
>
> First I think this is not the best newsgroup for the issues you are
> raising which might explain lack of response.
>
> However MS has precisely one vote at J16 level and no vote at WG21
> level so there is no way that MS can 'vote down' a proposal.

You mean if they vote "over my dead body" it does not matter? That would be
sad.

> OTOH
> regardless as to whether something is or is not part of another
> standard WG21 would always take careful note of 'existing practice'
> and would not normally act in a way that gratuitously breaks any
> existing practice.

Yes. And should it act when a practice is building up, which might break
the future practice of the C++ language? My point was not that MS will vote
things down or anything like that. My point was still what it was before: a
*binding* should preferably use constructs what are reflecting this
*binding* nature and such ones, which are a natural evolution route of the
language.

Let me again repeat very shortly what I have said already, an example. As
far as I understood - the evolution working group voting - people do not
want to touch the old enum, but want to create a new enum-like type, which
is more class like and safe etc. But the *name* of it is /right/ /now/
taken by a language *binding*.

IMO here the solution is to make the language binding to bond CLI and C++
(or vice versa) and not to bind C++. Right now, with the proposed language
changes it actually binds C++, the routes it can take - in addition of
creating a bond between the CLI and C++. All I wanted is to point out this,
point out possible (future) conflict of interests and to point out that in
some of the cases there is a very simple solution. Use cli enum instead of
enum class. It communicates better and does not bind C++. And IMHO similar
solution exists for most - if not all - of the max munch cases. The case of
the hat is more difficult, but my honest feeling is that if any sort of
useful GC will be introduced it will need something very very similar.

My point was not accusing anyone of anything, but to bring the (IMO
existing) problem into light. Before it brings itself.

--
Attila aka WW

Alexander Terekhov

unread,
Nov 20, 2003, 10:27:38 AM11/20/03
to

Peter Dimov wrote:
[...]

> On the other hand, I'm probably missing something, and I need to add
> that I'm fascinated by watching a new language evolve. :-)

Except that MS should better "evolve" C#, not C++ (calling it
"binding"... what a joke). I mean that it will probably be much
more productive if MS would roll carefully selected C++ stuff
into C#.

regards,
alexander.

Peter N. Lundblad

unread,
Nov 20, 2003, 10:30:44 AM11/20/03
to
pdi...@mmltd.net (Peter Dimov) wrote in message news:<7dc3b1ea.03111...@posting.google.com>...

> Nicola....@ObjectWay.it (Nicola Musatti) wrote in message news:<a327cf48.03111...@posting.google.com>...
>
> [...]
>
>
> I find the ^ innovation acceptable, but the 'gcnew' thing... Why
> should all "modern" reference-based languages copy the C++ new
> operator? What is wrong with
>
> R^ r; // R __gc * r = __gcnew R();
> R^ r(1, 2); // R __gc * r = __gcnew R(1, 2);
>
> R^ r = nullptr; // nil reference, explicit
> R^ r2 = r; // usual reference copy
>
> Given that the ^ things are halfway between pointers and values, as
> evidenced by the desire to overload operator+ with "value" semantics
> on T^, the above seems reasonable.

IMO, this is counterintuitive, but that depends on if you view the CLI
handle as a value type or as a new kind of pointer (with limitations).
Maybe you could view it as a kind of boost:optional<T> and use the
same semantics for default construction.

I also don't see the need for gcnew. Why not use placement new, i.e.:
T^ h = new (CLI::gc) T(14);
This would require an extension allows placement new overloads to
return something other than void* and a standard way to get to the raw
storage where to construct the object. That extension, however, could
be useful for other things as well, i.e. a new that returns a smart
pointer so you don't have to have a public constructor on the smart
pointer taking a raw pointer which is dangerous.

I have to add that I see this discussion as a bit premature, since we
don't have the spec draft yet, but hopefully we will in a few days...

regards,
//Peter

Francis Glassborow

unread,
Nov 20, 2003, 11:44:40 AM11/20/03
to
In article <h74lrv0d7uu8uafaf...@4ax.com>, Herb Sutter
<hsu...@gotw.ca> writes

>Fortunately, it doesn't. :-) C++/CLI does not take ref as a reserved word,
>only as a contextual keyword. Thus no uses of ref as an identifier are
>affected. (This is by design.)


IMO there is a fundamental flaw in the concept of 'contextual keywords'
in that they only work as expected in correctly written code. We already
have more than enough problem with unhelpful diagnostics. The
introduction of keywords that are only keywords in certain contexts
makes good diagnostics even less likely.

Contextual keywords also add extra requirements to tokenisers and
lexers. I think there are other avenues that I would wish to explore.
This is the problem with the very fast time-table set for TG5, it
largely assumes that the currently proposed 'solutions' are good enough.
I remain unconvinced. The problems need to be considered by a wider
range of language experts, rather than just tinker round the edges of a
proposed solution.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Hyman Rosen

unread,
Nov 21, 2003, 4:44:58 AM11/21/03
to
Francis Glassborow wrote:
> IMO there is a fundamental flaw in the concept of 'contextual keywords'
> in that they only work as expected in correctly written code.

But a standard can't be exepected to specify the behavior
of things that don't conform to it.

> Contextual keywords also add extra requirements to tokenisers and
> lexers.

I assume that Microsoft already has an implementation,
so it's clearly not impossible to do.

> I think there are other avenues that I would wish to explore.
> This is the problem with the very fast time-table set for TG5, it
> largely assumes that the currently proposed 'solutions' are good enough.
> I remain unconvinced. The problems need to be considered by a wider
> range of language experts, rather than just tinker round the edges of a
> proposed solution.

No one is preventing you from exploring other avenues.
But unless there's some concrete problem with the proposal,
vague worrying is not a good reason for delay.

news user

unread,
Nov 21, 2003, 4:47:09 AM11/21/03
to
> IMO there is a fundamental flaw in the concept of 'contextual keywords'
> in that they only work as expected in correctly written code. We already
> have more than enough problem with unhelpful diagnostics. The
> introduction of keywords that are only keywords in certain contexts
> makes good diagnostics even less likely.

Borland's object pascal used contextual keywords for many years.

They certainly did not impair good diagnostics, and improved
dramatically readability.

I'm not saying that your fears are unfounded in the context of C++.
However, previous usage shows that the idea is not intrinsically
flawed across languages, and thus should be given due consideration
before rejecting it for C++.

--- Raoul

Larry Evans

unread,
Nov 21, 2003, 9:28:32 AM11/21/03
to
On 11/20/2003 09:30 AM, Peter N. Lundblad wrote:
[snip]

> return something other than void* and a standard way to get to the raw
> storage where to construct the object. That extension, however, could
> be useful for other things as well, i.e. a new that returns a smart
> pointer so you don't have to have a public constructor on the smart
> pointer taking a raw pointer which is dangerous.
Or an auto_ptr which, instead of taking T* as argument, creates such
a T* in its CTOR. This could then be used to initialize another
type of smart_ptr which does gc. This has also been proposed at:

http://aspn.activestate.com/ASPN/Mail/Message/1521219

an implementation (with a preprocessor work-around for up to 3 CTOR
args) is in auto_new.hpp in:

http://groups.yahoo.com/group/boost/files/shared_cyclic_ptr/cyclic_smart_ptr.zip

Herb Sutter

unread,
Nov 21, 2003, 11:27:05 AM11/21/03
to
On 18 Nov 2003 15:01:17 -0500, Nicola....@ObjectWay.it (Nicola

Musatti) wrote:
>As to point 2 I'm aware that gc's could benefit from modifications to
>the standard, but for add on libraries such as the Boehm collector I
>believe there's nothing wrong in imposing conditions on the use of the
>data it handles.
>I don't find it any different than making sure you don't use iterators
>after performing invalidating operations.

I see we agree, and those are good points. It's reasonable for users to
opt into using third-party GC products if they know and agree that it
could restrict them to programming in a subset of standard C++ (possibly
only a very slight subset).

C++/CLI, which falls under point 1 in any case, had to meet a stricter
compatibility constraint, namely total compatibility with all currently
standards-conforming or popular pointer programming styles.

One interesting consequence of this is that existing programs that make
exotic uses of the native heap and of pointers continue to work unchanged
-- including programs that use the Boehm collector or other collectors to
manage the native heap.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)


Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Francis Glassborow

unread,
Nov 21, 2003, 9:38:43 PM11/21/03
to
In article <10693573...@master.nyc.kbcfp.com>, Hyman Rosen
<hyr...@mail.com> writes

>Francis Glassborow wrote:
> > IMO there is a fundamental flaw in the concept of 'contextual keywords'
> > in that they only work as expected in correctly written code.
>
>But a standard can't be exepected to specify the behavior
>of things that don't conform to it.

Perfectly true, but that does not mean we should not be concerned with
code that will compile even though it does not do what the programmer
expects. For example we have the problem of function declarations
hijacking the programmer's intent to define a variable.

Yes, I know the reason that it is that way but it is one of the problems
with the fragile nature of C++'s declarative syntax (inherited from C
and then extended)

Quite a few of the proposed contextual keywords concern declarations and
that is what concerns me.

>
> > Contextual keywords also add extra requirements to tokenisers and
> > lexers.
>
>I assume that Microsoft already has an implementation,
>so it's clearly not impossible to do.

Without wanting to be unpleasant I have to point out that Microsoft's
track record on issues of correct parsing of Standard C++ is far from
impeccable so I would, on the evidence, not wish to accept, sight
unseen, an implementation of an extended syntax. We already have far too
many awkward corner cases and anyone who believes it is easy to get
right should only study the history of 'argument dependant lookup.'

Notice that you said 'assume' and then deduced 'clearly not impossible'
I cannot get to you conclusion from your starting point.

>
> > I think there are other avenues that I would wish to explore.
> > This is the problem with the very fast time-table set for TG5, it
> > largely assumes that the currently proposed 'solutions' are good enough.
> > I remain unconvinced. The problems need to be considered by a wider
> > range of language experts, rather than just tinker round the edges of a
> > proposed solution.
>
>No one is preventing you from exploring other avenues.
>But unless there's some concrete problem with the proposal,
>vague worrying is not a good reason for delay.

The worrying is far from vague, it is based on over ten years of
experience in standardising language design (and several times being
ignored when subsequently proved correct -- e.g. the UK objected to
exception specifications unless they could be applied at compile time
and I was one of the main instigators of that objection)

--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Francis Glassborow

unread,
Nov 21, 2003, 9:40:46 PM11/21/03
to
In article <MPG.1a273bc2f...@news.easynet.be>, news user
<ne...@sisyphus.news.be.easynet.net> writes

> > IMO there is a fundamental flaw in the concept of 'contextual keywords'
> > in that they only work as expected in correctly written code. We already
> > have more than enough problem with unhelpful diagnostics. The
> > introduction of keywords that are only keywords in certain contexts
> > makes good diagnostics even less likely.
>
>Borland's object pascal used contextual keywords for many years.

The fundamental difference is that Pascal's declarative syntax is far
more robust than C++'s.

>
>They certainly did not impair good diagnostics, and improved
>dramatically readability.

As I would expect but only for a language with a well designed
declarative syntax (even Bjarne Stroustrup has been known to comment
that C's DS was an interesting experiment that failed.)

>
>I'm not saying that your fears are unfounded in the context of C++.
>However, previous usage shows that the idea is not intrinsically
>flawed across languages, and thus should be given due consideration
>before rejecting it for C++.

I suggest that that is the wrong way round, they should be demonstrated
to work effectively within the context of C++'s syntax. The rush to an
ECMA Standard is good in so far as it locks down MS and inhibits them
from creative tinkering once the job is done. But it is bad in so far as
it will permanently lock everyone into what will almost certainly prove
to be a far from ideal solution.

My reason for my last statement is based on history. Once we have
something Standardised it is very close to impossible to remove it or
alter it in any way that breaks existing code. One minor example is the
'evil' vector<bool> that permanently blights templates that use a vector
of one of the template type parameters.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Herb Sutter

unread,
Nov 22, 2003, 5:44:07 AM11/22/03
to
On 18 Nov 2003 14:58:55 -0500, Nicola....@ObjectWay.it (Nicola

Musatti) wrote:
>Properties are a topic of general interest; a better effort should be
>made to have them included in the next version of the standard? What
>happened to the Borland proposal?

Meta-comment: In standards, nothing happens except when someone decides to
invest the expertise, time, and energy to write and champion a proposal. I
think the ISO C++ committee would probably be favorable to looking at
properties -- and threads, for that matter. But that can only happen if
someone steps up to do the work, and no one currently involved in the
committee has so far demonstrated the bandwidth and drive to work on these
areas. Both properties and threads have been presented at committee
meetings, but then their authors for whatever reason did not follow
through and pursue them.

I think it's a shame that properties and threads aren't being actively
worked on in WG21/J16, but like every other proposal someone has to be
willing to invest the hard work to write and promote it and to convince
busy committee members why it should be in the standard. I agree with
Bjarne that properties should be in the language, not because they are a
core language feature (they are not and they can be simulated) but because
they are pervasive, and language support does give better ease of use than
simulating the feature in a library (possibly with compiler magic-assisted
types, where the compiler knows about special types and uses them as keys
for code generation).

FWIW, my understanding is that Borland intends to participate in C++/CLI
and we sure welcome their input on properties in particular. Note that
their approach has some C++ syntax extensions too.


>The hat symbol and gcnew could be replaced with a template like
>syntax, e.g.
>
>cli::handle<R> r = cli::gcnew<R>();

I agree that those are alternatives. Everyone, including me, first pushes
hard for a library-only (or at least library-like) solution when they
first start out on this problem. I think an argument can be made for it,
and at one time I did so too.

To me, a killer argument in favor of a new declarator with usage "R^"
instead of a library-like "cli::handle<R>" is its pervasiveness: It will
be by far the most widely used part of all these extensions, as it's the
common use case the vast majority of the time for CLI types (as objects,
as parameters, etc.). This extremely wide use amplifies two particular
negative consequences we'd like to avoid: First, the long spelling (here
"handle") could in practice effectively become a reserved word just
because people are liable to widely apply "using" to avoid being forced to
write the qualification every time (this is worse if the name chosen is a
common name likely to be used for other identifiers or even macros, and
"handle" is a very common name). Second, and worse, the long spelling
would also make the language several times more verbose in a very common
case than even the Managed Extensions syntax was, and that in turn was
already verbose compared to other CLI languages.

Compare five alternatives side by side:

cli::handle<R> r = cli::gcnew<R>(); // 1: alternative suggested above
handle<R> r = gcnew<R>(); // 2: ditto, with "using"s
R __gc* r = new R; // 3: original MC++ syntax
R^ r = gcnew R; // 4: C++/CLI syntax
R r = new R(); // 5: C#/Java syntax

I think you could make a case for any one of these, depending on your
tradeoffs. But I think a tradeoff that favors usability will favor the
last few options.

There are also other issues where having ^ and % declarators/operators
that roughly correspond to * and & enables a more elegant type calculus. I
(or someone on the team) will have to write those up someday, but consider
at some future time when we have full mixed types too: When we can have a
type that inherits from both native and CLR base classes/interfaces, we
will want to be able to pass a pointer to such an object to existing ISO
C++ APIs that take a Base1* and a handle to the same object to existing
CLI APIs that take an Base2^. Both will be common operations and therefore
both should be distinctly expressible with a terse syntax:

class NativeBase { };

// a mixed type
ref class R
: public NativeBase
, public System::Windows::Forms::Form
{ };

void NativeFunc( NativeBase* );
void CLIFunc( Object^ );

R r; // object on the stack
NativeFunc( &r ); // "give me a *" is spelled "&" as usual
CLIFunc( %r ); // "give me a ^" is spelled "%"

In this way, % is to ^ pretty much just as & is to *. If R^ were instead
spelled using a templatelike syntax, what would be the corresponding code
to get at it?

Finally, consider the agnostic template case:

template<typename T>
void f( T t ) {
SomeBase* b = &t; // I have to have a way of saying "I want a *"
// without knowing the type of T
SomeInterface^ i = %t; // I have to have a way of saying "I want a ^"
// without knowing the type of T
}

I'll write more about the pointer system in the future. For other design
considerations about handles I'll point to at Brandon's blog entry again:
http://blogs.gotdotnet.com/branbray/permalink.aspx/c57f8683-5973-4ecc-837c-95e37102e86d

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 22, 2003, 5:54:47 AM11/22/03
to
On 19 Nov 2003 09:36:04 -0500, Herb Sutter <hsu...@gotw.ca> wrote:
>On 18 Nov 2003 15:02:40 -0500, Richard Smith <ric...@ex-parrot.com>
>wrote:
>>> For those who are interested in seeing the shape of the project, we will
>>> be making the proposed standard's base document (the "starting point"
>>> document) publicly available on November 15.
>>
>>Is this document now available on the web anywhere? I've
>>tried googling for it and looking on the Microsoft and ECMA
>>web sites but have been unable to find it.
>
>We're, um, a few days late. It'll be there by the end of the week. I'll
>post the URL here and to my mailing list and to www.gotw.ca/microsoft (my
>memory being what it is, I ought to add the qualification: if I don't
>forget about it until I suddenly wake up in a panic in the middle of the
>night two weeks later and realize I didn't do it yet).

Oddly, I am remembering to do this. (Despite being quite sleep-deprived,
though that part's quite okay with me because it's due to an adorable new
puppy who just became part of our lives on Wednesday.)

Here's the link:


http://download.microsoft.com/download/9/9/c/99c65bcd-ac66-482e-8dc1-0e14cd1670cd/C++%20CLI%20Candidate%20Base%20Draft.pdf

This is the spec that Microsoft is contributing to the newly-formed ECMA
TC39/TG5 standards committee for consideration for the C++/CLI standards
process. It covers all the main proposed features, and it gives a pretty
thorough look at the scope and shape of what's being contemplated. There
are still places that need to be filled in, though, as well as some
technical decisions that TG5 will need to decide (in addition to any
existing decisions that they may decide to review or change).

Note that this is the last version of the document that will bear a
Microsoft copyright, so we've taken this opportunity to make it publicly
available while we still own it. If ECMA TC39/TG5 adopts this as their
base document, it will henceforth be an ECMA document maintained by that
ECMA group. That means it will be up to TG5 to decide what changes to make
and when to make future drafts publicly available. (From my informal
conversations, I wouldn't be surprised if interim drafts were published
every three months or so, but that's just my personal best guess right
now. We'll have to wait and see when the whole group feels the spec is in
shape for TG5 to feel ready to distribute its own first updated snapshot.)

Whew! It's been a long year, a long month, and a long week. Enjoy! And
please let us know what you think of this. Comments are welcome, and those
of us on the team who are blogging (see my Links) will be answering as
many as we can get to while we spend our days continuing to work on the
Whidbey product.

I'll probably be on the newsgroups only lightly over the next two weeks.
Next week is a short week of course, with the U.S. Thanksgiving holidays
closing most offices on Thursday and Friday. The following week, on Dec
4-5, is the first ECMA TC39/TG5 meeting already, down in College Station,
Texas -- it sure has come up fast.

Best wishes,

Herb Sutter

unread,
Nov 22, 2003, 5:57:25 AM11/22/03
to
On 19 Nov 2003 13:39:04 -0500, "Attila Feher"

<attila...@lmf.ericsson.se> wrote:
>everyone else in this subthread has got answers but me.

Sorry if I missed some people in responding. I haven't been able to keep
up with the thread. Please don't take it personally. :-) We just shipped
the candidate base document an hour ago, it's been a busy week.

>And I would like to
>see some reaction, because the things I have listed (possible bad side
>effects on the mainstream C++ language) are important enough not to be
>ignored.
>
>In general, is Microsoft (and the ECMA group created by it) ready to look
>past Microsofts urgent needs of today (being able to compile C++ code
>better) and take into account the *future* of standard C++? Right now, from
>the answers given to others and the no-answer given to me worries me,
>because it says: nope.

Well, of course the answer is yes; see my other responses and my blog
entries this week, some of which were motivated by your and others'
questions even if I didn't quote everyone directly. We've bent over
backwards to not break standards conformance (current and future), we just
spent a full release cycle mainly on conformance that we sure don't want
to throw away, and we still participate actively in WG21/J16 and actively
track the language and library evolution that's being done by and in
WG21/J16.

For example, I've already had questions about "why are you using typeid<>?
why not just use the word typeof?" We're not using typeof because there's
a chance that WG21 might want to use that word, even though the current
discussion seems to be leaning away from it (you never know when it might
lean back, after all).

>I am pretty sure that all of the design decisions taken by the group can be
>discussed here or in a blog one by one, but the questiosn remains: on what
>bases will/should C++/CLI grab the enum class, when ISO C++ has just decided
>to pursue further the notion of a more class like enum. And that is only
>one of the many questions.

See below.

>And I believe that it would really be good to address the general matter
>those questions bring up: how to make the C++/CLI binding the least
>intrusive to the language.

Absolutely.

>IMHO it is not as good as it could be. Many
>construct are too C++-ish to be taken by a language binding just like this.

Well, now that the draft spec has just been posted it may answer some
questions (and it may also raise more :-) ). Please let us know what you
think. Specific examples are always best, and we'd be glad of the
feedback.

>happen (and even if it does) MS will be forced to vote down *any* proposals
>in C++ evolution (or library), which would break their new CLI binding.

I don't understand what you mean. Microsoft certainly has no more power
than any other company (including several one-person companies who are J16
members) to vote down any proposal, and has no direct vote at all at the
ISO level. Besides, why would we do that? Our position as a company is
that as WG21/J16 adds features that overlap with C++/CLI, we will do our
best to follow the WG21/J16 design.

In particular, I think you were in the room a few weeks ago when Bjarne
and I presented the nullptr proposal, which was originally being done in
C++/CLI, during which design Bjarne suggested the name nullptr and that we
propose it in WG21/J16. During the Kona meeting, WG21/J16's sentiment was
that they preferred some substantive changes to the design (e.g., that
nullptr have an explicit type that would be deducible and nameable). For
our part, speaking for Microsoft only since I obviously can't speak for
all TG5 members, we support having the C++/CLI track what WG21/J16 is
doing in that and other areas of overlap. Another example is delegating
constructors.

In those areas of C++/CLI where WG21/J16 isn't actively doing anything
that overlaps (which is the case for most of the work) we're trying to use
our best judgment for how best to stay out of the way, and comments on how
to do that better are always welcome.

>example it might be the ISO C++ committee, who has to come up with a weird
>name for the new enum, because the enum class is taken, and class enum is
>not distinct enough (or vice versa). If this ever happen it will be a Very
>Bad Thing. IMO it is better to talk about them now and try to solve them
>now, then face the tension a year from now.

Right, and that's what's being done. David Miller is revising his enum
proposal based on EWG direction, C++/CLI's overlapping feature was
explicitly mentioned, and I will be one of the people working with him on
that. I think we're in violent agreement on this topic. :-)

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 22, 2003, 6:02:06 AM11/22/03
to
"WW" <wo...@freemail.hu> writes:

| IMHO it is already showing if you follow the signs carefully. I mean as I
| understood the votes of the evolution group, people want a new, more class
| like, safe enum.

That was not clearly cut. I was one of the people who voted against
touching the existing enums and I also voted for improved enums. That
does not mean I want "enum class". What I wanted to express is that I
really want good support for enums. I did not find the suggestion of
touching existing enums in isolation as a good thing to do. The
reason is that, it is a functionality that sits at the intersection of
C and C++ and I'm concerned about compatibility and ABI changes. I do
believe I expressed those concerns when the vote was taken. That is
also the reason I voted against the "new blown" enums, because, at the
end, one would have to address compatibility issues and my guess is
that that will end up with something worse than the existing "broken"
enums.

| And not to touch the existing one. Now it would be a very
| logical step to use the "enum class" series of keywords to defines such a
| thing.

There is a fundamental assumption behind the notion of contextual
keywords that I think does not scale very: The assumption that what
the programmed parser sees is what Joe Programmer actually intended to
say; moreover the programmed parser should try hard to make (its)
sense of series of tokens. Well, it works to a point; but then C++
has a long story to tell in that department.

[...]

| There was real concern voiced by some at the WG21 meeting about the fact
| that WG21 (ISO) has no control over the ECMA standardization process. It
| was worded rather as cooperation/coordination, but basically it trims down
| to control: since the ECMA standardization will be finished before WG21/J16
| members (working on C++ evolution but having no money to spend on ECMA)
| could have a word on its work. My biggest fear is that the unfortunate
| happens and C++/CLI (ECMA) will manage to use up C++ constructs what what
| WG21 later wants to use for the CLI indepedent C++. And those things go

Here I'm expressing my opinions as an individual, not as any national
body representative. I'm much more concerned about openness than
"control". Actually, I don't care about "control". I know it has been
repeated that the process is open to any body who wanted to work the
ECMA CLI/C++ thingy. However, as a matter of fact, I do have the
impression that it is not a process as open as the WG21 effort. The
schedule is set so that individuals like me have no *practical* chance
to follow kand contribute to that standardization. Such people would
need to work fulltime and have huge budget allocated for that job.
That impression might not be right but it is what I got after following all
the technical sessions and presentations. I also have the unpleasant
feeling that it is a way of getting an ISO binding standard to C++,
without playing an open game as WG21. I would call it a "take over",
but, hey, I don't. I would love to be proven wrong.

--
Gabriel Dos Reis
g...@integrable-solutions.net

Gabriel Dos Reis

unread,
Nov 22, 2003, 6:02:46 AM11/22/03
to
Nicola....@ObjectWay.it (Nicola Musatti) writes:

| The standard already provides a way to avoid conflicts when
| introducing new keywords: prepend a double underscore.

That is *a* way of solving that problem. Some like it. C did something
similar. Personally, I don't like it.

We have namespaces, i.e. facilities to manage identifiers. I would
prefer we take advantage of them.

Gabriel Dos Reis

unread,
Nov 22, 2003, 11:42:52 AM11/22/03
to
Herb Sutter <hsu...@gotw.ca> writes:

| On 18 Nov 2003 14:57:59 -0500, pdi...@mmltd.net (Peter Dimov) wrote:
| >This is all understandable, but the fact that choosing 'ref' as one of
| >the keywords breaks three TR1 libraries (and who knows how much other
| >code) might deserve mentioning.
|
| Fortunately, it doesn't. :-) C++/CLI does not take ref as a reserved word,
| only as a contextual keyword. Thus no uses of ref as an identifier are
| affected. (This is by design.)

I understand that. But, I find the phisolophy behing the notion of
"contextual keywords" fundamentally broken, as far as (human)
programming is concerned.
Contextual keywords work on the basis that any sense that the
programmed parser comes up for a series of tokens actually matches Joe
Programmer's intent. As C++ programmers and designers, we do have
experience in that department and I would not say it is really that
conclusive. It certainly is a fun and exciting (academic?) exercise
in parsing, but I'm NOT convinced that I would push for it. I
certainly do know that I would vote against it, if it were proposed
for C++.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 22, 2003, 11:44:18 AM11/22/03
to
Francis Glassborow <fra...@robinton.demon.co.uk> writes:

| In article <h74lrv0d7uu8uafaf...@4ax.com>, Herb Sutter
| <hsu...@gotw.ca> writes
| >Fortunately, it doesn't. :-) C++/CLI does not take ref as a reserved word,
| >only as a contextual keyword. Thus no uses of ref as an identifier are
| >affected. (This is by design.)
|
|
| IMO there is a fundamental flaw in the concept of 'contextual keywords'
| in that they only work as expected in correctly written code.

I completely agree with Francis here.
It is something that works, for instance, in situations of generated
codes that have been (mechanically) proven to be correct.
I believe C++ is better NOT to take that route.

[...]

| This is the problem with the very fast time-table set for TG5, it
| largely assumes that the currently proposed 'solutions' are good enough.

That matches my feelings (in a previous message) of non-openness.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 22, 2003, 11:45:24 AM11/22/03
to
Hyman Rosen <hyr...@mail.com> writes:

| Francis Glassborow wrote:
| > IMO there is a fundamental flaw in the concept of 'contextual keywords'
| > in that they only work as expected in correctly written code.
|
| But a standard can't be exepected to specify the behavior
| of things that don't conform to it.

Yes. But, you have to consider the target. If the standard is
targetting machines that program and that do not make errors, then
that may be fine. Otherwise, one can't ignore possible mistakes and
means to catch those errors. Contextual keywords work on the basis
that the program is correct. That might be perfect for machines.
I doubt it scales for Joe Programmer.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 22, 2003, 11:46:08 AM11/22/03
to
news user <ne...@sisyphus.news.be.easynet.net> writes:

| > IMO there is a fundamental flaw in the concept of 'contextual keywords'
| > in that they only work as expected in correctly written code. We already
| > have more than enough problem with unhelpful diagnostics. The
| > introduction of keywords that are only keywords in certain contexts
| > makes good diagnostics even less likely.
|
| Borland's object pascal used contextual keywords for many years.

A key question is: Does it have the same level of complexity of C++?

I believe we have enough gotchas to cope with current C++; I'm not
convinced that we should multiply them by any factor greater than 1.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 22, 2003, 6:29:34 PM11/22/03
to
Herb Sutter <hsu...@gotw.ca> writes:

[...]

| >happen (and even if it does) MS will be forced to vote down *any* proposals
| >in C++ evolution (or library), which would break their new CLI binding.
|
| I don't understand what you mean. Microsoft certainly has no more power
| than any other company (including several one-person companies who are J16
| members) to vote down any proposal, and has no direct vote at all at the
| ISO level.

Absolutely true. But, I don't think the corresponding ISO standard
could be made incompatible in any way from the ECMA standard. Which
is a really worrying point. The point isn't that people would intend to
make incompatible changes to the ISO standards. The point is that
once the ECMA document is accepted as a standard, there is very little
(if any) chance that the ISO document will be different, even if it
turns out that there are problems with the ECMA document and relations
to the ISO C++ WP or fonctionalities. The end result is really that
of having an ISO standard that was not really openly conducted. I
know you said that anyone can join the ECMA process. But I think that
is an abstract statement. One really has to face this issue from a
practical point of view. Given the schedule and the fees, there is
little hope for individuals (like me) to have resources (time and
money) to invest in that process. I believe that is a real concern
for many individuals involved in the ISO C++ effort.
That is a point that, I believe, should not be overlooked.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

White Wolf

unread,
Nov 22, 2003, 7:27:12 PM11/22/03
to
Gabriel Dos Reis wrote:
> "WW" <wo...@freemail.hu> writes:
>
>> IMHO it is already showing if you follow the signs carefully. I
>> mean as I understood the votes of the evolution group, people want a
>> new, more class like, safe enum.
>
> That was not clearly cut. I was one of the people who voted against
> touching the existing enums and I also voted for improved enums. That
> does not mean I want "enum class". What I wanted to express is that I
> really want good support for enums. I did not find the suggestion of
> touching existing enums in isolation as a good thing to do. The
> reason is that, it is a functionality that sits at the intersection of
> C and C++ and I'm concerned about compatibility and ABI changes. I do
> believe I expressed those concerns when the vote was taken. That is
> also the reason I voted against the "new blown" enums, because, at the
> end, one would have to address compatibility issues and my guess is
> that that will end up with something worse than the existing "broken"
> enums.

I think I understand you. What is then what you really want?

As for my point. I have said that "as I understood the vote". :-) Of
course that is just what I see. And more to the point: what I feel is that
there is a chance to have an enum class in C++ and it would be sad if it
would have some weird name just because the obvious one is taken. And it
was a suggestion about a broader issue in reusing existing C++ keywords.

>> And not to touch the existing one. Now it would be a very
>> logical step to use the "enum class" series of keywords to defines
>> such a thing.
>
> There is a fundamental assumption behind the notion of contextual
> keywords that I think does not scale very: The assumption that what
> the programmed parser sees is what Joe Programmer actually intended to
> say; moreover the programmed parser should try hard to make (its)
> sense of series of tokens. Well, it works to a point; but then C++
> has a long story to tell in that department.

I guess you and Francis know a lot more about this than I do. If you care
to share what kind of problems you talk about it would help (at least me) to
see the point.

[SNIP]


> Here I'm expressing my opinions as an individual, not as any national
> body representative. I'm much more concerned about openness than
> "control". Actually, I don't care about "control". I know it has
> been repeated that the process is open to any body who wanted to work
> the ECMA CLI/C++ thingy. However, as a matter of fact, I do have the
> impression that it is not a process as open as the WG21 effort. The
> schedule is set so that individuals like me have no *practical* chance
> to follow kand contribute to that standardization. Such people would
> need to work fulltime and have huge budget allocated for that job.

Well, at least it feels like it. You need a really dedicated organization
behind you to be able to attend all those meetings and really be part of the
work.

> That impression might not be right but it is what I got after
> following all the technical sessions and presentations. I also have
> the unpleasant feeling that it is a way of getting an ISO binding
> standard to C++, without playing an open game as WG21. I would call
> it a "take over", but, hey, I don't. I would love to be proven wrong.

Well... Herb's presentation on the WG21 suggests otherwise, but of course I
am not good in politics... OTOH I cannot imagine Herb playing politics. I
just cannot. There is a chance of hijacking the future C++ here, but I do
not think that will happen with Herb in both committees. I know this is not
a proof, but I would have to rebuild my whole world if it would happen...

About the costs. As Herb told they push ECMA to provide an affordable
pricing for smaller companies. And that is - at least - promising and it
suggests - to me - that they really do want everyone to be involved. OTOH
this is that very case of nothing is fully black or white (unless you are in
a Hollywood movie). Microsoft has failed to bring C++ on .NET with the
previous attempt. Now they need to do it *fast*. If you cannot get on the
train is secondary to that, because they do not want to miss the train - and
they are many many years behind, compared to some language named after a
coffee bean. I also feel (but as I am not good in business or politics I
might be very wrong) that for C++ inside Microsoft it is important that it
is brought onto the CLI and it is done effectively. And if the previous is
true, it is inevitable that it is important for the whole C++ community.
Why? Because we do not want Microsoft to give up on C++ for apparent
reasons.

The rest might look like wildly off-topic, but I mention them because I
sincerely believe that whatever happens around this CLI/C++ binding will
reflect back on the language. In my mind the ECMA group has an anormous
task and many objections to answer. One I hear more and more often is that
C++ equals to Microsoft. If they want to avoid that assumption to build
into the minds of people they have to prove that there will be no taking
over or hijacking, not even the slightest valid accusation of it. In
addition to this they have to make a huge effort to open up the ECMA process
to all those interested. I believe that even as far as sponsoring some
smaller companies to be able to delegate their experts. It might be a far
fecthed idea, but if they want to ensure they get all the best minds working
on this and there can be no shadow of accusations of a hijacking - it is
probably a possible route to take. The other thing to avoid is people
thinking that C++ will be bound to the CLI or that C++ is only used to bring
people to the CLI and then force them to move to a Microsoft propriatey
language. And IIRC I have seen these two last accusations already in
writing on some newsgroup.

I know I only wrote sort of political aspects above, but I think I have said
already all the techincal "ideas" before.

--
WW aka Attila

White Wolf

unread,
Nov 22, 2003, 7:28:18 PM11/22/03
to
Herb Sutter wrote:
> On 19 Nov 2003 13:39:04 -0500, "Attila Feher"
> <attila...@lmf.ericsson.se> wrote:
> >everyone else in this subthread has got answers but me.
>
> Sorry if I missed some people in responding. I haven't been able to
> keep up with the thread. Please don't take it personally. :-)
> We just shipped the candidate base document an hour ago,
> it's been a busy week.

That is what I thought. I just have this ugly urge to push buttons when I
do not get an answer. :-) Or is it that I am way too honest even to my
taste...? Why I pointed out the missing answer was because at the time it
looked bad. Now, reading your article, I have to say that my paranoia about
the issue not being looked upon has proven false. And that is good.

And of course I do not take anything personal, especially because this issue
is much more than me. If I will not like future C++ I can anytime change to
be a Java programmer. ;-)

> >And I would like to
> >see some reaction, because the things I have listed (possible bad
> side >effects on the mainstream C++ language) are important enough
> not to be >ignored.
> >
> >In general, is Microsoft (and the ECMA group created by it) ready
> to look >past Microsofts urgent needs of today (being able to
> compile C++ code >better) and take into account the *future* of
> standard C++? Right now, from >the answers given to others and the
> no-answer given to me worries me, >because it says: nope.
>
> Well, of course the answer is yes; see my other responses and my blog
> entries this week, some of which were motivated by your and others'
> questions even if I didn't quote everyone directly. We've bent over
> backwards to not break standards conformance (current and future), we
> just spent a full release cycle mainly on conformance that we sure
> don't want
> to throw away, and we still participate actively in WG21/J16 and
> actively track the language and library evolution that's being done
> by and in WG21/J16.

My point about that was that the "speed" difference between the two
committees seems to be the issue. Even with the best intentions it is
pretty hard to predict what the WG21/J16 will come up in the next n+1
years - for the "next C++". It is what it is, this is what we have got.
That is why I proposed that CLI binding would look for some sort of CLI
related keywords, thereby lessening the chance of a collision.

> For example, I've already had questions about "why are you using
> typeid<>? why not just use the word typeof?" We're not using typeof
> because there's
> a chance that WG21 might want to use that word, even though the
> current discussion seems to be leaning away from it (you never know
> when it might lean back, after all).

Yes, that is what I mean. It *is* a huge challenge and I *am* jelaous of
all of those who can be involved. :-)

[SNIP]


> >And I believe that it would really be good to address the general
> matter >those questions bring up: how to make the C++/CLI binding
> the least >intrusive to the language.
>
> Absolutely.

Phew. :-)

> >IMHO it is not as good as it could be. Many
> >construct are too C++-ish to be taken by a language binding just
> like this.
>
> Well, now that the draft spec has just been posted it may answer some
> questions (and it may also raise more :-) ). Please let us know what
> you think. Specific examples are always best, and we'd be glad of the
> feedback.

I am going to read it, thank you for the opportunity. Where do you react to
it? I am not saying I will have anything clever enough that I will actually
want to say it - but if I or anyone else has a comment, what is the place?
One of the MS newsgroups? This NG might not be it for most of the possible
comments...

> >happen (and even if it does) MS will be forced to vote down *any*
> proposals >in C++ evolution (or library), which would break their
> new CLI binding.
>
> I don't understand what you mean.

Clearly. I was not good - again - in saying what I mean.

> Microsoft certainly has no more
> power
> than any other company (including several one-person companies who
> are J16 members) to vote down any proposal, and has no direct vote at
> all at the
> ISO level. Besides, why would we do that? Our position as a company is
> that as WG21/J16 adds features that overlap with C++/CLI, we will do
> our best to follow the WG21/J16 design.

I did not mean that MS has any more power than others. All I meant was that
if MS has an ECMA standard voted in and invested man-centuries into an
implementation and its customers have invested man-millenia into
implementation using that standard... Well, there would be a clear problem.
If WG21 goes to break ECMA/TG5 code: it is bad. If it has to come up with
an awkward syntax for the future C++ so it won't break CLI/C++ code: that is
bad too.

> In particular, I think you were in the room a few weeks ago when

Yes, I was!

> Bjarne and I presented the nullptr proposal, which was originally being
done
> in C++/CLI, during which design Bjarne suggested the name nullptr and

[...]


> In those areas of C++/CLI where WG21/J16 isn't actively doing anything
> that overlaps (which is the case for most of the work) we're trying
> to use our best judgment for how best to stay out of the way, and
> comments on how to do that better are always welcome.

Well, the only idea I had is to get some CLI specific word to trigger most
CLI specific constructs and to try to make the hat-idea work on a way that
it can be reused later... not much, I know.

> >example it might be the ISO C++ committee, who has to come up with
> a weird >name for the new enum, because the enum class is taken, and
> class enum is >not distinct enough (or vice versa). If this ever
> happen it will be a Very >Bad Thing. IMO it is better to talk about
> them now and try to solve them >now, then face the tension a year
> from now.
>
> Right, and that's what's being done. David Miller is revising his enum
> proposal based on EWG direction, C++/CLI's overlapping feature was
> explicitly mentioned, and I will be one of the people working with
> him on that. I think we're in violent agreement on this topic. :-)

Great great great! I don't know about it. I am supposed to be on the
standardization mailing list (I mean I was told I will be, so I mean it that
way not that it is a right or something) but I am not - so unfortunately I
do not really see what is happening. :-(

Thank you for your kind words - you have (again) reassured me again. :-)

--
WW aka Attila

Herb Sutter

unread,
Nov 23, 2003, 9:54:25 AM11/23/03
to
On 18 Nov 2003 18:15:07 -0500, "Andrew Browne"
<clcppm...@this.is.invalid> wrote:
>As I understand it, the crucial argument here is that the compacting
>GC can move objects around and that standard C++ pointers do not allow
>for this. Ordering is mentioned, specifically with regard to set<T*>.

Yes, that and pointer hiding techniques.

>However I believe standard C++ only specifies pointer comparison
>between arbitrary pointers for void pointers (and I think this was
>actually a late language addition, specifically to support such things
>as set<T*>.)

Actually it's for all pointer types. The paragraph you're thinking of is
20.3.3/8:

For templates greater, less, greater_equal, and less_equal, the
specializations for any pointer type yield a total order, even
if the built-in operators <, >, <=, >= do not.

>Would it be possible for a conversion to void pointer to
>perform an implicit pin so that the ordering requirement would be
>satisfied?
>
>If that were possible, then presumably it would also be possible to
>perform an implicit pin when a pointer is cast to an integer type, so
>that it could be cast back to yield the same (still valid) pointer.

The phrase "implicit pin" sends up red flags in my head. :-) It turns out
that pins are expensive and dangerous. They therefore should be explicit,
short-lived, and very rare.

Consider: A pin creates a sandbar, an impediment to garbage collection.
The most fundamental design assumption in a compacting GC is that it can
move blocks around freely, like an unimpeded flow of water. But a pinned
object can't move, and things can and do jam up behind it until it gets
loose (unpinned) again. A favorite example of Jeff Peil's is that, if you
happen to be unlucky enough to pin the very last object in Generation 0,
you may not be able to allocate memory until the pin is released: The
allocator would make an initial attempt to get memory, and when that
failed it would automatically run a GC cycle to try to reclaim memory (but
this would not change anything because the GC can't get past the sandbar),
and then the allocator would try one more time, and when that again failed
the allocator would give up and report out of memory. Granted, the results
will rarely be this bad, but this should illustrate why pins are no
panacea and certainly oughtn't to be created implicitly or frequently.

Because pins should be short-lived, the CLI allows pinning pointers to
exist only on the stack. (Yes, you could still put one in main(), but
please don't.) Because pins should be explicit, the current C++/CLI draft
specifies them as a deliberately-uglier-to-spell "pin_ptr<T>" and never
creates them implicitly.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 23, 2003, 10:01:12 AM11/23/03
to
"White Wolf" <wo...@freemail.hu> writes:

| > That impression might not be right but it is what I got after
| > following all the technical sessions and presentations. I also have
| > the unpleasant feeling that it is a way of getting an ISO binding
| > standard to C++, without playing an open game as WG21. I would call
| > it a "take over", but, hey, I don't. I would love to be proven wrong.
|
| Well... Herb's presentation on the WG21 suggests otherwise, but of course I
| am not good in politics... OTOH I cannot imagine Herb playing politics. I
| just cannot.

I was not suggesting Herb is playing politics. If I recall correctly,
Herb is one of the *two* technical "advisors" (I don't recall the exact
qualification from the top of my head) on that board. I have no
clue of how all that works but I did not have the impression that all
the cards are in Herb's hand. So, it would be, IMHO, totally
misleading and insulting to characterize him as playing politics.

But, that does not remove the impression I got from the talks.

As far as language design is concerned, not all decisions are purely
technically based. Some facts may happen without support from the
technical experts. That has happened often.

| There is a chance of hijacking the future C++ here, but I do
| not think that will happen with Herb in both committees. I know this is not
| a proof, but I would have to rebuild my whole world if it would happen...

Again, I refuse to focuse on individuals/persons. Bjarne Stroustrup
is also listed as being on the technical board, and I do know he cares
about C++ as much as Herb. However, I do not see that a sufficient
guarantee that the ECMA CLI/C++ thingy would not not be a "take over".
As I said, I'm NOT expecting "take over" from those people. They are
not known to play politics. But, without intent of being insulting,
their presence on the technical board is no sufficient guranatee --
IMO -- that I should not worry about what is happening.

| About the costs. As Herb told they push ECMA to provide an affordable
| pricing for smaller companies. And that is - at least - promising and it
| suggests - to me - that they really do want everyone to be involved. OTOH

Intent is one thing, what happens is another. As things are currently
set, I see no practical evidence that the process is as open as WG21
(I'm not saying WG21 process is perfect and should be a model, but at
least, it is much more affordable).

| this is that very case of nothing is fully black or white (unless you are in
| a Hollywood movie). Microsoft has failed to bring C++ on .NET with the
| previous attempt. Now they need to do it *fast*. If you cannot get on the
| train is secondary to that, because they do not want to miss the train - and
| they are many many years behind, compared to some language named after a
| coffee bean. I also feel (but as I am not good in business or politics I
| might be very wrong) that for C++ inside Microsoft it is important that it
| is brought onto the CLI and it is done effectively. And if the previous is
| true, it is inevitable that it is important for the whole C++ community.
| Why? Because we do not want Microsoft to give up on C++ for apparent
| reasons.

I have no problem with a given commercial coorporate growing up its
standard for its products. However, I'm concerned about an ISO
standard process that, in many ways, affects/relates to a given
ISO-standardized programming language that is not the property of any
coorporate. If the ECMA CLI/C++ thingy is believed to be important to
the C++ community and that community should care about it, then I
think that process should be at least as open as the process that
is in charge of standardizing and evolving C++.

[...]

| to all those interested. I believe that even as far as sponsoring some
| smaller companies to be able to delegate their experts.

They might also be suspected of conflict of interests. I really value a
standard developed by independent people one a document sponsored
mostly by a given coorporate. C++ gets part of its credibility from
the fact that, it is not "funded" in any sense by a given company.

[...]

| I know I only wrote sort of political aspects above, but I think I have said
| already all the techincal "ideas" before.

(Un?)fortunately, language design and standardization are affected by
non-purely technical issues.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 23, 2003, 10:04:31 AM11/23/03
to
On 20 Nov 2003 10:30:44 -0500, lund...@linux.nu (Peter N. Lundblad)
wrote:

>pdi...@mmltd.net (Peter Dimov) wrote in message news:<7dc3b1ea.03111...@posting.google.com>...
>> Given that the ^ things are halfway between pointers and values, as
>> evidenced by the desire to overload operator+ with "value" semantics
>> on T^, the above seems reasonable.
>
>IMO, this is counterintuitive, but that depends on if you view the CLI
>handle as a value type or as a new kind of pointer (with limitations).

^ really does have pointerlike semantics, rather than valuelike semantics.
And it is important to distinguish between R and R^ (an object of ref
type, and a handle to an object).

I think that the presence of CLI operators on ^'s actually argues more for
pointerlike semantics than valuelike semantics, because unlike the usual
C++ operators they nearly always result in binding to a newly created
object. For example, for T^'s a and b, a += b results in adding the value
in *a to the value in *b and putting the result in a new object, and a now
refers to that. (Yes, whereas for the usual C++ operators we can implement
+ in terms of +=, it's the other way around for the CLI operators, which
takes getting used to.)

>I also don't see the need for gcnew. Why not use placement new, i.e.:
>T^ h = new (CLI::gc) T(14);
>This would require an extension allows placement new overloads to
>return something other than void* and a standard way to get to the raw
>storage where to construct the object. That extension, however, could
>be useful for other things as well, i.e. a new that returns a smart
>pointer so you don't have to have a public constructor on the smart
>pointer taking a raw pointer which is dangerous.

There are several reasons why we didn't use a form of placement new.

One reason is that we wanted to leave a door open in case in the future we
wanted to allow placement and class-specific forms of gcnew. Having a
parallel gcnew expression and operator best serves leaving that door open.

Another reason is that existing libraries, including GC libraries, already
use placement forms of new, and so many of the possible placement names
are taken. In particular, "new (gc) X" is already taken by the Boehm
collector. Yes, I know you suggested CLI::gc instead of plain gc, but in
practice I'm still concerned that enough people are liable to frequently
write "using namespace CLI;" (actually stdcli) to make this problematic.

Still another is one you cite: It's easier to teach that "the type of a
new-expression (and operator new) is a *" today, and that "the type of a
gcnew-expression is a ^".

Finally, a minor reason is that "gcnew" is slightly less typing than "new
(gc)" or "new (cli)", and moderately less typing than "new (stdcli::gc)".

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Hyman Rosen

unread,
Nov 23, 2003, 10:06:29 AM11/23/03
to
Francis Glassborow wrote:
> Without wanting to be unpleasant I have to point out that Microsoft's
> track record on issues of correct parsing of Standard C++ is far from
> impeccable so I would, on the evidence, not wish to accept, sight
> unseen, an implementation of an extended syntax.

ECMA is designed for fast-tracking standards. Microsoft has come up
with its CLI-to-C++ binding, and wants to submit it as such a standard.
Unless there is something clearly wrong with the submitted proposal,
it should be accepted.

In any case, <http://www.ecma-international.org/news/ecma-TG5-PR.htm>
states that this binding was developed by Microft, Edison Design Group,
and Dinkumware. If that much C++ brainpower can't come up with something
good, I don't see how anyone else will do better.

Herb Sutter

unread,
Nov 23, 2003, 10:19:16 AM11/23/03
to
On 22 Nov 2003 11:45:24 -0500, Gabriel Dos Reis
<g...@integrable-solutions.net> wrote:

>Hyman Rosen <hyr...@mail.com> writes:
>
>| Francis Glassborow wrote:
>| > IMO there is a fundamental flaw in the concept of 'contextual keywords'
>| > in that they only work as expected in correctly written code.

[...]


>one can't ignore possible mistakes and
>means to catch those errors. Contextual keywords work on the basis
>that the program is correct. That might be perfect for machines.
>I doubt it scales for Joe Programmer.

This subthread, starting with Francis' comment and including Gaby's above,
makes an excellent point that should not be underestimated. Quality of
diagnostics is indeed a key issue: If the programmer makes an error, what
will the diagnostic say? Will it be appropriate to what the programmer was
probably trying to do? Or will it be misleading?

Certainly there are contextual keyword designs that are fraught with
peril. We considered some and have already rejected many of them, several
for exactly this reason.

Let me write a larger response about C++/CLI keywords and how they're
currently designed, and then I'll get to this question in category #4
below (the only place where it applies).

C++/CLI specifies several keywords as extensions to ISO C++. The way they
are handled falls into five major categories, where only the first impacts
the meaning of existing ISO C++ programs.


1. Outright reserved words

As of this writing (November 22, 2003, the day after we released the
candidate base document), C++/CLI is down to only three reserved words:

gcnew generic nullptr

An existing program that uses these words as identifiers and wants to use
C++/CLI would have to rename the identifiers. I'll return to these three
again at the end.

All the other keywords, below, are contextual keywords that do not
conflict with identifiers. Any legal ISO C++ program that already uses the
names below as identifiers will continue to work as before; these keywords
are not reserved words.


2. Spaced keywords

One implementation technique we are using is to specify some keywords that
include embedded whitespace. These are safe: They can't possibly conflict
with any user identifiers because no C++ program can create an identifier
that contains whitespace characters. [I'll omit the obligatory reference
to Bjarne's classic April Fool's joke article on the whitespace operator.
:-) But what I'm saying here is true, not a joke.]

Currently these are:

for each
enum class/struct
interface class/struct
ref class/struct
value class/struct

For example, "ref class" is a single token in the lexer, and programs that
have a type or variable or namespace named "ref" are entirely unaffected.
(Somewhat amazingly, even most *macros* named "ref" are unaffected and
don't affect C++/CLI, unless coincidentally the next token in the macro's
definition line happens to be "class" or "struct"; more on this near the
end.)


3. Contextual keywords that are never ambiguous

Another technique we used was to define some keywords that can only appear
in positions in the language grammar where today nothing may appear. These
too are safe: They can't conflict with any user identifiers because no
identifiers could appear where the keyword appears, and vice versa.
Currently these are:

abstract finally in override sealed where

For example, "abstract" as a C++/CLI keyword can only appear in a class
definition after the class name and before the base class list, where
nothing can appear today:

ref class X abstract : B1, B2 { // ok, can only be the keyword
int abstract; // ok, just another identifier
};

class abstract { }; // ok, just another identifier

namespace abstract { /*...*/ } // ok, just another identifier


4. Contextual keywords that can be ambiguous with identifiers

Some keywords can appear in a grammar position where an identifier could
also appear, and this is the case that needs some extra attention. There
are currently five keywords in this category:

delegate event initonly literal property

In such grammar positions, when the compiler encounters a token that is
spelled the same as one of these keywords, the compiler can't know whether
the token means the keyword or whether it means an identifier until it
first does some further lookahead to consider later tokens. For example,
consider the following inside a class scope:

property int x; // ok, here property is the contextual keyword
property x; // ok, if property is the name of a type

There's no real ambiguity for "property" because as far as I can tell you
can't write legal C++/CLI code where "property" could be legally
interpreted both ways, both as the keyword and as an identifier.

Now imagine you're a compiler: What do you do when you hit the token
"property" as the first token of the next class member declaration?
There's not enough information to decide for sure whether it's an
identifier or a keyword without looking further ahead, and C++/CLI has to
specify the decision procedure -- the rules for deciding whether it's a
keyword or an identifier. As long as the user doesn't make a mistake
(i.e., as long as it's a legal program with or without C++/CLI) the answer
is clear, because there's no ambiguity.

But the question which Francis and Gaby raised comes up precisely in this
case: What if the user makes a mistake? For example:

property x; // error, if no type "property" exists

Let's say that we set up a disambiguation rule with the following general
structure (I'll get specific in just a moment):

1. Assume one case and try to parse what comes next that way.
2. If that fails, then assume the other case and try again.
3. If that fails, then issue a diagnostic.

In the case of "property x;" when there's no type in scope named
"property", both #1 and #2 will fail and the question is: When we get to
the diagnostic in case #3, what error message is the user likely to see?
The answer almost certainly is, a message that applies to the second
"other" case. Why? Because the compiler already tried the first case,
failed, backed up and tried the second "other" case -- and it's still in
that latter mode with all that context when it finally realizes that
didn't work either and now it has to issue the diagnostic. So by default,
absent some (often prodigious) amount of extra work inside the compiler,
the diagnostic that you'll get is the one that's easiest to give, namely
the one for the case the compiler was most recently pursuing, namely the
"other" case mentioned in #2 -- because the compiler already gave up on
the first case, and went down the other path instead.

So let's get specific. Let's say that the rule we picked was:

1. Assume that it's an identifier and try to parse it that way
(i.e., by default assume no use of the keyword extension).
2. If that fails, then assume that it's the keyword and try again.
3. If that fails, then issue a diagnostic.

Under that rule, what's the diagnostic the user gets on an illegal
declaration of "property x;"? One that's in the context of #2 (keyword),
something like "illegal property declaration," perhaps with a "the type
'x' was not defined" or a "you forgot to specify the type for property
'x'" in there somewhere.

On the other hand, let's say that the rule we picked was:

1. Assume that it's the keyword and try to parse it that way.
2. If that fails, then assume that it's an identifier and try again.
3. If that fails, then issue a diagnostic.

Under this rule, the diagnostic that's easy to give is something like "the
type 'property' was not defined."

Which is better?

This illustrates why it's very important to consider common mistakes and
whether the diagnostic the user will get really applies to what he was
probably trying to do. In this case, it's probably better to emit
something like "no type named 'property' exists" than "you forgot to
specify a type for your property named 'x'" -- the former is more likely
to address what the user was trying to do, and it also happens to preserve
the diagnostics for ISO C++ programs.

More broadly, of course, there are other rules you can use than the two
"try one way then try the other" variants shown above. But I hope this
helps to give the flavor for the 'quality of diagnostics' problem, and why
it's only an issue for five keywords.

I feel compelled to add that the collaboration and input over the past
year-plus from Bjarne Stroustrup and the folks at EDG (Steve Adamczyk,
John Spicer, and Daveed Vandevoorde, www.edg.com) has been wonderful and
invaluable in this regard specifically. It has really helped to have input
from other experienced compiler writers, including in Bjarne's case the
creator of the first C++ compiler and in EDG's case the folks who have one
of the world's strongest current C++ compilers. On several occasions all
of their input has helped get rid of inadvertent assumptions about "what's
implementable" and "what's diagnosable" based on just VC++'s own compiler
implementation and its source base. What's easy for one compiler
implementation is not necessarily so for another, and it's been extremely
useful to draw on the experience of comparing notes from two current
popular ones to make sure that features can be implemented readily on
various compiler architectures and source bases (not just VC++'s) and with
quality user diagnostics.


5. Not keywords, but in a namespace scope

Finally, there are a few "namespaced" keywords. These make the most sense
for pseudo-library features (ones that look and feel like library
types/functions but really are special names known to the compiler because
the compiler does special things when handling them). They appear in the
stdcli namespace and are:

array interior_ptr pin_ptr safe_cast

That's it.


Now, for a moment let's go back to case #1, reserved words. Right now
we're down to three reserved words. What would it take to get down to
zero? Consider the cases:

- nullptr: This has been proposed in WG21/J16 for C++0x, and at the last
meeting three weeks ago the evolution working group (EWG) was favorable to
it but wanted a few changes. That proposal paper written by me and Bjarne
is here: http://std.dkuug.dk/jtc1/sc22/wg21/docs/papers/2003/n1488.pdf .
We will revise the paper for the next meeting to reflect the EWG
direction. If C++0x does adopt the proposal and chooses to take the
keyword "nullptr" then the list of C++/CLI reserved words goes down to two
and C++/CLI would just directly follow the C++0x design for nullptr,
including any changes C++0x makes to it.

- gcnew: One obvious way to avoid taking this as a reserved word would be
to put it into bucket #1 as a spaced keyword, "gc new".

- generic: Similarly, a spaced keyword (possibly "generic template") would
avoid taking this reserved word. Unfortunately, spelling it "<anything>
template" is not only ugly, but seriously misleading because a generic
really is not at all a template.

Is it worth it to push all the way down to zero reserved words in C++/CLI?
There are pros and cons to doing so, but I've certainly always been
sympathetic to the goal of zero reserved words; Brandon and others will
surely tell you of my stubborn campaigning to kill off reserved words (I
think I've killed off over a half dozen already since I took the reins of
this effort in January, but I haven't kept an exact body count).

I think the right time to decide whether to push for zero reserved words
is probably near the end of the C++/CLI standards process (summer-ish
2004). At that point, when all other changes and refinements have been
made and everything else is in its final form, we will have a complete
(and I hope still very short) list of places where C++/CLI could change
the meaning of an existing C++ program, and that will be the best time to
consider them as a package and to make a decision whether to eliminate
some or all of them in a drive-it-to-zero cleanup push. I am looking
forward to seeing what the other participants in all C++ standards arenas,
and the broader community, think is the right thing to do as we get there.


Putting it all together, what's the impact on a legal ISO C++ program?
Only:

- The (zero to three) reserved words, which we may get down to zero.

- Macros with the same name as a contextual keyword, which out to be
rare because macros with all-lowercase names, never mind names that
are common words, are already considered bad form and liable to
break way more code than just C++/CLI. (For example, if a macro
named "event" existed it would already be breaking most attempts
to use Standard C++ iostreams, because the iostreams library has
an enum named "event".)

Let me illustrate the macro cases with two main examples that affect the
spaced keywords:

// Example 1: this has a different meaning in ISO C++ and C++/CLI
#define interface struct

In ISO C++, this means change every instance of "interface" to "struct".
In C++/CLI, because "interface struct" is a single reserved word, the
macro means instead to change every instance of "interface struct" to
nothing.

Here's the simplest workaround:

// Workaround 1: this has the same meaning in both
#define interface interface__
#define interface__ struct

Here's another example of a macro that can change the meaning of a program
in ISO C++ and C++/CLI:

// Example 2: this has a different meaning in ISO C++ and C++/CLI
#define ref const
ref class C { } c;

In ISO C++, "ref" goes to "const" and the last line defines a class C and
simultaneously declares a const object of that type named c. This is legal
code, albeit uncommon. In C++/CLI, the macro has no effect on the class
declaration because "ref class" is a single token (whereas the macro is
looking for the token "ref" alone, not "ref class") and so the last line
defines a ref class C and simultaneously declares a (non-const) object of
that type named c.

Here's the simplest workaround:

// Workaround 2: this has the same meaning in both
#define REF const
REF class C { } c;

But hey, macro names are supposed to be uppercase anyway. :-)

I hope these cases are somewhere between obscure and pathological. At any
rate, macros with short and common names are generally unusual in the wild
because they just break so much stuff. I would rate example 1 above as
fairly obscure (although windows.h has exactly that line in it, alas) and
example 2 as probably outright pathological (as I would rate all macros
with short and common names).

Whew. That's all for tonight.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 23, 2003, 10:28:36 AM11/23/03
to
On 22 Nov 2003 18:29:34 -0500, Gabriel Dos Reis

<g...@integrable-solutions.net> wrote:
>The end result is really that
>of having an ISO standard that was not really openly conducted. I
>know you said that anyone can join the ECMA process. But I think that
>is an abstract statement. One really has to face this issue from a
>practical point of view. Given the schedule and the fees, there is
>little hope for individuals (like me) to have resources (time and
>money) to invest in that process. I believe that is a real concern
>for many individuals involved in the ISO C++ effort.
>That is a point that, I believe, should not be overlooked.

Fair enough. BTW, I think that now that you're at Texas A&M, you can join
for free (and the first meeting a week and a half from now is at your
location).

A fundamental point about the TG5 schedule: It is designed to put us where
we can influence the CLI standard, which is progressing on its fast
schedule with or without us. Let me elaborate on that a little now...


And on 22 Nov 2003 06:02:06 -0500, Gabriel Dos Reis


<g...@integrable-solutions.net> wrote:
>schedule is set so that individuals like me have no *practical* chance
>to follow kand contribute to that standardization. Such people would
>need to work fulltime and have huge budget allocated for that job.

>That impression might not be right but it is what I got after following all
>the technical sessions and presentations. I also have the unpleasant
>feeling that it is a way of getting an ISO binding standard to C++,
>without playing an open game as WG21.

I should probably repeat here my own personal chain of reasoning that I
expressed to someone else in email a week ago. To me, the questions boil
down to these three:


Q1. Should the language redesign be done at all?

A1. (My answer) Clearly yes. The current MC++ design has known flaws, is
ugly and difficult to use, is known to have already impeded the use of C++
and have been a contributing factor why some programmers have chosen other
languages, and thus it marginalizes C++ on an important platform. For a
recent example, here's one I from two weeks ago on Infoworld:

"...Visual C++, once Microsoft’s crown jewel, was stuffed into the
trunk when .Net took the wheel. And that is at the heart of it.

"I never bought into the Java argument, later the C# argument,
that C++ is so arcane and dangerous that it has to be replaced.
If Visual Studio .Net were your first brush with C++, you would
leap into Sun’s arms and say, “By golly, you’re right -- C++ is
a horrible language!” ..."

http://www.infoworld.com/article/03/11/07/44OPcurve_1.html

He was talking also about the tools, but language is key to this too. I
had to come to realize that there was nothing I could do at Microsoft to
promote the cause of C++ more than to pour everything I could into the
effort that was already underway to fix the programming experience in C++
on .NET, and so that's what I'm doing.


Q2. If so, then should the design be an open standard or yet another set
of proprietary extensions exclusively controlled by one company?

A2. (My answer) Clearly open standards are better. Clearly interoperable
implementations are better.

To rehash (er, sorry, but pun intended) an incredibly tiny example that
had far more effect than it was worth, I refer again to the problems ISO
C++ has encountered with multiple and nonstandardized hash_map/hash_set
implementations -- exactly because they were incompatible with each other,
the standard couldn't use those names because any semantics we assigned
would break someone. In the end we felt forced to adopt uglier and less
intuitive names. At least if those container extensions had been written
consistently across vendors (as they would have been had there been some
standard for them), then we would have had the choice of adopting the name
with the standardized semantics, or of choosing a different name with
different semantics. As it was, we had only the choice of using a
different name.


Q3. If so, then where should the standard be done, in ISO or in ECMA?

A3. (My answer) ECMA TC39, where CLI is currently being revised. We don't
need to influence the ISO C++ standard -- it's fixed, and we are following
it slavishly. We do, however, need to track and influence the CLI
standard, otherwise it will certainly have small but painful gratuitous
incompatibilities that needlessly hurt C++ on that platform. (This has
already been demonstrated, and fixed, in several small areas.) And CLI is
being changed in ECMA TC39 on a fast schedule that we are matching (and
not initiating ourselves).

If we had tried to do the work in ISO (whether in WG21 or elsewhere), not
only would we not be close to TG3 and unable to influence CLI favorably to
C++ (aside: much more distant than C++ was from C and yet look at the
small but painful things in C99 that might have been tweaked ever so
lightly to be more compatible with C++), but we'd have missed the boat
completely: At ISO's fastest possible speed we would get approval to begin
the process at about the same time that the CLI standard will be completed
and frozen. Sigh.

Note that the question of where and at what speed we might choose to do
the work in a perfect world doesn't really even enter into the above: The
a priori fact is that the CLI work is already being done in ECMA on a fast
schedule targeting technical completeness in Sep 04 and for ECMA General
Assembly vote in Dec 04. None of us created or chose that situation;
that's where it is, and that can't be changed, at least not by any of us
in WG21. We can either go there to influence CLI, or not and just hope for
the best.


Those are the three fundamental decision points that are driving my
personal thinking, anyway. What do you think about this? It would help me
to understand your feelings on these questions, and which ones you would
answer differently and why.

Best wishes,

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

news user

unread,
Nov 23, 2003, 10:30:45 AM11/23/03
to
In article <m38ym9t...@uniton.integrable-solutions.net>,
g...@integrable-solutions.net says...

>
> I believe we have enough gotchas to cope with current C++; I'm not
> convinced that we should multiply them by any factor greater than 1.
>

I believe current C++ is cryptic enough; I'm not convinced that
its obfuscation level should be multiplied by any factor greater
than 1 :-)

What are the options ?

1. Add still more obscure syntax in order to be standard
compliant. This is the worst choice from the user point
of view, as the price in readability and useless thinking
must be paid day in day out by everybody.

2. Add readable syntax, and break existing code. Highly
bothersome, but less than (1), as conversion is essentially
a one-time cost.

3. Add contextual syntax. Bothersome, as it makes the
compiler even more complex. However, the user could not
care less, till compiler slowness, bugs and language
gotchas are increased to a dramatic level.

Therefore, (3) is a-priori the prefered solution.

To reject it, it must be unambiguously shown that
its hidden costs for the user are higher than the very
obvious costs of (1) and (2).

--- Raoul

Hyman Rosen

unread,
Nov 23, 2003, 6:20:38 PM11/23/03
to
Gabriel Dos Reis wrote:
> However, I do not see that a sufficient
> guarantee that the ECMA CLI/C++ thingy would not not be a "take over".

Standrads processes are not unlike open source projects. If it is
perceived that the current standard effort is too slow or useless,
and there ar epeople who are willing to do the work, then the
standard can be "forked" and taken over. This is as it should be.

Asking people who are doing good productive work to slow down because
others can't keep up is just a non-starter.

Francis Glassborow

unread,
Nov 23, 2003, 6:37:56 PM11/23/03
to
In article <u7Wvb.5059$7%4....@nwrdny03.gnilink.net>, Hyman Rosen
<hyr...@mail.com> writes

>ECMA is designed for fast-tracking standards. Microsoft has come up
>with its CLI-to-C++ binding, and wants to submit it as such a standard.
>Unless there is something clearly wrong with the submitted proposal,
>it should be accepted.

I think that is a very dangerous position. Once something becomes a
Standard we are generally left with the consequences on an almost
permanent basis. There are quite a few warts in the C++ Standard that we
basically expect to live with into the foreseeable future because fixing
them would break existing code.

I want to feel that the final document is positively good for C++ rather
than just that there is nothing clearly wrong with it. I am sure that
Herb, Bjarne, PJP et al. feel the same way which gives me a little more
comfort.

>
>In any case, <http://www.ecma-international.org/news/ecma-TG5-PR.htm>
>states that this binding was developed by Microft, Edison Design Group,
>and Dinkumware. If that much C++ brainpower can't come up with something
>good, I don't see how anyone else will do better.

Not being able to come up with something good is not the same as
actually doing so:-) The brain-power that went into C99 was very
considerable but there are many who feel that the result was flawed.
EDG's involvement in the base document is a good sign but that does not
lead me to pre-judge the end result because when all is said and done it
will be TG5 that produces the final document. Until I am convinced of
the quality of that end product I will be recommending that my NB vote
against fast-tracking it to an ISO Standard.

The best way to convince me and people like me is to ensure that we have
continued oversight of the work even though we may not have the time and
financial resources to attend TG5 meeting even as non-voting observers
(assuming that ECMA is sufficiently versatile to understand the
desirability of granting access to outside 'experts' as observers)


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Francis Glassborow

unread,
Nov 23, 2003, 6:42:43 PM11/23/03
to
In article <pv40svo541o3fcop3...@4ax.com>, Herb Sutter
<hsu...@gotw.ca> writes

>One implementation technique we are using is to specify some keywords that
>include embedded whitespace. These are safe: They can't possibly conflict
>with any user identifiers because no C++ program can create an identifier
>that contains whitespace characters. [I'll omit the obligatory reference
>to Bjarne's classic April Fool's joke article on the whitespace operator.
>:-) But what I'm saying here is true, not a joke.]
>
>Currently these are:
>
> for each
> enum class/struct
> interface class/struct
> ref class/struct
> value class/struct

I understand the idea but nonetheless I worry. Each of those things is
vulnerable to the preprocessor because it now has to understand
significant whitespace. Note that pre-processing happens prior to
tokenisation but here the preprocessor must recognise tokens that
include whitespace (or that is the burden of your example latter on).
And note that this is white space and not just a single space.

Actually we could have done with this single token type rule to deal
with multi-token types. I personally think that all types should have a
unique spelling. E.g. long int should not be allowed to be spelt as int
long.

I would also be happier to see class/struct reduced to one or other of
the keywords without the other being an option.

--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

Francis Glassborow

unread,
Nov 23, 2003, 6:43:13 PM11/23/03
to
In article <ka00sv8gumlviqsom...@4ax.com>, Herb Sutter
<hsu...@gotw.ca> writes

>I think that the presence of CLI operators on ^'s actually argues more for
>pointerlike semantics than valuelike semantics, because unlike the usual
>C++ operators they nearly always result in binding to a newly created
>object. For example, for T^'s a and b, a += b results in adding the value
>in *a to the value in *b and putting the result in a new object, and a now
>refers to that. (Yes, whereas for the usual C++ operators we can implement
>+ in terms of +=, it's the other way around for the CLI operators, which
>takes getting used to.)

As a teacher that makes me very uncomfortable, such subtleties make a
language much harder for the novice to understand.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

White Wolf

unread,
Nov 23, 2003, 7:26:17 PM11/23/03
to
Hyman Rosen wrote:
> Francis Glassborow wrote:
> > Without wanting to be unpleasant I have to point out that
> Microsoft's > track record on issues of correct parsing of Standard
> C++ is far from > impeccable so I would, on the evidence, not wish
> to accept, sight > unseen, an implementation of an extended syntax.
>
> ECMA is designed for fast-tracking standards. Microsoft has come up
> with its CLI-to-C++ binding, and wants to submit it as such a
> standard. Unless there is something clearly wrong with the submitted
> proposal, it should be accepted.

I believe that it might not be obvious at first sight what is wrong with
it - that might be the point few people are trying to make.

> In any case, <http://www.ecma-international.org/news/ecma-TG5-PR.htm>
> states that this binding was developed by Microft, Edison Design
> Group, and Dinkumware. If that much C++ brainpower can't come up with
> something good, I don't see how anyone else will do better.

While I believe that the listed companies and the invididuals behind those
names are indeed a impressive, I also believe that it is a rather strong
statement that noone else could do better or add anything. It is like if
you would be ready to propose that noone else but them attends the WG21
meetings. I doubt this is the case.

Attila

White Wolf

unread,
Nov 23, 2003, 7:27:01 PM11/23/03
to
Gabriel Dos Reis wrote:
[SNIP]

> Programmer's intent. As C++ programmers and designers, we do have
> experience in that department and I would not say it is really that
> conclusive. It certainly is a fun and exciting (academic?) exercise
> in parsing, but I'm NOT convinced that I would push for it. I
> certainly do know that I would vote against it, if it were proposed
> for C++.

That is all nice but *why*? Could you tell some real life examples? You
and Francis are both very much against this, but I did not see an argument
yet which I could understand. It all may be my fault, but I would
appreciate to see a "bad case", something simple showing the dangers.

--
WW aka Attila

White Wolf

unread,
Nov 23, 2003, 7:33:30 PM11/23/03
to
Gabriel Dos Reis wrote:
[SNIP]
>> Well... Herb's presentation on the WG21 suggests otherwise, but of
>> course I am not good in politics... OTOH I cannot imagine Herb
>> playing politics. I just cannot.
>
> I was not suggesting Herb is playing politics. If I recall correctly,
> Herb is one of the *two* technical "advisors" (I don't recall the
> exact qualification from the top of my head) on that board. I have no
> clue of how all that works but I did not have the impression that all
> the cards are in Herb's hand. So, it would be, IMHO, totally
> misleading and insulting to characterize him as playing politics.

Hm, I did not mean to imply that you have said so. I just have hard time to
imagine Herb doing anything like that - and also hard time to imagine he
could be caught up in such a process without his knowledge.

> But, that does not remove the impression I got from the talks.

What was exatcly your impression? Can you tell it in simple words that I
understand?

> As far as language design is concerned, not all decisions are purely
> technically based. Some facts may happen without support from the
> technical experts. That has happened often.

Oh. Do you mean around C++?

>> There is a chance of hijacking the future C++ here, but I do
>> not think that will happen with Herb in both committees. I know
>> this is not a proof, but I would have to rebuild my whole world if
>> it would happen...
>
> Again, I refuse to focuse on individuals/persons.

I did not aks it. :-) I was just telling that what seem you do not agree
with - I had the feeling that Herb (and Bjarne for that matter) being there
ensures that things go as good as possible. I feel from your post that you
fear that it might not be good enough...

> Bjarne Stroustrup
> is also listed as being on the technical board, and I do know he cares
> about C++ as much as Herb. However, I do not see that a sufficient
> guarantee that the ECMA CLI/C++ thingy would not not be a "take over".
> As I said, I'm NOT expecting "take over" from those people. They are
> not known to play politics. But, without intent of being insulting,
> their presence on the technical board is no sufficient guranatee --
> IMO -- that I should not worry about what is happening.

Here we agree, I am also worried. I was reading Herb telling that the
possible collision created by enum class is removed, just to read the next
day the same enum class in his posts as part of the current proposal. And I
am puzzled.

>> About the costs. As Herb told they push ECMA to provide an
>> affordable pricing for smaller companies. And that is - at least -
>> promising and it suggests - to me - that they really do want
>> everyone to be involved. OTOH
>
> Intent is one thing, what happens is another. As things are currently
> set, I see no practical evidence that the process is as open as WG21
> (I'm not saying WG21 process is perfect and should be a model, but at
> least, it is much more affordable).

I think the key issue here is that they have to be fast and they have to be
close to the CLI standardization. And I have got the idea (not from any
facts, it is just my paranoia) that it was not really a choice, MS did want
this in ECMA... And of course since they have lost years with the "bad"
managed C++ it is understandable they want to go fast. I guess Herb is
working 26 hours days nowadays to keep up with the process, and he might not
be alone.

This whole issue brings an old joke to my mind. Q: A 500 kilograms gorilla
walks into a bar. Where does he sit? A: Whereever he wants to. And for
everyones surprise I do not mean Microsoft to be the gorilla, it is rather
necessity. MS is not big because of itself, it is big because there is an
industry driven by it - and it is driven by that industry as well.
Whereever there is an industry of a non-trivial size there is politics. And
whenever this industry starts to flow into a direction you either grab on a
piece of driftwood and swim with the flood or you will be washed away.

IMO what is happening is Herb saving C++ in that flood, with an effort I
would call heroic. No brownnosing. The fact the Microsoft has "elminated"
Java on the MS operating systems has created a vacuum. If MS does not plug
it, someone else will. So .NET came and it is big. Bigger than huge. As a
movement, I am not talking about technical quality. As far as I see it
shows some real dedication that MS has brought the CLI and C# into an open
standardization process. I believe that it shows a very strong dedication
towards this platform, however open this process in reality is. And as for
C++ in all of this... It has taken some very bad hits already. So it is
essential for the future C++ (and obviously the C++ team within Microsoft)
to come up with a solution and come up with it fast. We can dislike the
process but I think we owe it to ourselves to admit that it is rather the
situation driving the necessary process than the other way around. And I
also believe that if C++ weakens at Microsoft it will weaken C++ in general.

So as you have said Gabriel there are decision in our business which are not
driven by technical facts, and the way the C++/CLI is being standardized is
one of them. We can complain about it, but I think it will not change a
thing. There are force bigger than you or me or Herb working here. We can
either work along the lines of forces and bend them to go more to the ways
we feel they should - and do it with diplomacy - or sit and see as they
bend our universe ways we dislike. I strongly suggest that you and all of
you here who are concerned about this process get involved because I
sincerely believe that there is no other way to influence it. Otherwise it
will be like watching your hat flying away in the strong wind and being run
over by a car...

[SNIP]


> I have no problem with a given commercial coorporate growing up its
> standard for its products. However, I'm concerned about an ISO
> standard process that, in many ways, affects/relates to a given
> ISO-standardized programming language that is not the property of any
> coorporate. If the ECMA CLI/C++ thingy is believed to be important to
> the C++ community and that community should care about it, then I
> think that process should be at least as open as the process that
> is in charge of standardizing and evolving C++.

I completely agree with you that in a perfect world it would go as you wish.
But there are facts of life here which does not allow it to go to that
direction. I have seen Herb addressing you in another post, suggesting that
you attend the next meeting. I humbly suggest you give it a chance. I
sincerely believe that your knowledge and insights are valued by all those
working in TG5. Give it a chance, go there and talk to them. It might be
that you and Herb can work out a way for you to participate. I believe it
is in Microsofts best interest to sponsor any good minds who can be involved
in the process. And if you have alternative suggestions why wouldn't they
listen to them? I think they all know the risk they are taking with this
fast standardization and I do not think they like it more than you do, but
as all of us, they have no choice.

>> to all those interested. I believe that even as far as sponsoring
>> some smaller companies to be able to delegate their experts.
>
> They might also be suspected of conflict of interests. I really
> value a standard developed by independent people one a document
> sponsored mostly by a given coorporate. C++ gets part of its
> credibility from the fact that, it is not "funded" in any sense by a
> given company.

As CLI is a Microsoft creation it is rather imposible for me to imagine it
otherwise.

>> I know I only wrote sort of political aspects above, but I think I
>> have said already all the techincal "ideas" before.
>
> (Un?)fortunately, language design and standardization are affected by
> non-purely technical issues.

Yes, I completely agree with you on that. But I believe that this is
something we cannot change. And sooner we admit to it the sooner we will be
able to find the degrees of freedom where there is a chance to affect the
process.

--
WW aka Attila

Gabriel Dos Reis

unread,
Nov 24, 2003, 6:07:26 AM11/24/03
to
Hyman Rosen <hyr...@mail.com> writes:

| Gabriel Dos Reis wrote:
| > However, I do not see that a sufficient
| > guarantee that the ECMA CLI/C++ thingy would not not be a "take over".
|
| Standrads processes are not unlike open source projects.

But then useful programming languages standards are not like software
implemented and marketed in two months.

[...]

| Asking people who are doing good productive work to slow down because
| others can't keep up is just a non-starter.

I would suggest you re-read my message.

I'm not asking anyone "doing good productive work to slow down because
of xxx".

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 24, 2003, 6:09:25 AM11/24/03
to
"White Wolf" <wo...@freemail.hu> writes:

| Gabriel Dos Reis wrote:
| [SNIP]
| > Programmer's intent. As C++ programmers and designers, we do have
| > experience in that department and I would not say it is really that
| > conclusive. It certainly is a fun and exciting (academic?) exercise
| > in parsing, but I'm NOT convinced that I would push for it. I
| > certainly do know that I would vote against it, if it were proposed
| > for C++.
|
| That is all nice but *why*? Could you tell some real life examples? You
| and Francis are both very much against this, but I did not see an argument
| yet which I could understand. It all may be my fault, but I would
| appreciate to see a "bad case", something simple showing the dangers.

Of course, C++ do not have contextual keywords. But it has experience
with the idea that whatever the parser comes up with matches
Programnmer's intent. I'll throw in two basic examples.

The first was mentioned by Francis in this thread. Consider

#include <vector>
#include <iostream>
#include <istream>
#include <ostream>
#include <iterator>
#include <algorithm>

int main()
{
vector<int> v(istream_iterator<int>(cin),
istream_iterator<int>());
// ...
}

It is fun to make a puzzle about it. It is less fun for Joe
Programmer who has been bitten by that.

The second concerns the explicit keyword. Why did we need it?
(certainly I'm not egainst implicit conversions, they are very useful;
but I can't just say that whatever the compiler comes up with always
matches my intent)

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Hyman Rosen

unread,
Nov 24, 2003, 6:10:42 AM11/24/03
to
Francis Glassborow wrote:
> I want to feel that the final document is positively good for C++ rather
> than just that there is nothing clearly wrong with it.

But that is not its first priority. The first priority is getting CLI and
..NET integerated well into C++. It's likely that a lot of the "is it good
for C++" contingent would wind up just being obstructionist.

> Until I am convinced of the quality of that end product I will be
> recommending that my NB vote against fast-tracking it to an ISO Standard.

Which fortunately will be irrelevant. This is basically a one-horse show.
It's Microsoft's CLI, and Microsoft's desire to integrate it with C++.
Having these things as ECMA standards means that other people can follow
along, and perhaps contribute ideas, but in the final analysis, it's not
really going to matter whether various national bodies go along.

> The best way to convince me and people like me is to ensure that we have
> continued oversight of the work even though we may not have the time and
> financial resources to attend TG5 meeting even as non-voting observers
> (assuming that ECMA is sufficiently versatile to understand the
> desirability of granting access to outside 'experts' as observers)

That Microsoft is laboring at all to convince you is just niceness on
their part, plus trying to get in a not-so-subtle jab at Sun and Java.
But in the long run, they don't care whether you go along.

You only need to look at the Fortran standardization process to see what
things look like when they run amok.

Attila Feher

unread,
Nov 24, 2003, 6:20:06 AM11/24/03
to
Francis Glassborow wrote:
> In article <ka00sv8gumlviqsom...@4ax.com>, Herb Sutter
> <hsu...@gotw.ca> writes
>> I think that the presence of CLI operators on ^'s actually argues
>> more for pointerlike semantics than valuelike semantics, because
>> unlike the usual C++ operators they nearly always result in binding
>> to a newly created object. For example, for T^'s a and b, a += b
>> results in adding the value in *a to the value in *b and putting the
>> result in a new object, and a now refers to that. (Yes, whereas for
>> the usual C++ operators we can implement + in terms of +=, it's the
>> other way around for the CLI operators, which takes getting used to.)
>
> As a teacher that makes me very uncomfortable, such subtleties make a
> language much harder for the novice to understand.

Do you really need to teach this? I probably misunderstand but with those ^
things I feel that the identity (ehem: memory address, this pointer value)
of the actual object "pointed" by it is rather unimportant. Should we
actually know this to be able to successfully use it as a beginner?

--
Attila aka WW

Gabriel Dos Reis

unread,
Nov 24, 2003, 7:36:27 AM11/24/03
to
Herb Sutter <hsu...@gotw.ca> writes:

| On 22 Nov 2003 18:29:34 -0500, Gabriel Dos Reis
| <g...@integrable-solutions.net> wrote:
| >The end result is really that
| >of having an ISO standard that was not really openly conducted. I

| >know you said that anyone can join the ECMA process. But I think tha=


t
| >is an abstract statement. One really has to face this issue from a
| >practical point of view. Given the schedule and the fees, there is
| >little hope for individuals (like me) to have resources (time and
| >money) to invest in that process. I believe that is a real concern
| >for many individuals involved in the ISO C++ effort.
| >That is a point that, I believe, should not be overlooked.
|

| Fair enough. BTW, I think that now that you're at Texas A&M, you can jo=


in
| for free (and the first meeting a week and a half from now is at your
| location).

The "like me" in my previous comment was meant to be *an* example of
"individuals". Certainly, it is a real great opportunity to get a
position at Texas A&M; but "Integrable Solutions" is still in France,
does not have its budget expanded. I'll continue to cover all fees
generated by working on C++ standardization with my own money.
Moreover, my position in Texas A&M does not allow for the fast
schedule set by the CLI/C++ binding. The Computer Science departement
did not hire me to work on CLI/C++ standardization. So, my being at
Texas A&M does not address any of the point I made. Finally, I'm not
sure that other people who have shared the same concerns with me will
get similar opportunities.

Certainly, I'll appreciate to have more in-depth discussions when the
meeting will be held in College Station but the points I made remain:
(1) the schedule as set is too compressed;
(2) fees are still too high for many individuals;
(3) the process is not as open as WG21's effort.

| A fundamental point about the TG5 schedule: It is designed to put us wh=


ere
| we can influence the CLI standard, which is progressing on its fast
| schedule with or without us. Let me elaborate on that a little now...
|
|
| And on 22 Nov 2003 06:02:06 -0500, Gabriel Dos Reis
| <g...@integrable-solutions.net> wrote:

| >schedule is set so that individuals like me have no *practical* chanc=


e
| >to follow kand contribute to that standardization. Such people would
| >need to work fulltime and have huge budget allocated for that job.

| >That impression might not be right but it is what I got after followi=


ng all
| >the technical sessions and presentations. I also have the unpleasant
| >feeling that it is a way of getting an ISO binding standard to C++,
| >without playing an open game as WG21.
|
| I should probably repeat here my own personal chain of reasoning that I

| expressed to someone else in email a week ago. To me, the questions boi=


l
| down to these three:
|
|
| Q1. Should the language redesign be done at all?
|

| A1. (My answer) Clearly yes. The current MC++ design has known flaws, i=
s
| ugly and difficult to use, is known to have already impeded the use of =
C++
| and have been a contributing factor why some programmers have chosen ot=


her
| languages, and thus it marginalizes C++ on an important platform. For a
| recent example, here's one I from two weeks ago on Infoworld:
|

| "...Visual C++, once Microsoft=C3=A2=C2=9E=A0=C2=9E=B9s crown jewel,=


was stuffed into the
| trunk when .Net took the wheel. And that is at the heart of it.
|
| "I never bought into the Java argument, later the C# argument,
| that C++ is so arcane and dangerous that it has to be replaced.
| If Visual Studio .Net were your first brush with C++, you would

| leap into Sun=C3=A2=C2=9E=A0=C2=9E=B9s arms and say, =C3=A2=C2=9E=A0=
=C2=9E=BCBy golly, you=C3=A2=C2=9E=A0=C2=9E=B9re right -- C++ is
| a horrible language!=C3=A2=C2=9E=A0=C2=9E=BD ..."


|
| http://www.infoworld.com/article/03/11/07/44OPcurve_1.html
|
| He was talking also about the tools, but language is key to this too. I

| had to come to realize that there was nothing I could do at Microsoft t=


o
| promote the cause of C++ more than to pour everything I could into the

| effort that was already underway to fix the programming experience in C=


++
| on .NET, and so that's what I'm doing.

I understand that. As I said in another message, I have no problem
with a corporate that grows up its standards for its own products. As
What is worrying me is the way the to-be-proposed ISO standard is
being conducted. I believe that, if it were more open, at least as
open as the WG21 effort, then:
(1) it will clear out most of the concerns being discussed in this
thread, and free time to focus on other important technical aspects=
;

(2) it'll have higher potential to allow more people who care about
C++ and CLI to come and work on the standard; it would not be
perceived as "just another Microsoft-imposed standard". It
would be something that would get more inputs from more
indenpendent sources.

| Q2. If so, then should the design be an open standard or yet another se=


t
| of proprietary extensions exclusively controlled by one company?
|

| A2. (My answer) Clearly open standards are better. Clearly interoperabl=
e
| implementations are better.

I completely agree that an open standard is preferable -- that is what
I have been saying.

| To rehash (er, sorry, but pun intended) an incredibly tiny example that

| had far more effect than it was worth, I refer again to the problems IS=


O
| C++ has encountered with multiple and nonstandardized hash_map/hash_set

| implementations -- exactly because they were incompatible with each oth=


er,
| the standard couldn't use those names because any semantics we assigned
| would break someone.

I do not agree with that report.

The main issue, as I understand it, that has blocked progress and
generated endless discussions for years is that a given implementation
has put its extensions of hash_map/hash_set in the standard namespace
(a mistake many implementations made a long time ago, but corrected
since then) but is under obligation of not moving out those codes,
even now that it knows it was a mistake. Several proposals have been
made, among which, hashmap/hashset, just keep the obvious names, ...

| In the end we felt forced to adopt uglier and less
| intuitive names.

Well, we have named a red-black tree "map" [argument heard at the
Oxford meeting :-)], we have an algorithm named remove that does not
remove anything, etc...

| At least if those container extensions had been written

| consistently across vendors (as they would have been had there been som=
e
| standard for them), then we would have had the choice of adopting the n=


ame
| with the standardized semantics, or of choosing a different name with
| different semantics. As it was, we had only the choice of using a
| different name.

Well, C++98 has been continuously criticized as being innovation
standardization instead of standardization of existing practice. I've
even read those arguments from people who put hash_map in std:: -- and
at the time, everybody was doing its own extension -- in this newsgroup.
And now, you're using that same existing unfortunate practice to argue
that schedule has to be fast -- which I understand as a nice euphemism
for "we need to standard innovation".

I guess something is odd somewhere.

If CLI is supposed to be a framework that supports C++, it should not
be developed in such a closed way. If it is that way, then it says
something.

| Q3. If so, then where should the standard be done, in ISO or in ECMA?
|

| A3. (My answer) ECMA TC39, where CLI is currently being revised. We don=
't
| need to influence the ISO C++ standard -- it's fixed, and we are follow=


ing
| it slavishly. We do, however, need to track and influence the CLI
| standard, otherwise it will certainly have small but painful gratuitous
| incompatibilities that needlessly hurt C++ on that platform. (This has

| already been demonstrated, and fixed, in several small areas.) And CLI =


is
| being changed in ECMA TC39 on a fast schedule that we are matching (and
| not initiating ourselves).
|

| If we had tried to do the work in ISO (whether in WG21 or elsewhere), n=
ot
| only would we not be close to TG3 and unable to influence CLI favorably=


to
| C++ (aside: much more distant than C++ was from C and yet look at the
| small but painful things in C99 that might have been tweaked ever so
| lightly to be more compatible with C++), but we'd have missed the boat

| completely: At ISO's fastest possible speed we would get approval to be=
gin
| the process at about the same time that the CLI standard will be comple=
ted
| and frozen. Sigh.

I think a least two questions ought to be clearly answered:

(1) Is CLI just a Microsoft framework to grow up its own products?
If no, in what aspect and to what extent do ISO National Body
members have to care about it?

(2) Is CLI intented to support at least C++ (among the zoo of
programmaing languages out there)?

In particular, if the answer to question (2) is "yes", then I would
say that setting the CLI/C++ binding process fast, expensive to track
and less open is odd and reflects a deep problem with the CLI
process. That in turn would lead me into thinking that the answer to
the first part of question (1) is "yes". That answer in itself might
be fine. But, it needs to be clearly stated.

| Note that the question of where and at what speed we might choose to do

| the work in a perfect world doesn't really even enter into the above: T=
he

The concerns I expressed are NOT stated with the assumption that we're
going to conduct any action in a perfect word. In a perfect word, we
would not be talking about CLI/C++ :-)

| a priori fact is that the CLI work is already being done in ECMA on a f=
ast
| schedule targeting technical completeness in Sep 04 and for ECMA Genera=


l
| Assembly vote in Dec 04. None of us created or chose that situation;

| that's where it is, and that can't be changed, at least not by any of u=
s
| in WG21. We can either go there to influence CLI, or not and just hope =


for
| the best.
|
|
| Those are the three fundamental decision points that are driving my

| personal thinking, anyway. What do you think about this? It would help =
me
| to understand your feelings on these questions, and which ones you woul=
d
| answer differently and why.

Surely, I was exposed only to a limited set of determining factors,
wasn't I? :-) In particular, since I'm not in your position at
Microsoft there are probably other key factors I do not have knowledge
of ;-p. Anyway, above you see what I feel about it. You may or may
not have to take it into account.
I believe I had to express those feelings I got and those shared by many
other individuals. The defavorable vote (at least 4 countries
against, J16 majority against) on the motion of having liason with
CLI/C++ thingy conveys something, I think.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Attila Feher

unread,
Nov 24, 2003, 7:49:12 AM11/24/03
to
Alexander Terekhov wrote:
> Peter Dimov wrote:
> [...]
>> On the other hand, I'm probably missing something, and I need to add
>> that I'm fascinated by watching a new language evolve. :-)
>
> Except that MS should better "evolve" C#, not C++ (calling it
> "binding"... what a joke).

Call it a joke or not, it is a serious one. I mean it has to be taken
seriously.

> I mean that it will probably be much
> more productive if MS would roll carefully selected C++ stuff
> into C#.

Please let me ask it: to whom would it be good? To MS: certainly, it is
much easier for them to change/control that language. For the C# people:
yep, it would be good because they would have a stronger language. But in
what way would it be good for "us"?

--
Attila aka WW

Francis Glassborow

unread,
Nov 24, 2003, 2:13:27 PM11/24/03
to
In article <bpsbuk$nns$1...@newstree.wise.edt.ericsson.se>, Attila Feher
<attila...@lmf.ericsson.se> writes

>Francis Glassborow wrote:
> > In article <ka00sv8gumlviqsom...@4ax.com>, Herb Sutter

> > <hsu...@gotw.ca> writes
>> > I think that the presence of CLI operators on ^'s actually argues
>> > more for pointerlike semantics than valuelike semantics, because
>> > unlike the usual C++ operators they nearly always result in binding

>> > to a newly created object. For example, for T^'s a and b, a += b
>> > results in adding the value in *a to the value in *b and putting
>> > the result in a new object, and a now refers to that. (Yes, whereas

>> > for the usual C++ operators we can implement + in terms of +=, it's

>> > the other way around for the CLI operators, which takes getting
>> > used to.)
> >
> > As a teacher that makes me very uncomfortable, such subtleties make
> > a language much harder for the novice to understand.
>
>Do you really need to teach this? I probably misunderstand but with
>those ^ things I feel that the identity (ehem: memory address, this
>pointer value) of the actual object "pointed" by it is rather
>unimportant. Should we actually know this to be able to successfully
use it as a beginner?

As you know, I have just written a book introducing programming to
newcomers and used C++ as the introductory language without once
explicitly using a pointer. However when I write my book introducing C++
to the just past novice programmer (i.e. someone who can do some
programming in some language) I will certainly need to cover pointers
and at some stage operator overloading. Having to explain that += is a
more primitive operation than + for value based semantics but that it is
the other way round when dealing with T^'s is not something I look
forward to. Note that as it is very likely that new users will want to
use C++ on an MS platform I will have to give very careful thought to
whether I write about pure C++ or C++ in a CLI environment. Actually as
an author I should feel happy with that because it provides an
opportunity for two books instead of one. However as a course presenter
I am much less happy because most customers will not want a five day
course extended to seven in order to cover CLI issues yet those using MS
platforms will not want CLI issues to be ignored.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be
reading this. Viruses do not just hurt the infected but the whole
community.

Nicola Musatti

unread,
Nov 24, 2003, 2:14:49 PM11/24/03
to
Herb Sutter <hsu...@gotw.ca> wrote in message
news:<entsrvom7gsvc0jar...@4ax.com>...
> On 18 Nov 2003 14:58:55 -0500, Nicola....@ObjectWay.it (Nicola
> Musatti) wrote:
> >Properties are a topic of general interest; a better effort should
> be >made to have them included in the next version of the standard?
> What >happened to the Borland proposal?
>
> Meta-comment: In standards, nothing happens except when someone
> decides to invest the expertise, time, and energy to write and
> champion a proposal. I think the ISO C++ committee would probably be
> favorable to looking at properties -- and threads, for that matter.
> But that can only happen if someone steps up to do the work, and no
> one currently involved in the committee has so far demonstrated the
> bandwidth and drive to work on these areas. Both properties and
> threads have been presented at committee meetings, but then their
> authors for whatever reason did not follow through and pursue them.

I'm well aware of that. Actually, I really dislike the idea of
introducing a new syntax for properties, so you could consider my not
submitting a proposal an act of resistance :-) Still, they might be the
lesser of many evil, and I'm aware of the interest many have in their
availability. Even the metaphor they represent is sound: how many tools
we use everyday whose behaviour is influenced by our setting a value to
some input device? Just think of your microwave oven's knobs...

Threads I care more about, but I really am an "in-expert" on the
subject.

> I think it's a shame that properties and threads aren't being actively

> worked on in WG21/J16, but like every other proposal someone has to be

> willing to invest the hard work to write and promote it and to
> convince busy committee members why it should be in the standard. I
> agree with Bjarne that properties should be in the language, not
> because they are a core language feature (they are not and they can be

> simulated) but because they are pervasive, and language support does
> give better ease of use than simulating the feature in a library
> (possibly with compiler magic-assisted types, where the compiler knows

> about special types and uses them as keys for code generation).

Even in this case there are many approaches that maight be pursued, some
of which feel very natural (to me, at least). To give an example, I see
no reason why bound member function pointers (closures in Borland
parlance) are not part of the language: given

int f();

struct S {
int i;
int g();
};

S s;
int S::*pi = &S::i;
int (S::*pg)() = &S::g;

if s.*pi and f are valid expressions, why s.*pg is not valid? Once bound
pointers to member functions are available, it's just a question of
devising a way to pass "this", implicitly or explicitly, to a property
declaration.

> FWIW, my understanding is that Borland intends to participate in
> C++/CLI and we sure welcome their input on properties in particular.
> Note that their approach has some C++ syntax extensions too.

Their current approach, i.e. their C++ binding to the VCL library, does
have a few extensions, all duly doubly-underscored.

Cheers,
Nicola Musatti

Nicola Musatti

unread,
Nov 24, 2003, 2:16:38 PM11/24/03
to
Herb Sutter <hsu...@gotw.ca> wrote in message
news:<entsrvom7gsvc0jar...@4ax.com>...
> On 18 Nov 2003 14:58:55 -0500, Nicola....@ObjectWay.it (Nicola
> Musatti) wrote:
[...]
> >The hat symbol and gcnew could be replaced with a template like
> >syntax, e.g.
> >
> >cli::handle<R> r = cli::gcnew<R>();
>
> I agree that those are alternatives. Everyone, including me, first
> pushes hard for a library-only (or at least library-like) solution
> when they first start out on this problem. I think an argument can be
> made for it, and at one time I did so too.
>
> To me, a killer argument in favor of a new declarator with usage "R^"
> instead of a library-like "cli::handle<R>" is its pervasiveness: It
> will be by far the most widely used part of all these extensions, as
> it's the common use case the vast majority of the time for CLI types
> (as objects, as parameters, etc.). This extremely wide use amplifies
> two particular negative consequences we'd like to avoid: First, the
> long spelling (here
> "handle") could in practice effectively become a reserved word just
> because people are liable to widely apply "using" to avoid being
> forced to write the qualification every time (this is worse if the
> name chosen is a common name likely to be used for other identifiers
> or even macros, and "handle" is a very common name). Second, and
> worse, the long spelling would also make the language several times
> more verbose in a very common case than even the Managed Extensions
> syntax was, and that in turn was already verbose compared to other CLI
languages.
>
> Compare five alternatives side by side:
>
> cli::handle<R> r = cli::gcnew<R>(); // 1: alternative suggested
above
> handle<R> r = gcnew<R>(); // 2: ditto, with "using"s
> R __gc* r = new R; // 3: original MC++ syntax
> R^ r = gcnew R; // 4: C++/CLI syntax
> R r = new R(); // 5: C#/Java syntax
>
> I think you could make a case for any one of these, depending on your
> tradeoffs. But I think a tradeoff that favors usability will favor the

> last few options.

Your reasoning is correct, but it would apply also to, say, boost's
shared_ptr, once it's accepted into the standard: now which is more
pervasive, the default smart pointer from the standard library or a
binding to a specific virtual machine?

Consider also tools that parse or generate code: for those a library
like solution that did not introduce new keywords would be far superior.

> There are also other issues where having ^ and % declarators/operators

> that roughly correspond to * and & enables a more elegant type
> calculus. I (or someone on the team) will have to write those up
> someday, but consider at some future time when we have full mixed
> types too: When we can have a type that inherits from both native and
> CLR base classes/interfaces, we will want to be able to pass a pointer

> to such an object to existing ISO
> C++ APIs that take a Base1* and a handle to the same object to
> C++ existing
> CLI APIs that take an Base2^.

This argument is reasonable, but it would be far more compelling if you
were writing about technology agnostic language issues. However, we're
discussing about accessing a specific technology. When, in ten years or
so, Microsoft replaces .NET with .FISHING_ROD, we'll have to look for a
new, special syntax to deal with .FISHING_ROD's "hooks" , as the hat
symbol will be already taken by .NET's handles.

Nicola Musatti

unread,
Nov 24, 2003, 2:17:00 PM11/24/03
to
Gabriel Dos Reis <g...@integrable-solutions.net> wrote in message
news:<m3vfpdt...@uniton.integrable-solutions.net>...
> Nicola....@ObjectWay.it (Nicola Musatti) writes:
>
> | The standard already provides a way to avoid conflicts when
> | introducing new keywords: prepend a double underscore.
>
> That is *a* way of solving that problem. Some like it. C did
> something similar. Personally, I don't like it.
>
> We have namespaces, i.e. facilities to manage identifiers. I would
> prefer we take advantage of them.

Are you talking about 1) introducing namespace specific keywords, or
2) not introducing new keywords at all? If 1) I sort of agree with you,
if 2) I agree with you a lot :-)

Gabriel Dos Reis

unread,
Nov 25, 2003, 5:58:39 AM11/25/03
to
news user <ne...@sisyphus.news.be.easynet.net> writes:

| In article <m38ym9t...@uniton.integrable-solutions.net>,
| g...@integrable-solutions.net says...
| >
| > I believe we have enough gotchas to cope with current C++; I'm not
| > convinced that we should multiply them by any factor greater than 1.
| >
|
| I believe current C++ is cryptic enough; I'm not convinced that
| its obfuscation level should be multiplied by any factor greater
| than 1 :-)

Sigh. :-)

[...]

| To reject it, it must be unambiguously shown that
| its hidden costs for the user are higher than the very
| obvious costs of (1) and (2).

First, we don't get only three choices.

Secondy, programming language design is different from software
development in many aspects. In particular, a coherent programming
language is rarely an accumulation of features nobody could object to
in a given short timeframe.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Matt Austern

unread,
Nov 25, 2003, 8:32:19 AM11/25/03
to
Gabriel Dos Reis <g...@integrable-solutions.net> writes:

> Surely, I was exposed only to a limited set of determining factors,
> wasn't I? :-) In particular, since I'm not in your position at
> Microsoft there are probably other key factors I do not have knowledge
> of ;-p. Anyway, above you see what I feel about it. You may or may
> not have to take it into account.
> I believe I had to express those feelings I got and those shared by many
> other individuals. The defavorable vote (at least 4 countries
> against, J16 majority against) on the motion of having liason with
> CLI/C++ thingy conveys something, I think.

It conveys something, but perhaps not as much as you might be
assuming.

I voted against having the liaison between WG21 and the CLI/C++
effort. This does not mean that Apple opposes the CLI/C++ effort, or
that I oppose it personally. Much simpler than that: my vote was
strictly for practical reasons.

I believe that such a liaison would be pointless regardless of one's
opinions about the merits of the ECMA process. Given the timing of
the ECMA meeting, I don't believe there could have been any useful way
in which WG21 could have given direction to the proposed liaison
group; for all practical purposes it would have been on its own. A
liaison group on its own, that doesn't represent anyone in any real
sense, didn't strike me as any more useful than no liaison group. I'd
rather not create new administrative structures that will do nothing
but make things more complicated.

If all we're concerned with is that WG21 be kept informed at the
Sydney or Redmond meeting, then that's guaranteed anyway because of
the overlap between the two groups.

Gabriel Dos Reis

unread,
Nov 25, 2003, 8:34:15 AM11/25/03
to
Nicola....@ObjectWay.it (Nicola Musatti) writes:

| Gabriel Dos Reis <g...@integrable-solutions.net> wrote in message
| news:<m3vfpdt...@uniton.integrable-solutions.net>...
| > Nicola....@ObjectWay.it (Nicola Musatti) writes:
| >
| > | The standard already provides a way to avoid conflicts when
| > | introducing new keywords: prepend a double underscore.
| >
| > That is *a* way of solving that problem. Some like it. C did
| > something similar. Personally, I don't like it.
| >
| > We have namespaces, i.e. facilities to manage identifiers. I would
| > prefer we take advantage of them.
|
| Are you talking about 1) introducing namespace specific keywords, or
| 2) not introducing new keywords at all? If 1) I sort of agree with you,
| if 2) I agree with you a lot :-)

:-)

Well, I'm suggesting to introduce a namespace that contains (past,
present and future) keywords. E.g.,

using std::lambda;
transform(v.begin(), v.end(), lambda(int x) { x * x });

vs.

transform(v.begin(), v.end(), std::lambda(int x) { x * x;});

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Gabriel Dos Reis

unread,
Nov 25, 2003, 8:37:00 AM11/25/03
to
"White Wolf" <wo...@freemail.hu> writes:

[...]

| > I have no problem with a given commercial coorporate growing up its
| > standard for its products. However, I'm concerned about an ISO
| > standard process that, in many ways, affects/relates to a given
| > ISO-standardized programming language that is not the property of any
| > coorporate. If the ECMA CLI/C++ thingy is believed to be important to
| > the C++ community and that community should care about it, then I
| > think that process should be at least as open as the process that
| > is in charge of standardizing and evolving C++.
|
| I completely agree with you that in a perfect world it would go as you wish.
| But there are facts of life here which does not allow it to go to that
| direction. I have seen Herb addressing you in another post, suggesting that
| you attend the next meeting. I humbly suggest you give it a chance.

Sure, but on what basis? Herb's comment was implicitly recalling that
I'm employed at Texas A&M University to do research work and that
the TG5 committee does not require fees from students and researchers.
However, Texas A&M did not recruit me to do work CLI/C++ binding.
So, that option is no option. If it were an ISO (or WG21-like)
process, I could -- as a member of the French national body -- ask for
day offs and go there represent myself or AFNOR, with low costs. I
don't think I'm alone in such configuration.

Next, it is not just all to go there. A key issue is what weight are
given to non-Microsoft people inputs. For example, are
Bjarne Stroustrup's technical inputs just advices that Microsoft could
safely ignore or what? If non-Microsoft people inputs are just
"advisory", then I see no reason why I would encourage my home country
national body to approve that standard. The ISO process might not be
perfect, but it makes the standardization effort much more open.

Again, a key question is: Why does Microsoft need an ISO standard for
CLI/C++? Why should ISO national body members care?
I think clear answers to those questions may be helpful.

--
Gabriel Dos Reis
g...@integrable-solutions.net

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Alexander Terekhov

unread,
Nov 25, 2003, 8:37:37 AM11/25/03
to

Attila Feher wrote:
[...]

> > I mean that it will probably be much
> > more productive if MS would roll carefully selected C++ stuff
> > into C#.
>
> Please let me ask it: to whom would it be good? To MS: certainly, it is
> much easier for them to change/control that language. For the C# people:
> yep, it would be good because they would have a stronger language. But in
> what way would it be good for "us"?

It wouldn't affect "us" (to the extent that ECMA/ISO "binding"
will certainly have on the C++ "future"). And that is good. I'd
have no problems with MS sponsored proposal to add {optional}
garbage collection to standard C++, or properties, or threads
(that one will be fun... threads from MS ;-) ). You got the idea.

regards,
alexander.

Herb Sutter

unread,
Nov 25, 2003, 8:41:00 AM11/25/03
to
On 23 Nov 2003 18:42:43 -0500, Francis Glassborow

<fra...@robinton.demon.co.uk> wrote:
>I would also be happier to see class/struct reduced to one or other of
>the keywords without the other being an option.

I have no strong preference either way. I thought that retaining
class/struct as synonyms (except for default accessibility) was staying
closest to ISO C++ and what C++ programmers are likely to expect.

But in your other posts you said we should try hard to minimize the
departures from ISO C++ (and I agree). Doesn't allowing both "class" and
"struct" do that? Feedback appreciated.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 25, 2003, 8:42:20 AM11/25/03
to
On 23 Nov 2003 18:43:13 -0500, Francis Glassborow
<fra...@robinton.demon.co.uk> wrote:

>In article <ka00sv8gumlviqsom...@4ax.com>, Herb Sutter
><hsu...@gotw.ca> writes
>>I think that the presence of CLI operators on ^'s actually argues more for
>>pointerlike semantics than valuelike semantics, because unlike the usual
>>C++ operators they nearly always result in binding to a newly created
>>object. For example, for T^'s a and b, a += b results in adding the value
>>in *a to the value in *b and putting the result in a new object, and a now
>>refers to that. (Yes, whereas for the usual C++ operators we can implement
>>+ in terms of +=, it's the other way around for the CLI operators, which
>>takes getting used to.)
>
>As a teacher that makes me very uncomfortable, such subtleties make a
>language much harder for the novice to understand.

Right, and this demonstrates the cost of not having had a C++ group in a
position where they can influence CLI to get rid of needless
incompatibilities with C++.

I agree with your comment. Yet those are the semantics of the CLI
operators, which we can't change. Given that CLI operators do behave this
way, what choice is there? (Short of just not supporting CLI operators at
all, which is what MC++ partly did, and one of the known serious
weaknesses that needed to be addressed.)

The broader issue: This is one of the perfect examples of why it's
important for a C++ standards group to finally be near the CLI group
(co-located, with some shared membership) to be able to influence them to
avoid small differences that needlessly make life difficult for C++
programmers and implementers. Not that every decision or even this
particular one would necessarily have been different, but I know some
might have been, and I know we already have a little list of things we'd
like CLI to do differently Right Now in their current effort now underway.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 25, 2003, 8:45:43 AM11/25/03
to
On 24 Nov 2003 07:36:27 -0500, Gabriel Dos Reis
<g...@integrable-solutions.net> wrote:

[in the context of hash_set/hash_map]


>The main issue, as I understand it, that has blocked progress and
>generated endless discussions for years is that a given implementation
>has put its extensions of hash_map/hash_set in the standard namespace
>(a mistake many implementations made a long time ago, but corrected
>since then) but is under obligation of not moving out those codes,
>even now that it knows it was a mistake. Several proposals have been
>made, among which, hashmap/hashset, just keep the obvious names, ...

[...]


>Well, C++98 has been continuously criticized as being innovation
>standardization instead of standardization of existing practice. I've
>even read those arguments from people who put hash_map in std:: -- and
>at the time, everybody was doing its own extension -- in this newsgroup.
>And now, you're using that same existing unfortunate practice to argue
>that schedule has to be fast -- which I understand as a nice euphemism
>for "we need to standard innovation".

I don't understand the last three lines.

First, C++/CLI is indeed standardizing existing practice, because we are
including only features that two commercial compilers will have actually
implemented (already or in the coming months before the binding standard
is finalized). We have already removed several features (e.g., mixed
types) for the specific reason that the designs look nice but they have
not yet been implemented, and won't be by the time the binding standard is
completed.

Second, the C++/CLI effort is going fast primarily because of the reason I
said in the lines you quoted a few lines later:

>| We do, however, need to track and influence the CLI
>| standard, otherwise it will certainly have small but painful gratuitous
>| incompatibilities that needlessly hurt C++ on that platform. (This has

>| already been demonstrated, and fixed, in several small areas.) And CLI is


>| being changed in ECMA TC39 on a fast schedule that we are matching (and
>| not initiating ourselves).

Gaby, I know you don't like the fast schedule, and I understand why. But
do you see that it's there because that's CLI's schedule? And is it worth
giving up the ability to influence CLI (which lack of influence I repeat
has already done damage to C++)?

I am sad that there is no answer here that will please everyone.


> (1) Is CLI just a Microsoft framework to grow up its own products?
> If no, in what aspect and to what extent do ISO National Body
> members have to care about it?
>
> (2) Is CLI intented to support at least C++ (among the zoo of
> programmaing languages out there)?

I can't speak directly for CLI, but I can say that it was designed to be a
"Common" Language Infrastructure (hence the name) that popular languages,
including C++, could target well, and that is being implemented by other
companies. For the most part I've observed that it has consciously tended
to avoid including features that were language-specific and not native to
all languges (such as multiple inheritance), because if it did so then it
would be handicapping languages that did not already have those features.
But some arguably language-specific features did creep in, even though
many were kept out.

Having said that, I really do not speak for CLI nor do I have any direct
influence on it personally. TG5 will be able to have influence on it, and
I think that's a good thing for C++.


>| a priori fact is that the CLI work is already being done in ECMA on a fast
>| schedule targeting technical completeness in Sep 04 and for ECMA General


>| Assembly vote in Dec 04. None of us created or chose that situation;

>| that's where it is, and that can't be changed, at least not by any of us
>| in WG21. We can either go there to influence CLI, or not and just hope for
>| the best.

Herb

Convener, ISO WG21 (C++ standards committee) (www.gotw.ca/iso)
Contributing editor, C/C++ Users Journal (www.gotw.ca/cuj)
Visual C++ architect, Microsoft (www.gotw.ca/microsoft)

[ See http://www.gotw.ca/resources/clcm.htm for info about ]

Herb Sutter

unread,
Nov 25, 2003, 8:47:40 AM11/25/03
to
Just to follow up to my own article, here's a correction about category 4,
which includes the keywords "delegate", "event", "initonly", "literal",
and "property":

There's usually no ambiguity in the case of property (or the other
keywords in this category); the only case I know of where you could write
legal C++/CLI code where one of these five keywords could be legally
interpreted both ways, both as the keyword and as an identifier, is when
the type has a global qualification. Here's an example courtesy of Mark
Hall:

initonly :: T t;

Is this a declaration of an initonly member t of type ::T (i.e,
initonly ::T t;), or a declaration of a member t of type initonly::T
(i.e, initonly::T t; where if initonly is the name of a namespace or
class then this is legal ISO C++). Our current thinking is to adopt the
rule "if it can be an identifier, it is," and so this case would mean the
latter, either always (even if there's no such type) or perhaps only if
there is such a type.

Attila Feher

unread,
Nov 25, 2003, 1:34:17 PM11/25/03
to
Herb Sutter wrote:
> I think that the presence of CLI operators on ^'s actually argues
> more for pointerlike semantics than valuelike semantics, because
> unlike the usual C++ operators they nearly always result in binding
> to a newly created object. For example, for T^'s a and b, a += b
> results in adding the value in *a to the value in *b and putting
> the result in a new object, and a now refers to that.
> (Yes, whereas for the usual C++ operators we can
> implement + in terms of +=, it's the other way around for the CLI
> operators, which takes getting used to.)
[SNIP]

Is this "anomaly" forced on us by the ways the CLI works? Is there a
rationale for this?

You say that the result is a new object bound to the handle represented by
a. Will this result in slicing, if a actually refered to a derived class of
the handle base type?

--
Attila aka WW

Francis Glassborow

unread,
Nov 25, 2003, 1:51:57 PM11/25/03
to
In article <k5f4sv8a6e63cbf70...@4ax.com>, Herb Sutter
<hsu...@gotw.ca> writes

>On 23 Nov 2003 18:42:43 -0500, Francis Glassborow
><fra...@robinton.demon.co.uk> wrote:
> >I would also be happier to see class/struct reduced to one or other of
> >the keywords without the other being an option.
>
>I have no strong preference either way. I thought that retaining
>class/struct as synonyms (except for default accessibility) was staying
>closest to ISO C++ and what C++ programmers are likely to expect.

I wish they were not almost synonyms:-) However that is no reason to
perpetuate the situation. Given the current programming style (putting
public first even in a class definition) I personally would elect to use
struct as the second part of your context keyword. I agree that this is
personal but it would mean that wherever I see class in code it will be
a C++ class and wherever I see struct I will have to check why that
keyword was used. Apart from anything else it will assist in grepping
because I will not have to grep for the alternative.

>
>But in your other posts you said we should try hard to minimize the
>departures from ISO C++ (and I agree). Doesn't allowing both "class" and
>"struct" do that? Feedback appreciated.

Yes and no. You are proposing an extension to C++ and for me extensions
should be kept to the minimum and should be made maximally detectable in
any piece of code.


--
Francis Glassborow ACCU
If you are not using up-to-date virus protection you should not be reading
this. Viruses do not just hurt the infected but the whole community.

It is loading more messages.
0 new messages