http://channel9.msdn.com/posts/VisualStudio/Visual-CPP-10-10-is-the-new-6/
may be interesting...
Giovanni
It seems to work.
Why and how often does everyone else move to newer versions?
I am currently working through C# Step by Step.
At least half the changes seem like the automakers putting new fenders on
cars with the same old technology.
Except that all cars use the right peddle for the accelerator.
But every new language / compiler moves and renames the controls.
My 2 cents
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:OWXbYMxR...@TK2MSFTNGP03.phx.gbl...
You are way ahead of the curve. Most haven't upgraded to it yet :-)
--
Ajay
I don't doubt that.
VC6 has a robust stable IDE, is snappy, and has top-quality MFC ClassWizard,
and the help system does work.
> Why and how often does everyone else move to newer versions?
When you move to the newer versions of VC, you lost something but also gain
something else.
For example: I like the visualizers for debugging purposes (like STL
visualizers, shared_ptr visualizers, etc.). They are available in VS2005 and
2008, but I think it is not trivial to implement them in VC6 (probably you
should do some low-level hook in the VC6 process and do some black-magic to
hook your code in the IDE, like WndTabs does...).
Moreover, both the compiler and the libraries improved a lot after VC6.
Since VC7.1 (a.k.a. VS.NET2003) you can compile multiplatform C++ code
without problems, including Boost libraries. And since VC7.1 you have
good-working STL ready "out-of-the-box" (instead, with VC6 you should use
STLport, or apply Dinkumware patches). And after VC6 you can use CString
also in non-MFC apps, because CString was refactored to be part of ATL.
And, being the new (i.e. post-VC6) CString a template class, you can have
both CStringA and CStringW in the same source files.
And there are others improvements, too.
Giovanni
If not every new version than what intelligent criteria do you use?
It takes a while to get used to a new way of doing things, especially when
are working on code from an earlier Visual Studio.
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:OFvMeeQ...@TK2MSFTNGP02.phx.gbl...
It's OK to skip versions because the benefit/cost ratio is too low... but
when you talk about using a 10 year old IDE, which is no longer supported or
even available on MSDN, and have skipped VS 2002/2003/2005/2008, then...
it's time to upgrade my friend.
I think VS 2005 is the minimum acceptable to be called a current Windows C++
developer.
-- David
> No doubt there are some improvements but how many people switch to every
> new version of the Visual Studio?
I have no idea about these numbers.
> If not every new version than what intelligent criteria do you use?
>
> It takes a while to get used to a new way of doing things, especially when
> are working on code from an earlier Visual Studio.
You may consider downloading the free Express edition of Visual C++ 2008,
and try it:
http://www.microsoft.com/express/vc/
With Express Edition you don't have MFC available, and you can't use quality
add-ins like Visual Assist X, but you can have a general "look and feel" of
the new IDE.
And the C++ compiler in the *free* VS2008 Express Edition is better than C++
compiler of VC6.
Moreover, the upgrade is not always a steep cuve. The steep curve is between
VC6 and VS2005, but the upgrade from VS2005 to VS2008 is very soft.
Note that if you upgrade your code from VC6 to VS2005 (or 2008), the
compiler may give you some errors and will help identify some problems in
C++ code that VC6 tended to hide (we had a recent example of that about some
code using CStringArray::GetAt() converted from VC6 to VS200x, and probably
there was a bug in the VC6 version).
Giovanni
Tom
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:OFvMeeQ...@TK2MSFTNGP02.phx.gbl...
The big advantage of using a newer version of Visual Studio than VC6 is
that the C++ compiler is much newer and much more compliant with the
ISO C++ language. That means that any third-party library code that you
may want to use in your projects is much more likely to compile.
C++ libraries are increasingly making use of advanced C++ features like
templates, and these were handled really quite badly by the old VC6
compiler.
OTOH some 3rd-party libraries aren't yet fully supported under VS2008.
The "sweet spot" at present is probably VS2005.
The big disadvantage of switching to the newer tools is that the IDE --
and especially its wizard support for MFC -- will be unfamiliar. Many
of us doing native code C++ development feel that the newer IDEs are
inherently less productive than the VC6 IDE, the wizards are less
accessible/less functional, and that more keystrokes/mouse-clicks are
needed to perform common tasks. The "10 is the new 6" soundbyte from
Microsoft is supposed to signal that they have addressed these concerns
in the new IDE (let's hope so!)
Cheers,
Daniel.
I think that alone will justify the upgrade for most. I personally
dont think MFC enhancements have amounted to much after VC6.
--
Ajay
I can't live without CStringA/W anymore. I got tired of manually creating a
manifest just to use Common Controls 6. I don't have to delete the .ncb
file nearly as much as with VC6. I like my XML color coded. When I edit
Javascript, I like that color coded also. I like the Server Explorer to
manage my databases (was that in VC6?). The missing Find/Replace in Entire
solution is irksome. Visual Assist X integrates much better than in VS6. I
got used to the Consolas font. I got used to editing true color icons (and
Icon Workshop Lite only works with VS2008).
-- David
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:763177E9-94F6-410E...@microsoft.com...
Tom: like when you static link MFC with VS2008 SP1/Feature Pack and you get
a fat 4MB exe for a simple dialog-box? :)
I don't call that an improvement if compared to VS2008 (without SP1) and
before.
One of the big advantages of "classic" MFC was the small overhead, which
seems to me gone away with "MFC 9".
I hope they will optimize and fix that.
G
> "Ajay Kalra" <ajay...@yahoo.com> wrote in message
...
>>I think that alone will justify the upgrade for most. I personally
>>dont think MFC enhancements have amounted to much after VC6.
> I can't live without CStringA/W anymore. I got tired of manually creating
> a manifest just to use Common Controls 6. I don't have to delete the .ncb
> file nearly as much as with VC6. I like my XML color coded. When I edit
> Javascript, I like that color coded also. I like the Server Explorer to
> manage my databases (was that in VC6?). The missing Find/Replace in
> Entire solution is irksome. Visual Assist X integrates much better than
> in VS6. I got used to the Consolas font. I got used to editing true
> color icons (and Icon Workshop Lite only works with VS2008).
I agree with all these good points by David, however they are improvements
in the editor and general development experience, but they are not specific
of MFC (as Ajay wrote).
The only important improvements in MFC seem to me the refactor of useful
classes like CString, being refactored to templates on character type, and
the better integration with ATL (in fact, CString template is now part of
ATL).
Giovanni
> Tom: like when you static link MFC with VS2008 SP1/Feature Pack and you
> get a fat 4MB exe for a simple dialog-box? :)
BTW: I correct the number, it is 1.5 MB (not 4 MB); however it is still too
big if compared to 300 KB which was the default with previous MFC.
However, it seems that Visual Studio 2010 will solve that (using "classic"
CWinApp instead of CWinAppEx as default behaviour):
https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=369643
Giovanni
Is it really that bad even if you enable function level linking?
Also, I saw somewhere on the VS10 blogs that they've decided not to do the
DLL's as SxS assemblies or something to that effect. Is that really true?
Anthony Wieser
Wieser Software Ltd
Yes, but in order to get those general development improvements, you have no
choice but to switch VS versions, and take the (also improved although only
slightly) MFC that goes with them. MS has not seen fit to be able to target
downlevel MFC versions with new VS versions. Since developers choose the
most productive solution as a whole, you can't just say MFC was not
improved, so I am sticking with VC6. Many developers even here on this
newsgroup were so sickened by the downfall of the wizards, they did not look
further and find the other improvements I mentioned and so took longer than
optimal to switch.... at least that's how I see it. Perhaps it is a wrong
view.
-- David
> Since developers choose the most productive solution as a whole, you can't
> just say MFC was not improved, so I am sticking with VC6.
I do agree.
G
I don't know about "function level linking". I tend not to use the command
line switches and I tend to build within the IDE, and tend to use the
default settings for the linker.
I think that a smart linker should just link in the .exe what is actually
required.
However, to fix that you can follow the "workaround" instruction in the
Connect post, or just don't install VS2008 SP1, or wait for VS2010.
The point is that the small-size behaviour should be the default one, just
working out of the box (like with VS2008-pre-SP1 and previous versions),
without requiring workaround.
Giovanni
I never realized that I take all this and more for granted when I
moved to C#/.Net.
--
Ajay
Thats called software evolution and is a feature.
--
Ajay
Correct. Core MFC hasnt changed and is not expected to change either.
I would call all these as minor appendages to MFC but nothing drastic.
--
Ajay
> Thats called software evolution and is a feature.
It's a feature if you use it; i.e. if you use the new MFC UI (which is based
on BCGSoft components), then I agree that it is fine to "pay" in term of
bigger .EXE.
But if I just use "classic" MFC, I don't want unused overhead to be linked
with the .exe. In other words, I don't like paying for what I don't use,
IMHO.
Anyway, they wrote on Connect that this bug will be fixed in VS2010.
Giovanni
Tom
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:%23nYNSZw...@TK2MSFTNGP04.phx.gbl...
That goes without saying. I was kidding.
> Anyway, they wrote on Connect that this bug will be fixed in VS2010.
I guess they at least they see it as a bug.
--
Ajay
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:3A60FEC8-C9EC-465F...@microsoft.com...
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:eIWV6fyS...@TK2MSFTNGP05.phx.gbl...
Tom
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:%23%23i3QwwS...@TK2MSFTNGP06.phx.gbl...
Overall I find CString and .NET::String pretty on par (only that difference
between using == and .Equals() always confuses me still).
Tom
"Ajay Kalra" <ajay...@yahoo.com> wrote in message
news:f1be5af8-f50f-4265...@3g2000yqs.googlegroups.com...
Tom
"Ajay Kalra" <ajay...@yahoo.com> wrote in message
news:19f01fa9-6662-4e32...@x38g2000yqj.googlegroups.com...
> How large is the same program statically linked for .NET :o)
I can't answer this question, because I think that .NET just does not have
static linking options :)
G
?? :)
I have never found the need to extend string in C#. Along with
StringBuilder, it does what one expects it to. In C++, I used stl's
basic_string etc along with CString. That whole mess along with ANSI/
UNICODE stuff is something you dont need to deal with in .Net/C#.
--
Ajay
Actually with a 3rd party linker like XenoCode PostBuild or RemoteSoft, you
can create 1 .exe with the relevant parts of .NET embedded inside it, so
that your .NET app runs without any installation at all. You can really
have XCopy for .NET apps. My WinForms .NET 2.0 apps are about 25 MB.
-- David
And thats a good thing....
--
Ajay
I also like how .NET works with .RESX files for resources. This is
especially nice withe ASP.NET when doing web pages.
I guess we should use the right tool for the right job.
Tom
"Ajay Kalra" <ajay...@yahoo.com> wrote in message
news:7053002f-dd00-452f...@3g2000yqs.googlegroups.com...
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:8BAB1B3E-9B33-45EE...@microsoft.com...
> Actually with a 3rd party linker like XenoCode PostBuild or RemoteSoft,
> you can create 1 .exe with the relevant parts of .NET embedded inside it,
> so that your .NET app runs without any installation at all. You can
> really have XCopy for .NET apps.
Thanks for the info, David.
(I was completely unaware of these 3rd party tools...).
Giovanni
Why? Installing .NET is a royal pain and takes longer and fails more often
than anyone wants to admit. Once it's installed, it may need to be
re-installed after installing another .NET app somehow muddles up the
existing one. MS is making progress on .NET installs, but they are not
perfect. In fact, not to long ago, installing I think it was .NET 3.0 broke
applocal deployment of VC runtime, and there is still no fix for that.
Unless your customers all have IT depts. to deal with this crap, if we can
spare our customers from installing .NET while still retaining the benefits
of producing .NET apps, it's a Good Thing.
-- David
Installing .Net is a one time effort. Newer os's come with it. Its the
legacy machines that need it. It will improve over time. Having said
that, I do recall MS doing something about splitting the run time into
multiple modules.
> Installing .NET is a royal pain and takes longer and fails more often
> than anyone wants to admit. Â
Dont know. I have never heard this one. Most common complaint is that
it requires admin privileges.
> Once it's installed, it may need to be
> re-installed after installing another .NET app somehow muddles up the
> existing one.
How can that happen? We havent seen this issue at all and our users
are not that technically bright.
>Â MS is making progress on .NET installs, but they are not
> perfect. Â In fact, not to long ago, installing I think it was .NET 3.0 broke
> applocal deployment of VC runtime, and there is still no fix for that.
The fix is that you should never use apps which require vc
runtime ;-)
--
Ajay
Yes, although it's not an Apples to Apples comparison. The 25 MB doesn't
include 3rd party ribbon libraries, so those would increase the size. OTOH,
the .NET base class library is so much richer than MFC, to get any real work
done with XML, Internet, multimedia, database, etc., your 1.5 MB would
probably increase also. Still the .NET one is much larger, no doubt.
-- David
Ahem... when .NET 4.0 comes out, everything becomes legacy and your users
are going to have to install it, regardless of what other .NET versions they
have.
> > Installing .NET is a royal pain and takes longer and fails more often
> > than anyone wants to admit.
>
> Dont know. I have never heard this one. Most common complaint is that
> it requires admin privileges.
There are horror stories that .NET 2.0 installed in 8 minutes on a clean VM,
but their customers fragmented drives and registries caused it to take 23
minutes and more. Not to mention first having to update Windows Installer
and rebooting in some cases.
> > Once it's installed, it may need to be
> > re-installed after installing another .NET app somehow muddles up the
> > existing one.
>
> How can that happen? We havent seen this issue at all and our users
> are not that technically bright.
My office mate is a CPA who spent 3 hours on the phone with Intuit because
QuickBooks (which uses .NET) got screwed up somehow. After uninstalling and
reinstalling various versions of .NET, finally the answer was to install
.NET 3.0. This of course was a huge download and install, but it did fix
the problem. And she is not alone.
> The fix is that you should never use apps which require vc
> runtime ;-)
The conspiracy side of me wonders if MS purposely does not fix that
incompatibility to send a message to developers to stop developing with VC!
;)
-- David
I wish it was true. They are just not that smart ;-0
--
Ajay
> I think it's fair so long as you can "not" have it if you want and just
> do a plain MFC (normal looking dialogs, etc.) application. I have an
> application that I use MFC in that is 413K and has everything statically
> linked included 6 languages built into the EXE
In fact, that is the point. You have to pay for things that you don't use -
that was my critic.
On the Connect site it was recognized as bug and will be fixed in next VS.
OK.
Giovanni
Actually, the size of produced .exe and .dll is a major reason to stick with
VC6. It's minimum static MFC linked size is about 150 KB instead of 400 KB.
That may be significant.
-- David
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:29A4AFBD-7545-4901...@microsoft.com...
Interesting. 20 minutes is a *long* time to install an app that could
otherwise by XCopy-deployed. .NET 3.5 SP1 speeds up things quite a bit. I
typically get 6 minute installs on that. Unfortunately, the thing is a 200
MB download, and I don't feel like asking my clients to put up with that.
So I've stuck with .NET 2.0 even though it takes longer to install, and
startup time of my app is much longer also than for .NET 3.5 SP1.
-- David
> .NET 3.5 SP1 speeds up things quite a bit. I typically get 6 minute
> installs on that. Unfortunately, the thing is a 200 MB download, and I
> don't feel like asking my clients to put up with that. So I've stuck with
> .NET 2.0 even though it takes longer to install, and startup time of my
> app is much longer also than for .NET 3.5 SP1.
David: you might find interesting reading the following blog post:
"SmallestDotNet: On the Size of the .NET Framework"
http://www.hanselman.com/blog/SmallestDotNetOnTheSizeOfTheNETFramework.aspx
<quote>
[...]
There's been some confusion about the size of the .NET Framework. The .NET
Framework is not really a 200+ meg download.
</quote>
G
Thanks G. Well I get the bootstrapper is the preferred way to install it,
so it only installs what it needs, but that has the flaw that an Internet
connection must be available. Not a bad assumption, but one which is not
always valid; for example, one of my (other) clients has 20 PC's is in a lab
environment with no Internet. So I tell them to download the 200 MB redist
(since that is the only one MS makes available that doesn't require an
Internet connection). What a joke.
BTW, if I am running an Athlon 64 Quad core processor and Windows Vista x64
bit Home Premium, do I need to install the x64 .NET or is the x86 one good
enough?
Further, one of the comments on that site said it took 54 minutes to install
.NET. Given stories like this, I don't see how any ISV can with a straight
face insist their customer use .NET unless their product is so compelling
people will put up with the pain of installing it. I don't know about
everyone else, but I don't have that good a product! ;)
I'm really seriously thinking about getting XenoCode.
-- David
Them's fightin' words! :-)
Seriously, my VC9 static link exe and dll are SMALLER than the dynamic link
VC6 exe. For my app (exe and dll plugin) (bytes, as reported by 'dir'):
exe dll
VC6 (mfc-dll) 4,222,976 2,142,208
VC7.1 (dll) 3,809,280 2,035,712
VC8 (static) 3,776,512 1,884,160
VC9 (static) 4,014,592 1,907,712
VC9 (64bit/static) 6,271,488 2,833,408
Dave Connet
My "very small program" grew about 2K when I went to VS 2008, but that may
be because of some minor changes I made as well. When you are dealing with
< 500K programs the disk size doesn't always tell you much :o)
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:%236eZCF2...@TK2MSFTNGP05.phx.gbl...
Maybe the promise of VC10 where it supports using other toolsets
(older ones and non-MS ones) will allow us to use the VC6 compiler and
linker as well as take advantage of its (to be) improved IDE.
Dave
Then AMD would like to know where you got it ... they only manufacture
Athlon 64 CPUs with one and two cores (their 3- and 4-core desktop CPUs
are called "Phenom", and their server CPUs (which are available with 1,
2 or 4 cores) are called "Opteron").
.. but perhaps I'm bisecting leporids ...
> ... do I need to install the x64 .NET or is the x86 one good
> enough?
The x64 one will generally be faster ...
Cheers,
Daniel.
It's the Phenom one, but I thought I saw a label on the HP computer with
"Athlon 64" emblazoned on it. Oh well.
>> ... do I need to install the x64 .NET or is the x86 one good
>> enough?
>
> The x64 one will generally be faster ...
>
Thanks, I'll install that one.
-- David
Wow, I don't know what to say. I've imported many VC6 projects into VS2003
(then VS2005), and the .exe size increased dramatically with every VS
version.
-- David
I think it depends on which new optimizations were being used. My programs
have stayed around the same size, but I've often tweaked properties. I also
went to Unicode about 2003 which caused my programs to get a little larger,
but not so much as to not make it worth it. I don't know if EXE size is
always a good indicator of program size. I tend to compile for "speed"
rather than size most of the time.
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:25606E09-0DA6-4CFF...@microsoft.com...
Well, one of the benefits of static linking is to reduce the download size,
so if that is important, then I still say VC6 is important to use. I worked
for a couple companies that absolutely wouldn't ship anything over 400 KB
and balked when the size approached 300 KB. (Dial up users are finicky.)
Believe me, we monitored the .exe size with every build and if it ever shot
up too much, we took a hard look at what changed and altered it if needed.
-- David
The reasons we upgrade are myriad, but the reasons we DON'T upgrade are simple: "Why?"
Microsoft simply does NOT understand the concept that stability is one of the most
important parameters. There is never a need to change the interface unless the change has
a DIRECT benefit to the user, and in the case of VS > 6, there has not been a single
noticeable improvement; rather, everything that used to work either fails, or works so
poorly as to be nearly unusable. But this is now considered "improvement"
This philosophy is now rampant with Microsoft; witness the "new improved" Office suite,
which was clearly designed by someone who never actually created documents for a living
(for example Word, once I select a certain menu bar, will REVERT to some other menu bar
after I do something, violating the most fundamental principles of good GUI design, which
is to obey what the user wants to do and not spontaneously change context on the user).
WinDbg, which no longer has "tile horizontally" but has new GUI features that make it
nearly unusable, is considered an "improvement".
I am beginning to wonder why I continue to use a lot of these products because they are
going out of their way to make gratuitous changes for the sake of making gratuitous
changes, with no attempt to actually IMPROVE the interfaces!
joe
On Mon, 17 Nov 2008 14:41:45 -0500, "Chris H" <humme...@royalmaster.com> wrote:
>I am still using the Old 6.
>
>It seems to work.
>
>Why and how often does everyone else move to newer versions?
>
>I am currently working through C# Step by Step.
>
>At least half the changes seem like the automakers putting new fenders on
>cars with the same old technology.
>
>Except that all cars use the right peddle for the accelerator.
>
>But every new language / compiler moves and renames the controls.
>
>
>
>My 2 cents
>
>
>
>"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
>news:OWXbYMxR...@TK2MSFTNGP03.phx.gbl...
>>I found this video on Channel 9:
>>
>> http://channel9.msdn.com/posts/VisualStudio/Visual-CPP-10-10-is-the-new-6/
>>
>> may be interesting...
>>
>> Giovanni
>>
>>
>
Joseph M. Newcomer [MVP]
email: newc...@flounder.com
Web: http://www.flounder.com
MVP Tips: http://www.flounder.com/mvp_tips.htm
>No doubt there are some improvements but how many people switch to every new
>version of the Visual Studio?
****
These days, as few as can possibly manage to be forced to
****
>
>
>
>If not every new version than what intelligent criteria do you use?
****
It has to present me an actual IMPROVEMENT in the experience. In my case, it was
CStringA/CStringW and STL that drove the change
****
>
>
>
>It takes a while to get used to a new way of doing things, especially when
>are working on code from an earlier Visual Studio.
****
And the fundamental question is "Why?" The answer is "because some nitwit who never wrote
a line of code in his entire career had a "vision" of how YOU should program, and managed
to avoid any intelligent design review, and then systematically ignored user feedback for
years". You should not have to learn new ways of doing things when the old ways were more
than adequate and met all the needs.
joe
****
>
>
>
>
>
>"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
>news:OFvMeeQ...@TK2MSFTNGP02.phx.gbl...
>>
>> "Chris H" <humme...@royalmaster.com> ha scritto nel messaggio
>> news:4921c90a$0$4919$607e...@cv.net...
>>>I am still using the Old 6.
>>>
>>> It seems to work.
>>
>> I don't doubt that.
>> VC6 has a robust stable IDE, is snappy, and has top-quality MFC
>> ClassWizard, and the help system does work.
>>
>>> Why and how often does everyone else move to newer versions?
>>
>> When you move to the newer versions of VC, you lost something but also
>> gain something else.
>>
>> For example: I like the visualizers for debugging purposes (like STL
>> visualizers, shared_ptr visualizers, etc.). They are available in VS2005
>> and 2008, but I think it is not trivial to implement them in VC6 (probably
>> you should do some low-level hook in the VC6 process and do some
>> black-magic to hook your code in the IDE, like WndTabs does...).
>>
>> Moreover, both the compiler and the libraries improved a lot after VC6.
>> Since VC7.1 (a.k.a. VS.NET2003) you can compile multiplatform C++ code
>> without problems, including Boost libraries. And since VC7.1 you have
>> good-working STL ready "out-of-the-box" (instead, with VC6 you should use
>> STLport, or apply Dinkumware patches). And after VC6 you can use CString
>> also in non-MFC apps, because CString was refactored to be part of ATL.
>> And, being the new (i.e. post-VC6) CString a template class, you can have
>> both CStringA and CStringW in the same source files.
>>
>> And there are others improvements, too.
> Well, one of the benefits of static linking is to reduce the download
> size, so if that is important, then I still say VC6 is important to use.
> I worked for a couple companies that absolutely wouldn't ship anything
> over 400 KB and balked when the size approached 300 KB. (Dial up users
> are finicky.) Believe me, we monitored the .exe size with every build and
> if it ever shot up too much, we took a hard look at what changed and
> altered it if needed.
For the purpose of reducing code size (in the cases when it matters), I
think that disabling C++ exception handling and RTTI can help in that
direction.
Giovanni
In the past few months, for the first time in my 21 years as a professional
using MS products, I have also begun questioning whether MS has jumped the
shark. While lots of good things still flow from there, there is a
significant deficit compared to the number of incredibly stupid things
coming from there.
I'm immersing into the cross platform Qt library these days, and it reminds
me of the good old days. For example, their textbox (QLineEdit) has both
"textEdited" and a "textChanged" events. The first thing you might wonder
is what are the differences and which one to handle. In the first paragraph
of the documentation for both events, they say textEdited is when the user
types new text, and textChanged also in addition is when the text changes
under programmatic control. How about that? The doc was written by people
who actually have a clue of what the readers are trying to accomplish, and
tell them how to do that! And the Qt framework is loaded with utility
classes that make life easy, again a testament that someone over there
actually considers what we are trying to accomplish and makes it easy(ier).
-- David
I'm not sure if that is safe to do in an MFC app, as the MFC core uses these
things?
Several times I have resorted to not linking with the RTL, or linking with
an RTL which was rebuilt to remove unnecessary parts (as per Matt Pietrek's
articles). But this was decades ago, I don't think it is necessary to go to
that level these days. But if simplying building with another version of VS
can significantly reduce the file size, and that is important, well....
-- David
It only proves that software *has to* evolve, because there is no evidence of "intelligent
design" at work.
Our sincere hope is that 10 will reverse this trend and show that someone actually
understands the software development process.
joe
I really hope so too, but am not encouraged from the CTP of VS10 that
focuses on large code bases and new C++ standards. There is no sign of any
IDE improvements or any change in philosophy that I can see.
-- David
Good one :)
>Our sincere hope is that 10 will reverse this trend and show that someone actually
>understands the software development process.
Here's hoping there's an awful lot more to come that they've not
talked about so far - because of the few things we've seen so far in
the new 6 one is a different editor with slightly fuzzy fonts :(
Dave
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:0857CCD8-8D24-4A04...@microsoft.com...
Seriously, I shoudl take a look at Qt. Sounds interesting.
Tom
"David Ching" <d...@remove-this.dcsoft.com> wrote in message
news:4C749D2E-6A18-403C...@microsoft.com...
While there is a lot to like about Office 2007 (especially how it shows you
inline things like font changes) I agree with your ribbon bar complaint.
My biggest complaint with VS 2008 is that some things just don't work
"sometimes". For example, two things in particular bug me:
1. If I want to switch the ID for two controls I can't have two named the
same thing (even temporarily) or it complains, but if I change ID_C1 to
ID_SOMETHING else then try to change ID_C2 to ID_C1 it still complains that
there is already a control ID_C1 even though there isn't any longer since I
renamed it to get rid of the other annoying "make sure I don't screw up"
message. I could see complaining about something like this at compile time,
but doing these checks while editing drives me nuts.
2. If I try to add a new dialog class to an already existing file it will
ask if I want to merge the class into that file, then take about 40 seconds
then say it had to close down the IDE and lose all my changes since the last
time I saved or compiled. I went back to adding things by hand.
These are both listed as bugs, but I'm hoping the next version at least will
focus on making what is there work reliably. I can work around the design,
but I can't work around wondering if I'll lose my changes if I don't save
after every few keystrokes.
Tom
"Joseph M. Newcomer" <newc...@flounder.com> wrote in message
news:kg4ei4h0u292h2fk4...@4ax.com...
Hi Tom,
You can get a smaller EXE with the newer (VS2008) compiler also if Feature
Pack or SP1 is included, *but* you have to do "extra work" as described in
the Connect link I posted above and repeat here:
https://connect.microsoft.com/VisualStudio/feedback/ViewFeedback.aspx?FeedbackID=369643
They said that this bug will be fixed in VS2010, and it is OK.
G
Ha, first it was MFC, now it is entire MS! ;)
Qt has opened my eyes to what is possible with a C++ framework. After
dealing with its elegance, I am now seeing that the IDE problems with MFC
tools is only the tip of the iceberg when it comes to how crude a framework
MFC is. It really has stopped evolving 10 years ago. By all means, check
Qt out!
-- David
> I'm not sure if that is safe to do in an MFC app, as the MFC core uses
> these things?
Hi David,
I just don't know, I've never tried that for an MFC app.
For ATL, they introduced a preprocessor flag: _ATL_NO_EXCEPTIONS; if it is
defined, instead of throwing an exception there will be an assertion
failure.
I don't know if this flag works for MFC, too.
Moreover, if there is really a reson to build very small Win32 EXEs, I think
that ATL (and possibly WTL, which is built on top of ATL) can be a good
option.
I did a simple test program in ATL showing a simple dialog-box with a button
and it is just a small 12 KB .EXE (built with VS2008).
Probably, thanks to the use of C++ templates, only the code that is really
needed is put in the EXE.
G
OK, I'm gonna' do it!!
For me, the pain of ATL/WTL is not worth it, and MFC was really the only
viable choice for me, so I was interested in keeping it as small as
possible.
Thanks,
David
Tom
"Giovanni Dicanio" <giovanniD...@REMOVEMEgmail.com> wrote in message
news:OHHUDmCT...@TK2MSFTNGP02.phx.gbl...
Good for you! :-)
"C++ GUI Programming with Qt 4, Second Edition" by Jasmin Blanchette and
Mark Summerfield (Prentice Hall) is what first inspired me.
http://www.amazon.com/Programming-Prentice-Source-Software-Development/dp/0132354160/ref=sr_1_1?ie=UTF8&s=books&qid=1227308249&sr=8-1-- David
As "improvement" in the IDE, they are doing the WPF-based code editor.
I'm not sure if it is an improvement or not...
I had bad impression of WPF for some characters rendering, as we previously
discussed.
And source code is made by characters :)
(So, I think that quality-rendered characters must be one of top priorities
in a code editor.)
So, what is the purpose of having a WPF-based editor instead of a GDI-based
one? Slowing down startup time or runtime thanks to injection of more
C#/managed code in the IDE?
Or does anyone want to embedd some movie or multimedia elements in the
source code? (Yes, composibility of multimedia elements is one of the good
feature of WPF :)
I'm open to innovations, but I think that they should offer *real*
improvements (like a snappy robust IDE, quality rendered characters, etc.).
Giovanni
> Qt has opened my eyes to what is possible with a C++ framework. After
> dealing with its elegance, I am now seeing that the IDE problems with MFC
> tools is only the tip of the iceberg when it comes to how crude a
> framework MFC is. It really has stopped evolving 10 years ago. By all
> means, check Qt out!
You are whetting the appetite for QT :)
BTW: IIRC, QT does lot of use of virtual methods instead of message maps
typical of MFC.
Is this recall of mine correct?
In that case, do you experience slowdowns in the rendering of the UI, or are
QT UIs just as snappy as native MFC ones?
I recall reading that when they invented MFC they wanted to avoid "bloated"
v-tables with lots of virtual functions to speed things up; but probably
with dual-core CPUs with clock at 2-3 GHz becoming mainstream this is not a
problem anymore.
Undoubtly, I think that the virtual methods are more elegant than message
maps.
Thanks,
G
It does seem like the cart is pulling the horse....
-- David
>
>"David Ching" <d...@remove-this.dcsoft.com> ha scritto nel messaggio
>news:57038FB0-DD64-4333...@microsoft.com...
>> "Joseph M. Newcomer" <newc...@flounder.com> wrote in message
>> news:kg4ei4h0u292h2fk4...@4ax.com...
>>> Our sincere hope is that 10 will reverse this trend and show that someone
>>> actually
>>> understands the software development process.
>>
>> I really hope so too, but am not encouraged from the CTP of VS10 that
>> focuses on large code bases and new C++ standards. There is no sign of
>> any IDE improvements or any change in philosophy that I can see.
>
>As "improvement" in the IDE, they are doing the WPF-based code editor.
>I'm not sure if it is an improvement or not...
>I had bad impression of WPF for some characters rendering, as we previously
>discussed.
>And source code is made by characters :)
>(So, I think that quality-rendered characters must be one of top priorities
>in a code editor.)
>
>So, what is the purpose of having a WPF-based editor instead of a GDI-based
>one? Slowing down startup time or runtime thanks to injection of more
>C#/managed code in the IDE?
****
It is cool.
Unless we get the complete source code to the editor, I fail to see what value a rewrite
adds, other than keeping programmers busy.
I've not used WPF, but if it makes the text less readable, is this really an
"improvement"?
Note that editors like Epsilon implement the entire editor in user-visible (and
modifiable) code; only the editor primitives are opaque. This is the only sane approach
to building an editor.
****
>
>Or does anyone want to embedd some movie or multimedia elements in the
>source code? (Yes, composibility of multimedia elements is one of the good
>feature of WPF :)
****
I want to be able to embed Visio drawings as comments in my code (embed, not link!) so I
can do decent comments without having to use +=| characters to draw my data structures.
Isn't it about time the IDE grew up and became real? If the editor cannot embed Visio,
PowerPoint, Word, and/or Excel (just to start! It should be able to embed anything!),
bitmaps, sound clips, etc., what good is it? We have made no progress since punched
cards, as far as I can tell, in terms of the fundamental model. We just add cute features
like query-replace and cut+paste, but the compilers are still thinking "punched cards".
****
>
>I'm open to innovations, but I think that they should offer *real*
>improvements (like a snappy robust IDE, quality rendered characters, etc.).
****
See, that's your problem! You actually want IMPROVEMENT!
Silly us. Who cares if its better as long as (a) it is cool and (b) it uses .NET?
joe
****
>
>Giovanni
>Hi Joe,
>
>While there is a lot to like about Office 2007 (especially how it shows you
>inline things like font changes) I agree with your ribbon bar complaint.
>
>My biggest complaint with VS 2008 is that some things just don't work
>"sometimes". For example, two things in particular bug me:
>
>1. If I want to switch the ID for two controls I can't have two named the
>same thing (even temporarily) or it complains, but if I change ID_C1 to
>ID_SOMETHING else then try to change ID_C2 to ID_C1 it still complains that
>there is already a control ID_C1 even though there isn't any longer since I
>renamed it to get rid of the other annoying "make sure I don't screw up"
>message. I could see complaining about something like this at compile time,
>but doing these checks while editing drives me nuts.
****
Yes, it seems like this is a feature that should exist. Also, I should be able to declare
that all controls in all dialogs in my project get UNIQUE numbers, not
unique-within-dialog numbers (try copy-and-paste a group of controls from dialog1 to
dialog2!) and I should at ANY TIME be able to change, FROM THE IDE, the ID number of any
control I feel like changing.
Like you, I end up editing resource.h by hand! This is stupid, but the IDE is obviously
protecting me against SOMETHING (although I can't imagine what!)
****
>
>2. If I try to add a new dialog class to an already existing file it will
>ask if I want to merge the class into that file, then take about 40 seconds
>then say it had to close down the IDE and lose all my changes since the last
>time I saved or compiled. I went back to adding things by hand.
****
Ditto. I never even try this any longer. In fact, like you, I've had to revert to adding
DDX_Control commands by hand, because I had to add some 90 controls for a very complex
dialog I was building, and didn't want to spend 2 hours doing it, which is what the
wizards take! I timed myself, and added them all in about 20 minutes by hand. What's
Wrong With This Picture?
I can also add:
A ClassView that says "n files to read" for some value of n, but isn't reading them, can't
be made to read them, and will not tell me why it isn't reading them; and Add Variable and
Add Event Handler that tell me that my files are read-only (they are not!) [deleting the
.ncb file doesn't change anything]
joe
****
> It does seem like the cart is pulling the horse....
:)
G
>>I'm open to innovations, but I think that they should offer *real*
>>improvements (like a snappy robust IDE, quality rendered characters,
>>etc.).
> ****
> See, that's your problem! You actually want IMPROVEMENT!
>
> Silly us. Who cares if its better as long as (a) it is cool and (b) it
> uses .NET?
:)
Giovanni
Whilst that is, on the face of it, an attractive notion ... I foresee
problems when working with Version Control systems. Most VCSs do a
decent job of displaying differences between versions of text files, but
can't handle differences between binary files ("files are different" is
all you get). I imagine that a source file with embedded Visio drawing
would be binary (though maybe the Visio diagram could stored as XML or
something?)
I also worry about the prospect of trying to port the code to another
(non-Windows) platform and finding that I can export the code but that
all the comments get left behind.
The nice thing about plain text is that it is fairly universal.
Cheers,
Daniel.
The generally accepted wisdom is that using C++ exception handling to
manage program errors produces SMALLER executables than relying on error
codes ... of course, that's only true if you actually remember to write
the code to check the error codes, it's easy to forget that and then
you'll get a smaller program with incomplete error checking.
RTTI is part of the C++ language, and in general you omit it at your
peril (things like dynamic_cast won't work without it). MFC was
originally written for 16-bit Visual C++, though, and that didn't
support standard C++ RTTI so MFC provides its own runtime type system
which can be used instead of C++ RTTI with MFC classes (dynamic_cast
still won't work if you disable RTTI, but you can use
CObject::GetRuntimeClass and CObject::IsKindOf).
Note: if you use C++ RTTI and dynamic_cast you MUST enable exceptions,
because dynamic_cast can throw an exception in the event of type
mismatch.
In short: you can save space by dropping language features if you want,
but it's generally not a good idea and you might be better off USING
those features and reducing the amount of your own code.
Cheers,
Daniel.
I'm with you there. I'm sure glad I waited to port to Unicode until after I
switched to VS2005.
Anthony Wieser
Wieser Software Ltd
Joe:
Unfortunately, as we have discussed before, the way they added the warning to
the documentation is itself mega-confusing, and on the face of it violates the
rules of the C++ language. You most certainly can pass modified parameters to
the immediate base class, and they will be used if the immediate base class has
implemented the message map entry in question. It is only if the call passes all
the way down to CWnd that the recreation of the original parameters occurs.
--
David Wilkinson
Visual C++ MVP
Thanks Daniel. Sounds like if I don't use exceptions or dynamic_cast in my
MFC app, I can safely turn them off. I agree, it's better to turn them on,
though.
I didn't know dynamic_cast could throw an exception. I just thought it
returned NULL if the cast was invalid.
Thanks,
David
David:
Can enabling RTTI and exceptions really make a significant difference to an
app's size?
dynamic_cast throws an exception when used with a reference and the cast is
invalid. For pointers it returns NULL.
> "Joseph M. Newcomer" <newc...@flounder.com> wrote in message
> news:kg4ei4h0u292h2fk4...@4ax.com...
>> Our sincere hope is that 10 will reverse this trend and show that
>> someone actually
>> understands the software development process.
>
> I really hope so too, but am not encouraged from the CTP of VS10 that
> focuses on large code bases and new C++ standards. There is no sign
> of any IDE improvements or any change in philosophy that I can see.
I took a quick look. (Far shorter than the time it took to download -
ouch!) Clicked a couple buttons, etc. My first thought was "what's
different?". My second was "well, there's still alot of time before it
actually releases..."
Dave Connet
****
Yes, it ABSOLUTELY violates how C++ works, and the worst part is that there was NEVER a
need for this; it was an amateur hack.
joe
****
Like MFC, there are some virtual methods. but there is also the concept of
signals and slots. Slots are similar to .NET delegates, and I believe they
are implemented with macros like MFC message maps. In practice, I don't
find much difference between virtual functions and message maps, except I
once wasted a couple hours because I used the __super C++ extension to refer
to the base class, which did not work due to using it in a message map (it
is only valid in a virtual function!).
Qt UI is generally snappy (you won't mistake it for a .NET app!), but there
are slight hesitations compared to MFC. Part of it is due to localizations
being applied on the fly, and the concept of child controls resizing
dynamically to accomodate their content. Ensuring text does not get cut off
is something I gladly give up a little speed for.
-- David
>In article news:<ve3fi411di2lggcc1...@4ax.com>, Joseph M.
>Newcomer wrote:
>> I want to be able to embed Visio drawings as comments in my code
>> (embed, not link!) so I can do decent comments without having to use
>> +=| characters to draw my data structures.
>
>Whilst that is, on the face of it, an attractive notion ... I foresee
>problems when working with Version Control systems. Most VCSs do a
>decent job of displaying differences between versions of text files, but
>can't handle differences between binary files ("files are different" is
>all you get). I imagine that a source file with embedded Visio drawing
>would be binary (though maybe the Visio diagram could stored as XML or
>something?)
****
Does this not say something about how bad our version control systems are? If it takes
smarts, then you should expect that Visio, PowerPoint, etc. would get enhancements that
allow doing a diff of pictures, but because we have this notion that version control
systems are intended for text, we have not pushed the idea that things OTHER than text
should be handled as well.
Ultimately, we are still trapped in the paradigm of punched cards, that programs are text
and text only, that compilers can read only raw text files, etc. I pointed this out back
in 1977, when we were first experimenting with structured storage. The whole notion of
#include is wrong, the notion that we have .cpp and .obj files is wrong. I've been saying
this for 30 years, and we still produce punched-card compilers that philosophically do not
differ from what we were using in the 1950s. Really, the only thing that has changed is
the cards can have more than 80 columns (the ISO standard requires a minimum of a
509-column card, according to Harbison & Steele 4th edition)
#include is wrong because it embodies the wrong notion of how to make definitions
available (text-only) and requires such obsolete, error-prone concepts such as search
paths. We were doing this much better in the early 1980s, but the ideas never escaped the
research domain because they weren't C. .obj files should be unnecessary; think of the
idea of streams in files. There's a source stream (which would NOT necessarily be raw
text) and an object stream, for example. But our file systems are just ways of putting
punched cards into named boxes and putting the boxes on shelves that have names.
joe
>
>I also worry about the prospect of trying to port the code to another
>(non-Windows) platform and finding that I can export the code but that
>all the comments get left behind.
>
>The nice thing about plain text is that it is fairly universal.
>
>Cheers,
> Daniel.
>
> Qt UI is generally snappy (you won't mistake it for a .NET app!), but
> there are slight hesitations compared to MFC. Part of it is due to
> localizations being applied on the fly, and the concept of child controls
> resizing dynamically to accomodate their content. Ensuring text does not
> get cut off is something I gladly give up a little speed for.
Thanks David.
G
> Can enabling RTTI and exceptions really make a significant difference to
> an app's size?
David:
I read on the microsoft.public.vc.language newsgroup that it really can:
From: "Alexander Nickolov" <agnicko...@mvps.org>
Subject: Re: win32 design patterns
Date: Thu, 28 Feb 2008 09:31:56 -0800
--[quote]--
In reality, you can save up to 20% binary code size (coming from
actual code experiments) by disabling C++ exceptions. Where
I work C++ exceptions are strictly prohibited since we care
about downloadable code size. Not to mention the hidden
complexity in properly handling C++ exceptions.
=====================================
Alexander Nickolov
Microsoft MVP [VC], MCSD
email: agnicko...@mvps.org
MVP VC FAQ: http://vcfaq.mvps.org
=====================================
--[end quote]--
Original message can be found here:
http://groups.google.ca/group/microsoft.public.vc.language/msg/455c2a64a0e8d401?dmode=source
I did not try compiling C++ code without exceptions, but in cases like those
cited in the above post, I think it could be a good option for reducing code
size.
Giovanni
Well, yes ... but one big reason that VCSs are so primitive is that
nobody has yet worked out how to present information to the user in a
more sophisticated way.
I'm all for better tools ... but until we have them we should work with
the limitations of what we've got.
> Ultimately, we are still trapped in the paradigm of punched cards,
> that programs are text and text only, that compilers can read only
> raw text files, etc.
Oh, unfair! ... it's more like punched paper tape ...
Seriously, though, if we are still "trapped" in that paradigm it's
because we (collectively) have not yet found an alternative that is as
easy to understand as text, has the simple appeal of text, and that
works!
I agree that people have been calling for richer representations of
program source for a very long time, but nobody has yet devised anything
that's caught on.
> #include is wrong because ...
It's not wrong ... it's just suboptimal. It does a job, and does it
fairly well, but what it does isn't the best approach to the problem.
Textual inclusion is simple to implement, though, and so offered a
pragmatic solution to the problem of shared definitions at the time that
C was being devised. If we're still stuck with it it's partly to
continue support for old code that still requires it and partly because
people haven't managed to agree on a more functional replacement.
C++ still uses #include because of an early (and very important, for the
success and widespread uptake of C++) decision to maintain sourcecode
compatibility with C.
C++ will soon (not in 2009, but maybe in only a few more years) gain
support for separate compilation of modules. We're getting there,
slowly.
Cheers,
Daniel.
<cynical>
Yes, you can probably save about 20% by removing all error checking,
</cynical>
Anecdotes are all very well, but I've never observed anything like so
big a difference. Exceptions are a very useful technique, and I'd
probably use them in preference to other less reliable techniques if
they doubled the code size (which they don't).
> Not to mention the hidden complexity in properly handling C++
> exceptions.
Hiding complexity is usually seen as an advantage ...
Cheers,
Daniel.
It will do for casts of references (because you can't have a NULL
reference).
See 5.2.7.9
The value of a failed cast to pointer type is the NULL pointer value
of the required result type. A failed cast to reference type throws
bad_cast (18.5.2).
Cheers,
Daniel.
That makes sense, thanks. I've never used dynamic_cast on a reference, just
on pointers.
-- David
Use the right tool for job, I say. Many C++ codebases simply don't use
exceptions. If those codebases happen to benefit by a reduction in
executable size, disable the exceptions.
-- David