Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Valgrind finds memory leak on ImVector<char[4096]>::resize(). Why?

123 views
Skip to first unread message

Flix

unread,
Dec 18, 2015, 4:20:13 PM12/18/15
to
Hi everybody. I'm using a library called
https://github.com/ocornut/imgui, that includes a very simple template
class called ImVector.

This class is commented in this way:
// Lightweight std::vector<> like class to avoid dragging dependencies
(also: windows implementation of STL with debug enabled is absurdly
slow, so let's bypass it so our code runs fast in debug).
// Our implementation does NOT call c++ constructors because we don't
use them in ImGui. Don't use this class as a straight std::vector
replacement in your code!

I thought that char arrays had no C++ constructor/destructor.
Why do Valgrind complains when resizing the array ?

Louis Krupp

unread,
Dec 18, 2015, 4:56:53 PM12/18/15
to
On Fri, 18 Dec 2015 22:19:35 +0100, Flix <wri...@newsgroup.com>
wrote:
Can you post a small program that reproduces the problem?

Louis

Jorgen Grahn

unread,
Dec 18, 2015, 8:14:23 PM12/18/15
to
On Fri, 2015-12-18, Flix wrote:
> Hi everybody. I'm using a library called
> https://github.com/ocornut/imgui, that includes a very simple template
> class called ImVector.
>
> This class is commented in this way:
> // Lightweight std::vector<> like class to avoid dragging dependencies

A weird thing to say about a well-standardized part of C++ which has
been around for a few decades, and is used by pretty much everyone ...

> (also: windows implementation of STL with debug enabled is absurdly
> slow, so let's bypass it so our code runs fast in debug).

Even more bizarre, unless the author is sarcastic.

> // Our implementation does NOT call c++ constructors because we don't
> use them in ImGui.

Also bizarre; you can't write a C++ class (a template in this case)
without dealing with C++ constructors. It's like living without using
DNA. But probably the author means that ImVector<T> doesn't construct
any T objects.

> Don't use this class as a straight std::vector
> replacement in your code!

Sound advice ...

> I thought that char arrays had no C++ constructor/destructor.
> Why do Valgrind complains when resizing the array ?

Carefully read what valgrind says, and read the code. It shouldn't be
too hard to figure out -- valgrind is pretty good at saying what went
wrong. You don't tell /us/ what valgrind says, so we can't help.
(I would have expected use of uninitialized data rather than a memory
leak.)

In general, when valgrind complains, it has good reasons.

/Jorgen

--
// Jorgen Grahn <grahn@ Oo o. . .
\X/ snipabacken.se> O o .

mark

unread,
Dec 19, 2015, 3:08:07 AM12/19/15
to
On 2015-12-19 02:14, Jorgen Grahn wrote:
> On Fri, 2015-12-18, Flix wrote:
>> Hi everybody. I'm using a library called
>> https://github.com/ocornut/imgui, that includes a very simple template
>> class called ImVector.
>>
>> This class is commented in this way:
>> // Lightweight std::vector<> like class to avoid dragging dependencies
>
> A weird thing to say about a well-standardized part of C++ which has
> been around for a few decades, and is used by pretty much everyone ...

There are people who are using C++ for some useful language features and
don't want to drag in C++ libraries (including the standard library).

The story C++ has on binary compatibility is just pathetic.

>> (also: windows implementation of STL with debug enabled is absurdly
>> slow, so let's bypass it so our code runs fast in debug).
>
> Even more bizarre, unless the author is sarcastic.

Huh? They are exactly right.

std::vector push_back is just one giant clusterfuck. There are >100
calls being made in there. The amount of useless overhead is mind-boggling.

Even for VC++ std::vector operator[]:

call [...]::operator[]
--> call [...]::_Myfirst
--> call [...]::_Get_data
--> call [...]::_Get_second

for ImVector:
call [...]::operator[]

With debug builds without inlining, there is a huge difference.

>> // Our implementation does NOT call c++ constructors because we don't
>> use them in ImGui.
>
> Also bizarre; you can't write a C++ class (a template in this case)
> without dealing with C++ constructors. It's like living without using
> DNA.

No, it's living without unnecessary overhead. They are only supporting
what C++ now calls trivially copyable types and are using memcpy.

I have written my own vector replacement (only supporting trivially
copyable types) and for many use cases it's significantly faster than
std::vector (not just in debug builds).

Jorgen Grahn

unread,
Dec 19, 2015, 5:35:15 AM12/19/15
to
On Sat, 2015-12-19, mark wrote:
> On 2015-12-19 02:14, Jorgen Grahn wrote:
> > On Fri, 2015-12-18, Flix wrote:
...
> >> (also: windows implementation of STL with debug enabled is absurdly
> >> slow, so let's bypass it so our code runs fast in debug).
> >
> > Even more bizarre, unless the author is sarcastic.
>
> Huh? They are exactly right.
>
> std::vector push_back is just one giant clusterfuck. There are >100
> calls being made in there. The amount of useless overhead is mind-boggling.

You're talking about non-optimized builds with a certain Microsoft
implementation of the standard library I guess. I honestly don't
care: I have a good implementation of std::vector here which I use.
I expect others to have a good C++ implementation too; they have been
freely available for at least a decade.

If you have a problem with std::vector, the solution is /not/ to
create your own incompatible replacement. Instead, don't use debug
builds, or get a better standard library. Or accept that debug builds
are ... debug builds.

Öö Tiib

unread,
Dec 19, 2015, 11:07:16 AM12/19/15
to
On Saturday, 19 December 2015 10:08:07 UTC+2, mark wrote:
> On 2015-12-19 02:14, Jorgen Grahn wrote:
>
> >> (also: windows implementation of STL with debug enabled is absurdly
> >> slow, so let's bypass it so our code runs fast in debug).
> >
> > Even more bizarre, unless the author is sarcastic.
>
> Huh? They are exactly right.

Yes but in a way that makes good trolls indistinguishable
from genuine kooks. (Morgan's maxim)

What is the point of using debug build and then write hand-optimized code
in it? The whole scenario is either eccentric (kook) or good sarcasm (troll)
but it is impossible to tell what.

omarc...@gmail.com

unread,
Dec 19, 2015, 1:50:55 PM12/19/15
to

> You're talking about non-optimized builds with a certain
> Microsoft implementation of the standard library I guess.
> I honestly don't care:

Well you don't but people using said implementation in non-optimized builds care, it is as simple as that. And they are a lots of those people, believe me.

> If you have a problem with std::vector, the solution is /not/ to
> create your own incompatible replacement. Instead, don't use
> debug builds, or get a better standard library.

Ridiculous and lazy.

You'd rather not use debug builds and suggest your hundred/thousands of users to not use debug builds and change their standard library, rather than write 30 lines of code?

Aside from Microsoft debug implementation, any standard-complying std::vector is likely to be slower than a finely tuned non-standard version for your own need. std:: classes are great but claiming that they are fit for all use cases is just lazy.

> > >> (also: windows implementation of STL with debug enabled is absurdly
> > >> slow, so let's bypass it so our code runs fast in debug).
> > >
> > > Even more bizarre, unless the author is sarcastic.
> >
> > Huh? They are exactly right.
>
> Yes but in a way that makes good trolls indistinguishable
> from genuine kooks. (Morgan's maxim)
>
> What is the point of using debug build and then write hand-optimized code
> in it? The whole scenario is either eccentric (kook) or good sarcasm (troll)
> but it is impossible to tell what.

It is neither.

When you work on a game meant to run at real-time / interactive frame-rate, when more so a complex game (imagine your typical console title) having a debug build that runs a 5 fps is absolutely redhibitory. You want to maximize your debugging capabilities every minute of the day while minimizing your performance cost.

That library quoted above (which I wrote, and I wrote that comment) is meant to be always-on and always-available and it would be a huge flaws if using the library in a debug build took 4 ms of your time every frame. Visual Studio in particular has really harsh (slow) settings by default nowadays and this extra cost in unacceptable for a library that may process thousands of UI objects every frame.

People who are finding std::vector satisfactory probably haven't been faced with writing a highly efficient and complex video game, in a highly competitive environment. Consider that GTA5 is running on the PS3 and have a look up at the specs for PS3 on Wikipedia, I can guarantee you they aren't using std::vector. When you gets into small details the implementation of STL classes is as varied as they are implementations and this variation can actually be an issue with porting. Considering how simple it is to rewrite that sort of containers it makes sense for many more uses that you can imagine. I agree it seems weird for many users of C++ but it is also a reality for many developers and it is a lack of imagination and understanding of different ecosystems and requirements to consider than rewriting own set of containers is so unusual or eccentric.

Back to the original post: is Valgrind report something and there is a case of a legit memory leak obviously this is a bug in the library and should be reported on the github for said library.

omarc...@gmail.com

unread,
Dec 19, 2015, 2:37:51 PM12/19/15
to
> I thought that char arrays had no C++ constructor/destructor.
> Why do Valgrind complains when resizing the array ?

char doesn't have a constructor/destructor but the memory itself still needs to be freed of course. It's hard to tell what your issue is given the lack of Valgrind report and/or code sample. I'd imagine the issue is probably that you have a ImVector whose destructor wasn't called (perhaps you have a ImVector on the heap somewhere).

There's no ImVector<char[4096]> in the ImGui codebase so it doesn't seem like code in the default library?

Mr Flibble

unread,
Dec 19, 2015, 3:08:34 PM12/19/15
to
There is nothing wrong with using std::vector (or the STL in general)
for games programming if the implementation is of decent quality and, in
the case of std::vector, you provide a custom allocator that doesn't
perform value initialization if that is your use-case; there is no need
to write your own container to do a similar job.

Optimizing for debug builds is just plain craziness sausages.

/Flibble

Öö Tiib

unread,
Dec 19, 2015, 3:16:37 PM12/19/15
to
On Saturday, 19 December 2015 20:50:55 UTC+2, omarc...@gmail.com wrote:

Please do not erase atributions, it is Usenet and not some web forum so
people won't get whom you quote or to what you reply if you erase those.
Restoring those ...

> I wrote:
> > On Saturday, 19 December 2015 10:08:07 UTC+2, mark wrote:
> > > On 2015-12-19 02:14, Jorgen Grahn wrote:
> > > > Comment from your code (like you say) was:
>
> > > >> (also: windows implementation of STL with debug enabled is absurdly
> > > >> slow, so let's bypass it so our code runs fast in debug).
> > > >
> > > > Even more bizarre, unless the author is sarcastic.
> > >
> > > Huh? They are exactly right.
> >
> > Yes but in a way that makes good trolls indistinguishable
> > from genuine kooks. (Morgan's maxim)
> >
> > What is the point of using debug build and then write hand-optimized code
> > in it? The whole scenario is either eccentric (kook) or good sarcasm (troll)
> > but it is impossible to tell what.
>
> It is neither.
>
> When you work on a game meant to run at real-time / interactive
> frame-rate, when more so a complex game (imagine your typical console
> title) having a debug build that runs a 5 fps is absolutely redhibitory.

Do not use debug build then.

> You want to maximize your debugging capabilities every minute of the
> day while minimizing your performance cost.

The debuggers work excellently with optimized builds. The only
thing that I avoid defining is NDEBUG (because that erases asserts).

>
> That library quoted above (which I wrote, and I wrote that comment) is
> meant to be always-on and always-available and it would be a huge flaws
> if using the library in a debug build took 4 ms of your time every frame.

I use debug versions only in unit-tests and automated tests. It does
not matter for me how long those run on test farm.

> Visual Studio in particular has really harsh (slow) settings by default
> nowadays and this extra cost in unacceptable for a library that may
> process thousands of UI objects every frame.

The slow run-time-checked debug versions of standard containers and
iterators are to track down errors like where someone did 'push_back'
element into a 'vector' and forgot that it *may* invalidate iterators.
Such logic errors are all caught by unit tests quickly so I do not need
those checks in versions that are debugged manually.

>
> People who are finding std::vector satisfactory probably haven't
> been faced with writing a highly efficient and complex video game, in
> a highly competitive environment. Consider that GTA5 is running on the
> PS3 and have a look up at the specs for PS3 on Wikipedia, I can guarantee
> you they aren't using std::vector.

What? Most people who use C++ for anything use it because they need
its excellent performance. Optimized (by decent compiler) code of
'std::vector' is about as good as same thing written in assembler.

> When you gets into small details the implementation of STL classes is
> as varied as they are implementations and this variation can actually be
> an issue with porting.

Fear, Uncertainty and Doubt. There was maybe case or two 20 years ago and
old women of bus-station gossip of it. World has changed.

> Considering how simple it is to rewrite that sort of containers it
> makes sense for many more uses that you can imagine.

Good containers aren't simple to write correctly. I have seen enough
of both kinds. It is tricky to gain authority among guys like you.
Typically I argue with them then let them to waste a week on it and
then demonstrate them how well used 'std' and/or 'boost' containers
are easier to debug, but still beat their "container". Often 5:1.
There are lot of excellent containers written by best and tested by
many. *Choosing* correct one is the trick to learn, not how to
invent yet another square wheel.

> I agree it seems weird for many users of C++ but it is also a reality
> for many developers and it is a lack of imagination and understanding
> of different ecosystems and requirements to consider than rewriting own
> set of containers is so unusual or eccentric.

I do not know what you imagine C++ is commonly used for by these
"many users". Good C++ developers are scarce so most are used only to
write most critical parts of anything.

>
> Back to the original post: is Valgrind report something and there is a
> case of a legit memory leak obviously this is a bug in the library and
> should be reported on the github for said library.

Of course. I have seen also standard libraries leaking memory and
crashing in situations but it happens very rarely and the defects are
repaired rather fast by vendors because lot more people evaluate code of
those, test those and use those.

omarc...@gmail.com

unread,
Dec 20, 2015, 3:31:45 AM12/20/15
to
On Saturday, 19 December 2015 21:08:34 UTC+1, Mr Flibble wrote:
> There is nothing wrong with using std::vector (or the STL in general)
> for games programming if the implementation is of decent quality and, in
> the case of std::vector, you provide a custom allocator that doesn't
> perform value initialization if that is your use-case; there is no need
> to write your own container to do a similar job.

There is nothing wrong for you. This is crazy that you can't even have the IMAGINATION to consider that someone somewhere may not want a vector where both size and capacity are not stored at 8-byte size_t. What if I want them to be 2-bytes or 4-bytes? What if I don't want to allow cases such as allowing the reference parameter to push_back pointing mid-vector, etc? Suggesting that every C++ programmer needs to use the exact same code in every situation is crazy.

You do realise that writing those few dozens lines doing exactly what I need in that simple context is easier than writing and maintaining an allocator and passing that around and dealing with horrible horrible error messages?

omarc...@gmail.com

unread,
Dec 20, 2015, 3:57:52 AM12/20/15
to
I genuinely wonder if you are trolling me or it is just lack of experience and imagination. You are both essentially saying "everybody should use the same code". Do I really need to point out how wrong that statement is?

On Saturday, 19 December 2015 21:16:37 UTC+1, Öö Tiib wrote:
> > When you work on a game meant to run at real-time / interactive
> > frame-rate, when more so a complex game (imagine your typical console
> > title) having a debug build that runs a 5 fps is absolutely redhibitory.
>
> Do not use debug build then.

So you are suggesting we don't use debuggers? You are suggesting we don't aim to improve our working condition by making a debugging session run faster to increase the odds of being able to work under those conditions?

> > You want to maximize your debugging capabilities every minute of the
> > day while minimizing your performance cost.
>
> The debuggers work excellently with optimized builds. The only
> thing that I avoid defining is NDEBUG (because that erases asserts).

They sometimes do an ok job, they can't possibly do everything. Also I see you probably haven't worked with debuggers shipped for various console platforms.

> > That library quoted above (which I wrote, and I wrote that comment) is
> > meant to be always-on and always-available and it would be a huge flaws
> > if using the library in a debug build took 4 ms of your time every frame.
> I use debug versions only in unit-tests and automated tests. It does
> not matter for me how long those run on test farm.

Great for you. I want users to be able to use debug versions on a daily basis when possible. e.g. working on feature X one is going to set up their data in a way that allows maximizing debugging capabilities if possible.

> > Visual Studio in particular has really harsh (slow) settings by default
> > nowadays and this extra cost in unacceptable for a library that may
> > process thousands of UI objects every frame.
>
> The slow run-time-checked debug versions of standard containers and
> iterators are to track down errors like where someone did 'push_back'
> element into a 'vector' and forgot that it *may* invalidate iterators.
> Such logic errors are all caught by unit tests quickly so I do not need
> those checks in versions that are debugged manually.

Great that your code is all unit tested. I guess that you must write very little code and work with simple systems. It's hard and unusual to automatically test all parts of a game. You can test many parts, you can't test all of it, certainly not with unit tests. Indie game developers in particularly generally don't have the infrastructure to do so. It is not my job as a library developer to push more burden on every indie developers when I want to help them.

> > People who are finding std::vector satisfactory probably haven't
> > been faced with writing a highly efficient and complex video game, in
> > a highly competitive environment. Consider that GTA5 is running on the
> > PS3 and have a look up at the specs for PS3 on Wikipedia, I can guarantee
> > you they aren't using std::vector.
>
> What? Most people who use C++ for anything use it because they need
> its excellent performance. Optimized (by decent compiler) code of
> 'std::vector' is about as good as same thing written in assembler.

I agree, most people. Not most high-end game developers.

First we're talking about debug builds here. Secondly there's hundreds of possible algorithm and implementation variants while stl provide a few dozens and they vary by implementation. It's *unacceptable* for a game of very high calibre to rely on implementation dependant automatic growth of vector capacity. Personally I don't want my vectors to have 24 bytes of overhead when they could be 16 bytes. Shitty error messages when you start dealing with more complex stuff is also unacceptable. Slower compilation all across the board is also not something I fancy.

I do use a lot of STL. I am merely saying there are cases where it is ok to not use it and for that library it makes perfect sense and it is the best thing to doo.

> > When you gets into small details the implementation of STL classes is
> > as varied as they are implementations and this variation can actually be
> > an issue with porting.
>
> Fear, Uncertainty and Doubt. There was maybe case or two 20 years ago and
> old women of bus-station gossip of it. World has changed.

Knowledge, Curiosity, Imagination. (Lack of in your case)

Essentially many of people answers in this thread are "we don't like that YOU write a dozen lines of your own code, we'd rather make the access bar for ALL YOUR USERS higher and harder because it is the right way to do things".

TL;DR;
- I am saying there are different ways of doing things for different scenarios. I am not saying your way is wrong.
- You are saying there is a single way of doing thing. You are saying that my way of doing thing, within a context that's not yours and that you poorly understand is wrong.

Gareth Owen

unread,
Dec 20, 2015, 5:01:18 AM12/20/15
to
omarc...@gmail.com writes:

> I genuinely wonder if you are trolling me or it is just lack of
> experience and imagination.

Not experience, but definitely imagination. He knows the single right
answer to every question and no alternative view, regardless how well
argued, will be entertained. You're best just ignoring him.

> You are both essentially saying "everybody should use the same
> code". Do I really need to point out how wrong that statement is?

You could try. He'll just ignore/denigrate you.

Nobody

unread,
Dec 20, 2015, 6:26:01 AM12/20/15
to
On Sat, 19 Dec 2015 08:06:59 -0800, Öö Tiib wrote:

> What is the point of using debug build and then write hand-optimized code
> in it?

Real-time code often adapts its behaviour to performance. E.g. games often
update the simulation using numerical integration where "dt" is the time
interval between previous frames, and/or adjust the level of graphical
detail in order to obtain a reasonable frame rate.

In this situation, it's often impossible to actually debug issues using
a debug build because the debug build runs so much slower than a release
build that it substantially changes the behaviour of the program,
resulting in Heisenbugs.

Flix

unread,
Dec 20, 2015, 7:25:39 AM12/20/15
to
Actually I just run Valgrind on a test case that was using my (old)
"imguifilesystem" control (it was meant to replace std:string).
I'll make further tests on it, and it I'll find something, I'll use the
ImGui Issue Forum (I though that C arrays could have some kind of
ctr/dct "overhead").

However it was not my intention to trigger such a discussion.
I'm sorry.

Maybe a library that doesn't use STL is more portable (less dependencies
= more portability)
and less dependent on the particular performance of a specific STL library.

But I'm sure people would have something to say about it...

Flix

unread,
Dec 20, 2015, 8:44:11 AM12/20/15
to
On 20/12/2015 13:25, Flix wrote:
> On 19/12/2015 20:37, omarc...@gmail.com wrote:
> Actually I just run Valgrind on a test case that was using my (old)
> "imguifilesystem" control (it was meant to replace std:string).

Ooops, I meant to replace a vector of char[PATH_MAX].


Mr Flibble

unread,
Dec 20, 2015, 1:15:54 PM12/20/15
to
On 20/12/2015 08:31, omarc...@gmail.com wrote:
> On Saturday, 19 December 2015 21:08:34 UTC+1, Mr Flibble wrote:
>> There is nothing wrong with using std::vector (or the STL in general)
>> for games programming if the implementation is of decent quality and, in
>> the case of std::vector, you provide a custom allocator that doesn't
>> perform value initialization if that is your use-case; there is no need
>> to write your own container to do a similar job.
>
> There is nothing wrong for you. This is crazy that you can't even have the IMAGINATION to consider that someone somewhere may not want a vector where both size and capacity are not stored at 8-byte size_t. What if I want them to be 2-bytes or 4-bytes? What if I don't want to allow cases such as allowing the reference parameter to push_back pointing mid-vector, etc? Suggesting that every C++ programmer needs to use the exact same code in every situation is crazy.

It is nothing to do with imagination and everything to do with
psychosis. If you aren't using std::vector because
sizeof(std::vector<T>) is 24 bytes instead of 12 bytes then you are
seriously in need of medication as psychosis can be the only explanation
for someone doing something totally wrong. YOU ARE DOING IT WRONG! (tm)

Oh and BTW most std::vector implementations will be storing three
pointers rather than std::size_t for size and capacity.

I am not suggesting that every C++ programmer use the same code for
different situations but you have yet to provide a sane reason for using
your container over std::vector (I have no idea as to what you meaning
by pushing back reference parameters mid-vector).

>
> You do realise that writing those few dozens lines doing exactly what I need in that simple context is easier than writing and maintaining an allocator and passing that around and dealing with horrible horrible error messages?
>

It is very rare that one needs an uninitialised buffer and when you do
new[] usually suffices so you still have yet to provide sufficient
rationale for using your container over std::vector sausages.

/Flibble

Flix

unread,
Dec 20, 2015, 2:05:50 PM12/20/15
to
I fixed it!
And it was my fault (even if the Valgrind output was not very useful).

Basically I had a class allocated on the heap that contained a
ImVector<char[4096]>.
When I deallocated it, I simply released the memory without calling the
destructor on my class: thus ~ImVector<char[4096]>() was never called,
even if my class heap-space was correctly deallocated.

Since "going down the chain": ImVector::resize() -> ImVector::reserve()
are the only places where allocators are used inside ImVector, Valgrind
returned these methods.

Basically Valgrind can't know where I should have freed the memory:
that's why leaks are very difficult to fix.

omarc...@gmail.com

unread,
Dec 20, 2015, 3:08:49 PM12/20/15
to
On Sunday, 20 December 2015 20:05:50 UTC+1, Flix wrote:
> > I thought that char arrays had no C++ constructor/destructor.
> > Why do Valgrind complains when resizing the array ?
> >
>
> I fixed it!
> And it was my fault (even if the Valgrind output was not very useful).

Glad that you found your problem Flix. I'd say Valgrind was reasonably useful there.

On Sunday, 20 December 2015 19:15:54 UTC+1, Mr Flibble wrote:
> (I have no idea as to what you meaning by pushing back reference parameters mid-vector

push_back() takes a const T& and a typical implementation needs to cater for the case where that reference points within the vector data itself at the time of calling the function, aka take a temporary copy is reallocating.

I already have provided with enough reasons that you decide to ignore; and probably missing some (many stl types playing terrible with edit&continue/live recompilation techniques, ease of portability to old or non-conformant architectures, duplicated code bloat polluting instruction cache, more painful visibility and stepping in debuggers, implementation dependent-behavior, horrible error messages, increased compilation times - consider various situations with pre-compiled headers are unavailable, constant harassment of 32/64 warnings because of a size_t that I don't care about, ease to extend a library you wrote yourself to create more specialised containers (e.g. holding a local buffer to avoid heap allocations) etc.).

I am going to opt out of the conversation and go get my medication for psychosis because some people here haven't written software where memory or performances or portability (hint: portability doesn't mean unix+mac+windows) matters and dealing with dozens/hundreds thousands of things going on. With sloppy programming practices being so common, no wonders my Windows keyboard driver nowadays takes 55 MB of RAM and software are generally slower and less snappy to the end user than they were ten years ago. Step back a bit to compare to my prime example above, that GTA5 runs with 256 MB of RAM on the PS3, aka about 5 fives the amount of RAM that my keyboard driver written by some arguably incompetent programmers uses. I don't need those 12 bytes for that library and I still recommend using STL in majority of cases, but I needed those 12 bytes in the past in several occasions.

omarc...@gmail.com

unread,
Dec 20, 2015, 3:17:36 PM12/20/15
to
On Sunday, 20 December 2015 21:08:49 UTC+1, omarc...@gmail.com wrote:
> On Sunday, 20 December 2015 20:05:50 UTC+1, Flix wrote:
> > > I thought that char arrays had no C++ constructor/destructor.
> > > Why do Valgrind complains when resizing the array ?
> > >
> >
> > I fixed it!
> > And it was my fault (even if the Valgrind output was not very useful).
>
> Glad that you found your problem Flix. I'd say Valgrind was reasonably useful there.
>
> On Sunday, 20 December 2015 19:15:54 UTC+1, Mr Flibble wrote:
> > (I have no idea as to what you meaning by pushing back reference parameters mid-vector
>
> push_back() takes a const T& and a typical implementation needs to cater for the case where that reference points within the vector data itself at the time of calling the function, aka take a temporary copy is reallocating.
>
> I already have provided with enough reasons that you decide to ignore; and probably missing some (many stl types playing terrible with edit&continue/live recompilation techniques, ease of portability to old or non-conformant architectures, duplicated code bloat polluting instruction cache, more painful visibility and stepping in debuggers, implementation dependent-behavior, horrible error messages, increased compilation times - consider various situations with pre-compiled headers are unavailable, constant harassment of 32/64 warnings because of a size_t that I don't care about, ease to extend a library you wrote yourself to create more specialised containers (e.g. holding a local buffer to avoid heap allocations) etc.).

And I have to add that those reasons makes even more sense when shipping a library. When I'm writing my own code I have less issues with using std::map etc. for convenience because I know it works and it'll cover 90% of my needs. When I'm writing a library that I expect game programmers to use it's a very different thing to use those. Even if you argue and debate my points, my library would simply lose half of its intended audience if it was dragging in <vector> and <map>.

Öö Tiib

unread,
Dec 20, 2015, 3:56:01 PM12/20/15
to
On Sunday, 20 December 2015 10:57:52 UTC+2, omarc...@gmail.com wrote:
> I genuinely wonder if you are trolling me or it is just lack of
> experience and imagination. You are both essentially saying "everybody
> should use the same code". Do I really need to point out how wrong
> that statement is?

Looked at your code ... bah. It won't pass review of any decent C++
developer. Go read some book about C++.

For single example: every class that has destructor (including 'ImVector')
violates rule of three. So there will be mundane memory management issues
with objects of your library classes.

Last time a leak in my code reached repository was 12 years ago. I am
trolling? Why? It is exactly as bad as I suspected, there are no point
even to test the thing.

Mr Flibble

unread,
Dec 20, 2015, 4:27:12 PM12/20/15
to
On 20/12/2015 20:08, omarc...@gmail.com wrote:
> On Sunday, 20 December 2015 19:15:54 UTC+1, Mr Flibble wrote:
>> (I have no idea as to what you meaning by pushing back reference parameters mid-vector
>
> push_back() takes a const T& and a typical implementation needs to cater for the case where that reference points within the vector data itself at the time of calling the function, aka take a temporary copy is reallocating.

A decent STL implementation will already have a check for reallocation
so I don't see a need for an extra check if element being inserted is a
reference to an existing element in same container assuming the previous
memory is deallocated AFTER elements are inserted into newly allocated
memory which should be the case for a decent implementation. I suspect
the less than perfect Microsoft VC++ STL implementation is informing
your erroneous views? Am I right? :D

>
> I already have provided with enough reasons that you decide to ignore; and probably missing some (many stl types playing terrible with edit&continue/live recompilation techniques, ease of portability to old or non-conformant architectures, duplicated code bloat polluting instruction cache, more painful visibility and stepping in debuggers, implementation dependent-behavior, horrible error messages, increased compilation times - consider various situations with pre-compiled headers are unavailable, constant harassment of 32/64 warnings because of a size_t that I don't care about, ease to extend a library you wrote yourself to create more specialised containers (e.g. holding a local buffer to avoid heap allocations) etc.).
>
> I am going to opt out of the conversation and go get my medication for psychosis because some people here haven't written software where memory or performances or portability (hint: portability doesn't mean unix+mac+windows) matters and dealing with dozens/hundreds thousands of things going on. With sloppy programming practices being so common, no wonders my Windows keyboard driver nowadays takes 55 MB of RAM and software are generally slower and less snappy to the end user than they were ten years ago. Step back a bit to compare to my prime example above, that GTA5 runs with 256 MB of RAM on the PS3, aka about 5 fives the amount of RAM that my keyboard driver written by some arguably incompetent programmers uses. I don't need those 12 bytes for that library and I still recommend using STL in majority of cases, but I needed those 12 bytes in the past in several occasions.

Sorry but you still haven't provided any sane rationale for your noddy
std::vector rip off sausages.

/Flibble

Öö Tiib

unread,
Dec 21, 2015, 2:21:49 AM12/21/15
to
On Sunday, 20 December 2015 13:26:01 UTC+2, Nobody wrote:
> On Sat, 19 Dec 2015 08:06:59 -0800, Öö Tiib wrote:
>
> > What is the point of using debug build and then write hand-optimized code
> > in it?
>
> Real-time code often adapts its behaviour to performance. E.g. games often
> update the simulation using numerical integration where "dt" is the time
> interval between previous frames, and/or adjust the level of graphical
> detail in order to obtain a reasonable frame rate.

Indeed, the time-critical parts can not be stepped in debugger anyway since
all sorts of timeouts may start to fire.

>
> In this situation, it's often impossible to actually debug issues using
> a debug build because the debug build runs so much slower than a release
> build that it substantially changes the behaviour of the program,
> resulting in Heisenbugs.

OK. So instead of realizing that whatever they attempt is futile they
start to write hand-optimized code to improve performance of debug build.
Thanks. That makes sense now.

Jorgen Grahn

unread,
Dec 21, 2015, 12:05:26 PM12/21/15
to
On Mon, 2015-12-21, 嘱 Tiib wrote:
> On Sunday, 20 December 2015 13:26:01 UTC+2, Nobody wrote:
>> On Sat, 19 Dec 2015 08:06:59 -0800, 嘱 Tiib wrote:
...
>> In this situation, it's often impossible to actually debug issues using
>> a debug build because the debug build runs so much slower than a release
>> build that it substantially changes the behaviour of the program,
>> resulting in Heisenbugs.
>
> OK. So instead of realizing that whatever they attempt is futile they
> start to write hand-optimized code to improve performance of debug build.
> Thanks. That makes sense now.

Perhaps the right thing to do for the OP is to make debug builds, but
modify them so normal versions of the standard library are used. Much
of the templated code will be built without optimization, but at least
it won't have extra range checks on iterators, operator[] and so on.

Sounds like that should be possible -- although I have no experience
with his toolchain. It's certainly possible with the GCC tools, since
controlling the optimization level, the generation of debug symbols,
and the libstdc++ settings, are separate settings.

Ian Collins

unread,
Dec 21, 2015, 5:06:45 PM12/21/15
to
Jorgen Grahn wrote:
> On Mon, 2015-12-21, 嘱 Tiib wrote:
>> On Sunday, 20 December 2015 13:26:01 UTC+2, Nobody wrote:
>>> On Sat, 19 Dec 2015 08:06:59 -0800, 嘱 Tiib wrote:
> ....
>>> In this situation, it's often impossible to actually debug issues using
>>> a debug build because the debug build runs so much slower than a release
>>> build that it substantially changes the behaviour of the program,
>>> resulting in Heisenbugs.
>>
>> OK. So instead of realizing that whatever they attempt is futile they
>> start to write hand-optimized code to improve performance of debug build.
>> Thanks. That makes sense now.
>
> Perhaps the right thing to do for the OP is to make debug builds, but
> modify them so normal versions of the standard library are used. Much
> of the templated code will be built without optimization, but at least
> it won't have extra range checks on iterators, operator[] and so on.
>
> Sounds like that should be possible -- although I have no experience
> with his toolchain. It's certainly possible with the GCC tools, since
> controlling the optimization level, the generation of debug symbols,
> and the libstdc++ settings, are separate settings.

I agree, both the tool chains I use (gcc and Oracle) allow fine control
over both optimisation and debug settings. A good compromise is often
"-g -O1" which give reasonable performance code with debug symbols.

If your tools can't give you basic optimisation with debugging support,
it's probably time to look elsewhere.

--
Ian Collins

mark

unread,
Dec 22, 2015, 7:52:41 AM12/22/15
to
On 2015-12-21 18:05, Jorgen Grahn wrote:
> On Mon, 2015-12-21, Öö Tiib wrote:
>> On Sunday, 20 December 2015 13:26:01 UTC+2, Nobody wrote:
>>> On Sat, 19 Dec 2015 08:06:59 -0800, Öö Tiib wrote:
> ...
>>> In this situation, it's often impossible to actually debug issues using
>>> a debug build because the debug build runs so much slower than a release
>>> build that it substantially changes the behaviour of the program,
>>> resulting in Heisenbugs.
>>
>> OK. So instead of realizing that whatever they attempt is futile they
>> start to write hand-optimized code to improve performance of debug build.
>> Thanks. That makes sense now.
>
> Perhaps the right thing to do for the OP is to make debug builds, but
> modify them so normal versions of the standard library are used. Much
> of the templated code will be built without optimization, but at least
> it won't have extra range checks on iterators, operator[] and so on.

At least with VC++ it doesn't work. Debug/release builds are not binary
compatible.

I definitely want range checking in debug builds. The performance
problem isn't range checking in the first place. The problem is insanely
long call chains that heavily rely on inlining.

Öö Tiib

unread,
Dec 22, 2015, 9:03:45 AM12/22/15
to
On Tuesday, 22 December 2015 14:52:41 UTC+2, mark wrote:
> On 2015-12-21 18:05, Jorgen Grahn wrote:
> > On Mon, 2015-12-21, Öö Tiib wrote:
> >> On Sunday, 20 December 2015 13:26:01 UTC+2, Nobody wrote:
> >>> On Sat, 19 Dec 2015 08:06:59 -0800, Öö Tiib wrote:
> > ...
> >>> In this situation, it's often impossible to actually debug issues using
> >>> a debug build because the debug build runs so much slower than a release
> >>> build that it substantially changes the behaviour of the program,
> >>> resulting in Heisenbugs.
> >>
> >> OK. So instead of realizing that whatever they attempt is futile they
> >> start to write hand-optimized code to improve performance of debug build.
> >> Thanks. That makes sense now.
> >
> > Perhaps the right thing to do for the OP is to make debug builds, but
> > modify them so normal versions of the standard library are used. Much
> > of the templated code will be built without optimization, but at least
> > it won't have extra range checks on iterators, operator[] and so on.
>
> At least with VC++ it doesn't work. Debug/release builds are not binary
> compatible.

C++ library is mostly header-only, what you mean by binary compatibility?
In VC++ different versions of library code are chosen depending how you
define _SECURE_SCL and _ITERATOR_DEBUG_LEVEL (also _SCL_SECURE_NO_WARNINGS).
Indeed, you have to define those same for whole application, can't
be that you tinker with choice of library versions on per-file basis.

>
> I definitely want range checking in debug builds. The performance
> problem isn't range checking in the first place. The problem is insanely
> long call chains that heavily rely on inlining.

Those "debug" and "release" are just names for common configurations.
If you need inlining in "debug" builds then enable optimizations in
"debug" builds. If you need debug information in "release" builds then
keep generating that for "release" builds.
0 new messages