On 29.04.16 03.02, Lynn McGuire wrote:
> Is there anything comparing the efficiency of a std::vector with a
> std::map ? I am using a std::vector in many circumstances with up to
> 26,000 entries. Unfortunately, I have very little way of predicting the
> ultimate size of the vector so I am using push_back () to add more
> entries instead of resize ().
You are fine. Using push_back() in a loop with n iterations will not
exceed O(n log n) in total.
> I am having some performance issues and am suspecting the std::vector
> growing so much by a single increments is really slowing my code down.
Use a profiler to track your performance issues. Probably they are
caused by something else.
If you can approximately predict the upper bound of the size for an
individual vector, reserve() is fine. Otherwise it might be
counterproductive. Allocation too much memory can significantly degrade
cache efficiency, besides the waste of resources, of course.
Note that the situation changes significantly if you use insert()
instead. This yields to O(n²) in general, regardless of reserve().
This is the point where std::map comes into play. But do not expect too
much. std::map uses Red-Black Tree. This is quite ineffective for many
small objects, because of the allocation overhead. This is the price for
keeping iterators valid at insert and delete operations.
If the latter is not required B-Trees with reasonable node sizes (63 or
127) outperform std:map by far. Unfortunately there is no B-Tree in the
std:: namespace.
Marcel