Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

efficiency of std::vector versus std::map

54 views
Skip to first unread message

Lynn McGuire

unread,
Apr 28, 2016, 9:02:58 PM4/28/16
to
Is there anything comparing the efficiency of a std::vector with a std::map ? I am using a std::vector in many circumstances with up
to 26,000 entries. Unfortunately, I have very little way of predicting the ultimate size of the vector so I am using push_back () to
add more entries instead of resize ().

I am having some performance issues and am suspecting the std::vector growing so much by a single increments is really slowing my
code down.

Or, should I be using reserve () to increase the size of my std::vector in increments of 10,000 instead of increments of 1 ?
http://en.cppreference.com/w/cpp/container/vector/reserve

Thanks,
Lynn

Daniel

unread,
Apr 28, 2016, 9:49:55 PM4/28/16
to
On Thursday, April 28, 2016 at 9:02:58 PM UTC-4, Lynn McGuire wrote:
> Is there anything comparing the efficiency of a std::vector with a
> std::map ? I am using a std::vector in many circumstances with up
> to 26,000 entries. Unfortunately, I have very little way of predicting
> the ultimate size of the vector so I am using push_back () to
> add more entries instead of resize ().
>

> Or, should I be using reserve () to increase the size of my std::vector in
> increments of 10,000 instead of increments of 1 ?

You could always reserve 26,000 entries and shrink_to_fit if that turns out
to be excessive.

Also, make sure that the things that you're adding to the vector support
move semantics.

Perhaps try std::deque, but 26,000 doesn't sound to me like a lot of
entries. Make sure that the time to reallocate isn't spent doing copies
rather than moves.

Daniel

Paavo Helde

unread,
Apr 29, 2016, 1:32:06 AM4/29/16
to
On 29.04.2016 4:02, Lynn McGuire wrote:
> Is there anything comparing the efficiency of a std::vector with a
> std::map ? I am using a std::vector in many circumstances with up to
> 26,000 entries. Unfortunately, I have very little way of predicting the
> ultimate size of the vector so I am using push_back () to add more
> entries instead of resize ().
>
> I am having some performance issues and am suspecting the std::vector
> growing so much by a single increments is really slowing my code down.

std::vector::push_back() does not grow in single increments, otherwise
it could not be amortized O(1) complexity as required by the standard.

OTOH, insertion and lookup in a map is O(log N), which is much-much worse.

26,000 is a very small number.

IOW, do not try to optimize prematurely. If your program is too slow,
profile it first to find out bottlenecks.

HTH
Paavo




Marcel Mueller

unread,
Apr 29, 2016, 2:06:32 AM4/29/16
to
On 29.04.16 03.02, Lynn McGuire wrote:
> Is there anything comparing the efficiency of a std::vector with a
> std::map ? I am using a std::vector in many circumstances with up to
> 26,000 entries. Unfortunately, I have very little way of predicting the
> ultimate size of the vector so I am using push_back () to add more
> entries instead of resize ().

You are fine. Using push_back() in a loop with n iterations will not
exceed O(n log n) in total.

> I am having some performance issues and am suspecting the std::vector
> growing so much by a single increments is really slowing my code down.

Use a profiler to track your performance issues. Probably they are
caused by something else.

> Or, should I be using reserve () to increase the size of my std::vector
> in increments of 10,000 instead of increments of 1 ?
> http://en.cppreference.com/w/cpp/container/vector/reserve

If you can approximately predict the upper bound of the size for an
individual vector, reserve() is fine. Otherwise it might be
counterproductive. Allocation too much memory can significantly degrade
cache efficiency, besides the waste of resources, of course.

Note that the situation changes significantly if you use insert()
instead. This yields to O(n²) in general, regardless of reserve().
This is the point where std::map comes into play. But do not expect too
much. std::map uses Red-Black Tree. This is quite ineffective for many
small objects, because of the allocation overhead. This is the price for
keeping iterators valid at insert and delete operations.
If the latter is not required B-Trees with reasonable node sizes (63 or
127) outperform std:map by far. Unfortunately there is no B-Tree in the
std:: namespace.


Marcel

Lynn McGuire

unread,
Apr 30, 2016, 11:18:17 PM4/30/16
to
Thanks,
Lynn


Lynn McGuire

unread,
Apr 30, 2016, 11:18:36 PM4/30/16
to
Thanks,
Lynn


Lynn McGuire

unread,
Apr 30, 2016, 11:18:47 PM4/30/16
to
Thanks,
Lynn

0 new messages