Find the highest bit set on your integer, divide by 7 and round up
(i.e. + 6 / 7) -- that should be the number of bytes it takes to
encode the varint. On x86, there is a bsr instruction which computes
ilog2 (with gcc you could do e.g. 32 - __builtin_clz(var)).
-ilia