Added Dain's pure-java Snappy codec, should run tests again

132 views
Skip to first unread message

tsaloranta

unread,
Oct 26, 2011, 6:51:35 PM10/26/11
to jvm-compressor-benchmark
I added Dain's nice pure-java Snappy codec (see https://github.com/dain/snappy)
a while ago.
It seems to perform very nicely, close to its JNI-using buddy. So it
would be nice to run that test.

But I was wondering if anyone has good ideas on how to automate
running of these tests. They take quite a while, and I don't have
spare boxes at home. I oftentimes let some of the tests run overnight,
but it makes end-to-end runtime rather long.

-+ Tatu +-

ps. Ning-lzf 0.9 is coming soon too, and it'll challenge Dain's snappy
java again... nice performance improvements all around!

Taro L. Saito

unread,
Oct 26, 2011, 9:16:32 PM10/26/11
to jvm-compress...@googlegroups.com
Hi,

Thanks for the information. I'm impressed with the performance of the
pure-java version.
It looks like my snappy-java has some overheads when calling JNI methods, and
stream-based compression needs some improvement.

Automation of testing can be done if you set up some Jenkins server, and write
some scripts to run the bench and upload the test results.
But I guess the real problem is where to run the tests.

I think the default settings of the jvm-compressor-benchmark (e.g.,
the number of iterations, warm-up time, etc.)
are a little bit heavier than necessary.

My question is that the results will be far different if you reduce
these numbers?
If the tests can be run in shorter time, updating the results will be
easy for you.

Regards,
--
Taro L. Saito
<l...@xerial.org>
University of Tokyo
http://www.xerial.org/leo
Tel. +81-47-136-4065 (64065)

Tatu Saloranta

unread,
Oct 27, 2011, 12:23:25 AM10/27/11
to jvm-compress...@googlegroups.com
On Wed, Oct 26, 2011 at 6:16 PM, Taro L. Saito <l...@cb.k.u-tokyo.ac.jp> wrote:
> Hi,
>
> Thanks for the information. I'm impressed with the performance of the
> pure-java version.
> It looks like my snappy-java has some overheads when calling JNI methods, and
> stream-based compression needs some improvement.

Ok, good, little bit of competition can help drive improvements. :)

> Automation of testing can be done if you set up some Jenkins server, and write
> some scripts to run the bench and upload the test results.
> But I guess the real problem is where to run the tests.

Right. I actually have some ideas on that, but it may take couple of
weeks to get more details.
If so, a Jenkins setup would be useful.

> I think the default settings of the jvm-compressor-benchmark (e.g.,
> the number of iterations, warm-up time, etc.)
> are a little bit heavier than necessary.
>
> My question is that the results will be far different if you reduce
> these numbers?
> If the tests can be run in shorter time, updating the results will be
> easy for you.

They could probably be reduced, but I suspect that there is limit to
this, meaning that even with reductions total runtime for existing
sets probably can't be squeezed to less than couple of hours.
I know that many other performance benchmark (such as jvm-serializers)
need hours to run as well.

-+ Tatu +-

Tatu Saloranta

unread,
Oct 31, 2011, 1:38:22 PM10/31/11
to jvm-compress...@googlegroups.com
Quick note: I did re-run tests with latest versions I found, and wiki
page is now updated.
Let me know if and when new versions that (may) have significant
performance differences are released.

I will probably run new set of tests in relative near future, since I
am planning to release lzf 0.9, but since it's such an effort that is
likely to occur in couple of weeks. So I might as well try to
incorporate other upgrades, if there are any.

-+ Tatu +-

Reply all
Reply to author
Forward
0 new messages