Time measurement precision

326 views
Skip to first unread message

Joao Américo

unread,
Aug 7, 2013, 2:26:03 PM8/7/13
to junit-be...@googlegroups.com
Hello,

Is it possible to configure JUnitBenchmarks to have more precise time measurement?

Thanks in advance

Dawid Weiss

unread,
Aug 7, 2013, 3:48:02 PM8/7/13
to junit-be...@googlegroups.com
Nope. If you need nanosecond-precision I'd say try jmh or google's caliper.

http://openjdk.java.net/projects/code-tools/jmh/
https://code.google.com/p/caliper/

Dawid
> --
> You received this message because you are subscribed to the Google Groups
> "JUnitBenchmarks: Performance Benchmarking for JUnit4" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to junit-benchmar...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>
>

Chun Yin Vincent Lau

unread,
Aug 10, 2013, 1:09:14 AM8/10/13
to junit-be...@googlegroups.com
Thanks, was having the same question

so  this option is not related?
com.carrotsearch.junitbenchmarks.Clock.NANO_TIME

what is the difference for switching this clock option?
not sure in the documentation

Thanks,Vincent

Dawid Weiss

unread,
Aug 10, 2013, 8:43:01 AM8/10/13
to junit-be...@googlegroups.com
The difference is how the time is measured -- System.nanoTime vs.
System.currentTimeMillis, not how it is reported (always millis
granularity).

Try JMH, it probably has what you need.
http://openjdk.java.net/projects/code-tools/jmh/

Dawid

Brice Yorfno

unread,
Nov 18, 2013, 10:57:23 AM11/18/13
to junit-be...@googlegroups.com
Hello,
I working with JUnitBenchmark and use it for testing small pieces of code which consume half a millisecond. I was frustrated when I saw the charts - all times were equal to 0. After some investigation, I found that the request on the DB (H2) call ROUND(x, 2). So, reported values are expressed in seconds with 10 millis precision. Thus, if you want to display values with millis precision, just change the second parameter of ROUND to 6 [ROUND(x, 6)] in the file method-chart-results.sql

Question to the JUnitBenchmark team: I think that it could be very important to allow the user to override the reported value precision within the @BenchmarkMethodChart(precision = 6) annotation. So, is it possible to get this feature in the next release ?

Regards, Brice.
method-chart-results.sql

Brice Yorfno

unread,
Nov 18, 2013, 11:05:12 AM11/18/13
to junit-be...@googlegroups.com
ERRATUM

Hello,
I working with JUnitBenchmark and use it for testing small pieces of code which consume half a millisecond. I was frustrated when I saw the charts - all times were equal to 0. After some investigation, I found that the request on the DB (H2) call ROUND(x, 2). So, reported values are expressed in seconds with 10 millis precision. Thus, if you want to display values with millis precision, just change the second parameter of ROUND to 3 [ROUND(x, 3)] in the file method-chart-results.sql,

6 if you want values in microsec precision.
 
Question to the JUnitBenchmark team: I think that it could be very important to allow the user to override the reported value precision within the @BenchmarkMethodChart(precision = 3) annotation. So, is it possible to get this feature in the next release ?

Dawid Weiss

unread,
Nov 18, 2013, 1:57:45 PM11/18/13
to junit-benchmarks
Hi Brice.

The problem is that JUnitBenchmarks development has been pretty
stagnant, so I don't know when the next release is going to be...
Besides, it's really not super-well suited for measuring
under-millisecond execution times -- that's part of the reason why
certain reporting tools don't have any lower scale -- I just don't
think such low-timing benchmarks can be measured accurately with the
JUnit infrastructure (and God knows what else in the same VM).

Perhaps you could take a look at JMH or Google Caliper -- these seem
to be better tailored to running isolated benchmarks under "clean" JVM
conditions.

If you'd still like this to be fixed, I promise I will check out your
pull requests on github :)

Dawid

Brice Yorfno

unread,
Nov 19, 2013, 7:57:11 AM11/19/13
to junit-be...@googlegroups.com
Hi Dawid,

Thanks for your answer. Did you mean that JUnitBenchmark is not so clean regarding JVM conditions ? Concerning my improvement request, I will implement it and submit my pull requests to you.

Performance studies and tests are a big part of my job. Usually, I wrote my own test framework because of the specificities of my tests. Today, the tests I have to run are quite simples so I was looking for a good testing framework and I think that your is a very good one (easy to use, nice reports, smart approach, maintainability, ...). Caliper doesn't seem to be well maintained and JMH is very abstract to me.

Regarding the time precision, I run my tests on a machine on which I control the entire activity (minimize measurements noise) and run each tests 10000 times in order to cover most of the situations (hardware warm up, ...).

Regards,

Dawid Weiss

unread,
Nov 19, 2013, 8:11:34 AM11/19/13
to junit-benchmarks
> Thanks for your answer. Did you mean that JUnitBenchmark is not so clean
> regarding JVM conditions ?

Not everything can be done from JUnit level -- for example when you
run benchmarking tests and there are more tests than one within the
class, then the result may be "primed" depending on the order of
executed methods (which in general can be random on newer JDKs where
reflection returns them in unpredictable order).

> I will implement it and submit my pull requests to you.

Sure, you're welcome to do so. Fork the project on github and hack to
your liking -- that's why it's there.

> reports, smart approach, maintainability, ...). Caliper doesn't seem to be
> well maintained and JMH is very abstract to me.

Fine-grained tests are very, very tricky. There are a lot of things
going on under JVM's hood. I think JMH is the best fit for these at
the moment, although I've used Caliper in the past too.

> Regarding the time precision, I run my tests on a machine on which I control
> the entire activity (minimize measurements noise) and run each tests 10000
> times in order to cover most of the situations (hardware warm up, ...).

I'm not even going into environment noise, I was only talking about
the "noise" introduced by various JVM settings, code execution
ordering resulting from how JUnit schedules tests, etc. For benchmarks
that last at least a few millis I don't think these will play a key
role, but for nano scale they very well may.

Dawid
Reply all
Reply to author
Forward
0 new messages