Timing difference with linux command time

50 views
Skip to first unread message

Riccardo Nigrelli

unread,
Nov 29, 2021, 4:20:41 PM11/29/21
to benchmark-discuss
I wrote a benchmark with 10 iterations and 20 repetitions.
The execution is wrapped with the unix command `time` to get the peak memory.

When the execution is ended, I notice a substantial difference:

This is the gbencmark result:
```
"real_time": 1.1915376993454993e+03,
"cpu_time": 1.1907871268750000e+03,
"time_unit": "ms"
```

the `time` result:
User time (seconds): 237.99
System time (seconds): 15.24

As you can see, there is a big difference. Why?
The difference it may be due that the `time` include all the 20 repetions? If yes, how I can compute the single repetition? Only divide by 20?
Reply all
Reply to author
Forward
0 new messages