Should Pupil use CLOCK_MONOTONIC_RAW on linux?

253 views
Skip to first unread message

Rafael Picanço

unread,
Aug 19, 2015, 3:03:23 PM8/19/15
to pupil-discuss
Hi Pupil Community,

Right now, Pupil is using two different methods to generate timestamp: one based on the camera's hardware and other based on OS pre-defined functions (clock_gettime on linux, for example). As it stands, only OS based clocks are available for all supported cameras.

From the clock_gettime docs:

       CLOCK_MONOTONIC
              Clock that cannot be set and represents monotonic time
              since some unspecified starting point.  This clock is
              not affected by discontinuous jumps in the system time
              (e.g., if the system administrator manually changes the
              clock), but is affected by the incremental adjustments
              performed by adjtime(3) and NTP.

       CLOCK_MONOTONIC_RAW (since Linux 2.6.28; Linux-specific)
              Similar to CLOCK_MONOTONIC, but provides access to a
              raw hardware-based time that is not subject to NTP
              adjustments or the incremental adjustments performed by
              adjtime(3).

For instance, the NTP docs says:

5.1.3.2. How frequently will the System Clock be updated?

As time should be a continuous and steady stream, ntpd updates the clock in small quantities. However, to keep up with clock errors, such corrections have to be applied frequently. If adjtime() is used, ntpd will update the system clock every second. If ntp_adjtime() is available, the operating system can compensate clock errors automatically, requiring only infrequent updates. See also Section 5.2 and Q: 5.1.6.1..

I am not sure how critically those adjustments are, i.e., how much they are affecting the Pupil timestamp accuracy.

So, should Pupil use CLOCK_MONOTONIC_RAW instead?

ps.

For a summary of time measuring differences (precision, granularity and accuracy), take a look at this answer:

  • Precision is the amount of information, i.e. the number of significant digits you report. (E.g. I am 2m, 1.8m, 1.83m, 1.8322m tall. All those measurements are accurate, but increasingly precise.)

  • Accuracy is the relation between the reported information and the truth. (E.g. "I'm 1.70m tall" is more precise than "1.8m", but not actually accurate.)

  • Granularity or resolution are abou the smallest time interval that the timer can measure. For example, if you have 1ms granularity, there's little point reporting the result with nanosecond precision, since it cannot possibly be accurate to that level of precision.


Moritz Kassner

unread,
Aug 25, 2015, 8:09:02 AM8/25/15
to pupil-discuss
Hi,

I have looked at this in detail when I implemented the timestamp sources. We are using clock monotonic: It most precisely represents time and is strictly monotonically increasing and does not jump as its corrections are 'slewed' in.


The real problem is that  there is no good single time source available on all three OS's. Instead we are forced to use different ones. This is why we have not made a hard convention for the actual time source.
Reply all
Reply to author
Forward
0 new messages