Accurate Timer

0 views
Skip to first unread message

Shawana Kallhoff

unread,
Aug 3, 2024, 1:59:36 PM8/3/24
to camppesttradquad

Because you are using setTimeout() or setInterval(). They cannot be trusted, there are no accuracy guarantees for them. They are allowed to lag arbitrarily, and they do not keep a constant pace but tend to drift (as you have observed).

However, sometimes you really need a steady interval executing your callbacks without drifting. This requires a bit more advanced strategy (and code), though it pays out well (and registers less timeouts). Those are known as self-adjusting timers. Here the exact delay for each of the repeated timeouts is adapted to the actually elapsed time, compared to the expected intervals:

I'ma just build on Bergi's answer (specifically the second part) a little bit because I really liked the way it was done, but I want the option to stop the timer once it starts (like clearInterval() almost). Sooo... I've wrapped it up into a constructor function so we can do 'objecty' things with it.

The snippet also includes a solution for your problem. So instead of incrementing seconds variable every 1000ms interval, we just start the timer and then every 100ms* we just read elapsed time from the timer and update the view accordingly.

Most of the timers in the answers here will linger behind the expected time because they set the "expected" value to the ideal and only account for the delay that the browser introduced before that point. This is fine if you just need accurate intervals, but if you are timing relative to other events then you will (nearly) always have this delay.

To correct it, you can keep track of the drift history and use it to predict future drift. By adding a secondary adjustment with this preemptive correction, the variance in the drift centers around the target time. For example, if you're always getting a drift of 20 to 40ms, this adjustment would shift it to -10 to +10ms around the target time.

I agree with Bergi on using Date, but his solution was a bit of overkill for my use. I simply wanted my animated clock (digital and analog SVGs) to update on the second and not overrun or under run creating obvious jumps in the clock updates. Here is the snippet of code I put in my clock update functions:

Self-corrects the setTimeout, can run it X number of times (-1 for infinite), can start running instantaneously, and has a counter if you ever need to see how many times the func() has been run. Comes in handy.

Edit: Note, this doesn't do any input checking (like if delay and repeat are the correct type. And you'd probably want to add some kind of get/set function if you wanted to get the count or change the repeat value.

Many of these answers here are great, but they typically their code examples are pages and pages of code (the good ones even have instructions on the best way to copy/paste it all). I just wanted to understand this problem with a very simple example.

Based on it I was able to created fully self-correcting React interval hook which also handle the case when a timer needs to catch up after a long period of inactivity, so I'm leaving a link if someone finds it useful.

I also have witten a class which is accurate to 1ms. I took Hans Passant's code from forum
-US/6cd5d9e3-e01a-49c4-9976-6c6a2f16ad57/1-millisecond-timer
and wrapped it in a class for ease of use in your Form. You can easily set up multiple timers if you want. In the example code below I have used 2 timers. I have tested it and it works ok.

I think the other answers are failing to address why there's 14ms slew through each iteration of the OP's code; it's not because of an imprecise system clock (and DateTime.Now is not inaccurate, unless you've turned off NTP services or have the wrong time zone set or something silly! It's only imprecise).

Even with an imprecise system clock (making use of DateTime.Now, or having a solar cell hooked up to an ADC to tell how high the sun is in the sky, or dividing the time between peak tides, or ...), code following this pattern will have an average of zero slew (it will be perfectly accurate with exactly one second between ticks on average):

(If you're copying-and-pasting this, watch out for cases where your tick code takes longer than interval to execute. I'll leave it as an exercise for the reader to find the easy ways to make this skip as many beats as it takes for nextTick to land in the future)

I'm guessing that Microsoft's implementation of System.Threading.Timer follows this kind of pattern instead. This pattern will always have slew even with a perfectly precise and perfectly accurate system timer (because it takes time to execute even just the add operation):

As other posters have said, the Stopwatch class gives great precision for time measurement, but doesn't help at all with accuracy if the wrong pattern is followed. But, as @Shahar said it's not like you're ever going to get a perfectly-precise timer to begin with, so you need to rethink things if perfect precision is what you're after.

Note that Microsoft doesn't talk much about the internals of the System.Threading.Timer class so I'm educatedly speculating about it, but if it quacks like a duck then it's probably a duck. Also, I realize this is several years old, but it's still a relevant (and I think unanswered) question.

Some years later but here is what I came up with. It alines itself and is usually accurate to under 1ms. In a nutshell, it starts with a low CPU intensive Task.Delay and moves up to a spinwait. it is usually accurate to about 50s (0.05 ms).

Desktop operating system (such as windows) are not real-time operating system. which means, you can't expect full accuracy and you can't force the scheduler to trigger your code in the exact millisecond you want. Specially in .NET application which is non-deterministic...for examply, any time the GC can start collecting, a JIT compilation might be a bit slower or a bit faster....

Edit: Environment.Ticks is also based off the system timer and may have the same accuracy issues as DateTime.Now. I'd advise choosing the StopWatch instead as many other answerers have mentioned.

I have found many libraries but I really don't know which would be the best solution for my case. Possibly I'd need a non-blocking timer, is it possible to do something else while the timer is counting?

What I am trying to do is to blink the 13 led on the two Yun but one board after the other. After the offset, which is used to synchronize the two boards, I want them to blink the led every 20 ms, one right after the other:

Now, for the sake of simplicity I am trying to blink the led 13 on two Arduino Yun at the same instant. They read the offset value which tells them how long they must wait till turn high the led.
I checked the offset values and they seem reasonable but I still having a precision of 50 ms more or less, sometimes even bigger. I need to be more accurate, no more than 5 ms.

mridolfi:
Now, for the sake of simplicity I am trying to blink the led 13 on two Arduino Yun at the same instant. They read the offset value which tells them how long they must wait till turn high the led.
I checked the offset values and they seem reasonable but I still having a precision of 50 ms more or less, sometimes even bigger. I need to be more accurate, no more than 5 ms.

It sounds like you have two issues here - the accuracy of the blinking rate, and the accuracy of the synchronization between the two Yuns. While they both involve timing, they are very different situations, with different solutions, and they need to be handled differently.

The easy part is the accuracy of the initial delay timing, and the rate of blinking. You should be able to get reasonably accurate timing using the micros() function, with a native resolution of about four microseconds. When it comes to the loop that actually flashes the LED, you will get better stability by computing a new threshold value by adding your delay to the previous time value, rather than simply setting the previous time value to the current micros() function.

There will always be little timing errors caused by not getting to that piece of code in time (perhaps you were busy doing something else, or an interrupt came in and delayed the code a few microseconds.) In your original code, those errors will accumulate and the flashes will tend to drift over time. In the modified code, any timing error will correct itself: there may be some jitter in the pulses, but they will average out over time to be the correct timing (within the accuracy of the crystal oscillator.)

If you were running a single Yun, using micros() should give you enough resolution, and the modified code should give you enough accuracy. But now you are trying to synchronize the operation of two Yuns so that they trigger at the same time. (You've mentioned an accuracy of 5 ms, I'm guessing that is the desired maximum time difference between the two of them?) There are two issues here: the synchronization of the initial timing, and the relative flash speeds between the two Yuns. The second issue is helped by using the non-drifting code, but they will still drift relative to each other because of the crystal clock tolerances.

The real trick is the initial synchronization between the two: this is a difficult one to solve when all you have is a network connection between the two Yuns. You want them to be within 5 ms of each other, but any network traffic will have timing variations in transmission time that will vary by more than that. So even if you kept a stable timing reference on the master unit, and sent that time to a slave unit, the time it takes to send that message will vary by more than your target 5 ms. If you had a hard-wired signal between the two units, it would be much easier to synchronize them.

I've had a similar situation in the past where I needed to synchronize the clocks between two units that needed to operate some distance apart from each other. We ended up using a temperature controlled rubidium standard oscillator (thousands of dollars) in each unit because a crystal oscillator just wasn't stable enough. That helped keep them from drifting apart from each other, but it didn't help the synchronization. We tried using a radio signal between them to synchronize them, but in the end we just brought them next to each other, used a wired connection between them to synchronize them, and then did our testing within the next hour or two before they drifted too far apart. (One unit was on the ground, another in a fighter jet, and we ignored the relativistic time differences caused by them moving at different speeds. )

c80f0f1006
Reply all
Reply to author
Forward
0 new messages