RE: [fltk.general] Accuracy of repeat_timeout on Windows vs Linux - [General Use]

216 views
Skip to first unread message

MacArthur, Ian (Leonardo, UK)

unread,
Mar 26, 2018, 5:03:10 AM3/26/18
to fltkg...@googlegroups.com

> Ok, thanks, that's a bit what i expected.
> You're probably talking about the Win32 timeGetDevCaps/timeBeginPeriod functions.
> https://msdn.microsoft.com/en-us/library/windows/desktop/dd743626(v=vs.85).aspx
> I'm currently trying to use these, but with no effect. :(
> (Setting it once to 1 ms before entering fltk main loop.)


Right: I knew I'd used this before, so went looking for it - here's some stuff cut out of a bigger project that does this.

This code runs on Win7 (usually) and this timer init is run *very soon* after the app starts, and before I create any fltk elements.

With this done, I can get *fairly* close to 1ms ticks pretty consistently, across a range of machines. It certainly does much better than 15.5ms resolution! 20ms should be easily obtainable with fair repeatability.

Doing it this way means that the fltk "built in" timers get the improved resolution.

If you want "even better" timing accuracy on WIN32 you need to use the various "high resolution" or "multimedia" timers they provide, but doing so is a pain, and does not make the fltk timers any better...
I usually use timeSetEvent() for that, though MS appear to have deprecated that in favour of CreateTimerQueueTimer(), but frankly in my limited experience timeSetEvent() works better and is simpler to use!



Here's my "known working" code:

///////////////////////////////////////////////////////////////////////


static unsigned int TIMER_RES;

/********************************************************************************
*
* Name : TIMER_INIT
* Description : Initializes the Win32 timer system
* Parameters : None
* Returns : Success or failure flag
*
********************************************************************************/
ERR_CODE TIMER_INIT(void)
{
ERR_CODE exit_status = SUCCESS; /* return code */
TIMECAPS timecaps; /* Holds the system timer capability structure */
MMRESULT result; /* Error code returned by time function */

/* Initialise the timer resoultion to the desired 1 millisecond */
if(timeGetDevCaps(&timecaps, sizeof(TIMECAPS)) != TIMERR_NOERROR)
{
exit_status = TIMER_FAILURE;
printf("TIMER_INIT : Unable to set 1ms resolution FAIL!\n");
}
else
{
= 1; /* 1 millisecond */

if (TIMER_RES < timecaps.wPeriodMin)
{
TIMER_RES = timecaps.wPeriodMin;
}
else
if (TIMER_RES > timecaps.wPeriodMax)
{
TIMER_RES = timecaps.wPeriodMax;
}
else
{
/* resolution is OK */
}

result = timeBeginPeriod(TIMER_RES);
if (result != 0)
{
exit_status = TIMER_FAILURE;
printf("TIMER_INIT : Unable to start timer FAIL!\n");
}
}
return exit_status;
} /* TIMER_INIT */

/********************************************************************************
*
* Name : TIMER_QUIT
* Description : Deinitializes the Win32 timer system
* Parameters : None
* Returns : Success or failure flag
*
********************************************************************************/
ERR_CODE TIMER_QUIT(void)
{
ERR_CODE exit_status = SUCCESS; /* return code */
MMRESULT result; /* Error code returned by function */

result = timeEndPeriod(TIMER_RES);
if (result != 0)
{
exit_status = TIMER_FAILURE;
printf("TIMER_INIT : Unable to stop timer FAIL!\n");
}
return exit_status;
}


///////////////////////////////////////////




********************************************************************
This email and any attachments are confidential to the intended
recipient and may also be privileged. If you are not the intended
recipient please delete it from your system and notify the sender.
You should not copy it or use it for any purpose nor disclose or
distribute its contents to any other person.
********************************************************************

Albrecht Schlosser

unread,
Mar 26, 2018, 1:17:34 PM3/26/18
to fltkg...@googlegroups.com

On 26.03.2018 11:03 MacArthur, Ian (Leonardo, UK) wrote:

>> Ok, thanks, that's a bit what i expected.
>> You're probably talking about the Win32
timeGetDevCaps/timeBeginPeriod functions.
>>
https://msdn.microsoft.com/en-us/library/windows/desktop/dd743626(v=vs.85).aspx
>> I'm currently trying to use these, but with no effect.
>> (Setting it once to 1 ms before entering fltk main loop.)
>
> Right: I knew I'd used this before, so went looking for it - here's
some stuff cut out of a bigger project that does this.
>
> This code runs on Win7 (usually) and this timer init is run *very
soon* after the app starts, and before I create any fltk elements.
>
> With this done, I can get *fairly* close to 1ms ticks pretty
consistently, across a range of machines. It certainly does much better
than 15.5ms resolution! 20ms should be easily obtainable with fair
repeatability.

Thanks for this code snippet. I was curious and tried it with the timer
debugging code in test/blocks.cxx.

I can't get it working though under Windows 10. I attach a diff file for
those wanting to test and maybe find a bug?

Use patch -p1 to apply the patch. Then compile and run test/blocks,
start the game, then type '+' several times. Every time you press '+'
the program switches to the next level and outputs a line with average
data. To suppress unnecessary lines I used a filter like this:

$ bin/examples/blocks | grep 'average\|win_timer\|^time'
--- win_timer_init() ---
timeGetDevCaps() : wPeriodMin/Max = 1/1000000
timeBeginPeriod(): Timer period set to 1 ms.
*** level = 1, interval 0.100000, average delta time = 0.008750, n = 12
*** level = 2, interval 0.075000, average delta time = 0.011286, n = 7
*** level = 3, interval 0.056250, average delta time = 0.007972, n = 9
*** level = 4, interval 0.042188, average delta time = 0.005822, n = 13
*** level = 5, interval 0.031641, average delta time = 0.003389, n = 18
*** level = 6, interval 0.023730, average delta time = 0.006885, n = 17
*** level = 7, interval 0.017798, average delta time = 0.013369, n = 18
*** level = 8, interval 0.013348, average delta time = 0.003753, n = 32
*** level = 9, interval 0.010011, average delta time = 0.005254, n = 19
*** level = 10, interval 0.007508, average delta time = 0.008025, n = 15
*** level = 11, interval 0.005631, average delta time = 0.010369, n = 15
*** level = 12, interval 0.004224, average delta time = 0.011176, n = 20
*** level = 13, interval 0.003168, average delta time = 0.012311, n = 23
*** level = 14, interval 0.002376, average delta time = 0.013291, n = 21
*** level = 15, interval 0.001782, average delta time = 0.014218, n = 13
*** level = 16, interval 0.001336, average delta time = 0.013807, n = 14
*** level = 17, interval 0.001002, average delta time = 0.014783, n = 14
*** level = 18, interval 0.001000, average delta time = 0.014846, n = 13
*** level = 19, interval 0.001000, average delta time = 0.014308, n = 13
*** level = 20, interval 0.001000, average delta time = 0.014929, n = 14
*** level = 21, interval 0.001000, average delta time = 0.014500, n = 28
*** level = 22, interval 0.001000, average delta time = 0.014622, n = 45
*** level = 23, interval 0.001000, average delta time = 0.014674, n = 46
*** level = 24, interval 0.001000, average delta time = 0.014562, n = 64
*** level = 25, interval 0.001000, average delta time = 0.014611, n = 54
*** level = 26, interval 0.001000, average delta time = 0.015017, n = 60
--- win_timer_quit() ---

I don't see that any function call fails. It doesn't matter if I run it
with administrator privileges or not.

Each game level reduces the "interval" from 0.100 (100 ms) to 0.001 (1
ms). The average delta time is the time difference between the triggered
timer and the expected value. You can see that interval + average delta
time is about 0.016 starting with level 8. Average delta time is nearly
constant starting with level 18 (1 ms). The last value 'n' is the number
of measurements in one average value (depends on the time between two
'+' key presses). win_timer_quit() is called before the program terminates.

I have no idea why timeBeginPeriod() does not work as expected. As I
wrote above, this is Windows 10. I can't test on earlier Windows
versions (except maybe WIndows XP in a VM).

Can anybody confirm my findings on other Windows versions? Ian or
anybody else, can you try if it works for you under Windows 7? Anybody
on Windows 10?

TIA for testing.
win_timer_res.diff

chris

unread,
Mar 26, 2018, 3:02:58 PM3/26/18
to fltkg...@googlegroups.com
Am 26.03.2018 um 19:17 schrieb Albrecht Schlosser:

> Each game level reduces the "interval" from 0.100 (100 ms) to 0.001 (1
> ms). The average delta time is the time difference between the triggered
> timer and the expected value. You can see that interval + average delta
> time is about 0.016 starting with level 8. Average delta time is nearly
> constant starting with level 18 (1 ms). The last value 'n' is the number
> of measurements in one average value (depends on the time between two
> '+' key presses). win_timer_quit() is called before the program terminates.
>
> I have no idea why timeBeginPeriod() does not work as expected. As I
> wrote above, this is Windows 10. I can't test on earlier Windows
> versions (except maybe WIndows XP in a VM).

The shortest delay that can be set with SetTimer() - the function FLTK
is using - is 10ms:


https://msdn.microsoft.com/en-us/library/windows/desktop/ms644906(v=vs.85).aspx


Quote from there:

If uElapse is less than USER_TIMER_MINIMUM (0x0000000A), the timeout is
set to USER_TIMER_MINIMUM

Lars Ruoff

unread,
Mar 26, 2018, 3:09:45 PM3/26/18
to fltkg...@googlegroups.com
timerBeginPeriod doesnt have an effect for me in Win7, 64bit.
I moved the function to right after main(), but still no effect.
I got a timeGetDevCaps wPeriodMin of 1 which i set it to then. But i still feel the 15ms barrier.



--
You received this message because you are subscribed to the Google Groups "fltk.general" group.
To unsubscribe from this group and stop receiving emails from it, send an email to fltkgeneral+unsubscribe@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

MacArthur, Ian (Leonardo, UK)

unread,
Mar 27, 2018, 4:50:51 AM3/27/18
to fltkg...@googlegroups.com
> The timer accuracy seems to be set correctly, but
> doesnt have any effect on the Fl::repeat_timeout accuracy. :(



Fl::repeat_timeout() is a bit of a "problem child" - on the "posix-like" platforms, it attempts to take account of the time elapsed *during* the callback, so that calling repeat_timeout repeatedly will give a "uniform" callback interval...

But, for various technical reasons, the Win32 repeat_timeout() doesn't do that; instead it is basically just a synonym for add_timeout().

The net result is that on Windows the callback interval drifts by the amount of time spent inside the callback on each iteration.

That added to the somewhat duff behaviour of the Win32 timers anyway, and it all gets a bit... poor...

I'm not sure there is a credible "fltk" fix for this issue, TBH.

So what I do, if I need "accurate" timer repetition on Win32 is one or other of:



Option 1:

// Create the timer callback code
void CALLBACK timerA_callback(UINT uTimerID, UINT uMsg, DWORD_PTR param, DWORD_PTR dw1, DWORD_PTR dw2)
{
// My timer callback logic
}


// Trigger a repeating timer event
:
MMRESULT timerA = timeSetEvent(period, 0, timerA_callback, 0, TIME_PERIODIC);
:


Or Option 2:

// Create the timer callback code
void CALLBACK timerB_callback(void* param, BOOLEAN bb)
{
// timer callback logic
}


// Trigger a repeating timer event
:
CreateTimerQueueTimer(&timerB, NULL, timerB_callback, 0, 0, period, 0);
:



Both of these mechanisms I have used (on Win7, can't comment on Win10!) and got reasonably consistent timer repetition.

Though, you still need to get the timeBeginPeriod(1); stuff to work, since most of these timers still depend on that "feature" to work.

Lars Ruoff

unread,
Mar 27, 2018, 5:13:37 AM3/27/18
to fltkg...@googlegroups.com
@Tue, Mar 27, 2018 at 10:50 AM, MacArthur, Ian (Leonardo, UK) <ian.ma...@leonardocompany.com> :

Thanks, will try both.

Now maybe the problem i have is that my "timer callback logic" consists of a refresh request of a FLTK OpenGL window, i.e. i call a Fl_Gl_Window::redraw().
As i understand it now, this will only mark the widget for future draw, asynchroneously.
So even if the timer callback will be executed accurately, the real update will be done from another message loop, which itself is probably running with the lower accuracy.

FYI, the whole current code is visible here:
starting with function cb_update_tick, which calls MyAppWindow::RunUpdate(), which calls MyAppWindow::Update(), which calls Fl_Gl_Window::redraw().

How must i handle this?
Should i call draw myself instead of the async redraw?

MacArthur, Ian (Leonardo, UK)

unread,
Mar 27, 2018, 5:14:47 AM3/27/18
to fltkg...@googlegroups.com


> > I have no idea why timeBeginPeriod() does not work as expected. As I
> > wrote above, this is Windows 10. I can't test on earlier Windows
> > versions (except maybe WIndows XP in a VM).
>
> The shortest delay that can be set with SetTimer() - the function FLTK
> is using - is 10ms:
>
>
> https://msdn.microsoft.com/en-
> us/library/windows/desktop/ms644906(v=vs.85).aspx
>
>
> Quote from there:
>
> If uElapse is less than USER_TIMER_MINIMUM (0x0000000A), the timeout is
> set to USER_TIMER_MINIMUM


Yes - it is not clear that timeBeginPeriod() affects SetTimer() events - indeed, at least under Win10 it really would seem that it does not.

Though it clearly does affect "other" Win32 timer behaviours, at least on Win7... I knew I'd been here before, so went digging and found this test code (attached) that I used way back when to verify how CreateTimerQueueTimer() operates, and from this (at least on Win7) you can see that it does "work" as you'd expect.

If you run this code, from the command line, with no parameters, it *does not* assert timeBeginPeriod() and you get roughly 15.6ms "best" periods.

Period: 1ms
CreateTimerQueueTimer: cb_time = 15.652, main_time = 15.671, wanted time = 1


If you run this code, from the command line, with one or more parameters, it *does* assert timeBeginPeriod() and you get roughly the period you ask for, down to about 1ms.

Period: 1ms
CreateTimerQueueTimer: cb_time = 0.979, main_time = 0.994, wanted time = 1


So, for a repeating period of less than 15.6ms, SetTimer() is not going to be the way to go, but CreateTimerQueueTimer() or timeSetEvent() do seem to give adequate behaviour.
timer_queue_test.c

MacArthur, Ian (Leonardo, UK)

unread,
Mar 27, 2018, 5:30:55 AM3/27/18
to fltkg...@googlegroups.com
> Thanks, will try both.

I just posted an old "worked example" I did of using CreateTimerQueueTimer().

TBH, I think that timeSetEvent() is maybe easier to use, and might even be more accurate (less inaccurate) in my tests on Win7. But it is deprecated in favour of CreateTimerQueueTimer() by MS these days so...



> Now maybe the problem i have is that my "timer callback logic" consists
> of a refresh request of a FLTK OpenGL window, i.e. i call a
> Fl_Gl_Window::redraw().
> As i understand it now, this will only mark the widget for future draw,
> asynchroneously.
> So even if the timer callback will be executed accurately, the real
> update will be done from another message loop, which itself is probably
> running with the lower accuracy.

The fltk event handling loop is not really dependent on how the timers run - in principle it should "wake up" as soon as it gets a new event to process.

How soon that "wake up" will actually occur is another matter, and in that case calling timeBeginPeriod(1); might actually help, because it changes how often the Win kernel awakes and so can have a knock-on effect on how non-timer events are scheduled...





> FYI, the whole current code is visible here:
> https://sourceforge.net/p/podball/code/ci/default/tree/src/podball/viewer.cpp#l184
> starting with function cb_update_tick, which calls
> MyAppWindow::RunUpdate(), which calls MyAppWindow::Update(), which calls
> Fl_Gl_Window::redraw().

> How must i handle this?
> Should i call draw myself instead of the async redraw?

Calling draw() yourself is rarely a good idea in fltk-space, and the response of the asynch redraw() is generally "good enough" for most purposes.

However: Fl_Gl_Window is set up so that you can (with care) set the context and draw directly if you want to, so you could try that.

I generally would not, however, and it has never turned out to be necessary in the end.

How many FPS are you trying to get? For most purposes, anything more than about 30 FPS is just wasting CPU/GPU time: anything more than the refresh rate of your monitor is definitely just wasting CPU/GPU time, since no one will ever see those scenes...

Matthias Melcher

unread,
Mar 27, 2018, 9:13:54 AM3/27/18
to fltkg...@googlegroups.com

Aaaah. What you are looking for is not a timer, but you want to know when to render the next OpenGL frame. That can be achieved by requesting OpenGL to synchronize buffer swapping with the vertical blank of the display (search for OpenGL VSYNC for hundreds of hits).

In FLTK, you can use the OpenGL extensions for buffer swapping and a call to glFinish() at the right point in time. No timers at all needed. Note that modern graphics cards don't move any data around at all and simply swap the start address of a frame buffer.


Just make sure that you give FLTK a chance to handle the message pipe before glFinish() blocks the CPU for the rest of the frame. FLTK offers multithreading to avoid blocking and using awake() to trigger the main thread and message handling. Also, FLTK drawing should be handled in the main thread, but OpenGL drawing is usually possible in another thread.

Lars Ruoff

unread,
Mar 27, 2018, 9:31:05 AM3/27/18
to fltkg...@googlegroups.com
Nono, i think you got me wrong. My rendering is trivial and taking much less time than the requested timeouts.
My basic need is to throttle everything *down* to a frame rate of 50 fps.
Also the state of my application is changing during the call, it is not only about rendering but also about moving the game forward.

On Tue, Mar 27, 2018 at 3:13 PM, 'Matthias Melcher' via fltk.general <fltkg...@googlegroups.com> wrote:

Aaaah. What you are looking for is not a timer, but you want to know when to render the next OpenGL frame. That can be achieved by requesting OpenGL to synchronize buffer swapping with the vertical blank of the display (search for OpenGL VSYNC for hundreds of hits).

In FLTK, you can use the OpenGL extensions for buffer swapping and a call to glFinish() at the right point in time. No timers at all needed. Note that modern graphics cards don't move any data around at all and simply swap the start address of a frame buffer.

--

Matthias Melcher

unread,
Mar 27, 2018, 3:28:35 PM3/27/18
to fltkg...@googlegroups.com


Am Dienstag, 27. März 2018 15:31:05 UTC+2 schrieb Lars Ruoff:
Nono, i think you got me wrong. My rendering is trivial and taking much less time than the requested timeouts.
My basic need is to throttle everything *down* to a frame rate of 50 fps.
Also the state of my application is changing during the call, it is not only about rendering but also about moving the game forward.

Yes, I understood that. Using glFinish() with WGL_EXT_swap_control will give you a very precise time base for your animation. You can wait 10 or so frames to figure out what the actual frame rate is, and after that, you use that time to create perfectly fluid animation while advancing the app state. glFinish() will wait until the next frame starts. If you are done sooner, the CPU will be available for other tasks on your machine.
Reply all
Reply to author
Forward
0 new messages