>> (Setting it once to 1 ms before entering fltk main loop.)
>
> Right: I knew I'd used this before, so went looking for it - here's
some stuff cut out of a bigger project that does this.
>
> This code runs on Win7 (usually) and this timer init is run *very
soon* after the app starts, and before I create any fltk elements.
>
> With this done, I can get *fairly* close to 1ms ticks pretty
consistently, across a range of machines. It certainly does much better
than 15.5ms resolution! 20ms should be easily obtainable with fair
repeatability.
Thanks for this code snippet. I was curious and tried it with the timer
debugging code in test/blocks.cxx.
I can't get it working though under Windows 10. I attach a diff file for
those wanting to test and maybe find a bug?
Use patch -p1 to apply the patch. Then compile and run test/blocks,
start the game, then type '+' several times. Every time you press '+'
the program switches to the next level and outputs a line with average
data. To suppress unnecessary lines I used a filter like this:
$ bin/examples/blocks | grep 'average\|win_timer\|^time'
--- win_timer_init() ---
timeGetDevCaps() : wPeriodMin/Max = 1/1000000
timeBeginPeriod(): Timer period set to 1 ms.
*** level = 1, interval 0.100000, average delta time = 0.008750, n = 12
*** level = 2, interval 0.075000, average delta time = 0.011286, n = 7
*** level = 3, interval 0.056250, average delta time = 0.007972, n = 9
*** level = 4, interval 0.042188, average delta time = 0.005822, n = 13
*** level = 5, interval 0.031641, average delta time = 0.003389, n = 18
*** level = 6, interval 0.023730, average delta time = 0.006885, n = 17
*** level = 7, interval 0.017798, average delta time = 0.013369, n = 18
*** level = 8, interval 0.013348, average delta time = 0.003753, n = 32
*** level = 9, interval 0.010011, average delta time = 0.005254, n = 19
*** level = 10, interval 0.007508, average delta time = 0.008025, n = 15
*** level = 11, interval 0.005631, average delta time = 0.010369, n = 15
*** level = 12, interval 0.004224, average delta time = 0.011176, n = 20
*** level = 13, interval 0.003168, average delta time = 0.012311, n = 23
*** level = 14, interval 0.002376, average delta time = 0.013291, n = 21
*** level = 15, interval 0.001782, average delta time = 0.014218, n = 13
*** level = 16, interval 0.001336, average delta time = 0.013807, n = 14
*** level = 17, interval 0.001002, average delta time = 0.014783, n = 14
*** level = 18, interval 0.001000, average delta time = 0.014846, n = 13
*** level = 19, interval 0.001000, average delta time = 0.014308, n = 13
*** level = 20, interval 0.001000, average delta time = 0.014929, n = 14
*** level = 21, interval 0.001000, average delta time = 0.014500, n = 28
*** level = 22, interval 0.001000, average delta time = 0.014622, n = 45
*** level = 23, interval 0.001000, average delta time = 0.014674, n = 46
*** level = 24, interval 0.001000, average delta time = 0.014562, n = 64
*** level = 25, interval 0.001000, average delta time = 0.014611, n = 54
*** level = 26, interval 0.001000, average delta time = 0.015017, n = 60
--- win_timer_quit() ---
I don't see that any function call fails. It doesn't matter if I run it
with administrator privileges or not.
Each game level reduces the "interval" from 0.100 (100 ms) to 0.001 (1
ms). The average delta time is the time difference between the triggered
timer and the expected value. You can see that interval + average delta
time is about 0.016 starting with level 8. Average delta time is nearly
constant starting with level 18 (1 ms). The last value 'n' is the number
of measurements in one average value (depends on the time between two
'+' key presses). win_timer_quit() is called before the program terminates.
I have no idea why timeBeginPeriod() does not work as expected. As I
wrote above, this is Windows 10. I can't test on earlier Windows
versions (except maybe WIndows XP in a VM).
Can anybody confirm my findings on other Windows versions? Ian or
anybody else, can you try if it works for you under Windows 7? Anybody
on Windows 10?
TIA for testing.