News:

MASM32 SDK Description, downloads and other helpful links
MASM32.com New Forum Link
masmforum WebSite

Effect of timer resolution on character repeat rate

Started by xmetal, February 18, 2008, 06:23:12 PM

Previous topic - Next topic

xmetal

I have always been frustrated with my Dell USB keyboard's character repeat rate, even with its value set to maximum. Some time ago, I noticed that a particular application - HelpNDoc, boosted the repeat rate not only in its own editor, but across the whole operating system. After a lot of tracing, stepping and breakpointing through its code I found that a Windows Multimedia function - timeBeginPeriod, was responsible for this wonderful effect.

I created a small application to test it. On my system, I see a performance improvement in the range of 1-9 milliseconds. For values >= 10 milliseconds, there doesn't seem to be any observable difference.

[attachment deleted by admin]

Rockoon

Indeed.

The tick period seems to start at 10ms with my XP-64 rig. This could be the default value, or a service I have installed may set it that low. On dos/win3.1 systems, the tick period was ~55ms (1/18.2 seconds to be more precise) and was derived as 65536 ticks of the Programmable Interval Timer (PIT).

Note that the tick rate set with timeBeginPeriod() directly effects the resolution of timeGetTime(), but does NOT have an effect on getTickCount()

A decade ago there was some research which suggested that system performance was adversly effected if you set the period to 1ms. The measure came out to something like 5% to 10%. I doubt on todays dual and quadcore 2ghz+ rigs you would even be able to measure a performance difference.
When C++ compilers can be coerced to emit rcl and rcr, I *might* consider using one.

u

Nice finding :). I've been scared of losing my current Turbo-Plus keyboard, as it gives 100+ chars/second out of the box, and there's no replacement. All keyboards that are currently sold are criminally slow.
Btw, timeBeginPeriod() does not improve the resolution of timeGetTime() here (win2k sp5). It's always with 1ms resolution, and timeGetTime() takes 1300 cycles iirc. The hardware timer, used for thread-switching is 60Hz here. (it's what affects GetTickCount(), calls to that function simply return the cached value from the last timer-interrupt).
Please use a smaller graphic in your signature.

MichaelW

On my Windows 2000 SP4 system timeBeginPeriod sets the resolution for timeGetTime just as you would expect. But with my PS/2 connected, IBM keyboard there is no effect on the repeat rate.

[attachment deleted by admin]
eschew obfuscation

Rockoon

Quote from: Ultrano on February 22, 2008, 03:04:43 PM
Btw, timeBeginPeriod() does not improve the resolution of timeGetTime() here (win2k sp5). It's always with 1ms resolution...

One of your running services/processes must set it that low.

From MSDN:

Quote

timeBeginPeriod()

..

This function affects a global Windows setting. Windows uses the lowest value (that is, highest resolution) requested by any process. Setting a higher resolution can improve the accuracy of time-out intervals in wait functions. However, it can also reduce overall system performance, because the thread scheduler switches tasks more often. High resolutions can also prevent the CPU power management system from entering power-saving modes. Setting a higher resolution does not improve the accuracy of the high-resolution performance counter.


Quote
timeGetTime()

..

Windows NT/2000: The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. If you do so, the minimum difference between successive values returned by timeGetTime can be as large as the minimum period value set using timeBeginPeriod and timeEndPeriod.

..

Windows 95: The default precision of the timeGetTime function is 1 millisecond. In other words, the timeGetTime function can return successive values that differ by just 1 millisecond. This is true no matter what calls have been made to the timeBeginPeriod and timeEndPeriod functions.


Quote
and timeGetTime() takes 1300 cycles iirc.

Not bad at all, considering that its probably the least problematic source of time available to us that works from Windowns 95 and beyond.


The OP's keyboard driver must use a Multimedia timer callback, as these are effected by timeBeginPeriod()
When C++ compilers can be coerced to emit rcl and rcr, I *might* consider using one.

xmetal

Since I need this "feature" so badly, I created another, smaller application whose shortcut I placed in the startup programs folder. It sets the timer resolution to the minimum value, creates an invisible window, and waits for the WM_DESTROY message which, I assume is sent to all windows when the system is shutting down. I first thought instead, of calling Sleep(INFINITE), but then I thought that such a program would need to be "forcefully" terminated by Windows on shutdown. Using GetMessage seems to be a more "graceful" method. Am I right?

I also found that if the system is put in standby or hibernate mode, and then turned back on, the effect disappears. Its a minor annoyance, since one just needs to start another instance of this application.

[attachment deleted by admin]