The MASM Forum Archive 2004 to 2012

General Forums => The Campus => Topic started by: G`HOST on December 01, 2005, 07:49:48 PM

Title: milliseconds
Post by: G`HOST on December 01, 2005, 07:49:48 PM
Hi,
Is the 1000 milliseconds different on different machines in actual time?
My question is if i am trying to implement a timer with say 1000 milliseconds timeout value,then will it work same on different machines or the timeout (in actual clock time) will change ?If it does depends on system to system then whats the good way to work around it so i get same value at every machine?
Thanx .
Title: Re: milliseconds
Post by: MichaelW on December 01, 2005, 09:32:11 PM
A millisecond is a millisecond, but you will see significant variations in accuracy, dependent not so much on the machine as on the length of the timed interval, the timing function and/or method used, and how heavily loaded (task-wise) the system is. You can minimize the inaccuracies by increasing the length of the timed interval and/or using a higher-resolution timer and/or boosting your process's priority class.

MSDN: Timers (http://msdn.microsoft.com/library/default.asp?url=/library/en-us/winui/winui/windowsuserinterface/windowing/timers.asp)

MSDN: SetPriorityClass (http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dllproc/base/setpriorityclass.asp)