• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 426
  • Last Modified:

Accurate Timing in Windows

Is there a way to guarantee accurate time (mlliseconds) from a C++ program running in a windows environment?
0
zilch
Asked:
zilch
1 Solution
 
Tommy HuiEngineerCommented:
You can always call GetTickCount() at the start of the program and call it again at the end to determine the number of milliseconds that have expired.

This is guaranteed to work because GetTickCount() returns the number of milliseconds that have elapsed since the start of Windows. However, the number of milliseconds elapsed for your program may be inaccurate because of the dynamics of the OS.
0
 
jkrCommented:
To retrieve the time your program consumed, 'GetProcessTimes()' would do the job better...
0
 
chensuCommented:
The GetTickCount function is limited to the resolution of the system timer. If you need a higher resolution timer, use a multimedia timer or a high-resolution timer.

The following functions are used with multimedia timers.

timeBeginPeriod  
timeEndPeriod  
timeGetDevCaps  
timeGetSystemTime  
timeGetTime  
timeKillEvent  
TimeProc  
timeSetEvent

The following functions are used with high-resolution timers.

QueryPerformanceFrequency
QueryPerformanceCounter


You may use timeGetTime instead of GetTickCount.

"Windows NT: The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. If you do so, the minimum difference between successive values returned by timeGetTime can be as large as the minimum period value set using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and QueryPerformanceFrequency functions to measure short time intervals at a high resolution,

Windows 95: The default precision of the timeGetTime function is 1 millisecond. In other words, the timeGetTime function can return successive values that differ by just 1 millisecond. This is true no matter what calls have been made to the timeBeginPeriod and timeEndPeriod functions."

0
Cloud Class® Course: SQL Server Core 2016

This course will introduce you to SQL Server Core 2016, as well as teach you about SSMS, data tools, installation, server configuration, using Management Studio, and writing and executing queries.

 
zilchAuthor Commented:
I should have stated more throughly.  I need a way to generate accurate time events within a Windows environment, not total time running!  This is for controlling turn on/off events for testing, as well as measuring total times of the events.  This is regardless of the mouse event or any other interrupt windows might do.
0
 
chensuCommented:
So, use timeSetEvent to set a multimedia timer.

By the way, you should have rejected the answer since it doesn't solve your problem.
0
 
CameronPCommented:
chensu;

I have a problem with timeGetTime.  I tried it using the mmsystem.h included and I still got a linker error.  Is there a special setting I have to set when using this function?

By the way zilch, why did you accept this answer if it was'nt correct.  I gave up 20 points to get to it and then found nothing concrete.

Cheers
Cameron
0
 
chensuCommented:
You need to link with winmm.lib.
0
 
CameronPCommented:
Thanks.

I linked it and its fine now, but is the timeGetTime value actually accurate to 1mS?  I am trying to switch some relays which uses some multiplexing which rely on accurate time switching.

Cheers
Cameron
0
 
chensuCommented:
Windows NT: The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. If you do so, the minimum difference between successive values returned by timeGetTime can be as large as the minimum period value set using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and QueryPerformanceFrequency functions to measure short time intervals at a high resolution.

Windows 95: The default precision of the timeGetTime function is 1 millisecond. In other words, the timeGetTime function can return successive values that differ by just 1 millisecond. This is true no matter what calls have been made to the timeBeginPeriod and timeEndPeriod functions.
0
 
CameronPCommented:
Thankyou very much.

Cheers
Cameron
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Port Scanner

Check which ports are open to the outside world. Helps make sure that your firewall rules are working as intended.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now