• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 294
  • Last Modified:

C++ Timer

I am currently writing a program using Visual C++ under Windows 95.
I would like to be able to time a peripherial device down to as accurate a time as possible, if possible in micro seconds. Is there any functions / techniques that will be able to give me this accurate a method of timing.

This program has to be accurate on various machines with different clock speeds.
The only solution I can find is the Performance Counter, but this is not always supported.
1 Solution
vallyAuthor Commented:
Edited text of question
Tommy HuiEngineerCommented:
You'll need to do a couple of things:

Use the multimedia timer: timeSetEvent().

However, timeSetEvent is not really accurate, so that you can do is use timeSetEvent for a one-time event, 1 millisecond before the actual time. Then when the timer fires, use QueryPerformanceCounter() to count until the actual time has occurred.
vallyAuthor Commented:
When I said time it down, I meant as in how long it took to finish its task, eg transfer a block of data.

Besides QueryPerformanceCounter is out as it is not always supported
QueryPerformanceCounter() is supported on Pentuims and above, Pentium clones, Alphas, etc.  What more do you need?

Thui, your answer sounds strangely familiar :-)
Use the timeGetTime function. The timeGetTime function retrieves the system time, in milliseconds. The system time is the time elapsed since Windows was started.

DWORD dwStartTime = ::timeGetTime();
DWORD dwEndTime = ::timeGetTime();

DWORD dwHowLong = dwEndTime - dwStartTime;

Windows NT: The default precision of the timeGetTime function can be five milliseconds or more, depending on the machine. You can use the timeBeginPeriod and timeEndPeriod functions to increase the precision of timeGetTime. If you do so, the minimum difference between successive values returned by timeGetTime can be as large as the minimum period value set using timeBeginPeriod and timeEndPeriod. Use the QueryPerformanceCounter and QueryPerformanceFrequency functions to measure short time intervals at a high resolution,

Windows 95: The default precision of the timeGetTime function is 1 millisecond. In other words, the timeGetTime function can return successive values that differ by just 1 millisecond. This is true no matter what calls have been made to the timeBeginPeriod and timeEndPeriod functions.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

Tackle projects and never again get stuck behind a technical roadblock.
Join Now