• C

Millisecond timing of a loop.

Hi I've got a device hooked up to the parallel port and I want to see how long it outputs a "low" signal.  This is my program so far:

#include <stdio.h>

void main () {
  int in;
  in = inp(889);
  for(;in == 120;in = inp(889)){
  };
  // Start of Timer.
  for(;in != 120;in = inp(889)){
  };
  // End of Timer.
};

So quite simply all I'm trying to do is measure the length of time (in milliseconds) the second for loop lasts.  Trouble is I don't know how to measure time in milliseconds.  Can anybody help me out with some code to count this for loop in milliseconds?

Thanks in advance,
Cide
CideAsked:
Who is Participating?
 
zebadaConnect With a Mentor Commented:
Like pjknibbs said, you should use QueryPerformanceCounter().
This is C++ code but it simple to convert to C code.

#include <windows.h>
#include <stdio.h>

#define unsigned long ULONG;

class hrt {
private:
  LARGE_INTEGER frequency;

  LARGE_INTEGER startCount;
  SYSTEMTIME    startTime;
  LARGE_INTEGER startMs;

  LARGE_INTEGER currentCount;
  SYSTEMTIME    currentTime;
  LARGE_INTEGER currentMs;

public:
  hrt(void);
  ~hrt(void){};

  SYSTEMTIME *getTime();
};

hrt::hrt()
{
  QueryPerformanceFrequency(&frequency);
  QueryPerformanceCounter(&startCount);
  GetSystemTime(&startTime);
  startMs.QuadPart = startTime.wHour*3600000+startTime.wMinute*60000+startTime.wSecond*1000+startTime.wMilliseconds;
}

SYSTEMTIME *
hrt::getTime()
{
  ULONG ms;
  ULONG hour,min,sec;

  QueryPerformanceCounter(&currentCount);
  currentMs.QuadPart = ((currentCount.QuadPart-startCount.QuadPart)*1000)/frequency.QuadPart;
  ms = static_cast<ULONG>(startMs.QuadPart+currentMs.QuadPart);

  hour = ms/3600000;
  ms %= 3600000;
  min = ms/60000;
  ms %= 60000;
  sec = ms/1000;
  ms %= 1000;

  currentTime = startTime;
  currentTime.wHour = static_cast<USHORT>(hour);
  currentTime.wMinute = static_cast<USHORT>(min);
  currentTime.wSecond = static_cast<USHORT>(sec);
  currentTime.wMilliseconds = static_cast<USHORT>(ms);
  return &currentTime;
}

int
main(int argc, char *argv[])
{
  hrt         t;
  SYSTEMTIME  *now;

  for ( int i=0 ; i<100 ; i++ ) {
    now = t.getTime();
    printf("%02d:%02d:%02d.%03d\n",now->wHour,now->wMinute,now->wSecond,now->wMilliseconds);
  }
  return 0;
}
0
 
CideAuthor Commented:
Sorry forgot to say I'm using djgpp and windows 98.
0
 
pjknibbsCommented:
If you're writing this as a Win32 console application you can use the function timeGetTime() (found in WINMM.LIB) to return a millisecond-accurate timer. (Well, it SHOULD be millisecond accurate--I think under Win32 this function is actually getting its information from a separate timer thread, so it's possible for a lot of high-priority system threads to kick the value out of sync).

If you want this to be a pure DOS application then I haven't a clue how you'd get millisecond accurate timing.
0
Choose an Exciting Career in Cybersecurity

Help prevent cyber-threats and provide solutions to safeguard our global digital economy. Earn your MS in Cybersecurity. WGU’s MSCSIA degree program was designed in collaboration with national intelligence organizations and IT industry leaders.

 
pjknibbsCommented:
Oh, if you want the highest accuracy, you might also want to look at the QueryPerformanceCounter() and QueryPerformanceFrequency() functions--you'll need to use 64-bit arithmetic to handle the return values in this case, though.
0
 
BeyondWuCommented:
You can also simply use GetTickCount().
0
 
zebadaCommented:
No, you can't use GetTickCount() as the resolution is not high enough.

Windows NT 3.5 and later The system timer runs at approximately 10ms.
Windows NT 3.1 The system timer runs at approximately 16ms.
Windows 95 and later The system timer runs at approximately 55ms.

0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.