I have a routine called gettimeofday defined like this:
gettimeofday(struct timeval *tp, struct timezone *tzp)
GetSystemTime( &nt_time );
tp->tv_sec = nt_time.wSecond;
tp->tv_usec = liCounter.LowPart;
I use this routine to get a diff of times. For example, let's say I want to time how long it takes me to "eat." So I do a gettimeofday() before I start "eating", I "eat", and then I do a gettimeofday() when I'm done "eating." then to get a "time to eat", I subtract the start time from the finish time.
My question is this: on some NT 4.0 systems, I get the diff time as 2 milliseconds, on some other NT 4.0 system, the diff time is 98000 milliseconds.
I don't understand why the same gettimeofday() routine will have such a big timer difference.
What's going on here?