Link to home
Start Free TrialLog in
Avatar of ronald_pangestu
ronald_pangestu

asked on

how to calculate elapsed time in c for windows

dear expert,

I am trying to calculate how long it takes for my quick sort function to sort an array. I want to calculate it in microseconds because obviously miliseconds and seconds still resulting in 0 elapsed time. I am using microsoft VS . Net 2003. Thank you
Avatar of PaulCaswell
PaulCaswell
Flag of United Kingdom of Great Britain and Northern Ireland image

You may be looking for 'ftime'. Its in sys\timeb.h.

Take a look here for just one of the many discussions on this subject.

https://www.experts-exchange.com/questions/20455818/Checking-for-1-sec-time-difference.html

Paul
ASKER CERTIFIED SOLUTION
Avatar of PaulCaswell
PaulCaswell
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of ronald_pangestu
ronald_pangestu

ASKER

i have tr using ftime() but it only able to calculate in miliseconds which is not enough. Is there anyway how to calc microseconds or nanoseconds in windows?
A little research suggests QueryPerformanceCounter may be of use. Other methods requre assembler.

https://www.experts-exchange.com/questions/20514943/C-C-How-to-write-microsecond's-timer.html

Seriously though, if you are testing a sort routine it would be better to call it 1000 times.

Run through one loop, randomising an array of data and then sorting it 1000 times. Then perform another loop just randomising the data. Subtract one from the other. You'll get a much better idea of its performance across a range of data.

Paul
Acctually i only want to run it once. Can you like take a time (in microseconds format) in the beginning of your sort and at the end of your sort or any processing? and at the end you get the difference between those two.
SOLUTION
Avatar of jkr
jkr
Flag of Germany image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Hi,
Paul is right. Just call your routine 1000 or why not 1000000 times. And simply divide the result to 1000 (or 1000000) at the end. You will get a much better precision for your routine in this way.
I aggree with Paul. When hardware clock are not acurate enough for measurement, this is the only way.