ronald_pangestu
asked on
how to calculate elapsed time in c for windows
dear expert,
I am trying to calculate how long it takes for my quick sort function to sort an array. I want to calculate it in microseconds because obviously miliseconds and seconds still resulting in 0 elapsed time. I am using microsoft VS . Net 2003. Thank you
I am trying to calculate how long it takes for my quick sort function to sort an array. I want to calculate it in microseconds because obviously miliseconds and seconds still resulting in 0 elapsed time. I am using microsoft VS . Net 2003. Thank you
ASKER CERTIFIED SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
ASKER
i have tr using ftime() but it only able to calculate in miliseconds which is not enough. Is there anyway how to calc microseconds or nanoseconds in windows?
A little research suggests QueryPerformanceCounter may be of use. Other methods requre assembler.
https://www.experts-exchange.com/questions/20514943/C-C-How-to-write-microsecond's-timer.html
Seriously though, if you are testing a sort routine it would be better to call it 1000 times.
Run through one loop, randomising an array of data and then sorting it 1000 times. Then perform another loop just randomising the data. Subtract one from the other. You'll get a much better idea of its performance across a range of data.
Paul
https://www.experts-exchange.com/questions/20514943/C-C-How-to-write-microsecond's-timer.html
Seriously though, if you are testing a sort routine it would be better to call it 1000 times.
Run through one loop, randomising an array of data and then sorting it 1000 times. Then perform another loop just randomising the data. Subtract one from the other. You'll get a much better idea of its performance across a range of data.
Paul
ASKER
Acctually i only want to run it once. Can you like take a time (in microseconds format) in the beginning of your sort and at the end of your sort or any processing? and at the end you get the difference between those two.
SOLUTION
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Hi,
Paul is right. Just call your routine 1000 or why not 1000000 times. And simply divide the result to 1000 (or 1000000) at the end. You will get a much better precision for your routine in this way.
Paul is right. Just call your routine 1000 or why not 1000000 times. And simply divide the result to 1000 (or 1000000) at the end. You will get a much better precision for your routine in this way.
I aggree with Paul. When hardware clock are not acurate enough for measurement, this is the only way.
Take a look here for just one of the many discussions on this subject.
https://www.experts-exchange.com/questions/20455818/Checking-for-1-sec-time-difference.html
Paul