• C

how to calculate elapsed time in c for windows

dear expert,

I am trying to calculate how long it takes for my quick sort function to sort an array. I want to calculate it in microseconds because obviously miliseconds and seconds still resulting in 0 elapsed time. I am using microsoft VS . Net 2003. Thank you
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

You may be looking for 'ftime'. Its in sys\timeb.h.

Take a look here for just one of the many discussions on this subject.


Alternateively, call your sort routine 1000 times.


Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
ronald_pangestuAuthor Commented:
i have tr using ftime() but it only able to calculate in miliseconds which is not enough. Is there anyway how to calc microseconds or nanoseconds in windows?
SolarWinds® IP Control Bundle (IPCB)

Combines SolarWinds IP Address Manager and User Device Tracker to help detect IP conflicts, quickly identify affected systems, and help your team take near instantaneous action. Help improve visibility and enhance reliability with SolarWinds IP Control Bundle.

A little research suggests QueryPerformanceCounter may be of use. Other methods requre assembler.


Seriously though, if you are testing a sort routine it would be better to call it 1000 times.

Run through one loop, randomising an array of data and then sorting it 1000 times. Then perform another loop just randomising the data. Subtract one from the other. You'll get a much better idea of its performance across a range of data.

ronald_pangestuAuthor Commented:
Acctually i only want to run it once. Can you like take a time (in microseconds format) in the beginning of your sort and at the end of your sort or any processing? and at the end you get the difference between those two.
See http://www.sysinternals.com/Information/HighResolutionTimers.html ("Inside Windows NT High Resolution Timers") about what you can expect.
Paul is right. Just call your routine 1000 or why not 1000000 times. And simply divide the result to 1000 (or 1000000) at the end. You will get a much better precision for your routine in this way.
Artysystem administratorCommented:
I aggree with Paul. When hardware clock are not acurate enough for measurement, this is the only way.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today

From novice to tech pro — start learning today.