troubleshooting Question

In SQL Server, what would cause a query's duration to increase when I/O and CPU time are cut by 50%?

Avatar of AHTDEV
AHTDEV asked on
Microsoft SQL Server
9 Comments2 Solutions681 ViewsLast Modified:
I have three queries that I've worked to optimize and my efforts have reduced CPU usage and I/O counts by half.  (I verified these values via profiler and statictics in QA.)  When I execute the "optimized" queries after executing "dbcc dropcleanbuffers", the duration is roughly double the original duration.  When I execute the queries without dropping the buffers, the duration is rougly half the original value.

Here are some values returned by Profiler after dropping the buffers:
CPU:   6969 (old), 3250 (new)
Reads:  537075 (old), 191632 (new)
Duration 13235 (old), 36990 (new)

When looking at physical I/O via "set statistics io on", the number of physical I/O operations drop significantly.  So, how can duration go up when CPU usage goes down, logical I/O goes down, and physical I/O goes down?  What can I do to prove/fix the problem?

Our community of experts have been thoroughly vetted for their expertise and industry experience.

Join our community to see this answer!
Unlock 2 Answers and 9 Comments.
Start Free Trial
Learn from the best

Network and collaborate with thousands of CTOs, CISOs, and IT Pros rooting for you and your success.

Andrew Hancock - VMware vExpert
See if this solution works for you by signing up for a 7 day free trial.
Unlock 2 Answers and 9 Comments.
Try for 7 days

”The time we save is the biggest benefit of E-E to our team. What could take multiple guys 2 hours or more each to find is accessed in around 15 minutes on Experts Exchange.

-Mike Kapnisakis, Warner Bros