• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 238
  • Last Modified:

Predict Reads Prior to SQL Query

Is there a way to predict ahead of time the number of reads a query needs to perform?

Example: I just ran a query that took roughly 15 minutes and performed 600,000 reads. Prior to running the query, I had no idea how long it was going to run for but if there was a way ahead of time to estimate the number of reads the query would need then I could estimate the run time by watching it for the first few minutes to how many reads per minute it was doing.

I'm not looking for something as slick as a progress bar but if there is any reliable method to estimate runtime of a query, that would be fantastic.

1 Solution
lcohanDatabase AnalystCommented:
I suggest you use SSMS to asses the query performance, missing indexes, execution plan, reads, etc.
I would use the "Estimated Execution Plan" or just run it manually and see the actual execution plan then you should be able to see why it takes 15 minute to run - missing index most likely or you may have a function in the where clause that invalidates the index.

IF the query is quick in SQL SSMS....then you get the idea.

Also you can run


prior to a SELECT to see the reads, CPU, etc.
dthansenAuthor Commented:
Thank you for your response lcohan.

I'm not looking to improve performance of a query in this scenario. I'm looking to estimate runtime.

lcohanDatabase AnalystCommented:
No you can't in my opinion and that's the reason Microsoft did not put anything like you mentioned in SSMS (like progress bar) just because is practicaly impossible being tied to so many factors.
The Estimated Execution plan however should give you a pretty good idea about the impact (including reads - just hover over the tables involved) even though "I'm not looking to improve performance of a query" - but in my opinion you must if a SELECT (reads as you said) takes 15 minutes to complete - HTH.
Jim HornMicrosoft SQL Server Developer, Architect, and AuthorCommented:
lcohan is correct in that what you are asking (estimate query stats without actually running it) is not possible in SQL Server.

I have a few buddies that have been in similar situations, and they handled it by throwing the entire process into a loop that executed 'blocks' of 50,000 records each.  Right before the loop moved on to the next 50,000 rows there was an INSERT INTO an auditing table that had columns such a page reads execution time, size of tempdb, etc.  This way, developers can monitor the big honkin' query execution performance stats in 'blocks' of 50,000 rows, instead of having to wait for the entire query to execute.

Good luck.
Scott PletcherSenior DBACommented:
You can use SQL's "Query Governor" to prevent queries that SQL "thinks" will run than you specify as the max time limit from running.

Otherwise, you would have to rely on the estimated query plan.  You could capture the XML version of the plan and attempt to do analysis on it to compute a rough idea of the total I/O required, but it's not going to be 100% accurate, of course.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

Tackle projects and never again get stuck behind a technical roadblock.
Join Now