Link to home
Start Free TrialLog in
Avatar of dthansen
dthansen

asked on

Predict Reads Prior to SQL Query

Is there a way to predict ahead of time the number of reads a query needs to perform?

Example: I just ran a query that took roughly 15 minutes and performed 600,000 reads. Prior to running the query, I had no idea how long it was going to run for but if there was a way ahead of time to estimate the number of reads the query would need then I could estimate the run time by watching it for the first few minutes to how many reads per minute it was doing.

I'm not looking for something as slick as a progress bar but if there is any reliable method to estimate runtime of a query, that would be fantastic.

Thanks,
Dean
Avatar of lcohan
lcohan
Flag of Canada image

I suggest you use SSMS to asses the query performance, missing indexes, execution plan, reads, etc.
I would use the "Estimated Execution Plan" or just run it manually and see the actual execution plan then you should be able to see why it takes 15 minute to run - missing index most likely or you may have a function in the where clause that invalidates the index.

IF the query is quick in SQL SSMS....then you get the idea.

Also you can run

SET STATISTICS IO ON
SET STATISTICS TIME ON

prior to a SELECT to see the reads, CPU, etc.
Avatar of dthansen
dthansen

ASKER

Thank you for your response lcohan.

I'm not looking to improve performance of a query in this scenario. I'm looking to estimate runtime.

Thanks,
Dean
ASKER CERTIFIED SOLUTION
Avatar of lcohan
lcohan
Flag of Canada image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
lcohan is correct in that what you are asking (estimate query stats without actually running it) is not possible in SQL Server.

I have a few buddies that have been in similar situations, and they handled it by throwing the entire process into a loop that executed 'blocks' of 50,000 records each.  Right before the loop moved on to the next 50,000 rows there was an INSERT INTO an auditing table that had columns such a page reads execution time, size of tempdb, etc.  This way, developers can monitor the big honkin' query execution performance stats in 'blocks' of 50,000 rows, instead of having to wait for the entire query to execute.

Good luck.
You can use SQL's "Query Governor" to prevent queries that SQL "thinks" will run than you specify as the max time limit from running.

Otherwise, you would have to rely on the estimated query plan.  You could capture the XML version of the plan and attempt to do analysis on it to compute a rough idea of the total I/O required, but it's not going to be 100% accurate, of course.