I have a web application that has a dashboard-like main menu. When the user first logs in, we do a database pull from SQL Server 2005 and then store the resulting DataTable in a Cache object. The page then displays the dashboard results from the cached data. As long as the cache is there, we do not pull the data from the database again. Certain events that would cause a refresh of the dashboard simply clear the cache, and that way the data is up to date, but with minimal database interaction.
Many times the query only returns 10 rows, but in cases where there are 500 rows, the page on our production server can take almost 8 seconds to load if we have to do the database pull. That 8 seconds is calculated from the time taken field in our logs in IIS, so it accounts for the latency of the data transmission.
However, if we load that same page after the data is cached, it takes less than a second. Again, this number comes from the server logs, showing a 7 second performance hit when we do the database call. The obvious answer is that the stored procedure is slow, but using SQL Server Management Studio, if we run that exact query on the production server, it returns results in less than 500 milliseconds.
So, I have a page that executes 7 seconds slower when there is a database call, but that call only takes .5 seconds. Where is that other 6.5 of additional time coming from?
Our theory is that because we are putting the data into a DataTable, we are taking a hit from .net loading the query results into the table. Heres what we found as we tried to test that&
If I watch the performance monitor on the production server when we simply run the query from SQL Server Management Studio, we get a slight increase in CPU utilization for a moment. But if we run the web page that has the database call in it, the CPU utilization maxes out for about twice the time. That would imply the problem is in the .net processing, as suspected.
If I run that same test where I watch the CPU utilization on the Processes tab in the Task Manager on the production server, when we load the web page that has the database call, we see the CPU utilization for SQL Server increase substantially, but the CPU utilization for the .net worker process does not noticeably increase. I would have thought that if the problem was .net straining to create the DataTable, the .net worker process would jump in CPU utilization, not SQL Servers process.
Any ideas on what is happening or how to narrow down the problem would be greatly appreciated.