Link to home
Start Free TrialLog in
Avatar of donaldwyatt
donaldwyatt

asked on

Why is SQL Profiler performance different for same procedure called from Query Analyzer vs. ASP.net?

I have a stored procedure, that if I run from SQL Management Studio and monitor with SQL Profiler, I get the following results:

CPU: 343
Reads: 126133
Duration: 514

However, if I call that same procedure from an ASP.net application, and monitor it, I get these results:

CPU: 6719
Reads:  1783186
Duration: 6720

The "TextData" field is EXACTLY the same between the two. The only differences I see are:

1. The application name ("Microsoft SQL Server Management Studio - Query" vs. ".Net SqlClient Data Provider")
2. The "Event Class" ("SQL:BathCompleted" vs. "RPC:Completed")

No matter how many times or in what order I test this, the dramatic difference in results in the same.

What could account for this drastic difference in CPU time, reads, and duration for the exact same procedure call?
Avatar of Cedric_D
Cedric_D

It could be caused by execution plan calculating on RPC call.

Please issue
dbcc dropcleanbuffers
dbcc freeproccache

at the Studio and measure 1st execution.

Also, do your proc returns many rows?
Avatar of donaldwyatt

ASKER

Thanks for the reply. I ran those two statements, followed by the procedure call in Studio. Here are the results on those same three metrics:
 
CPU: 593
Reads: 126239
Duration: 4987
 
So, after that change, while the duration increased, the CPU and reads are still about the same as the other times it was run from Studio -- and still way off from the .net call.
 
Oh, sorry I left out the row count. It returns 570 rows.
From http://weblogs.sqlteam.com/dang/archive/2008/02/18.aspx I understand that RPC method used instead of batch when you pass some parameters.


For experiment, could you modify you ASP.Net code to not execute your procedure in "correct" way (MyTableID = @MyTableID, selectCommand.CommandType = CommandType.StoredProcedure),

but instead build form a single EXecuteSQL statement: "exec sp_Proc " + Param1.ToString() + ",  " + ...

Cedric,

Sorry for the delayed response over the weekend.
 
I created the statement as you asked, and it does show up as a SQL Batch in Profiler, as expected. Unfortunately, I have a complicating factor -- when I ran the original procedure call from the web that showed the problems in the past, it ran almost perfectly, with CPU, duration, and read values almost the same as when run from Studio.
 
I go through periods where this happens where everything seems to work just fine, and then it slows back down and the discrepancy between the calls from Studio and ASP.net reappear. I'll continue to monitor it, and once things slow back down, I'll run the properly parameterized version vs. the non-properly done version to compare the results.
 
In the meantime, if you have any ideas what could cause this unexplained variability, please let me know. There have been no server or other service reboots that we have initiated, so I can't figure out what causes the change.
 
Thanks!
 
 
ASKER CERTIFIED SOLUTION
Avatar of Cedric_D
Cedric_D

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
The procedure takes 2 parameters. Both are ints. For the purposes of this testing, I have been using the same parameters each time, so the results should be on equal footing.
Cedric D,

Oh, after doing a bit more research, I think I see what you mean -- that depending on the parameters, the procedure is sometimes optimized for a different set of values, which explains the differing reads and performance?
 
Working under that idea, I tried to do some rough tests of running the dbcc dropcleanbuffers and dbcc freeproccache statements and then re-testing. Doing that seems to sometimes make the slow running call from ASP.net go faster, but it's always fast from Studio.
 
Are the execution plans ever dropped on their own? Would SQL Server use a different plan if the call is coming from ASP.net vs. Studio? I'm just struggling to understand the reason for the difference in the two.
 
Thanks for your help!
OK, so this explains how I could have differing performance because of different execution plans when called from different sources: http://www.mssqltips.com/tip.asp?tip=1304

For now, I have added an Optimize For hint for the particular account that has the greatest slowdown with this procedure. It causes the smaller accounts to execute this query a little slower, but the difference for them is only very slight. So, it's worth the tradeoff.

Do you have any ideas about a better way of handling this for a long-term solution? I guess I could create two procedures -- one for large accounts and one for small, with the appropriate optimizations -- and then have asp.net call the appropriate one. That doesn't seem terribly clean, though. Any thoughts.

Thanks!
I'd prefer not to create different procedures, but instead modify existent.

Sometimes, after investigating execution plans, I make a IF-fork inside SP based on input parameters and rewrite slightly different statement for different inputs, this can help greatly ..