Link to home
Start Free TrialLog in
Avatar of DigitechComputer
DigitechComputer

asked on

Stored Procedure Much Slower in .NET SqlCommand than in SSMS

I have a stored procedure that is executing an INSERT statement that we are seeing significant delays when executing.  When running it from our C# .NET application to insert 30 records in a row, it's taking roughly 4 seconds total to complete (only counting the tame it takes to run the SqlCommand.ExecuteNonQuery() method).  However, calling the same exact stored procedure from within SQL Server Management Studio the same number of times only takes about 0.4 seconds.  I can't figure out what's different between the 2 setups that would make such a difference of 10x speed.

I have tried all of the following with no noticeable change in speed:
1.  Creating the stored procedure "WITH RECOMPILE"
2.  Checking all of the "SET" values that are configured within SSMS and C#.  The only difference one was SET ARITHABORT, which was ON in SSMS and OFF when called from the .NET application.  Adding "SET ARITHABORT ON" to the start of the stored procedure made no difference, though.
3.  Removed all default values from the sproc parameters

The code used to call the stored procedure from the .NET application is:

using (SqlConnection newConn = new SqlConnection(connectionString))
{
   using (SqlCommand uCmd = new SqlCommand("sproc_name", newConn))
   {
      uCmd.CommandType = CommandType.StoredProcedure;
      uCmd.Connection.Open();
      
      //About 15 parameters added using:
      uCmd.Parameters.AddWithValue("@ParamName", value);
      ...

      //One output parameter
      SqlParameter paramOUT = new SqlParameter("@OutPutKey", SqlDbType.UniqueIdentifier);
      paramOUT.Direction = ParameterDirection.Output;
      uCmd.Parameters.Add(paramOUT);

      uCmd.ExecuteNonQuery();
      uCmd.Connection.Close();
   }
}

Open in new window


The stored procedure itself is just a list of set commands (SET ANSI_NULLS ON, SET QUOTED_IDENTIFIER ON, SET ARITHABORT ON), a list of non-defaulted parameters, and the setting of the output variable that will be the new uniqueidentifier that will be inserted as the primary key in the table, followed by the INSERT statement itself.  

The application is build on .NET 4 and the SQL server is  MS SQL Server 2005.
SOLUTION
Avatar of Lowfatspread
Lowfatspread
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
hi

you should trace the execution of the proc in PROFILER and view the execution plan that the .net app connection receives and compare it with the execution plan your get when you execute the proc directly from SSMS.

Aaron.
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of DigitechComputer
DigitechComputer

ASKER

It turns out there was an issue in my initial testing environment, and the execution times of the insert stored procedure from SSMS and .NET are actually about the same.  Both are still slower than I'd like them to be, though.

I looked into adding them all at once as you suggested.  The application is typically saving 1-30 (usually fewer) business objects at a time, calling a different insert stored procedure for each.  I don't think I would benefit from any sort of bulk inserting, but maybe you have something else in mind.

I did try having the calls to these sprocs all share a single SqlConnection which is only opened once at the start and closed once when all of the inserts are done.  I also changed like business objects to share the same SqlCommand object, just modifying the parameter values instead of redeclaring them.  These changes gave me about a 25% boost in performance.  Do you have any other ideas for how this setup can be improved?  

Here is an example of the insert stored procedure I use:

alter procedure InsertStuff
	@Field1 uniqueidentifier,
	@Field2 datetime,
	...
	@CreateDate datetime,
	@PrimaryKEY uniqueidentifier OUTPUT
AS

declare @newCreateDate datetime 
set @newCreateDate=getDate()

set @PrimaryKEY = NEWID()

	INSERT INTO [dbo].[Table]
	(
		Field1,
		Field2,
		...
		CreateDate,
		PrimaryKEY
	)
	VALUES
	(
		@Field1,
		@Field2,
		...
		@newCreateDate,
		@PrimaryKEY
	) 

Open in new window

SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
ASKER CERTIFIED SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
I will directly call getdate() in the values list. I had thought that calling a function in an insert wasn't allowed, as is the case with sproc parameters.  I should also be able to eliminate the output parameter, creating the primary key value on the .NET side.  

The table in question doesn't use triggers, but I was able to eliminate 3 mostly-unneeded indexes.  

It's very hard to tell since the more I run this query, the more SQL optimizes it, but unfortunately these changes don't seem to have made much of a difference.  It even seems to be taking slightly longer to save now, but that's almost certainly because SQL hasn't fully optimized the new structures yet.  A single call to the sproc with empty values in its parameters is currently taking about 25ms, which still seems higher than it should be.  Is there anything else I can look at to improve performance other than hardware and network setup?
So it turns out that having a clustered index on a non-sequential uniqueidentifier field is a bad idea.  Changing the clustered index to a sequential field like CreateDate made a very noticeable improvement in save times.  Combined with your other suggestions, INSERTs are performing a lot better than they used to.  Thanks to all for your suggestions!