Solved

Why is my stored procedure so slow?

Posted on 2002-03-19
19
514 Views
Last Modified: 2008-02-01
I am new to stored procedure for adding/updating data with ADO from VB.  I have a very basic stored procedure running on MS-SQL 2000 on a fast machine with 1GB RAM.  From Query Analyzer it takes less than a second to execute, but from VB it takes about 7-9 seconds.

Stored Procedure
------------------
ALTER      PROCEDURE [dbo].[sp_ADD_REP]
  @REPID char(3),
  @REPNAME VARCHAR(50),
  @REPCONTACT varchar(30) = NULL,
  @REPADDR1 varchar(40) = NULL,
  @REPADDR2 varchar(40) = NULL,
  @REPCITY varchar(25) = NULL,
  @REPSTATE char(2) = NULL,
  @REPZIP varchar(10) = NULL,
  @REPCOUNTRY varchar(20) = NULL,
  @REPPHONE varchar(20) = NULL,
  @REPFAX varchar(20) = NULL,
  @REPEMAIL1 varchar(50) = NULL,
  @REPEMAIL2 varchar(50) = NULL,
  @REPWEBSITE varchar(60) = NULL,
  @REPVENDID varchar(12) = NULL,
  @REPCOMMISSION decimal (8,2) = NULL
AS
set nocount on
declare @IDCount int
set @IDCount = (SELECT COUNT(REPID) FROM REPS WHERE REPID = @REPID)
if @IDCount = 0
begin
  INSERT INTO REPS (REPID, REPNAME) VALUES (@REPID, @REPNAME)
end

IF @REPID IS NOT NULL
BEGIN
  UPDATE REPS SET REPNAME = @REPNAME, REPCONTACT = @REPCONTACT,
                  REPADDR1 = @REPADDR1, REPADDR2 = @REPADDR2,
                  REPCITY = @REPCITY, REPSTATE = @REPSTATE,
                  REPZIP = @REPZIP, REPCOUNTRY = @REPCOUNTRY,
                  REPPHONE = @REPPHONE, REPFAX = @REPFAX,
                  REPEMAIL1 = @REPEMAIL1, REPEMAIL2 = @REPEMAIL2,
                  REPWEBSITE = @REPWEBSITE, REPVENDID = @REPVENDID,
                  REPCOMMISSION = @REPCOMMISSION
  WHERE REPS.REPID = @REPID
END


VB Code
-----------

Public Sub Import_Rep_IDs()
  Dim rsFSReps As ADODB.Recordset
  Dim rsReps As New ADODB.Recordset
  Dim cmFSReps As ADODB.Command
  Dim cmReps As ADODB.Command
  Dim prm As ADODB.Parameter
 
  Stat "Setting up for Sales Rep Import"
  Set cmFSReps = New ADODB.Command
  cmFSReps.CommandText = "FS_ComCode"
  cmFSReps.CommandType = adCmdStoredProc
  cmFSReps.ActiveConnection = cnRpt
  cmFSReps.Parameters.Refresh
 
  Set cmReps = New ADODB.Command
  cmReps.CommandText = "sp_ADD_REP"
  cmReps.CommandType = adCmdStoredProc
  cmReps.ActiveConnection = cnRpt
  Set rsFSReps = cmFSReps.Execute
  Do While Not rsFSReps.EOF
    Stat rsFSReps!REPID & " - " & rsFSReps!REPNAME
    Set prm = cmReps.CreateParameter("@REPID", adChar, adParamInput, 3, Trim(rsFSReps!REPID))
    cmReps.Parameters.Append prm
    Set prm = cmReps.CreateParameter("@REPNAME", adVarChar, adParamInput, 50, Trim(rsFSReps!REPNAME))
    cmReps.Parameters.Append prm
    cmReps.Execute
    rsFSReps.MoveNext
  Loop
  Stat ""
  On Error Resume Next
  rsFSReps.Close
  rsReps.Close
  Set cmFSReps = Nothing
  Set rsReps = Nothing
  Set rsFSReps = Nothing
End Sub

Why is this not executing as swiftly as I anticipated?  All the execution time is on the VB statement "cmreps.execute". I am running it across a WAN link (T1 on DB Server and 768x256 DSL on workstation), but using a SP is about 10 times slower than using the recordset.addnew/update process.
0
Comment
Question by:DRRYAN3
  • 6
  • 4
  • 3
  • +4
19 Comments
 
LVL 1

Expert Comment

by:barx
Comment Utility
I am not sure what you are trying to do but I can offer some input.

Possibly you need to move your cmReps.Execute out of the loop ie

Set rsFSReps = cmFSReps.Execute
 Do While Not rsFSReps.EOF
   Stat rsFSReps!REPID & " - " & rsFSReps!REPNAME
   Set prm = cmReps.CreateParameter("@REPID", adChar, adParamInput, 3, Trim(rsFSReps!REPID))
   cmReps.Parameters.Append prm
   Set prm = cmReps.CreateParameter("@REPNAME", adVarChar, adParamInput, 50, Trim(rsFSReps!REPNAME))
   cmReps.Parameters.Append prm
   rsFSReps.MoveNext
 Loop
   cmReps.Execute
 Stat ""

How many records does rsFSReps contain??  I think this is the key to your problem.  You appear to be executing sp_ADD_REP several times which would explians why it runs slower in vb than in Query analyser.

0
 
LVL 49

Expert Comment

by:DanRollins
Comment Utility
A quick Sanity check:  Set the SP to simply return, then time the VB fn.  That is your "control" since it times just the network activity and other overhead.

But I agree with barx that you are making a lot of round trips that clever programming could avoid.

-- Dan
0
 
LVL 12

Author Comment

by:DRRYAN3
Comment Utility
I am still in the process of converting from Access to SQL and I am not happy with overall performance on updates.  In this situation, I have an SP which pulls an ID and name from my MRP system (FS_COM_CODE) and I need to insert that data into another database on the same server where additional information related to that ID is stored.  This is a small set of data (30-40 records), but I need to do the same thing with inventory items (3000 items) and customers (10s of thousands).

barx - your code would have me appending parameter after parameter to the cmreps command without ever sending them to the server.  Read the code.

DanRollins - If I comment out one line in the VB program (cmReps.execute), all is well so the problem doesn't seem to be in retrieving the recordset from FS_COM_CODE.  

I put that line back and modified the SP such that everything below  "declare @IDCount int" is commented, and things are STILL very slow.  This can't all be network - I've monitored activity on that segment and there's a little burst as I execute the SP and then a long (for computers) pause of no activity.

Suggestions?
0
 
LVL 12

Author Comment

by:DRRYAN3
Comment Utility
I've also found that using a second connection to the database has tripled the speed.
0
 
LVL 29

Accepted Solution

by:
Göran Andersson earned 50 total points
Comment Utility
I'm not sure how this will affect it, but you could try simply this:

Public Sub Import_Rep_IDs()
   Dim rsFSReps As ADODB.Recordset
 
   Stat "Setting up for Sales Rep Import"
   Set rsFSReps = cnRpt.Execute("FS_ComCode")
   Do While Not rsFSReps.EOF
      Stat rsFSReps!REPID & " - " & rsFSReps!REPNAME
      cnRpt.Execute "sp_ADD_REP '" & Trim(rsFSReps!REPID) & "','" & Replace(Trim(rsFSReps!REPNAME),"'","''") & "'"
      rsFSReps.MoveNext
   Loop
   Stat ""
   rsFSReps.Close
   Set rsFSReps = Nothing
End Sub

You could also rewrite the sp like this:

ALTER PROCEDURE sp_ADD_REP
   ....
AS
IF @REPID IS NOT NULL
BEGIN
   UPDATE ...
   IF @@ROWCOUNT=0
   BEGIN
      INSERT ...
   END
END
0
 
LVL 12

Author Comment

by:DRRYAN3
Comment Utility
GreenGhost

So far, you have had the biggest impact on performance with your first suggestion, executing the SP from the connection.  Why is this?  

Regarding suggestion 2 - I am trying to use the same SP for adding new records as well as updating existing records.  I see where your process skips the count step and only inserts if the update fails to match any rows.  I'll play with it.

Is there a limit to the number of parameters I can pass this way?

DRRYAN3
0
 
LVL 29

Expert Comment

by:Göran Andersson
Comment Utility
Sometimes creating objects is expensive. Up to a whole millisecond... For computers, that's equal to three or four eternities... ;)

I have run really huge queries (like a select with 20-30 joins) without running into any limit. I guess the ever-present 8000 character limit applies here too...

Something that could really speed things up, though, was if you would just join the stored procedures into one. That would not have a running status display, though...
0
 

Expert Comment

by:allian11
Comment Utility
Creation of Parameter object is a kind of expensive operation, so if you put the Parameter creation statement be out of the loop, and put assigning value be inside of the loop as below, it may help.

Set prm = cmReps.CreateParameter("@REPID", adChar,  
                                         adParamInput, 3)
cmReps.Parameters.Append prm
Set prm = cmReps.CreateParameter("@REPNAME", adVarChar,
                                        adParamInput, 50)
cmReps.Parameters.Append prm

Do While Not rsFSReps.EOF

   .Parameters("@REPID")= Trim(rsFSReps!REPID)
   .Parameters("@REPNAME")= Trim(rsFSReps!REPNAME)

   cmReps.Execute
   rsFSReps.MoveNext
   
Loop
0
 

Expert Comment

by:allian11
Comment Utility
Oops! You should put cmReps (command object before the Parameters object)
And maybe you can close the Command object after executing it.

Do While Not rsFSReps.EOF

  cmReps.Parameters("@REPID")= Trim(rsFSReps!REPID)
  cmReps.Parameters("@REPNAME")= Trim(rsFSReps!REPNAME)

  cmReps.Execute
  rsFSReps.MoveNext
  cmReps.close
 
Loop
0
Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

 
LVL 49

Expert Comment

by:DanRollins
Comment Utility
GreenGhost also simplified the logic by avoiding all of the CreateParameter stuff.  SQL is a simple text-based language and I often wonder why so many people go through all of the Parameter-setting logic when a simple string of text will do the trick.

>>... modified the SP...and things are STILL very slow

An 'empty' SP call should be almost instantaeous.  
 
Thus your test indicates that *most* of the problem is the network transport.  So that is where to work.  Check the client settings via the ODBC control panel.  Check your security settings.  Perhaps the db server is needing to jump through some security hoops -- and that can take time.

-- Dan
0
 
LVL 2

Expert Comment

by:VincentWong
Comment Utility
mmm. I think I would min. the round trips to only 1. Then split the string by database. You may not like this method since you have to join the parameter into a single string.

Here is the code:

=======================================================

CREATE PROCEDURE dbo.addMe(
    -- NVARCHAR (4000)
    -- VARCHAR  (8000)
    @ItemNameList       NVARCHAR(4000)  = NULL  ,
) AS
BEGIN
    CREATE TABLE #ItemNameList (
        Recnum      DECIMAL(18,0) IDENTITY (1,1),
        ItemName    NVARCHAR(16)
    )
    INSERT INTO #ItemNameList (
        ItemName
    )
        EXECUTE getSplit @ItemNameList, '#'

    /*
        Then do all the Loop/IF EXISTS/Update/Insert logic here.
        Remark: The Recnum would be useful when you have more than one parameter.
    */

END
GO

CREATE PROCEDURE dbo.getSplit (
    @as_exp  NVARCHAR(4000),
    @as_delimeter NVARCHAR(1)
)
AS
BEGIN
    DECLARE @_counter   INTEGER
    DECLARE @_aChar     NVARCHAR(1)
    DECLARE @_aString   NVARCHAR(4000)

    CREATE TABLE #SplitStrings (
        splitted_string  NVARCHAR(4000)
    )
    SELECT @_counter = 1
    SELECT @_aString = ''

    WHILE (@_counter <= LEN(@as_exp))
        BEGIN
            SELECT @_aChar = SUBSTRING(@as_exp, @_counter, 1)

            IF @_aChar = @as_delimeter OR @_counter = LEN(@as_exp)
                BEGIN
                    IF @_counter = LEN(@as_exp)
                        BEGIN
                            SELECT @_aString = @_aString + @_aChar
                        END
                    INSERT INTO #SplitStrings (splitted_string) VALUES (@_aString)
                    SELECT @_aString = ''
                END
            ELSE
                BEGIN
                    SELECT @_aString = @_aString + @_aChar
                END

            SELECT @_counter = @_counter + 1
        END
    SELECT splitted_string FROM #SplitStrings
END
GO
0
 
LVL 2

Expert Comment

by:VincentWong
Comment Utility
oops.

CREATE PROCEDURE dbo.addMe(
   -- NVARCHAR (4000)
   -- VARCHAR  (8000)
   @ItemNameList       NVARCHAR(4000)  = NULL  **** , ****
) AS


A typo there. Please delete the **** , ****
0
 
LVL 12

Author Comment

by:DRRYAN3
Comment Utility
OK - I think I have solved today's problem and will split the points between DanRollins and GreenGhost.

Switching to passing a text string on the connection is so much faster than all that parameter crud in the ADO SDK documentation, I don't know why it's there at all.

DanRollins comments led me to a bad static route on the W2K server hosting my remote user VPNs.  Performance picked up after fixing it.

Thanks
DRRYAN3
0
 
LVL 5

Expert Comment

by:Netminder
Comment Utility
Points reduced for split, per request.

Netminder
CS Moderator
0
 
LVL 49

Expert Comment

by:DanRollins
Comment Utility
DRRYAN3,
Please choose 'Accept Comment as Answer' on one of GreenGhost's comments and then post a 50-pt Q titled "Points For DanRollins" in the SQL Toptic area.  In the question, please paste the following text:

For help in: http://www.experts-exchange.com/mssql/Q.20278642.html

Thanks!

-- Dan
0
 
LVL 29

Expert Comment

by:Göran Andersson
Comment Utility
Well, the connection object is for using output parameters and catching return values. However, I have yet to find a real need for this...

Also, by setting the Prepared property, a query plan is created for the command. This is supposed to be used when reusing the command. When the query is just a call to an sp, I doubt that you would gain anything, though.

If you have 16 parameters, but you only want to change two of them from call to call, setting the Prepared property, combined with allian11's method, might be faster. But hardly if you want to change all the parameters.
0
 
LVL 12

Author Comment

by:DRRYAN3
Comment Utility
Thanks
DRRYAN3
0
 
LVL 49

Expert Comment

by:DanRollins
Comment Utility
>>I don't know why it's there at all.

Documentation is so often like that.  Rarely-used functionality gets 'First Nation' status along with functions that are used 1000 times more often.

That said... formal Parameters *do* come in handy.  Imagine that you want to store a 300KB JPG file or even a 5K block of text... Using a Parameter lets you avoid needing to normalize the data by escaping single quotes etc.

-- Dan
0
 
LVL 12

Author Comment

by:DRRYAN3
Comment Utility
Good point.
0

Featured Post

What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

Join & Write a Comment

This article explains how to reset the password of the sa account on a Microsoft SQL Server.  The steps in this article work in SQL 2005, 2008, 2008 R2, 2012, 2014 and 2016.
For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
Viewers will learn how the fundamental information of how to create a table.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

772 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

12 Experts available now in Live!

Get 1:1 Help Now