Solved

Poor performance when executing stored procedure multiple times

Posted on 2009-04-15
5
195 Views
Last Modified: 2012-06-27
We have an application that attempts to save a collection of objects to a SQL 2005 database using a 'save' stored procedure via the SQLCommand class. We have a performance issue in that some calls of the stored procedure are slow (15-20ms) whereas other calls execute very quickly (<1ms). This is a problem as we will typically save a collection of 100,000 objects.

Using SQL Server Profiler, I can see that the actual stored procedure runs very quickly every time (always <1ms), however on odd occasions, the difference between Debug.Print points 2 and 3 can be 15-20ms. Does anyone have any pointers as to why this may happen, are we doing something wrong?

We would prefer to avoid a method of bulk saving the objects (by serializing to XML or similar).
Public Function UpdateDataBase()

        Using cn As New SqlConnection("Data Source=localhost;Initial Catalog=TestDBEmpty;User ID=******;Password=******;Network Library=dbmssocn")

            cn.Open()

            For Each item As OBJ_Bookings In Me

                Try

                    'Debug.Print("Start - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 1

                    OBJ_Bookings.SaveToDatabase(item, cn)

                    'Debug.Print("End - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 4

                Catch ex As Exception

                    MsgBox(ex.Message)

                End Try

            Next

        End Using

   End Function
 

    Public Sub SaveToDatabase(ByRef ItemToUpdate as OBJ_Bookings, ByRef cn As SqlConnection)

            Using cmd As SqlCommand = cn.CreateCommand

                With cmd

                    .CommandType = CommandType.StoredProcedure

                    .CommandText = "isCRUD_BookingsSave"

                    .Parameters.AddWithValue("@Id", ItemToUpdate.Id)

                    .Parameters.AddWithValue("@StartTime", ItemToUpdate.StartTime.DBValue)

                    .Parameters.AddWithValue("@EndTime", ItemToUpdate.EndTime.DBValue)

                    .Parameters.AddWithValue("@Date", ItemToUpdate._Date_.DBValue)

                    Debug.Print("Procedure Start - " & Now.ToString & " - " & Now.Millisecond) 'Debug.Print Point 2

                    .ExecuteNonQuery()

                    Debug.Print("Procedure End - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 3

                End With

            End Using

        End Sub

Open in new window

0
Comment
Question by:stueyp34
  • 3
  • 2
5 Comments
 
LVL 4

Expert Comment

by:Cakinci
ID: 24147353
There is one big practical problem about the architecture of the code;
According to logic for each line OBJ_Bookings you are comminting an executenonquery.
Unfortunatley in SQL server structure; doing bulk insert and doing step by step insert differs so much.

If you will  aggregate all the queries,
separated by newline in a one huge string (with StringBuilder of
course) and then do only one executenonquery per all the objects - that
will surely make greater performance increase.(Bulk insert)

But if you will commit a execute non query statement for each line it will take so much time.

Hope this info will give an idea about the problem.

0
 

Author Comment

by:stueyp34
ID: 24147582
Thanks for the comment.

I agree that performing an executenonquery for each individual object is inefficient compared to a bulk insert style update. However, 80% of the time, the individual save approach is acceptable within our time/efficiency requirements. 80% of the time, the executenonquery completes in <1ms, however the other 20% of the time, it takes approx 15-20ms. This is where my confusion lies, they are doing the same task (as far as I can see), yet sometimes it completes in a timely fashion, sometimes not.
0
 
LVL 4

Expert Comment

by:Cakinci
ID: 24151902
From my knowledge i can give some ideas why it is happening like this;

Every executenonquerry is generating a thread or even more to complete its job. But in addition to those jobs cpu has already has some jobs maybe(Some of them waiting in the queu to be executed; some of them cycling through each other according to their proirity)
My assumption is in your for loop according to the cpu load the computer can handle up to some point theese threads that sql server is generating. But at the end it is becoming a bottleneck and it starts to dump them into queue.(this is where your delay starts)

Actually there is one easy way to understand why it is happening like in your situation;
1-Give sqlserver the highest priority from the task manager.
2-Close all other program and services in the computer that you are testing.
3-First start from 1000 rows fro example to insert. While inserting monitor cpu usage by microsoft thread monitoring tool (http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx).
4-Observe also memory usage.
4-Then step by step increase the amount of data that is being inserted and again observe till it comes to botleneck. In my opinion this will the point either your memory will be full(and the computer will start using much more page file) or the cpu load is in maximum.(Cpu cache also)

Hope this information will help you.
Take care
0
 

Author Comment

by:stueyp34
ID: 24156663
Perhaps I can best demonstrate this problem by simplifying the code snippet. I have added a button to a testbench form, which simply loops from 0-1000, executing a stored procedure. The procedure does nothing other than "Print 'Hello'".

If I run the below procedure, I will most often find that the difference between the start and end debug.print points is less than 1ms, however on odd occasions, the difference is approx 15ms. I only need to process the 1st 10 loops for the 1st delay to happen, and it is not consistent (eg every 5th loop).

Has anyone else come accross this behaviour before?
    Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button2.Click

        Using cn As New SqlConnection("Data Source=localhost;Initial Catalog=TestDBEmpty;User ID=******;Password=******;Network Library=dbmssocn")

            cn.Open()

            For x As Integer = 0 To 1000

                Using cmd As SqlCommand = cn.CreateCommand

                    With cmd

                        .CommandType = CommandType.StoredProcedure

                        .CommandText = "BulkTest"

                        Debug.Print("Start - " & Now & " - " & Now.Millisecond)

                        .ExecuteNonQuery()

                        Debug.Print("End - " & Now & " - " & Now.Millisecond)

                    End With

                End Using

                x += 1

            Next

        End Using

    End Sub

Open in new window

0
 

Accepted Solution

by:
stueyp34 earned 0 total points
ID: 24260372
I've not found a solution to this problem, so we have approached it from a different way by doing a bulk update. Although we lose out on object level control, the performance is much better, thus is the accepted approach
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
Round up to 100% in .NET 10 85
VB.NET HttpWebRequest 12 57
Loop through Multiple Processes Async 2 28
Protecting vb6 & .Net code Obfuscation 18 96
Article by: Kraeven
Introduction Remote Share is a simple remote sharing tool, enabling you to see, add and remove remote or local shares. The application is written in VB.NET targeting the .NET framework 2.0. The source code and the compiled programs have been in…
Parsing a CSV file is a task that we are confronted with regularly, and although there are a vast number of means to do this, as a newbie, the field can be confusing and the tools can seem complex. A simple solution to parsing a customized CSV fi…
This Micro Tutorial hows how you can integrate  Mac OSX to a Windows Active Directory Domain. Apple has made it easy to allow users to bind their macs to a windows domain with relative ease. The following video show how to bind OSX Mavericks to …
Internet Business Fax to Email Made Easy - With eFax Corporate (http://www.enterprise.efax.com), you'll receive a dedicated online fax number, which is used the same way as a typical analog fax number. You'll receive secure faxes in your email, fr…

920 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now