Solved

Poor performance when executing stored procedure multiple times

Posted on 2009-04-15
5
198 Views
Last Modified: 2012-06-27
We have an application that attempts to save a collection of objects to a SQL 2005 database using a 'save' stored procedure via the SQLCommand class. We have a performance issue in that some calls of the stored procedure are slow (15-20ms) whereas other calls execute very quickly (<1ms). This is a problem as we will typically save a collection of 100,000 objects.

Using SQL Server Profiler, I can see that the actual stored procedure runs very quickly every time (always <1ms), however on odd occasions, the difference between Debug.Print points 2 and 3 can be 15-20ms. Does anyone have any pointers as to why this may happen, are we doing something wrong?

We would prefer to avoid a method of bulk saving the objects (by serializing to XML or similar).
Public Function UpdateDataBase()
        Using cn As New SqlConnection("Data Source=localhost;Initial Catalog=TestDBEmpty;User ID=******;Password=******;Network Library=dbmssocn")
            cn.Open()
            For Each item As OBJ_Bookings In Me
                Try
                    'Debug.Print("Start - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 1
                    OBJ_Bookings.SaveToDatabase(item, cn)
                    'Debug.Print("End - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 4
                Catch ex As Exception
                    MsgBox(ex.Message)
                End Try
            Next
        End Using
   End Function
 
    Public Sub SaveToDatabase(ByRef ItemToUpdate as OBJ_Bookings, ByRef cn As SqlConnection)
            Using cmd As SqlCommand = cn.CreateCommand
                With cmd
                    .CommandType = CommandType.StoredProcedure
                    .CommandText = "isCRUD_BookingsSave"
                    .Parameters.AddWithValue("@Id", ItemToUpdate.Id)
                    .Parameters.AddWithValue("@StartTime", ItemToUpdate.StartTime.DBValue)
                    .Parameters.AddWithValue("@EndTime", ItemToUpdate.EndTime.DBValue)
                    .Parameters.AddWithValue("@Date", ItemToUpdate._Date_.DBValue)
                    Debug.Print("Procedure Start - " & Now.ToString & " - " & Now.Millisecond) 'Debug.Print Point 2
                    .ExecuteNonQuery()
                    Debug.Print("Procedure End - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 3
                End With
            End Using
        End Sub

Open in new window

0
Comment
Question by:stueyp34
  • 3
  • 2
5 Comments
 
LVL 4

Expert Comment

by:Cakinci
ID: 24147353
There is one big practical problem about the architecture of the code;
According to logic for each line OBJ_Bookings you are comminting an executenonquery.
Unfortunatley in SQL server structure; doing bulk insert and doing step by step insert differs so much.

If you will  aggregate all the queries,
separated by newline in a one huge string (with StringBuilder of
course) and then do only one executenonquery per all the objects - that
will surely make greater performance increase.(Bulk insert)

But if you will commit a execute non query statement for each line it will take so much time.

Hope this info will give an idea about the problem.

0
 

Author Comment

by:stueyp34
ID: 24147582
Thanks for the comment.

I agree that performing an executenonquery for each individual object is inefficient compared to a bulk insert style update. However, 80% of the time, the individual save approach is acceptable within our time/efficiency requirements. 80% of the time, the executenonquery completes in <1ms, however the other 20% of the time, it takes approx 15-20ms. This is where my confusion lies, they are doing the same task (as far as I can see), yet sometimes it completes in a timely fashion, sometimes not.
0
 
LVL 4

Expert Comment

by:Cakinci
ID: 24151902
From my knowledge i can give some ideas why it is happening like this;

Every executenonquerry is generating a thread or even more to complete its job. But in addition to those jobs cpu has already has some jobs maybe(Some of them waiting in the queu to be executed; some of them cycling through each other according to their proirity)
My assumption is in your for loop according to the cpu load the computer can handle up to some point theese threads that sql server is generating. But at the end it is becoming a bottleneck and it starts to dump them into queue.(this is where your delay starts)

Actually there is one easy way to understand why it is happening like in your situation;
1-Give sqlserver the highest priority from the task manager.
2-Close all other program and services in the computer that you are testing.
3-First start from 1000 rows fro example to insert. While inserting monitor cpu usage by microsoft thread monitoring tool (http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx).
4-Observe also memory usage.
4-Then step by step increase the amount of data that is being inserted and again observe till it comes to botleneck. In my opinion this will the point either your memory will be full(and the computer will start using much more page file) or the cpu load is in maximum.(Cpu cache also)

Hope this information will help you.
Take care
0
 

Author Comment

by:stueyp34
ID: 24156663
Perhaps I can best demonstrate this problem by simplifying the code snippet. I have added a button to a testbench form, which simply loops from 0-1000, executing a stored procedure. The procedure does nothing other than "Print 'Hello'".

If I run the below procedure, I will most often find that the difference between the start and end debug.print points is less than 1ms, however on odd occasions, the difference is approx 15ms. I only need to process the 1st 10 loops for the 1st delay to happen, and it is not consistent (eg every 5th loop).

Has anyone else come accross this behaviour before?
    Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button2.Click
        Using cn As New SqlConnection("Data Source=localhost;Initial Catalog=TestDBEmpty;User ID=******;Password=******;Network Library=dbmssocn")
            cn.Open()
            For x As Integer = 0 To 1000
                Using cmd As SqlCommand = cn.CreateCommand
                    With cmd
                        .CommandType = CommandType.StoredProcedure
                        .CommandText = "BulkTest"
                        Debug.Print("Start - " & Now & " - " & Now.Millisecond)
                        .ExecuteNonQuery()
                        Debug.Print("End - " & Now & " - " & Now.Millisecond)
                    End With
                End Using
                x += 1
            Next
        End Using
    End Sub

Open in new window

0
 

Accepted Solution

by:
stueyp34 earned 0 total points
ID: 24260372
I've not found a solution to this problem, so we have approached it from a different way by doing a bulk update. Although we lose out on object level control, the performance is much better, thus is the accepted approach
0

Featured Post

Live: Real-Time Solutions, Start Here

Receive instant 1:1 support from technology experts, using our real-time conversation and whiteboard interface. Your first 5 minutes are always free.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Microsoft Reports are based on a report definition, which is an XML file that describes data and layout for the report, with a different extension. You can create a client-side report definition language (*.rdlc) file with Visual Studio, and build g…
It’s quite interesting for me as I worked with Excel using vb.net for some time. Here are some topics which I know want to share with others whom this might help. First of all if you are working with Excel then you need to Download the Following …
Along with being a a promotional video for my three-day Annielytics Dashboard Seminor, this Micro Tutorial is an intro to Google Analytics API data.
Established in 1997, Technology Architects has become one of the most reputable technology solutions companies in the country. TA have been providing businesses with cost effective state-of-the-art solutions and unparalleled service that is designed…

776 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question