Solved

Poor performance when executing stored procedure multiple times

Posted on 2009-04-15
5
192 Views
Last Modified: 2012-06-27
We have an application that attempts to save a collection of objects to a SQL 2005 database using a 'save' stored procedure via the SQLCommand class. We have a performance issue in that some calls of the stored procedure are slow (15-20ms) whereas other calls execute very quickly (<1ms). This is a problem as we will typically save a collection of 100,000 objects.

Using SQL Server Profiler, I can see that the actual stored procedure runs very quickly every time (always <1ms), however on odd occasions, the difference between Debug.Print points 2 and 3 can be 15-20ms. Does anyone have any pointers as to why this may happen, are we doing something wrong?

We would prefer to avoid a method of bulk saving the objects (by serializing to XML or similar).
Public Function UpdateDataBase()

        Using cn As New SqlConnection("Data Source=localhost;Initial Catalog=TestDBEmpty;User ID=******;Password=******;Network Library=dbmssocn")

            cn.Open()

            For Each item As OBJ_Bookings In Me

                Try

                    'Debug.Print("Start - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 1

                    OBJ_Bookings.SaveToDatabase(item, cn)

                    'Debug.Print("End - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 4

                Catch ex As Exception

                    MsgBox(ex.Message)

                End Try

            Next

        End Using

   End Function
 

    Public Sub SaveToDatabase(ByRef ItemToUpdate as OBJ_Bookings, ByRef cn As SqlConnection)

            Using cmd As SqlCommand = cn.CreateCommand

                With cmd

                    .CommandType = CommandType.StoredProcedure

                    .CommandText = "isCRUD_BookingsSave"

                    .Parameters.AddWithValue("@Id", ItemToUpdate.Id)

                    .Parameters.AddWithValue("@StartTime", ItemToUpdate.StartTime.DBValue)

                    .Parameters.AddWithValue("@EndTime", ItemToUpdate.EndTime.DBValue)

                    .Parameters.AddWithValue("@Date", ItemToUpdate._Date_.DBValue)

                    Debug.Print("Procedure Start - " & Now.ToString & " - " & Now.Millisecond) 'Debug.Print Point 2

                    .ExecuteNonQuery()

                    Debug.Print("Procedure End - " & Now.ToString & " - " & Now.Millisecond)  'Debug.Print Point 3

                End With

            End Using

        End Sub

Open in new window

0
Comment
Question by:stueyp34
  • 3
  • 2
5 Comments
 
LVL 4

Expert Comment

by:Cakinci
Comment Utility
There is one big practical problem about the architecture of the code;
According to logic for each line OBJ_Bookings you are comminting an executenonquery.
Unfortunatley in SQL server structure; doing bulk insert and doing step by step insert differs so much.

If you will  aggregate all the queries,
separated by newline in a one huge string (with StringBuilder of
course) and then do only one executenonquery per all the objects - that
will surely make greater performance increase.(Bulk insert)

But if you will commit a execute non query statement for each line it will take so much time.

Hope this info will give an idea about the problem.

0
 

Author Comment

by:stueyp34
Comment Utility
Thanks for the comment.

I agree that performing an executenonquery for each individual object is inefficient compared to a bulk insert style update. However, 80% of the time, the individual save approach is acceptable within our time/efficiency requirements. 80% of the time, the executenonquery completes in <1ms, however the other 20% of the time, it takes approx 15-20ms. This is where my confusion lies, they are doing the same task (as far as I can see), yet sometimes it completes in a timely fashion, sometimes not.
0
 
LVL 4

Expert Comment

by:Cakinci
Comment Utility
From my knowledge i can give some ideas why it is happening like this;

Every executenonquerry is generating a thread or even more to complete its job. But in addition to those jobs cpu has already has some jobs maybe(Some of them waiting in the queu to be executed; some of them cycling through each other according to their proirity)
My assumption is in your for loop according to the cpu load the computer can handle up to some point theese threads that sql server is generating. But at the end it is becoming a bottleneck and it starts to dump them into queue.(this is where your delay starts)

Actually there is one easy way to understand why it is happening like in your situation;
1-Give sqlserver the highest priority from the task manager.
2-Close all other program and services in the computer that you are testing.
3-First start from 1000 rows fro example to insert. While inserting monitor cpu usage by microsoft thread monitoring tool (http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx).
4-Observe also memory usage.
4-Then step by step increase the amount of data that is being inserted and again observe till it comes to botleneck. In my opinion this will the point either your memory will be full(and the computer will start using much more page file) or the cpu load is in maximum.(Cpu cache also)

Hope this information will help you.
Take care
0
 

Author Comment

by:stueyp34
Comment Utility
Perhaps I can best demonstrate this problem by simplifying the code snippet. I have added a button to a testbench form, which simply loops from 0-1000, executing a stored procedure. The procedure does nothing other than "Print 'Hello'".

If I run the below procedure, I will most often find that the difference between the start and end debug.print points is less than 1ms, however on odd occasions, the difference is approx 15ms. I only need to process the 1st 10 loops for the 1st delay to happen, and it is not consistent (eg every 5th loop).

Has anyone else come accross this behaviour before?
    Private Sub Button2_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) Handles Button2.Click

        Using cn As New SqlConnection("Data Source=localhost;Initial Catalog=TestDBEmpty;User ID=******;Password=******;Network Library=dbmssocn")

            cn.Open()

            For x As Integer = 0 To 1000

                Using cmd As SqlCommand = cn.CreateCommand

                    With cmd

                        .CommandType = CommandType.StoredProcedure

                        .CommandText = "BulkTest"

                        Debug.Print("Start - " & Now & " - " & Now.Millisecond)

                        .ExecuteNonQuery()

                        Debug.Print("End - " & Now & " - " & Now.Millisecond)

                    End With

                End Using

                x += 1

            Next

        End Using

    End Sub

Open in new window

0
 

Accepted Solution

by:
stueyp34 earned 0 total points
Comment Utility
I've not found a solution to this problem, so we have approached it from a different way by doing a bulk update. Although we lose out on object level control, the performance is much better, thus is the accepted approach
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Article by: Kraeven
Introduction Remote Share is a simple remote sharing tool, enabling you to see, add and remove remote or local shares. The application is written in VB.NET targeting the .NET framework 2.0. The source code and the compiled programs have been in…
Calculating holidays and working days is a function that is often needed yet it is not one found within the Framework. This article presents one approach to building a working-day calculator for use in .NET.
Get a first impression of how PRTG looks and learn how it works.   This video is a short introduction to PRTG, as an initial overview or as a quick start for new PRTG users.
This video explains how to create simple products associated to Magento configurable product and offers fast way of their generation with Store Manager for Magento tool.

762 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

7 Experts available now in Live!

Get 1:1 Help Now