Solved

Queries Slow Down in a Loop after 20 Iterations

Posted on 2004-10-04
23
1,202 Views
Last Modified: 2008-02-26
Hi,

Thank you in advance for any help.  I notice this problem in much of my code in my SPs when using loops.  When I have a cursor or any looping mechanism for that matter, that runs code as my example below, the first 30 iterations take secords but then the next, say 300 iterations slow down to minutes or even hours.  I am going to just give an example since this seems to be happening with all my code.  Please do not focus on the syntax, I am just giving an example of the structure.  I am more interested if calling the dynamic queries, transactions, locking, etc could be an issue.  I also tried wrapping all the SQL statements and exec (@SQL) in BEGIN TRAN / COMMIT TRAN but it still slows down.  Also, the reason I am running it in this manner is to cut down the number of transactions at a time the update is performing.  In the example below, this query would update about 5000 records for every loop.


Any ideas?


DELETE FROM mytable WHERE ID = @ID




DECLARE rsDD CURSOR FAST_FORWARD
FOR
      SELECT             DataDate
      FROM             #tblDD WITH (NOLOCK)
      ORDER BY       DataDate    
OPEN rsDD
FETCH rsDD INTO @dd
WHILE (@@FETCH_status = 0 )
     BEGIN

          SET @SQL ='INSERT INTO mytable(ID, DD, value1, value2) '
          SET @SQL = @SQL + 'SELECT ID, Value1, Value2 FROM tblData WITH (NOLOCK) WHERE ID= ' + @ID + ' AND DD = ' + @dd
          Exec (@SQL)

          SET @SQL ='UPDATE mytable  '
          SET @SQL = @SQL + 'SET mytable.value3 = ''' + convert(varchar (20),@yedd) + ''' '
          SET @SQL = @SQL + 'WHERE mytable.ID = ''' + convert(varchar (20),@dd) + ''''
          exec (@SQL)

          SET @SQL ='UPDATE mytable  '
          SET @SQL = @SQL + 'SET mytable.value4 = ''' + convert(varchar (20),@yedd) + ''' '
          SET @SQL = @SQL + 'WHERE mytable.ID = ''' + convert(varchar (20),@dd) + ''''
          exec (@SQL)

          FETCH rsDD INTO @dd
     END
CLOSE rsDD
DEALLOCATE rsDD
            
            
0
Comment
Question by:karlkawano
  • 7
  • 6
  • 3
  • +3
23 Comments
 
LVL 34

Expert Comment

by:arbert
ID: 12223136
Where is the rest of the proc where you declare @yedd and @dd?

and why are you performing inserts and turning right around and updating the same record--what are you trying to do???
0
 
LVL 34

Expert Comment

by:arbert
ID: 12223142
"  I am going to just give an example since this seems to be happening with all my code.  Please do not focus on the syntax, I am just giving an example of the structure"

Sorry, just saw this statement--it's tough for us to give much help unless you give us actual code.  Pointer though, cursors will most always be slow--don't use them unless you have to....
0
 
LVL 1

Expert Comment

by:AntonG
ID: 12224037
here is the basic structure I tend to you for my cursors:
please note that I have two checks within the cursor for @@fetch_status

In addition, what is the neccessity to use FAST_FORWARD, you are not gaining anything out of this.... and this could be causing your speed degredation.

I would also recommened that you don't use the sting and EXEC method. It does not have any benefits to you in this manner. Besides the fact that it also allows for SQL Injection....

I see you manetioned that you also tried using transactions.... just note one thing, I would put the transaction around the loop, so the transaction will only commit after all 5000 inserts, etc. have been created, this will reduce speed degredation issues as it will only commit the whole batch of statements once, not on every loop.

DECLARE ID_Cursor SCROLL CURSOR FOR
     SELECT
          blah
     FROM
          dbo.TableBlah


OPEN ID_Cursor
FETCH NEXT FROM ID_Cursor INTO @ID_Cursor

WHILE (@@fetch_status <> -1)
      BEGIN
      IF (@@fetch_status <> -2)
            BEGIN


                                END
            FETCH NEXT FROM ID_Cursor INTO @ID_Cursor

END

CLOSE ID_Cursor
DEALLOCATE ID_Cursor
0
 
LVL 34

Expert Comment

by:arbert
ID: 12224190
"what is the neccessity to use FAST_FORWARD,"

What is the neccessity for using a cursor.
0
 
LVL 18

Expert Comment

by:ShogunWade
ID: 12225571
Once again, I concur with Arbert.    SQL Server is designed and optimised for set operations not itterations.  itterations should only be used when set operations are not possible.   And in addition you really should nt use DSQL unless it is absolutely necessary.
0
 

Author Comment

by:karlkawano
ID: 12226156
Hi,

Thanks for the comments.  There is a lot of logic in the code but in the end it just creates insert and update statements as shown above.  I also have another query that just does updates in the same fashion that starts to slow down.  I have tried pointers but they run as fast relative to a cursor since there is not a lot of data there.  The bottleneck is when it starts to loop through it slows down after a while.  If I run the code and comment out the SQL statements it runs right through without stopping.  If I run the SQL statements in query analyser they run fast at first but then slow down.  To do this I did print statement of the @SQL variable.  So I had all the insert statements.  I tried running a batch of just the insert statements separated by GO statements and the first 20 ran fast but then they too started running slow.  After a while, if I ran one or two of the insert statements they ran slow.  Could it be some way I have the log file set up or could I be filling the log and it needs to write to the database?

Thanks,

Karl
0
 
LVL 34

Expert Comment

by:arbert
ID: 12226199
"The bottleneck is when it starts to loop through it slows down after a while.  If I run the code and comment out the SQL statements it runs right through without stopping"

Well ya--that's the guts of the cursor...

"After a while, if I ran one or two of the insert statements they ran slow.  Could it be some way I have the log file set up or could I be filling the log and it needs to write to the database"


It's because you're using cursors....Why do you have to use the cursors?
0
 

Author Comment

by:karlkawano
ID: 12226442
Hi guys,

Thanks again for the responses, I started my last post last night and finished it up this morning before I saw the other remarks.  

"Fast_Forward"  my understanding is this would make the cursor read-only and use some optimization.  True?

"Cursors" - I did have some code a while back set up using pointers but I still ran into this problem.  Just seemed easier to use but I can switch back.

I'll try to use the whole thing wrapped in a transaction.  One question about this - On http://www.sql-server-performance.com/ they mention to break up transactions in smaller batches so as to keep the locks low.  Thought they said this would help performance?  Just to give you an idea, the insert query is inserting into a table of about 3.5 Million records.  The insert query inserts anywhere from 5 to 5000 records (say 100000 on average) on each iteration and about 100 iterations for a total of 100,000 records added.  When the query is rerun, the records for that group are deleted and then they are rerun so the number of records in the backtests are stabilized around 3.5 Mill.  It will slowly move up but we are going to have another solution in place in the future.

Finally, what are the problems associated with DSQL?  I tried using SQL but ran into problems because it didn't allow me the flexability I needed.  I couldn't use a variable name for a table.  Seemed like I was only able to use variables for where clauses.  Is this correct?  Again, I am trying to keep some of the code out so I don't convolute the problem.  

Looks like this may be tougher than I thought and several of you are contributing which I appreciate.  I am going to boost the points to 500 so I can spread the wealth.

Thanks again,

Karl
0
 

Author Comment

by:karlkawano
ID: 12226461
Hi Arbert,

Let me try rewriting without the cursor and see if it helps.  Will let you know.


Thanks,

Karl
0
 
LVL 34

Expert Comment

by:arbert
ID: 12226976
We can't comment because you're not answering our questions--post the exact code you're using in one of the situations and tell us why you need to do it the way you're doing it.

You can use variables for tables, but then you have to resort to DSQL..
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 

Author Comment

by:karlkawano
ID: 12227038
Hi,

Ok, sorry.  Let me create some code to simulate the problem.  I can't put the code that I am currently working on because I could get into a mess as far as proprietary calculations, etc.

I'll have it up by tomorrow or Thursday.  

Thanks,

Karl
0
 
LVL 18

Expert Comment

by:ShogunWade
ID: 12227051
good luck
0
 
LVL 6

Expert Comment

by:izblank
ID: 12232927
One of the reasons it slows down could be table scans during updates.  It runs fast while the table is small, and then updates take longer and longer with each iteration.

Here's what you can try:

1. Do not do updates at all.  Instead, collect all the data to insert in local variables, then insert everything in one shot.
2. If you think you absolutely need updates, then:
a) create indexes to speed queries up
b) load several thousand rows, stop the process, update statistics, restart the process from where it left.
c) use optimizer hints

a) is a must, you can combine b) and c)

Another thing worth mentioning is use of dynamic SQL.  You'll be better of by not using dynamic SQL at all, but if you must, then at least use it as dynamic SQL with bind variables (parameters).  I do not have exact sysntax, but you can do a lookup in BOL
0
 
LVL 34

Expert Comment

by:arbert
ID: 12233048
"2. If you think you absolutely need updates, then:
a) create indexes to speed queries up"

Of course, it can have quite the opposite impact as well....

0
 
LVL 6

Expert Comment

by:izblank
ID: 12233103
"2. If you think you absolutely need updates, then:
a) create indexes to speed queries up"

"Of course, it can have quite the opposite impact as well...."

True.  But compare the overhead of index maintenace to the overhead of a table scan on a 3.5 million rows table.

Like I've said, it's better to avois updates at all, then you will not need any indexes - at least for the duration of the population process.
0
 
LVL 34

Expert Comment

by:arbert
ID: 12233142
"But compare the overhead of index maintenace to the overhead of a table scan on a 3.5 million rows table."

Ya, little better when you expand upon it...didn't say anything about 3.5 million rows above.....
0
 

Author Comment

by:karlkawano
ID: 12237754
Thanks for all the comments guys.  I really appricate the input.  Arbert, I will get some code posted as soon as I have time.

I do have the table indexed.  There are about 3.5 million records in this table.  The number of records added and updated is probably insignificant next to the number of records in the table.  

I'll look into bind variables (parameters) and the statistics.  

Thanks Again,

Karl
0
 
LVL 10

Accepted Solution

by:
Jay Toops earned 250 total points
ID: 12249390
maybee you need to allow a larger fill factor on your indexes? ..
it sounds like its filled up its free space and is having to do reindexing.

Jay
0
 

Author Comment

by:karlkawano
ID: 12249548
Thanks Jay,

I will look into that too...

Karl
0
 
LVL 18

Assisted Solution

by:ShogunWade
ShogunWade earned 250 total points
ID: 12249562
you should monitor page splits before adjusting your fillfactor.  larger  fillfactor will encumber reads.
0

Featured Post

Backup Your Microsoft Windows Server®

Backup all your Microsoft Windows Server – on-premises, in remote locations, in private and hybrid clouds. Your entire Windows Server will be backed up in one easy step with patented, block-level disk imaging. We achieve RTOs (recovery time objectives) as low as 15 seconds.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction In my previous article (http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/SSIS/A_9150-Loading-XML-Using-SSIS.html) I showed you how the XML Source component can be used to load XML files into a SQL Server database, us…
International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
Via a live example, show how to shrink a transaction log file down to a reasonable size.

932 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now