Link to home
Start Free TrialLog in
Avatar of karlkawano
karlkawano

asked on

Queries Slow Down in a Loop after 20 Iterations

Hi,

Thank you in advance for any help.  I notice this problem in much of my code in my SPs when using loops.  When I have a cursor or any looping mechanism for that matter, that runs code as my example below, the first 30 iterations take secords but then the next, say 300 iterations slow down to minutes or even hours.  I am going to just give an example since this seems to be happening with all my code.  Please do not focus on the syntax, I am just giving an example of the structure.  I am more interested if calling the dynamic queries, transactions, locking, etc could be an issue.  I also tried wrapping all the SQL statements and exec (@SQL) in BEGIN TRAN / COMMIT TRAN but it still slows down.  Also, the reason I am running it in this manner is to cut down the number of transactions at a time the update is performing.  In the example below, this query would update about 5000 records for every loop.


Any ideas?


DELETE FROM mytable WHERE ID = @ID




DECLARE rsDD CURSOR FAST_FORWARD
FOR
      SELECT             DataDate
      FROM             #tblDD WITH (NOLOCK)
      ORDER BY       DataDate    
OPEN rsDD
FETCH rsDD INTO @dd
WHILE (@@FETCH_status = 0 )
     BEGIN

          SET @SQL ='INSERT INTO mytable(ID, DD, value1, value2) '
          SET @SQL = @SQL + 'SELECT ID, Value1, Value2 FROM tblData WITH (NOLOCK) WHERE ID= ' + @ID + ' AND DD = ' + @dd
          Exec (@SQL)

          SET @SQL ='UPDATE mytable  '
          SET @SQL = @SQL + 'SET mytable.value3 = ''' + convert(varchar (20),@yedd) + ''' '
          SET @SQL = @SQL + 'WHERE mytable.ID = ''' + convert(varchar (20),@dd) + ''''
          exec (@SQL)

          SET @SQL ='UPDATE mytable  '
          SET @SQL = @SQL + 'SET mytable.value4 = ''' + convert(varchar (20),@yedd) + ''' '
          SET @SQL = @SQL + 'WHERE mytable.ID = ''' + convert(varchar (20),@dd) + ''''
          exec (@SQL)

          FETCH rsDD INTO @dd
     END
CLOSE rsDD
DEALLOCATE rsDD
            
            
Avatar of arbert
arbert

Where is the rest of the proc where you declare @yedd and @dd?

and why are you performing inserts and turning right around and updating the same record--what are you trying to do???
"  I am going to just give an example since this seems to be happening with all my code.  Please do not focus on the syntax, I am just giving an example of the structure"

Sorry, just saw this statement--it's tough for us to give much help unless you give us actual code.  Pointer though, cursors will most always be slow--don't use them unless you have to....
here is the basic structure I tend to you for my cursors:
please note that I have two checks within the cursor for @@fetch_status

In addition, what is the neccessity to use FAST_FORWARD, you are not gaining anything out of this.... and this could be causing your speed degredation.

I would also recommened that you don't use the sting and EXEC method. It does not have any benefits to you in this manner. Besides the fact that it also allows for SQL Injection....

I see you manetioned that you also tried using transactions.... just note one thing, I would put the transaction around the loop, so the transaction will only commit after all 5000 inserts, etc. have been created, this will reduce speed degredation issues as it will only commit the whole batch of statements once, not on every loop.

DECLARE ID_Cursor SCROLL CURSOR FOR
     SELECT
          blah
     FROM
          dbo.TableBlah


OPEN ID_Cursor
FETCH NEXT FROM ID_Cursor INTO @ID_Cursor

WHILE (@@fetch_status <> -1)
      BEGIN
      IF (@@fetch_status <> -2)
            BEGIN


                                END
            FETCH NEXT FROM ID_Cursor INTO @ID_Cursor

END

CLOSE ID_Cursor
DEALLOCATE ID_Cursor
"what is the neccessity to use FAST_FORWARD,"

What is the neccessity for using a cursor.
Once again, I concur with Arbert.    SQL Server is designed and optimised for set operations not itterations.  itterations should only be used when set operations are not possible.   And in addition you really should nt use DSQL unless it is absolutely necessary.
Avatar of karlkawano

ASKER

Hi,

Thanks for the comments.  There is a lot of logic in the code but in the end it just creates insert and update statements as shown above.  I also have another query that just does updates in the same fashion that starts to slow down.  I have tried pointers but they run as fast relative to a cursor since there is not a lot of data there.  The bottleneck is when it starts to loop through it slows down after a while.  If I run the code and comment out the SQL statements it runs right through without stopping.  If I run the SQL statements in query analyser they run fast at first but then slow down.  To do this I did print statement of the @SQL variable.  So I had all the insert statements.  I tried running a batch of just the insert statements separated by GO statements and the first 20 ran fast but then they too started running slow.  After a while, if I ran one or two of the insert statements they ran slow.  Could it be some way I have the log file set up or could I be filling the log and it needs to write to the database?

Thanks,

Karl
"The bottleneck is when it starts to loop through it slows down after a while.  If I run the code and comment out the SQL statements it runs right through without stopping"

Well ya--that's the guts of the cursor...

"After a while, if I ran one or two of the insert statements they ran slow.  Could it be some way I have the log file set up or could I be filling the log and it needs to write to the database"


It's because you're using cursors....Why do you have to use the cursors?
Hi guys,

Thanks again for the responses, I started my last post last night and finished it up this morning before I saw the other remarks.  

"Fast_Forward"  my understanding is this would make the cursor read-only and use some optimization.  True?

"Cursors" - I did have some code a while back set up using pointers but I still ran into this problem.  Just seemed easier to use but I can switch back.

I'll try to use the whole thing wrapped in a transaction.  One question about this - On http://www.sql-server-performance.com/ they mention to break up transactions in smaller batches so as to keep the locks low.  Thought they said this would help performance?  Just to give you an idea, the insert query is inserting into a table of about 3.5 Million records.  The insert query inserts anywhere from 5 to 5000 records (say 100000 on average) on each iteration and about 100 iterations for a total of 100,000 records added.  When the query is rerun, the records for that group are deleted and then they are rerun so the number of records in the backtests are stabilized around 3.5 Mill.  It will slowly move up but we are going to have another solution in place in the future.

Finally, what are the problems associated with DSQL?  I tried using SQL but ran into problems because it didn't allow me the flexability I needed.  I couldn't use a variable name for a table.  Seemed like I was only able to use variables for where clauses.  Is this correct?  Again, I am trying to keep some of the code out so I don't convolute the problem.  

Looks like this may be tougher than I thought and several of you are contributing which I appreciate.  I am going to boost the points to 500 so I can spread the wealth.

Thanks again,

Karl
Hi Arbert,

Let me try rewriting without the cursor and see if it helps.  Will let you know.


Thanks,

Karl
We can't comment because you're not answering our questions--post the exact code you're using in one of the situations and tell us why you need to do it the way you're doing it.

You can use variables for tables, but then you have to resort to DSQL..
Hi,

Ok, sorry.  Let me create some code to simulate the problem.  I can't put the code that I am currently working on because I could get into a mess as far as proprietary calculations, etc.

I'll have it up by tomorrow or Thursday.  

Thanks,

Karl
good luck
One of the reasons it slows down could be table scans during updates.  It runs fast while the table is small, and then updates take longer and longer with each iteration.

Here's what you can try:

1. Do not do updates at all.  Instead, collect all the data to insert in local variables, then insert everything in one shot.
2. If you think you absolutely need updates, then:
a) create indexes to speed queries up
b) load several thousand rows, stop the process, update statistics, restart the process from where it left.
c) use optimizer hints

a) is a must, you can combine b) and c)

Another thing worth mentioning is use of dynamic SQL.  You'll be better of by not using dynamic SQL at all, but if you must, then at least use it as dynamic SQL with bind variables (parameters).  I do not have exact sysntax, but you can do a lookup in BOL
"2. If you think you absolutely need updates, then:
a) create indexes to speed queries up"

Of course, it can have quite the opposite impact as well....

"2. If you think you absolutely need updates, then:
a) create indexes to speed queries up"

"Of course, it can have quite the opposite impact as well...."

True.  But compare the overhead of index maintenace to the overhead of a table scan on a 3.5 million rows table.

Like I've said, it's better to avois updates at all, then you will not need any indexes - at least for the duration of the population process.
"But compare the overhead of index maintenace to the overhead of a table scan on a 3.5 million rows table."

Ya, little better when you expand upon it...didn't say anything about 3.5 million rows above.....
Thanks for all the comments guys.  I really appricate the input.  Arbert, I will get some code posted as soon as I have time.

I do have the table indexed.  There are about 3.5 million records in this table.  The number of records added and updated is probably insignificant next to the number of records in the table.  

I'll look into bind variables (parameters) and the statistics.  

Thanks Again,

Karl
ASKER CERTIFIED SOLUTION
Avatar of Jay Toops
Jay Toops
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Thanks Jay,

I will look into that too...

Karl
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial