Solved

SQL delete script without creating huge transaction logs

Posted on 2009-03-30
16
1,548 Views
Last Modified: 2012-05-06
Hi All

I have the following SQL query I would like to run on a very large table to remove any rows more than 2 months old:

delete from tablename
WHERE logTime < dateadd(mm,-2,getdate())

Where tablename is the name of the table and logTime is the date the row was created.

Problem is when I run the query the transaction log grows to be gigantic - is there a way I can run the query without this happening?

Thanks
0
Comment
Question by:Elthamtech
  • 4
  • 4
  • 3
  • +3
16 Comments
 
LVL 7

Expert Comment

by:luani
Comment Utility
You can put the database in simple mode(so it will not log anything), delete the records and the reset to FULL, this works if U have to do this one time not all the time....
0
 
LVL 60

Expert Comment

by:chapmandew
Comment Utility
this should help:

while exists(select 1 from WHERE logTime < dateadd(mm,-2,getdate()))
begin
delete top(100) from tablename
WHERE logTime < dateadd(mm,-2,getdate())

waitfor delay '0:0:02'
end
0
 
LVL 13

Expert Comment

by:St3veMax
Comment Utility
Sadly not. All delete commands are logged. You cant use truncate table as this clears the lot.

How many records are there in the table and how many are you looking to get rid of?

You could consider inserting all recs > 2 months into a new table; drop the old table and then rename the new table?

If you do the above; Please try it in a development environment first!

HTH
0
 
LVL 37

Accepted Solution

by:
momi_sabag earned 300 total points
Comment Utility
changing the log mode won't help
you need  to break it into smaller transactions, for example try to perform this :

declare @rowcount int
set @rowCount = 1

while @RowCount > 0
 begin
 with a as (select top 5000 * from tablename
  WHERE logTime < dateadd(mm,-2,getdate())
  )
 delete from a
  select @rowCount = @@rowCount
end
0
 
LVL 7

Expert Comment

by:luani
Comment Utility
>>changing the log mode won't help
have a try anyway, is the more simple and more fast solution of your problem ....

0
 
LVL 60

Expert Comment

by:chapmandew
Comment Utility
simple and fast doesn't mean it is the best way.  it will cause you to not be able to recover to a point in time for your other transactions.
0
 
LVL 7

Expert Comment

by:luani
Comment Utility
>>it will cause you to not be able to recover to a point in time for your other transactions.
Totally agree, he can always do a backup before everything ..but let him choose what's best for him......... ;)
0
 
LVL 60

Expert Comment

by:chapmandew
Comment Utility
can also do this:

while exists(select 1 from WHERE logTime < dateadd(mm,-2,getdate()))
begin
delete top(100) from tablename
WHERE logTime < dateadd(mm,-2,getdate())

checkpoint
end
0
How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

 

Author Comment

by:Elthamtech
Comment Utility
I didn't expect so many responses! Very impressive, Thanks! In answer to a couple of your questions:

1. My database is already in simple mode
2. The table in question currently has 67112479 rows
3. The database is backed up using CommVault every night

I'm going to try a couple of your suggestions shortly, I'll let you know how I go
0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
>>My database is already in simple mode<<
At least that settles the why don't you put the database in simple mode for grins solution. :)
0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
>>The table in question currently has 67112479 rows<<
And more importantly how many would remain?
0
 
LVL 7

Expert Comment

by:luani
Comment Utility
>>Problem is when I run the query the transaction log grows to be gigantic
In simple mode it should not be growing !?
anyway
put the record you want in a temp table,
select * into temp1  from tablename where WHERE logTime < dateadd(mm,-2,getdate())

--truncate table
truncate table tablename
and then re-insert the temp table records back to
insert into tablename select * from temp1
0
 

Author Comment

by:Elthamtech
Comment Utility
acperkins: I'm guessing about 1/3 of the transactions will remain

luani:The database is definitely in simple recovery mode, and when I tried to execute my query the log file grew to be over 100gb


My concern with both those suggestions is because I'll be moving about a third of the transactions, the log will still get too big. Is there any other way?

Thanks
0
 
LVL 60

Assisted Solution

by:chapmandew
chapmandew earned 200 total points
Comment Utility
Yes...the other way is to delete the records in increments like momi and I showed you.  
0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
>>I'm guessing about 1/3 of the transactions will remain<<
Copy them to a temporary table T_YourTableName with the same structure without any indexes or contraints.
Delete the original table YourTableName.
Rename the temporary table T_YourTableName to YourTableName
Add the indexes and constraints.

>>The database is definitely in simple recovery mode, and when I tried to execute my query the log file grew to be over 100gb<<
You know what they say: You can lead a horse to water, but you cannot make him drink.  Or would "beating a dead horse" be a better metaphore? :)
0
 

Author Comment

by:Elthamtech
Comment Utility
Thanks heaps guys

I ran momi_sabag's query on my table last night and the log only grew to 150mb (not 100gb!)

Thanks to chapmandew for helping me release what I had to do.

I appreciate every ones comments but because I was only looking to delete 1/3 of the rows, I felt that copying to a temporary table wouldn't help that much.

Cheers
0

Featured Post

What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

Join & Write a Comment

Suggested Solutions

Title # Comments Views Activity
Analysis of table use 7 24
ms sql stored procedure 22 74
Numeric sequence in SQL 14 36
combine an MS SQL string in Idera DM 9 26
In this article I will describe the Backup & Restore method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
Via a live example, show how to setup several different housekeeping processes for a SQL Server.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

728 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now