Improve company productivity with a Business Account.Sign Up

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1606
  • Last Modified:

SQL delete script without creating huge transaction logs

Hi All

I have the following SQL query I would like to run on a very large table to remove any rows more than 2 months old:

delete from tablename
WHERE logTime < dateadd(mm,-2,getdate())

Where tablename is the name of the table and logTime is the date the row was created.

Problem is when I run the query the transaction log grows to be gigantic - is there a way I can run the query without this happening?

Thanks
0
Elthamtech
Asked:
Elthamtech
  • 4
  • 4
  • 3
  • +3
2 Solutions
 
Luan JubicaProject ManagerCommented:
You can put the database in simple mode(so it will not log anything), delete the records and the reset to FULL, this works if U have to do this one time not all the time....
0
 
chapmandewCommented:
this should help:

while exists(select 1 from WHERE logTime < dateadd(mm,-2,getdate()))
begin
delete top(100) from tablename
WHERE logTime < dateadd(mm,-2,getdate())

waitfor delay '0:0:02'
end
0
 
St3veMaxCommented:
Sadly not. All delete commands are logged. You cant use truncate table as this clears the lot.

How many records are there in the table and how many are you looking to get rid of?

You could consider inserting all recs > 2 months into a new table; drop the old table and then rename the new table?

If you do the above; Please try it in a development environment first!

HTH
0
What Kind of Coding Program is Right for You?

There are many ways to learn to code these days. From coding bootcamps like Flatiron School to online courses to totally free beginner resources. The best way to learn to code depends on many factors, but the most important one is you. See what course is best for you.

 
momi_sabagCommented:
changing the log mode won't help
you need  to break it into smaller transactions, for example try to perform this :

declare @rowcount int
set @rowCount = 1

while @RowCount > 0
 begin
 with a as (select top 5000 * from tablename
  WHERE logTime < dateadd(mm,-2,getdate())
  )
 delete from a
  select @rowCount = @@rowCount
end
0
 
Luan JubicaProject ManagerCommented:
>>changing the log mode won't help
have a try anyway, is the more simple and more fast solution of your problem ....

0
 
chapmandewCommented:
simple and fast doesn't mean it is the best way.  it will cause you to not be able to recover to a point in time for your other transactions.
0
 
Luan JubicaProject ManagerCommented:
>>it will cause you to not be able to recover to a point in time for your other transactions.
Totally agree, he can always do a backup before everything ..but let him choose what's best for him......... ;)
0
 
chapmandewCommented:
can also do this:

while exists(select 1 from WHERE logTime < dateadd(mm,-2,getdate()))
begin
delete top(100) from tablename
WHERE logTime < dateadd(mm,-2,getdate())

checkpoint
end
0
 
ElthamtechAuthor Commented:
I didn't expect so many responses! Very impressive, Thanks! In answer to a couple of your questions:

1. My database is already in simple mode
2. The table in question currently has 67112479 rows
3. The database is backed up using CommVault every night

I'm going to try a couple of your suggestions shortly, I'll let you know how I go
0
 
Anthony PerkinsCommented:
>>My database is already in simple mode<<
At least that settles the why don't you put the database in simple mode for grins solution. :)
0
 
Anthony PerkinsCommented:
>>The table in question currently has 67112479 rows<<
And more importantly how many would remain?
0
 
Luan JubicaProject ManagerCommented:
>>Problem is when I run the query the transaction log grows to be gigantic
In simple mode it should not be growing !?
anyway
put the record you want in a temp table,
select * into temp1  from tablename where WHERE logTime < dateadd(mm,-2,getdate())

--truncate table
truncate table tablename
and then re-insert the temp table records back to
insert into tablename select * from temp1
0
 
ElthamtechAuthor Commented:
acperkins: I'm guessing about 1/3 of the transactions will remain

luani:The database is definitely in simple recovery mode, and when I tried to execute my query the log file grew to be over 100gb


My concern with both those suggestions is because I'll be moving about a third of the transactions, the log will still get too big. Is there any other way?

Thanks
0
 
chapmandewCommented:
Yes...the other way is to delete the records in increments like momi and I showed you.  
0
 
Anthony PerkinsCommented:
>>I'm guessing about 1/3 of the transactions will remain<<
Copy them to a temporary table T_YourTableName with the same structure without any indexes or contraints.
Delete the original table YourTableName.
Rename the temporary table T_YourTableName to YourTableName
Add the indexes and constraints.

>>The database is definitely in simple recovery mode, and when I tried to execute my query the log file grew to be over 100gb<<
You know what they say: You can lead a horse to water, but you cannot make him drink.  Or would "beating a dead horse" be a better metaphore? :)
0
 
ElthamtechAuthor Commented:
Thanks heaps guys

I ran momi_sabag's query on my table last night and the log only grew to 150mb (not 100gb!)

Thanks to chapmandew for helping me release what I had to do.

I appreciate every ones comments but because I was only looking to delete 1/3 of the rows, I felt that copying to a temporary table wouldn't help that much.

Cheers
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

What Kind of Coding Program is Right for You?

There are many ways to learn to code these days. From coding bootcamps like Flatiron School to online courses to totally free beginner resources. The best way to learn to code depends on many factors, but the most important one is you. See what course is best for you.

  • 4
  • 4
  • 3
  • +3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now