Solved

The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases

Posted on 2014-04-01
4
3,281 Views
Last Modified: 2014-04-04
Trying to delete 300 million records from a table that has 500 million records.
After 2 hours I get the following

Any ideas?  thx

The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases
0
Comment
Question by:JElster
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
4 Comments
 
LVL 52

Assisted Solution

by:Carl Tawn
Carl Tawn earned 250 total points
ID: 39969304
I imagine you have simply run out of space in the tempdb log file. You'll have to check how big the log file is and either allow it more space (if disk capacity allows), or switch to deleting in batches, rather than attempting to delete the whole 300 million rows in one go.

Also, if you run the following statement, it should confirm what the issue is:
select log_reuse_wait_desc from sys.databases where database_id = db_id('tempdb')

Open in new window

0
 
LVL 69

Accepted Solution

by:
Scott Pletcher earned 250 total points
ID: 39969353
>> Trying to delete 300 million records from a table that has 500 million records. <<

Is that table in tempdb or another db?

If it's in another db, I suspect that snapshot isolation of some type is on for that table, causing SQL to have to keep versions of the deleted rows in tempdb.


Deleting 300M rows all in one shot is not normally a good idea anyway -- way too large for a single transaction, esp. if heaven-forbid that sucker needs to roll back.

If at all possible, try deleting in batches, say 100K at a time.  Add a 1/3 or 1/2 second delay (WAITFOR DELAY) between batches if you can afford the time.
0
 
LVL 1

Author Comment

by:JElster
ID: 39969376
Another db
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 39977986
In addition to the comments about doing this in batches (I would recommend not larger than 500K rows) I would also make sure that you are doing frequent Transaction Log backups if the Recovery Model for this database is Full.
0

Featured Post

Free learning courses: Active Directory Deep Dive

Get a firm grasp on your IT environment when you learn Active Directory best practices with Veeam! Watch all, or choose any amount, of this three-part webinar series to improve your skills. From the basics to virtualization and backup, we got you covered.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

by Mark Wills Attending one of Rob Farley's seminars the other day, I heard the phrase "The Accidental DBA" and fell in love with it. It got me thinking about the plight of the newcomer to SQL Server...  So if you are the accidental DBA, or, simp…
So every once in a while at work I am asked to export data from one table and insert it into another on a different server.  I hate doing this.  There's so many different tables and data types.  Some column data needs quoted and some doesn't.  What …
This is a high-level webinar that covers the history of enterprise open source database use. It addresses both the advantages companies see in using open source database technologies, as well as the fears and reservations they might have. In this…
NetCrunch network monitor is a highly extensive platform for network monitoring and alert generation. In this video you'll see a live demo of NetCrunch with most notable features explained in a walk-through manner. You'll also get to know the philos…

687 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question