Solved

The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases

Posted on 2014-04-01
4
3,202 Views
Last Modified: 2014-04-04
Trying to delete 300 million records from a table that has 500 million records.
After 2 hours I get the following

Any ideas?  thx

The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases
0
Comment
Question by:JElster
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
4 Comments
 
LVL 52

Assisted Solution

by:Carl Tawn
Carl Tawn earned 250 total points
ID: 39969304
I imagine you have simply run out of space in the tempdb log file. You'll have to check how big the log file is and either allow it more space (if disk capacity allows), or switch to deleting in batches, rather than attempting to delete the whole 300 million rows in one go.

Also, if you run the following statement, it should confirm what the issue is:
select log_reuse_wait_desc from sys.databases where database_id = db_id('tempdb')

Open in new window

0
 
LVL 69

Accepted Solution

by:
Scott Pletcher earned 250 total points
ID: 39969353
>> Trying to delete 300 million records from a table that has 500 million records. <<

Is that table in tempdb or another db?

If it's in another db, I suspect that snapshot isolation of some type is on for that table, causing SQL to have to keep versions of the deleted rows in tempdb.


Deleting 300M rows all in one shot is not normally a good idea anyway -- way too large for a single transaction, esp. if heaven-forbid that sucker needs to roll back.

If at all possible, try deleting in batches, say 100K at a time.  Add a 1/3 or 1/2 second delay (WAITFOR DELAY) between batches if you can afford the time.
0
 
LVL 1

Author Comment

by:JElster
ID: 39969376
Another db
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 39977986
In addition to the comments about doing this in batches (I would recommend not larger than 500K rows) I would also make sure that you are doing frequent Transaction Log backups if the Recovery Model for this database is Full.
0

Featured Post

Salesforce Made Easy to Use

On-screen guidance at the moment of need enables you & your employees to focus on the core, you can now boost your adoption rates swiftly and simply with one easy tool.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
SSIS how to COMPARE a data column from different servers? 6 128
T-SQL to Update Table Dynamically 2 64
Set the max value for a column 7 41
Problem with SqlConnection 4 184
So every once in a while at work I am asked to export data from one table and insert it into another on a different server.  I hate doing this.  There's so many different tables and data types.  Some column data needs quoted and some doesn't.  What …
In this article we will get to know that how can we recover deleted data if it happens accidently. We really can recover deleted rows if we know the time when data is deleted by using the transaction log.
Nobody understands Phishing better than an anti-spam company. That’s why we are providing Phishing Awareness Training to our customers. According to a report by Verizon, only 3% of targeted users report malicious emails to management. With compan…
Attackers love to prey on accounts that have privileges. Reducing privileged accounts and protecting privileged accounts therefore is paramount. Users, groups, and service accounts need to be protected to help protect the entire Active Directory …

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question