Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases

Posted on 2014-04-01
4
Medium Priority
?
3,541 Views
Last Modified: 2014-04-04
Trying to delete 300 million records from a table that has 500 million records.
After 2 hours I get the following

Any ideas?  thx

The transaction log for database 'tempdb' is full. To find out why space in the log cannot be reused, see the log_reuse_wait_desc column in sys.databases
0
Comment
Question by:JElster
4 Comments
 
LVL 52

Assisted Solution

by:Carl Tawn
Carl Tawn earned 1000 total points
ID: 39969304
I imagine you have simply run out of space in the tempdb log file. You'll have to check how big the log file is and either allow it more space (if disk capacity allows), or switch to deleting in batches, rather than attempting to delete the whole 300 million rows in one go.

Also, if you run the following statement, it should confirm what the issue is:
select log_reuse_wait_desc from sys.databases where database_id = db_id('tempdb')

Open in new window

0
 
LVL 70

Accepted Solution

by:
Scott Pletcher earned 1000 total points
ID: 39969353
>> Trying to delete 300 million records from a table that has 500 million records. <<

Is that table in tempdb or another db?

If it's in another db, I suspect that snapshot isolation of some type is on for that table, causing SQL to have to keep versions of the deleted rows in tempdb.


Deleting 300M rows all in one shot is not normally a good idea anyway -- way too large for a single transaction, esp. if heaven-forbid that sucker needs to roll back.

If at all possible, try deleting in batches, say 100K at a time.  Add a 1/3 or 1/2 second delay (WAITFOR DELAY) between batches if you can afford the time.
0
 
LVL 1

Author Comment

by:JElster
ID: 39969376
Another db
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 39977986
In addition to the comments about doing this in batches (I would recommend not larger than 500K rows) I would also make sure that you are doing frequent Transaction Log backups if the Recovery Model for this database is Full.
0

Featured Post

[Webinar] Cloud and Mobile-First Strategy

Maybe you’ve fully adopted the cloud since the beginning. Or maybe you started with on-prem resources but are pursuing a “cloud and mobile first” strategy. Getting to that end state has its challenges. Discover how to build out a 100% cloud and mobile IT strategy in this webinar.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Recently, when I was asked to create a new SQL 2005 cluster, Microsoft released a new service pack for MS SQL 2005 what is Service Pack 3. When I finished the installation of MS SQL 2005 I found myself troubled why the installation of SP3 failed …
I am showing a way to read/import the excel data in table using SQL server 2005... Suppose there is an Excel file "Book1" at location "C:\temp" with column "First Name" and "Last Name". Now to import this Excel data into the table, we will use…
Please read the paragraph below before following the instructions in the video — there are important caveats in the paragraph that I did not mention in the video. If your PaperPort 12 or PaperPort 14 is failing to start, or crashing, or hanging, …
Screencast - Getting to Know the Pipeline

810 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question