Solved

SQL Server log file too large to manage - what's the fastest way to compact it?

Posted on 2007-12-05
11
3,160 Views
Last Modified: 2010-04-21
I have a MS SQL Server 2000 file that is 99,146,176 KB size and takes up all of the C:\ drive.  Obviously I try shrinking it but I keep getting the error to backup your log.  I seriously do not need a single log of transaction that has occurred.   How could I do just a delete of everything in there to reduce the file size?

Currently I ran a BACKUP LOG to DISK - not realizing the huge file size - how can I stop what is taking place without disrupting  to follow your below comments?

I will not ever need but a log of what happened for only in the last week - I'm sorry if I don't know what size limitation will that be - but it is definitely not going to be that huge.  How can I prevent this from happening in the future?
0
Comment
Question by:stephenlecomptejr
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
  • 2
  • +1
11 Comments
 
LVL 21

Assisted Solution

by:mastoo
mastoo earned 160 total points
ID: 20412075
Backup Log dbname With Truncate_only

After doing so, you probably want to run a full db backup and either put the database in simple recovery mode or start doing log backups to avoid a repeat of the problem.
0
 
LVL 5

Assisted Solution

by:MrNetic
MrNetic earned 160 total points
ID: 20412092
stephenlecomptejr,

Seems that you database is in full recovery model.

If the database you are talking about is a Production DB, you should consider implementing daily Backups to your transaction log, otherwise, put the database in SIMPLE recovery Model.

Hope it helps....
0
 
LVL 8

Expert Comment

by:i2mental
ID: 20412116
What recovery mode is this database in? If it's FULL, your log file size has grown because  you aren't making regular log file backups. Every insert/update/delete is being logged into this file and will grow over time. To keep this from growing to large, you need to make a maintenance plan to backup the log on a regular interval. Perhaps every 30 minutes or an hour depending on the usage.

The proper way to free up that space now if you are using FULL recovery is to let the log backup you're doing now finish. This will create a .trn file that is there for when you need to do a restore. It is not require anymore for the database to run, so you can delete it once to backup is complete. Note that this will mean that you can only restore to the last time you made a full backup. So I would advise you do a full backup once this log is complete so that you don't have potientally unrecoverable data.
0
Comprehensive Backup Solutions for Microsoft

Acronis protects the complete Microsoft technology stack: Windows Server, Windows PC, laptop and Surface data; Microsoft business applications; Microsoft Hyper-V; Azure VMs; Microsoft Windows Server 2016; Microsoft Exchange 2016 and SQL Server 2016.

 
LVL 1

Author Comment

by:stephenlecomptejr
ID: 20412124
I tried what you stated and now my log is back down to 1024 KB.
Thank you,

My database file is 3,199,168 KB size after being shrunk.
I do have it on schedule to backup every Sunday- what else am I doing wrong?

I'm not sure how to put in simple recovery or why would doing log backups prevent a repeat of the same problem?
0
 
LVL 1

Author Comment

by:stephenlecomptejr
ID: 20412153
oops got my answer from the comments above.

Thanks guys - just freakin' out over here in a panic.
I notice that everytime I run a DTS package (I just ran one thus far) - my log file size went back to 517, 184 KB.

I really don't need that much log information everytime a DTS is ran.
Any clues on helping me fix this?  My log size will balloon too large and even a simple log backup everytime would be useless for this.
0
 
LVL 8

Accepted Solution

by:
i2mental earned 180 total points
ID: 20412156
Right click on the database and select properties. Go into the "Options" section and set the Recovery Model to Simple. Simple recovery model only allows you to restore to the last time you made a FULL backup instead of allowing you to restore to a point in time up until youre last log backup.
0
 
LVL 5

Expert Comment

by:MrNetic
ID: 20412169
Read my previous post ( 20412092 ).
0
 
LVL 1

Author Comment

by:stephenlecomptejr
ID: 20412175
i2mental,
Should I not also pick Auto shrink as well?
0
 
LVL 8

Expert Comment

by:i2mental
ID: 20412202
Depending on what the dts package is doing, this file size may be necessary/optimal. When the database is set to simple, all transactions are still logged until a checkpoint is reached. A checkpoint is like when the data is finally written to the disk at the end of an insert/update/delete statement.  If that statement is particularly large, then the log file will contain that information until it can be committed.  Once it is written, the log file is effectively cleaned out, but the data file stays the same. There is a cost to growing the log file to the size that is needed, so SQL Server will keep it at that size so that the next time the process is run, the file doesn't have to grow any further. I would recommend you leave it as the size it is now if you're set to simple now.

You can set autoshrink if you like, but like I said, if this same DTS package is giong to run each time, it's going to keep growing itself out and that's overhead you don't necessarily need.
0
 
LVL 21

Expert Comment

by:mastoo
ID: 20412220
Be sure you understand the difference between full and simple recovery modes.  In simple mode, you can only recover to the last full backup.  In full mode, you need to run full backups and periodic log backups and then you can recover to the last log backup.
0
 
LVL 1

Author Closing Comment

by:stephenlecomptejr
ID: 31412867
The combination of everyone's comments helped me solve my problems.  I hope no offense occurs from splitting the points.
0

Featured Post

Backup Solution for AWS

Read about how CloudBerry Backup fully integrates your backups with Amazon S3 and Amazon Glacier to provide military-grade encryption and dramatically cut storage costs on any platform.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Nowadays, some of developer are too much worried about data. Who is using data, who is updating it etc. etc. Because, data is more costlier in term of money and information. So security of data is focusing concern in days. Lets' understand the Au…
Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
Viewers will learn how to use the UPDATE and DELETE statements to change or remove existing data from their tables. Make a table: Update a specific column given a specific row using the UPDATE statement: Remove a set of values using the DELETE s…

740 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question