Solved

SQL Server log file too large to manage - what's the fastest way to compact it?

Posted on 2007-12-05
11
3,133 Views
Last Modified: 2010-04-21
I have a MS SQL Server 2000 file that is 99,146,176 KB size and takes up all of the C:\ drive.  Obviously I try shrinking it but I keep getting the error to backup your log.  I seriously do not need a single log of transaction that has occurred.   How could I do just a delete of everything in there to reduce the file size?

Currently I ran a BACKUP LOG to DISK - not realizing the huge file size - how can I stop what is taking place without disrupting  to follow your below comments?

I will not ever need but a log of what happened for only in the last week - I'm sorry if I don't know what size limitation will that be - but it is definitely not going to be that huge.  How can I prevent this from happening in the future?
0
Comment
Question by:stephenlecomptejr
  • 4
  • 3
  • 2
  • +1
11 Comments
 
LVL 21

Assisted Solution

by:mastoo
mastoo earned 160 total points
ID: 20412075
Backup Log dbname With Truncate_only

After doing so, you probably want to run a full db backup and either put the database in simple recovery mode or start doing log backups to avoid a repeat of the problem.
0
 
LVL 5

Assisted Solution

by:MrNetic
MrNetic earned 160 total points
ID: 20412092
stephenlecomptejr,

Seems that you database is in full recovery model.

If the database you are talking about is a Production DB, you should consider implementing daily Backups to your transaction log, otherwise, put the database in SIMPLE recovery Model.

Hope it helps....
0
 
LVL 8

Expert Comment

by:i2mental
ID: 20412116
What recovery mode is this database in? If it's FULL, your log file size has grown because  you aren't making regular log file backups. Every insert/update/delete is being logged into this file and will grow over time. To keep this from growing to large, you need to make a maintenance plan to backup the log on a regular interval. Perhaps every 30 minutes or an hour depending on the usage.

The proper way to free up that space now if you are using FULL recovery is to let the log backup you're doing now finish. This will create a .trn file that is there for when you need to do a restore. It is not require anymore for the database to run, so you can delete it once to backup is complete. Note that this will mean that you can only restore to the last time you made a full backup. So I would advise you do a full backup once this log is complete so that you don't have potientally unrecoverable data.
0
 
LVL 1

Author Comment

by:stephenlecomptejr
ID: 20412124
I tried what you stated and now my log is back down to 1024 KB.
Thank you,

My database file is 3,199,168 KB size after being shrunk.
I do have it on schedule to backup every Sunday- what else am I doing wrong?

I'm not sure how to put in simple recovery or why would doing log backups prevent a repeat of the same problem?
0
 
LVL 1

Author Comment

by:stephenlecomptejr
ID: 20412153
oops got my answer from the comments above.

Thanks guys - just freakin' out over here in a panic.
I notice that everytime I run a DTS package (I just ran one thus far) - my log file size went back to 517, 184 KB.

I really don't need that much log information everytime a DTS is ran.
Any clues on helping me fix this?  My log size will balloon too large and even a simple log backup everytime would be useless for this.
0
How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

 
LVL 8

Accepted Solution

by:
i2mental earned 180 total points
ID: 20412156
Right click on the database and select properties. Go into the "Options" section and set the Recovery Model to Simple. Simple recovery model only allows you to restore to the last time you made a FULL backup instead of allowing you to restore to a point in time up until youre last log backup.
0
 
LVL 5

Expert Comment

by:MrNetic
ID: 20412169
Read my previous post ( 20412092 ).
0
 
LVL 1

Author Comment

by:stephenlecomptejr
ID: 20412175
i2mental,
Should I not also pick Auto shrink as well?
0
 
LVL 8

Expert Comment

by:i2mental
ID: 20412202
Depending on what the dts package is doing, this file size may be necessary/optimal. When the database is set to simple, all transactions are still logged until a checkpoint is reached. A checkpoint is like when the data is finally written to the disk at the end of an insert/update/delete statement.  If that statement is particularly large, then the log file will contain that information until it can be committed.  Once it is written, the log file is effectively cleaned out, but the data file stays the same. There is a cost to growing the log file to the size that is needed, so SQL Server will keep it at that size so that the next time the process is run, the file doesn't have to grow any further. I would recommend you leave it as the size it is now if you're set to simple now.

You can set autoshrink if you like, but like I said, if this same DTS package is giong to run each time, it's going to keep growing itself out and that's overhead you don't necessarily need.
0
 
LVL 21

Expert Comment

by:mastoo
ID: 20412220
Be sure you understand the difference between full and simple recovery modes.  In simple mode, you can only recover to the last full backup.  In full mode, you need to run full backups and periodic log backups and then you can recover to the last log backup.
0
 
LVL 1

Author Closing Comment

by:stephenlecomptejr
ID: 31412867
The combination of everyone's comments helped me solve my problems.  I hope no offense occurs from splitting the points.
0

Featured Post

How to improve team productivity

Quip adds documents, spreadsheets, and tasklists to your Slack experience
- Elevate ideas to Quip docs
- Share Quip docs in Slack
- Get notified of changes to your docs
- Available on iOS/Android/Desktop/Web
- Online/Offline

Join & Write a Comment

International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.

747 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

9 Experts available now in Live!

Get 1:1 Help Now