Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

SQL 2005 transaction log

Posted on 2011-03-11
9
Medium Priority
?
267 Views
Last Modified: 2012-05-11
Hi,

I have a SQL 2005 cluster server running 32 bit. I have a database roughly around the size of 520gb. The transcation log is currently sitting at 72GB. I have a full backup that runs daily which is supposed to truncate the logs files? It appears this transcation log is not shrinking.

What can i do to solve this and it is effecting performance.

Regards
0
Comment
Question by:monarchit
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
9 Comments
 
LVL 7

Expert Comment

by:Gene_Cyp
ID: 35108099
0
 
LVL 30

Expert Comment

by:Rich Weissler
ID: 35108110
Backing up, and truncating the logs doesn't cause them to shrink, it allows the space to be reused.

The best practice recommendation is also to NOT shrink the transaction logs, unless you know something very unusual has occurred to cause them to grow much larger than they need to be.
0
 
LVL 7

Expert Comment

by:Gene_Cyp
ID: 35108126
One key way to stop it from growing too much is to change the recovery type to:

"Simple recovery model" in the SQL Settings

Just make sure that the recovery model you select meets your recovery needs.
0
Prepare for your VMware VCP6-DCV exam.

Josh Coen and Jason Langer have prepared the latest edition of VCP study guide. Both authors have been working in the IT field for more than a decade, and both hold VMware certifications. This 163-page guide covers all 10 of the exam blueprint sections.

 
LVL 6

Expert Comment

by:graf0
ID: 35108240
If you don't need to take transaction log backups, just switch the database to Simple Recovery Model and shrink the log file manually, using Tasks > Srink > Files on the database.
0
 
LVL 30

Expert Comment

by:Rich Weissler
ID: 35108296
If you don't need the point in time recovery... I concur with Gene_Cyp and graf0 -- Simple Recovery Model might be a good fit for you.  

I feel I should mention however -- one of the mistakes I made when starting with SQL was to assume the Transaction Log in the Simple Recovery Model was either minimally used or not used.  Be aware that the Transaction Log still needs to be sized sufficient to hold entire transactions while they are being committed to the database.  It won't be 'nothing' and probably won't be just the size of a single transaction... but it will probably be smaller than your transaction log will want to be with the Full Recovery Model.

One of the worst things you can do for performance would be to go through repeated cycles of shrinking and growing your transaction log files...
0
 
LVL 21

Expert Comment

by:Alpesh Patel
ID: 35108321
Hi,

First take back off Log file and truncate it.
0
 
LVL 2

Expert Comment

by:Umesh_Madap
ID: 35115890
most of the guys have answered to u r question.

I would like to know how frequent u r taking the t-log backups, for example if u r taking t-log backup every 1 hr then change it to 30 min.. and see. if you don't want point in time recovery then u r set it to simple recovery model.
0
 
LVL 1

Accepted Solution

by:
rcharlton earned 2000 total points
ID: 35116075
I've worked in high availability VLDBs (Very Large Database Environments) with SQL Server 2005. The best way that I've found is to perform the following:

1. Set the recovery interval to 5 minutes. Too long of a recovery interval and SQL Server is doing to many "background things" while your DB users are waiting for it. Too short and you have the same scenario. If there's a restore necessary, you're only talking about granularly losing data for the last 5 minutes prior to the failure.
2. Backup the transaction log every 5 minutes; this keeps it small and compact, allows you to shrink the transaction log (yikes! Yours is at 72GB? Do you need that much for the transaction log? Probably not)
3. You only have to worry about space where the transasction log backups are being placed.

As a caveat, I would implement the following backup strategy:

1. Perform a full backup once a week, say on Sunday.
2. Perform a differential backup Monday through Saturday.
3. Perform a log backup every 5 minutes.
4. Create stored procedures for the above, including ones that will do the restore.
5. Optionally, your database can be architected to use filegroups and partitioning. Non frequently accessed data, or data that never changes, can be on filegroups; data that frequently changes can be on other file groups. You only need to backup data on filegroups that are changing. This SIGNIFICANTLY reduces the amount of time to perform the backup. Although somewhat more complex to implement, once you get the hang of it -- it's a breeze, and your backups / restores happen very quickly. In my scenario for example, I had imports which happened on a certain day of the week. Those tables were targeted for certain filegroups, with partitioning, and a backup of those filegroups was performed on those days. The same was true for other filegroups which accepted data on different days. Those filegroups were backed up on those certain days as well. IT sounds complicated, but it reduces the amount of backup space required, and frees up the server for the production users, and will increase performance.
0
 

Author Closing Comment

by:monarchit
ID: 36032707
thanks
0

Featured Post

Nothing ever in the clear!

This technical paper will help you implement VMware’s VM encryption as well as implement Veeam encryption which together will achieve the nothing ever in the clear goal. If a bad guy steals VMs, backups or traffic they get nothing.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

This article will describe one method to parse a delimited string into a table of data.   Why would I do that you ask?  Let's say that you need to pass multiple parameters into a stored procedure to search for.  For our sake, we'll say that we wa…
by Mark Wills PIVOT is a great facility and solves many an EAV (Entity - Attribute - Value) type transformation where we need the information held as data within a column to become columns in their own right. Now, in some cases that is relatively…
In this brief tutorial Pawel from AdRem Software explains how you can quickly find out which services are running on your network, or what are the IP addresses of servers responsible for each service. Software used is freeware NetCrunch Tools (https…
Visualize your data even better in Access queries. Given a date and a value, this lesson shows how to compare that value with the previous value, calculate the difference, and display a circle if the value is the same, an up triangle if it increased…

688 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question