Solved

Shrink tlog while mirroring a database

Posted on 2009-04-14
9
600 Views
Last Modified: 2012-05-06
Hello all,

I am trying to find information on the best way to shrink a transaction log while the database is being mirrored. Typically I set the recovery to simple and then run dbcc shrinkfile to bring the tlog down. With mirroring this is not possible as the mirror must be in Full recovery mode. My question then is two fold.
1.) Can a transaction log be shrunk while mirroring? Since the db is in full recovery mode and I run dbcc shrinkfile I usually get an error message unless I set recovery to simple.
2.) If I do set recovery to simple while mirroring what happens to the mirror and if it breaks is there a quick method for reestablishing the mirror?

I appreiciate anyone's help.

Keith
0
Comment
Question by:kwoznica
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 3
  • 2
9 Comments
 
LVL 60

Expert Comment

by:chapmandew
ID: 24141592
1.  sure
2.  No

remember, truncating the log is different than shrinking the file.  If you're doing regular log backups (hopefully you are) then you can just use dbcc shrinkfile to shirnk the log file.
0
 
LVL 15

Expert Comment

by:rob_farley
ID: 24144374
The only way to clear up space inside a transaction log (also known as truncating) if the database is in the Full Recovery Model (as is required by mirroring) is to back it up.

So back it up regularly. Like... every few minutes. You don't need to keep the file if you're not interested in it (and clearly you're not if you're used to setting the database to Simple). If you do this, the transaction log shouldn't grow much, and you shouldn't need to shrink the log file at all (which ultimately just makes it have to grow more often - and growing log files is a painful operation).

Honestly - you should be trying to avoid having to shrink log files. I think it's only very exceptional circumstances that should make you want to shrink a log file.

Rob
0
 

Author Comment

by:kwoznica
ID: 24147505
Rob,

I understand your point about backups. During the day we do run log backups. The issue that I have is on weekends we also run a reorginize on the database to keep the database efficient and also as compact as possible. It is during this maintenance task that the log file will fill up to 120GB. I also run tlog backups during the maintenance reorginize but drive space has become an issue. This is the reason I will run dbcc shrinkfile.

In a circumstance such as this what would you recommend?
0
Free eBook: Backup on AWS

Everything you need to know about backup and disaster recovery with AWS, for FREE!

 
LVL 60

Expert Comment

by:chapmandew
ID: 24147546
Since you can't switch to Bulk Logged (because you're doing mirroring), you're better off doing what you're thinking about doing a shrink after a log backup.
0
 
LVL 15

Expert Comment

by:rob_farley
ID: 24153667
You should remember that if you're mirroring, then that 120GB of log will have to be sent to the mirror as well.

Depending on how large your database is, you may prefer to stop mirroring, switch to Simple, do your reorg, then switch back to Full and re-establish your mirroring.

Or else ask yourself why you really need to do a reorg? Are you checking for index fragmentation before rebuilding them, for example?

Rob
0
 

Author Comment

by:kwoznica
ID: 24396550
Rob,

I appreciate your suggestions. I am going to look into scripting out a job for switching back from mirroring, Simple mode, reorg, Full and back to the mirror.

I have to ask you one more question based on your last question. Should I check for index fragmentation and if so based on what statistical value should I kick off a reorg?

Thanks. The points are yours.

Keith
0
 
LVL 15

Accepted Solution

by:
rob_farley earned 500 total points
ID: 24408866
Keith,

It's all going to depend on how you feel your performance is being hit. If you're largely doing Seeks on your index, then you will barely notice fragmentation, because each Seek will still take about 3 reads.

If you're scanning, then you'll be following quite a few more links than you would ordinarily need, and this is where the pain will come in.

Unfortunately, it's like the length of string. Some people feel fragmentation pain early, some people can have massive amounts of fragmentation and really not care.

Definitely check the level of fragmentation before reorganising an index. It's not worth the pain if you don't need it. But also make sure you've baselined your performance, so that you can work out what pain you're experiencing (before and after). See what happens (preferably on a test machine) if you ignore the fragmentation a bit longer, etc... but definitely monitor what the fragmentation is like, to try to get a feel for what you should do.

Also bear in mind that tables will have different uses. Some will bear much higher levels of fragmentation, while others will hurt you quite quickly. The better you understand these different levels, the better time you'll have managing it.

Rob
0
 

Author Closing Comment

by:kwoznica
ID: 31570110
Rob, I appreciate your time in making your response. It is people like you that help a young dba like myself make progress. Thanks.
0
 
LVL 15

Expert Comment

by:rob_farley
ID: 24478555
No problem - any time.

Rob
0

Featured Post

Free eBook: Backup on AWS

Everything you need to know about backup and disaster recovery with AWS, for FREE!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
PERFORMANCE OF SQL QUERY 13 85
Whats wrong in this query - Select * from tableA,tableA 11 58
SQL Error - Query 6 55
Parse this column 6 38
Introduction: When running hybrid database environments, you often need to query some data from a remote db of any type, while being connected to your MS SQL Server database. Problems start when you try to combine that with some "user input" pass…
Use this article to create a batch file to backup a Microsoft SQL Server database to a Windows folder.  The folder can be on the local hard drive or on a network share.  This batch file will query the SQL server to get the current date & time and wi…
The Email Laundry PDF encryption service allows companies to send confidential encrypted  emails to anybody. The PDF document can also contain attachments that are embedded in the encrypted PDF. The password is randomly generated by The Email Laundr…
Finding and deleting duplicate (picture) files can be a time consuming task. My wife and I, our three kids and their families all share one dilemma: Managing our pictures. Between desktops, laptops, phones, tablets, and cameras; over the last decade…

739 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question