Solved

SQL Server 2008 R2 Maintenance - Transaction Log Backups Failing

Posted on 2013-01-08
11
2,298 Views
Last Modified: 2013-01-23
I have a large DB file 146Gb and and even larger Transaction Log file 219GB that is constantly growing.

I have scheduled transaction log backups to run every 2 hours as per DB maintenance plan in an effort to shrink the log file however it's not working and the jobs fail with the following error:

Log Name:      Application
Source:        SQLSERVERAGENT
Date:          1/3/2013 6:00:07 AM
Event ID:      208
Task Category: Job Engine
Level:         Warning
Keywords:      Classic
User:          N/A
Computer:      SQL.internal.corp
Description:
SQL Server Scheduled Job 'DBMaintenance.Transaction Log' (0xA16724D22E27E042BD579A5E41365A12) - Status: Failed - Invoked on: 2013-01-03 06:00:00 - Message: The job failed.  The Job was invoked by Schedule 14 (DBMaintenance.Transaction Log).  The last step to run was step 1 (Transaction Log).


This error is generated every 2 hours, it is set to backup the transaction logs on specific database which includes the one in question. I see in the datatbase selection, there is a warning advising databases with simple recovery will be excluded.  - I have verified that the recovery model of the database is set to Full.

I have 2 issues I need assistance with (i'm not very fluent in SQL so bear with me)
-Need to determine why the Transaction Log backup is not completing
-Need to shrink the log /database files without any down time

Thank you.
0
Comment
Question by:fireguy1125
  • 5
  • 3
  • 2
  • +1
11 Comments
 
LVL 3

Expert Comment

by:IKtech
Comment Utility
Can you try running the trans log backup on one database at a time to see if you can narrow it down to a specific database\logfile?  Also are you attempting to do anything else in the same maintenance plan that you are using to backup the logs?
0
 
LVL 22

Expert Comment

by:Steve Wales
Comment Utility
Under SQL Server Agent, expand the Jobs tree and then find the job, right click and View History.

Each job should have a plus sign to the left of it to expand the job.

Click on one of the steps and you should have more information on the error - or look in the LOG directory of your instance installation and have a look at the contents of SQLAGENT.OUT for more details of the error.

Once you get your backups working, have a read of Kimberly Tripp's couple of excellent blog entries on just this topic for shrinking and then regrowing your log files to an appropriate size:

http://www.sqlskills.com/blogs/kimberly/8-steps-to-better-transaction-log-throughput/
http://www.sqlskills.com/blogs/kimberly/transaction-log-vlfs-too-many-or-too-few/
0
 
LVL 1

Author Comment

by:fireguy1125
Comment Utility
As part of my maintenance plan I have:

My transaction log backup is set to run every 2 hours, so about 12 times within 24 hours, which is during times of when queries and realtime database access is occuring, etc.

 -ConsistencyChecks every night 10pm for all databases and include indexes
-Backup Databases eery night 10pm, Full backup for all Databases, verify backup integrity and compress backups
-Update statistics on all databases, all existing statistics, full scan weekly saturdays at 1am
-Rebuild indexes on all user databases the first sunday of every month - reorganize pages with the default amoutn of free space
-Log and History cleanuup: Delete backup files, based on txt extension, older than 3 weeks - every sunday at 6pm

**The last log shrinking job that appears to have ran was almost 2 years ago, the T-SQL statement on that job is:

USE [SQLDatabase]
GO
DBCC SHRINKFILE (N'Test DB Creation_log' , 0, TRUNCATEONLY)
GO

Is this something that I can/should run again? Is this something that can be done at any time or after hours with limited database access? Also how long would it take on a 219GB transacation log file?

----------------------------------------------------------------------
Result of the Log File Job History:

Date            1/8/2013 4:00:01 PM
Log            Job History (DBMaintenance.Transaction Log)

Step ID            1
Server            SQLSERVER
Job Name            DBMaintenance.Transaction Log
Step Name            Transaction Log
Duration            00:00:13
Sql Severity            0
Sql Message ID            0
Operator Emailed            
Operator Net sent            
Operator Paged            
Retries Attempted            0

Message
Executed as user: INTERNAL.CORP\svc_sql. Microsoft (R) SQL Server Execute Package Utility  Version 10.50.1600.1 for 64-bit  Copyright (C) Microsoft Corporation 2010. All rights reserved.    Started:  4:00:01 PM  Progress: 2013-01-08 16:00:03.44     Source: {02BF0A94-27C5-4D0A-8383-4E6F525B27B7}      Executing query "DECLARE @Guid UNIQUEIDENTIFIER      EXECUTE msdb..sp...".: 100% complete  End Progress  Progress: 2013-01-08 16:00:09.78     Source: Back Up Database Task      Executing query "BACKUP LOG [BackupExec] TO  DISK = N'G:\SQLBACKUP\...".: 100% complete  End Progress  Progress: 2013-01-08 16:00:10.03     Source: Back Up Database Task      Executing query "BACKUP LOG [BEDB] TO  DISK = N'G:\SQLBACKUP\Transa...".: 100% complete  End Progress  Progress: 2013-01-08 16:00:10.31     Source: Back Up Database Task      Executing query "BACKUP LOG [DB2] TO  DISK = N'G:\SQLBACKUP\...".: 100% complete  End Progress  Progress: 2013-01-08 16:00:11.12     Source: Back Up Database Task      Executing query "BACKUP LOG [SQLDATABASE] TO  DISK = N'G:\SQLBACKUP\Tr...".: 100% complete  End Progress  Progress: 2013-01-08 16:00:11.49     Source: Back Up Database Task      Executing query "BACKUP LOG [ReportServer] TO  DISK = N'G:\SQLBACKU...".: 100% complete  End Progress  Error: 2013-01-08 16:00:12.40     Code: 0xC002F210     Source: Maintenance Cleanup Task Execute SQL Task     Description: Executing the query "EXECUTE master.dbo.xp_delete_file 0,N'E:\Program F..." failed with the following error: "xp_delete_file() returned error 2, 'The system cannot find the file specified.'". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly.  End Error  DTExec: The package execution returned DTSER_FAILURE (1).  Started:  4:00:01 PM  Finished: 4:00:12 PM  Elapsed:  11.482 seconds.  The package execution failed.  The step failed.
0
 
LVL 22

Expert Comment

by:Steve Wales
Comment Utility
Backups seem to be working, looks like it's the maintenance clean up task that's failing:

Executing the query "EXECUTE master.dbo.xp_delete_file 0,N'E:\Program F..." failed with the following error: "xp_delete_file() returned error 2, 'The system cannot find the file specified.'"

Check the Maintenance Cleanup Task - are you telling it to look in the right location?  Does that location exist?

You're backing everything up to g:\SQLBACKUP but trying to delete files from e:\program ... ?
0
 
LVL 1

Author Comment

by:fireguy1125
Comment Utility
Ok Great, that did it, I saw a maintenance cleanup task below the transaction log backup job that contained the old path after the backups were moved from E: to G:.

Now onto the other issue...how is the log file so big, if the tranasaction log backups are successful?

This is the TSQL for those transaction log backups:

BACKUP LOG [INTERNAL] TO  DISK = N'G:\SQLBACKUP\Transaction Log backups\INTERNAL_backup_2013_01_08_164427_2599474.trn' WITH  RETAINDAYS = 7, NOFORMAT, NOINIT,  NAME = N'INTERNAL_backup_2013_01_08_164427_2599474', SKIP, REWIND, NOUNLOAD, COMPRESSION,  STATS = 10
GO
0
Zoho SalesIQ

Hassle-free live chat software re-imagined for business growth. 2 users, always free.

 
LVL 22

Expert Comment

by:Steve Wales
Comment Utility
OK see these two links as mentioned above:

http://www.sqlskills.com/blogs/kimberly/8-steps-to-better-transaction-log-throughput/
http://www.sqlskills.com/blogs/kimberly/transaction-log-vlfs-too-many-or-too-few/

The transaction log will grow and grow forever unless a transaction log backup is done.

At some stage in the past you've either had a really large transaction or had a while without transaction log backups.

Now that you're doing log backups, there's space in the logfile that can be reclaimed.

With that in mind, have a read of the two blog posts and let us know if you need anything else.
0
 
LVL 1

Author Comment

by:fireguy1125
Comment Utility
So essentially, after the transaction log backup completes, i'll run a

DBCC SHRINKFILE(transactionloglogicalfilename, TRUNCATEONLY) command and that should shrink the size - how long does this process normally take, I intend on performing tonight  after our full database backup completes is there a timeframe that you can provide estimate , minutes, hours?

Thank you again for all your assistance!
0
 
LVL 22

Accepted Solution

by:
Steve Wales earned 500 total points
Comment Utility
Less than a minute, probably seconds - I've always seen fast turn around when I've had to do it.

You will want to perform a log backup after the full backup, it is the log backup that marks the space as reusable.

Then you will want to grow the transaction log back to a size that fits your workload so it doesn't auto grow (see the part of the blog entry talking about VLF's).

Do this in a period of low activity.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
So essentially, after the transaction log backup completes, i'll run a

DBCC SHRINKFILE(transactionloglogicalfilename, TRUNCATEONLY)

If you value performance and data integrity, please do not do this, this is a very bad idea.
0
 
LVL 22

Expert Comment

by:Steve Wales
Comment Utility
acperkins:  After a full backup and a log backup (which marks the transaction log as truncated), why wouldn't you be able to do this?

I always take another full backup after doing this, if I'm ever faced with a runaway transaction log.

This is the advice given in the blog posts I mentioned as well (and several others) as the steps involved in shrinking a transaction log file and then growing it again afterwards for both making your log file smaller and dealing with too many VLF's.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
Comment Utility
After a full backup and a log backup (which marks the transaction log as truncated), why wouldn't you be able to do this?
Actually I am sorry, I may have taken the comment out of context.  I thought the author was suggesting adding this to their scheduled maintenance job, rather than a one off emergency-only situation.

This is the advice given in the blog posts I mentioned as well
Exactly and the part the author did not quote, most likely overlooked (even though you did mention it) and in my opinion the most important part was that the database has to be sized appropriately after shrinking it (Kimberly's step 3).
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Suggested Solutions

     When we have to pass multiple rows of data to SQL Server, the developers either have to send one row at a time or come up with other workarounds to meet requirements like using XML to pass data, which is complex and tedious to use. There is a …
SQL Server engine let you use a Windows account or a SQL Server account to connect to a SQL Server instance. This can be configured immediatly during the SQL Server installation or after in the Server Authentication section in the Server properties …
Sending a Secure fax is easy with eFax Corporate (http://www.enterprise.efax.com). First, Just open a new email message.  In the To field, type your recipient's fax number @efaxsend.com. You can even send a secure international fax — just include t…
This tutorial demonstrates a quick way of adding group price to multiple Magento products.

763 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

14 Experts available now in Live!

Get 1:1 Help Now