Solved

Shrinking transaction log

Posted on 2004-10-09
5
771 Views
Last Modified: 2008-03-17
I am trying to create a job where a complete backup of a database is created and the transaction log file is shrunk.  Here are the t-sql statements:


DECLARE @LogicalName AS VARCHAR(100),  
      @PhysicalName VARCHAR(100),
      @Name VARCHAR(100),
      @Description VARCHAR(100)

SET @LogicalName = 'TAG_Backup_Week_of_' + DATENAME(MM, GETDATE()) + '_' + CAST(DATEPART(DD, GetDate()) AS VARCHAR(2)) + '_' + CAST(DatePart(YYYY, GetDate()) AS VARCHAR(4))
SET @PhysicalName = 'E:\Backups\TAG\TAG_Backup_Week_of_' + DATENAME(MM, GETDATE()) + '_' + CAST(DATEPART(DD, GetDate()) AS VARCHAR(2)) + '_' + CAST(DatePart(YYYY, GetDate()) AS VARCHAR(4)) + '.bak'
SET @Name = 'TAG Complete'
SET @Description = 'Complete weekly backup of TAG database'

EXEC sp_addumpdevice 'disk', @LogicalName, @PhysicalName

BACKUP DATABASE TAG
   TO @LogicalName
   WITH
      NAME = @Name,
      DESCRIPTION = @Description,
      NOINIT, --Do not overwrite previous backup sets
      RETAINDAYS = 30 --Number days must elapse before backup can be overwritten

DBCC SHRINKFILE (TAG_Log, 25)  -- Shrink the transaction log file to 25 mb if possible

GO

A complete backup of the database will automatically truncate the log file, right or do I need to do that manually with:

BACKUP LOG TAG
   TO @LogicalName  WITH TRUNCATEONLY


Currently the t-log size is 50mb.
So why is the transaction log not releasing the extra space back to the OS?

0
Comment
Question by:km1039
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
5 Comments
 
LVL 6

Expert Comment

by:boblah
ID: 12267569
Hi km1039,

Shrinking the transaction log is complicated. Here's a script that normally does the job for me (this does none of the other stuff you're doing, you'll need to add that in)

   SET NOCOUNT ON
   DECLARE @LogicalFileName sysname,
           @MaxMinutes INT,
           @NewSize INT

   -- *** MAKE SURE TO CHANGE THE NEXT 4 LINES WITH YOUR CRITERIA. ***
   USE     [goran_V2_CCDB]              -- This is the name of the database
                                  -- for which the log will be shrunk.
   SELECT  @LogicalFileName = 'goran_aroma_V2_xfm_Log',  -- Use sp_helpfile to
      -- identify the logical file
      -- name that you want to shrink.
           @MaxMinutes = 10,      -- Limit on time allowed to wrap log.
           @NewSize    = 30       -- in MB

   -- Setup / initialize
   DECLARE @OriginalSize int
   SELECT @OriginalSize = size -- in 8K pages
     FROM sysfiles
     WHERE name = @LogicalFileName
   SELECT 'Original Size of ' + db_name() + ' LOG is ' +
           CONVERT(VARCHAR(30),@OriginalSize) + ' 8K pages or ' +
           CONVERT(VARCHAR(30),(@OriginalSize*8/1024)) + 'MB'
     FROM sysfiles
     WHERE name = @LogicalFileName

   CREATE TABLE DummyTrans
     (DummyColumn char (8000) not null)

   -- Wrap log and truncate it.
   DECLARE @Counter   INT,
           @StartTime DATETIME,
           @TruncLog  VARCHAR(255)
   SELECT  @StartTime = GETDATE(),
           @TruncLog = 'BACKUP LOG ['+ db_name() + '] WITH TRUNCATE_ONLY'
   -- Try an initial shrink.
   DBCC SHRINKFILE (@LogicalFileName, @NewSize)

   EXEC (@TruncLog)

   -- Wrap the log if necessary.
   WHILE     @MaxMinutes > DATEDIFF (mi, @StartTime, GETDATE()) -- time has not expired
         AND @OriginalSize = (SELECT size FROM sysfiles WHERE name = @LogicalFileName)  -- the log has not shrunk    
         AND (@OriginalSize * 8 /1024) > @NewSize  -- The value passed in for new size is smaller than the current size.
     BEGIN -- Outer loop.
       SELECT @Counter = 0
       WHILE  ((@Counter < @OriginalSize / 16) AND (@Counter < 50000))
         BEGIN -- update
           INSERT DummyTrans VALUES ('Fill Log')  -- Because it is a char field it inserts 8000 bytes.
           DELETE DummyTrans
           SELECT @Counter = @Counter + 1
         END   -- update
       EXEC (@TruncLog)  -- See if a trunc of the log shrinks it.
     END   -- outer loop
   SELECT 'Final Size of ' + db_name() + ' LOG is ' +
           CONVERT(VARCHAR(30),size) + ' 8K pages or ' +
           CONVERT(VARCHAR(30),(size*8/1024)) + 'MB'
     FROM sysfiles
     WHERE name = @LogicalFileName
   DROP TABLE DummyTrans
   PRINT '*** Perform a full database backup ***'
   SET NOCOUNT OFF

Cheers!
0
 

Author Comment

by:km1039
ID: 12267638
Thanks.  I'll give it a try.
0
 
LVL 34

Expert Comment

by:arbert
ID: 12268870
"Shrinking the transaction log is complicated."

Not really, a lot of it depends on your recovery model, but you can simply truncate the log and then shrink it:

backup log yourdatabasename with truncate_only
go
dbcc shrinkifile(yourdatabasename_log,0)
go
0
 

Author Comment

by:km1039
ID: 12269129
so executing a complete backup will NOT remove inactive entries from the transaction log and I need to do it manually?

by the way this is a production database and the recovery model is full

thanks arbert
0
 
LVL 34

Accepted Solution

by:
arbert earned 250 total points
ID: 12269138
No, they are marked as inactive, but you still have to shrink (dbcc shrinkfile) to recover "free" space...
0

Featured Post

Industry Leaders: We Want Your Opinion!

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
SQL Percentage Formula 7 33
T-SQL: Wrong Result 7 39
Any benefit to adding a Clustered index here? 4 40
how to use ROW_NUMBER() correctly 8 44
JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
This video shows, step by step, how to configure Oracle Heterogeneous Services via the Generic Gateway Agent in order to make a connection from an Oracle session and access a remote SQL Server database table.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

759 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question