Solved

SQL 2005/2008 Find how .mdf and .ldf files have grown - shrunk

Posted on 2010-11-21
8
761 Views
Last Modified: 2012-05-10
Hi,

Is there a way to find out how much the files of a database both log and data have been growing? (ie a table that shows dates/times of when a log or data file had to expend and if a autoshrink was run how much space that cleared up)

We have very little space on our log file volume and it gets dangerously near full on a regular bases. When we are down to to 5-20mb our database manager then runs a shrink on all log files and we get a good 5-6gb of space back.

From what i understand its best to try and keep the files (.ldf and .mdf) at a static size to improve performance, so my theory was if i can find the ldf that is growing a lot i can A resize it to a larger size (and know the size to extend to) and if needed move it to a different location while we wait for more storage.

Hope this makes sense and it is possible.

Any help or suggestions gratefully received.

0
Comment
Question by:stebennettsjb
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
8 Comments
 
LVL 5

Expert Comment

by:SinghAmandeep
ID: 34186527
Try this

--INF: How to Shrink the SQL Server 7.0 Transaction Log
-- SQL7  http://support.microsoft.com/support/kb/articles/q256/6/50.asp?id=256650&SD
-- SQL7  http://www.support.microsoft.com/kb/256650  
-- SQL2000 http://support.microsoft.com/kb/272318/en-us
-- SQL2005 http://support.microsoft.com/kb/907511/en-us
-- select db_name()
-- select * from sysfiles

-- THIS SCRIPT IS NOT INTENDED FOR SCHEDULED USAGE !! READ BOL and the urls !!

SET NOCOUNT ON
DECLARE @LogicalFileName sysname,
        @MaxMinutes INT,
        @NewSize INT


-- *** MAKE SURE TO CHANGE THE NEXT 3 LINES WITH YOUR CRITERIA. ***

SELECT  @LogicalFileName = 'Airtel_Log',  -- Use sp_helpfile to identify the logical file name that you want to shrink.
        @MaxMinutes = 10,               -- Limit on time allowed to wrap log.
        @NewSize = 305                  -- in MB

-- Setup / initialize
DECLARE @OriginalSize int
SELECT @OriginalSize = size -- in 8K pages
  FROM sysfiles
  WHERE name = @LogicalFileName
SELECT 'Original Size of ' + db_name() + ' LOG is ' +
        CONVERT(VARCHAR(30),@OriginalSize) + ' 8K pages or ' +
        CONVERT(VARCHAR(30),(@OriginalSize*8/1024)) + 'MB'
  FROM sysfiles
  WHERE name = @LogicalFileName
CREATE TABLE DummyTrans
  (DummyColumn char (8000) not null)


-- Wrap log and truncate it.
DECLARE @Counter   INT,
        @StartTime DATETIME,
        @TruncLog  VARCHAR(255)
SELECT  @StartTime = GETDATE(),
        @TruncLog = 'BACKUP LOG ' + db_name() + ' WITH TRUNCATE_ONLY'
-- Try an initial shrink.
DBCC SHRINKFILE (@LogicalFileName, @NewSize)
EXEC (@TruncLog)
-- Wrap the log if necessary.
WHILE     @MaxMinutes > DATEDIFF (mi, @StartTime, GETDATE()) -- time has not expired
      AND @OriginalSize = (SELECT size FROM sysfiles WHERE name = @LogicalFileName)  -- the log has not shrunk    
      AND (@OriginalSize * 8 /1024) > @NewSize  -- The value passed in for new size is smaller than the current size.
  BEGIN -- Outer loop.
    SELECT @Counter = 0
    WHILE  ((@Counter < @OriginalSize / 16) AND (@Counter < 50000))
      BEGIN -- update
        INSERT DummyTrans VALUES ('Fill Log')  -- Because it is a char field it inserts 8000 bytes.
        DELETE DummyTrans
        SELECT @Counter = @Counter + 1
      END   -- update
    EXEC (@TruncLog)  -- See if a trunc of the log shrinks it.
  END   -- outer loop
SELECT 'Final Size of ' + db_name() + ' LOG is ' +
        CONVERT(VARCHAR(30),size) + ' 8K pages or ' +
        CONVERT(VARCHAR(30),(size*8/1024)) + 'MB'
  FROM sysfiles
  WHERE name = @LogicalFileName
DROP TABLE DummyTrans
PRINT '*** Perform a full database backup ***'
SET NOCOUNT OFF
0
 

Author Comment

by:stebennettsjb
ID: 34186552
Hi guys thanks for the replies.

Im trying to find how much the files are growing ie, run a script to show a table with all the ldf files and what size they are and the date that they autogrow.

Its basically monitoring that im looking to do not actually shrink the files, im comfortable with how to do that.

Thanks and hope that makes sense

s
0
Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

 
LVL 6

Accepted Solution

by:
subhashpunia earned 500 total points
ID: 34187174
In SSMS right click Database > Reports > Standard Reports > Disk Usage > here you will see the fata and log file growth status.
0
 

Author Comment

by:stebennettsjb
ID: 34187455
Hi,

this sounds good but when i go to reports i get Custom Reports as the only option, no standard.

also would this give me access to T-SQL for the report. I was hoping to hook up the data into SSRS.

Thanks for all the help

S
0
 
LVL 9

Expert Comment

by:sarabhai
ID: 34187456
Pls check sys.master_files view
0
 

Author Comment

by:stebennettsjb
ID: 34187499
sorry me being simple..

right clicked on the database folder instead of the server. I can now see Standard Reports but there is no Disk Usage report.

Thanks for the help

s
0
 

Author Comment

by:stebennettsjb
ID: 34204279
no one know how to get this report via TSQL and for all the databases instead of doing them one by one?

Thanks for the help so far :)

S
0

Featured Post

Optimize your web performance

What's in the eBook?
- Full list of reasons for poor performance
- Ultimate measures to speed things up
- Primary web monitoring types
- KPIs you should be monitoring in order to increase your ROI

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Naughty Me. While I was changing the database name from DB1 to DB_PROD1 (yep it's not real database name ^v^), I changed the database name and notified my application fellows that I did it. They turn on the application, and everything is working. A …
In this article we will get to know that how can we recover deleted data if it happens accidently. We really can recover deleted rows if we know the time when data is deleted by using the transaction log.
NetCrunch network monitor is a highly extensive platform for network monitoring and alert generation. In this video you'll see a live demo of NetCrunch with most notable features explained in a walk-through manner. You'll also get to know the philos…
Michael from AdRem Software explains how to view the most utilized and worst performing nodes in your network, by accessing the Top Charts view in NetCrunch network monitor (https://www.adremsoft.com/). Top Charts is a view in which you can set seve…

623 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question