Still celebrating National IT Professionals Day with 3 months of free Premium Membership. Use Code ITDAY17

x
?
Solved

batch job to shrink data file till its  a certain size

Posted on 2014-04-23
7
Medium Priority
?
66 Views
Last Modified: 2015-07-15
I have a server that has a large database we'd like to shrink and get some of the space
back. I know I can use 'DBCC shrinkfile (DBNAME, SIZE) ' we'd like to do it in chunks because of the size of the file and its a production database. is there a way to built it into a batch job that does 10GB chunks till its the desired size we'd like.
0
Comment
Question by:richa1960
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 2
7 Comments
 
LVL 30

Accepted Solution

by:
Rich Weissler earned 1200 total points
ID: 40019800
I assume you already know all the reasons why you shouldn't shrink database files.  Many folks, far smarter than I, have invested a huge number of bits explaining why it's a very bad idea.

I also assume you're really working with the datafile, not the transaction log.

I don't believe cutting off chunks of your datafile at a time will work quite the way you hope.  If you HAVE to perform the shrink operation, do it just once during a maintenance window, and be sure to leave enough time to fix the index fragmentation shrinking causes.  (It also means, you'll need the NEW size of the database to be large enough to complete the reindex/reorganization operations.)
0
 

Author Comment

by:richa1960
ID: 40020423
Yes we're only dealing with the data file, the problem with doing it in a maint window is that the file is so large it wouldn't complete in time for customer to continue to access their data. so we'd like to do it in batches over a longer period to minimize the impact. we've accounted for the re-index its the primary process we're dealing with.
0
 
LVL 35

Expert Comment

by:David Todd
ID: 40021292
Hi,

This should do it.

Do set the fileid correctly, do set the shrink limit, do select the correct database.

HTH
  David

declare @fileid int
set @fileid = 1

select size * 8 * 1024
from dbo.sysfiles
where
	fileid = @fileid
;

declare @size bigint

while 1 = 1 begin
	-- size is 8k pages
	-- @size is bytes
	select @size = size * 8 * 1024
	from dbo.sysfiles
	where
		fileid = @fileid
	;

        -- 100 is limit to shrink to in GB
	if @size / 1024 / 1024 / 1024.0 < 100
		break
	;

	-- @targetsize is size less 10GB expressed in MB
	declare @targetsize int
	set @targetsize = ( @size / 1024.0 / 1024.0 / 1024.0 - 10 ) * 1024

	dbcc shrinkfile( @fileid, @targetsize )

end

Open in new window

0
Ransomware: The New Cyber Threat & How to Stop It

This infographic explains ransomware, type of malware that blocks access to your files or your systems and holds them hostage until a ransom is paid. It also examines the different types of ransomware and explains what you can do to thwart this sinister online threat.  

 

Author Comment

by:richa1960
ID: 40054792
I'll be testing the script this weekend on a dev system we have setup.
0
 

Author Comment

by:richa1960
ID: 40067360
Receiving the following error:

Msg 8115, Level 16, State 2, Line 5
Arithmetic overflow error converting expression to data type int.
Msg 8115, Level 16, State 2, Line 16
Arithmetic overflow error converting expression to data type int.
0
 
LVL 35

Expert Comment

by:David Todd
ID: 40068689
Hi,

I've added some explicit casts to bigint that should take care of the overflows.

I've added checks to make sure that each subsequent iteration through the loop isn't taking too long.

HTH
  David

declare @fileid int
set @fileid = 1

select size * 8 * cast( 1024 as bigint )
from dbo.sysfiles
where
	fileid = @fileid
;

declare @size bigint
declare @LoopStart datetime
declare @CurrentDuration  int
declare @PreviousDuration int

while 1 = 1 begin
	select @LoopStart = getdate()

	-- if each iteration is getting longer, then back off
	if @CurrentDuration > 2 * @PreviousDuration begin
		print 'Starting to take too long. Breaking ...'
		break
	end
	;

	-- size is 8k pages
	-- @size is bytes
	select @size = size * 8 * cast( 1024 as bigint )
	from dbo.sysfiles
	where
		fileid = @fileid
	;

        -- 100 is limit to shrink to in GB
	if @size / 1024 / 1024 / 1024.0 < 100 begin
		print 'At desired limit. Breaking ...'
		break
	end
	;

	-- @targetsize is size less 10GB expressed in MB
	declare @targetsize int
	set @targetsize = ( @size / 1024.0 / 1024.0 / 1024.0 - 10 ) * 1024

	print 'shrinking to ' + convert( varchar( max ), @targetsize )
	dbcc shrinkfile( @fileid, @targetsize )

	select @PreviousDuration = @CurrentDuration
	select @CurrentDuration = datediff( ms, @LoopStart, getdate())
end

Open in new window

0
 

Author Comment

by:richa1960
ID: 40117578
Sorry, I've been out with a personal matter to deal with, the script will be run in production this weekend, thanks.
0

Featured Post

U.S. Department of Agriculture and Acronis Access

With the new era of mobile computing, smartphones and tablets, wireless communications and cloud services, the USDA sought to take advantage of a mobilized workforce and the blurring lines between personal and corporate computing resources.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

A Stored Procedure in Microsoft SQL Server is a powerful feature that it can be used to execute the Data Manipulation Language (DML) or Data Definition Language (DDL). Depending on business requirements, a single Stored Procedure can return differe…
Ready to get certified? Check out some courses that help you prepare for third-party exams.
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function
Via a live example, show how to setup several different housekeeping processes for a SQL Server.

688 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question