Solved

Database Archive

Posted on 2003-11-04
10
431 Views
Last Modified: 2006-11-17
I want find out best possible option for the database archive.

Archive means append the records to archive database and delete archived records from main database. I have main database and archive database on same SQL server. Problem is the database size. Main database could of size 40 GB +. I have implemented this using  INSERT INTO by selecting from main database and then deleting the selected records from main database. This works, but the transaction log size grows substantially. And Insert and Delete needs to part of transaction.

What could be better approach to handle this?
 
0
Comment
Question by:Milind00
10 Comments
 
LVL 5

Expert Comment

by:russellshome
ID: 9685339
Change 5 below to whatever does not make the transaction log too large & is most efficient.

declare @rows int
select @rows = count(au_id) from pubs..authors

while @rows > 0
begin
      begin tran
            insert into tempdb..authors
            select top 5 * from pubs..authors
            order by au_id

            delete
            from pubs..authors
            where au_id in (
                  select top 5 au_id
                  from tempdb..authors
                  order by au_id
            )

      commit tran
      select @rows = count(au_id) from pubs..authors
end
0
 
LVL 5

Expert Comment

by:russellshome
ID: 9685346
Slight change::

declare @rows int
select @rows = count(au_id) from pubs..authors

while @rows > 0
begin
     begin tran
          insert into tempdb..authors
          select top 5 * from pubs..authors
          order by au_id

          delete
          from pubs..authors
          where au_id in (
               select top 5 au_id
               from pubs..authors
               order by au_id
          )

     commit tran
     select @rows = count(au_id) from pubs..authors
end
0
 
LVL 4

Author Comment

by:Milind00
ID: 9685574
russellshome ...with your query do I need to use DBCC shrinkfile to shrink the database file and LDF file?
0
Windows Server 2016: All you need to know

Learn about Hyper-V features that increase functionality and usability of Microsoft Windows Server 2016. Also, throughout this eBook, you’ll find some basic PowerShell examples that will help you leverage the scripts in your environments!

 
LVL 4

Author Comment

by:Milind00
ID: 9685593
russellshome...

Also how about using bulk insert for the above task. What could be efficent way BULK Insert or the way you suggested?
0
 
LVL 1

Expert Comment

by:RadimHampel
ID: 9686998
When you are removing all records in one huge transaction, log will be containing all data from your table(if in your db is only table, up to 40gb). Even if you are using simple recovery mode, log file will grow quickly. Solution presented by russellshome is good, but it isn't done in one transaction. But after it, you should shirk log file, i don't advice you to shrink data file, because (as i expect) it will grow again.
0
 
LVL 22

Expert Comment

by:CJ_S
ID: 9687094
A TRUNCATE statement is not logged, might help?
0
 
LVL 2

Expert Comment

by:sedmans
ID: 9687155
I am not a DTS expert but this sounds like it is something that would be suited to it and you might be able to bulk inserts this way.
0
 
LVL 34

Expert Comment

by:arbert
ID: 9689063
Agreee with sedmans, look at using DTS bulk insert task.  You can also specify the transaction size to keep the number of open transactions to a minimum...

Brett
0
 
LVL 5

Accepted Solution

by:
russellshome earned 500 total points
ID: 9699997
While RadimHampel says "it isn't done in one transaction", the point is that my method does each pair of insert /delete in a transaction.

> do I need to use DBCC shrinkfile to shrink the database file and LDF file?

Depends on the recovery model.

Simple no

Full Yes

>Also how about using bulk insert for the above task. What could be efficent way BULK Insert or the way you suggested?

BULK insert will be more efficient but less robust.

You can only ensure that a row is deleted for each row inserted in two ways.
Your first attempt which puts all insert/delete pairs in one transaction (and needs lots of log file space)
Or my suggestion which puts a smaller number of insert/delete pairs into the transaction.
0
 
LVL 4

Author Comment

by:Milind00
ID: 9706192
Thanks a lot experts. I think I got the answer.
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Everyone has problem when going to load data into Data warehouse (EDW). They all need to confirm that data quality is good but they don't no how to proceed. Microsoft has provided new task within SSIS 2008 called "Data Profiler Task". It solve th…
Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.

770 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question