Solved

Database Archive

Posted on 2003-11-04
10
424 Views
Last Modified: 2006-11-17
I want find out best possible option for the database archive.

Archive means append the records to archive database and delete archived records from main database. I have main database and archive database on same SQL server. Problem is the database size. Main database could of size 40 GB +. I have implemented this using  INSERT INTO by selecting from main database and then deleting the selected records from main database. This works, but the transaction log size grows substantially. And Insert and Delete needs to part of transaction.

What could be better approach to handle this?
 
0
Comment
Question by:Milind00
10 Comments
 
LVL 5

Expert Comment

by:russellshome
ID: 9685339
Change 5 below to whatever does not make the transaction log too large & is most efficient.

declare @rows int
select @rows = count(au_id) from pubs..authors

while @rows > 0
begin
      begin tran
            insert into tempdb..authors
            select top 5 * from pubs..authors
            order by au_id

            delete
            from pubs..authors
            where au_id in (
                  select top 5 au_id
                  from tempdb..authors
                  order by au_id
            )

      commit tran
      select @rows = count(au_id) from pubs..authors
end
0
 
LVL 5

Expert Comment

by:russellshome
ID: 9685346
Slight change::

declare @rows int
select @rows = count(au_id) from pubs..authors

while @rows > 0
begin
     begin tran
          insert into tempdb..authors
          select top 5 * from pubs..authors
          order by au_id

          delete
          from pubs..authors
          where au_id in (
               select top 5 au_id
               from pubs..authors
               order by au_id
          )

     commit tran
     select @rows = count(au_id) from pubs..authors
end
0
 
LVL 4

Author Comment

by:Milind00
ID: 9685574
russellshome ...with your query do I need to use DBCC shrinkfile to shrink the database file and LDF file?
0
 
LVL 4

Author Comment

by:Milind00
ID: 9685593
russellshome...

Also how about using bulk insert for the above task. What could be efficent way BULK Insert or the way you suggested?
0
 
LVL 1

Expert Comment

by:RadimHampel
ID: 9686998
When you are removing all records in one huge transaction, log will be containing all data from your table(if in your db is only table, up to 40gb). Even if you are using simple recovery mode, log file will grow quickly. Solution presented by russellshome is good, but it isn't done in one transaction. But after it, you should shirk log file, i don't advice you to shrink data file, because (as i expect) it will grow again.
0
Control application downtime with dependency maps

Visualize the interdependencies between application components better with Applications Manager's automated application discovery and dependency mapping feature. Resolve performance issues faster by quickly isolating problematic components.

 
LVL 22

Expert Comment

by:CJ_S
ID: 9687094
A TRUNCATE statement is not logged, might help?
0
 
LVL 2

Expert Comment

by:sedmans
ID: 9687155
I am not a DTS expert but this sounds like it is something that would be suited to it and you might be able to bulk inserts this way.
0
 
LVL 34

Expert Comment

by:arbert
ID: 9689063
Agreee with sedmans, look at using DTS bulk insert task.  You can also specify the transaction size to keep the number of open transactions to a minimum...

Brett
0
 
LVL 5

Accepted Solution

by:
russellshome earned 500 total points
ID: 9699997
While RadimHampel says "it isn't done in one transaction", the point is that my method does each pair of insert /delete in a transaction.

> do I need to use DBCC shrinkfile to shrink the database file and LDF file?

Depends on the recovery model.

Simple no

Full Yes

>Also how about using bulk insert for the above task. What could be efficent way BULK Insert or the way you suggested?

BULK insert will be more efficient but less robust.

You can only ensure that a row is deleted for each row inserted in two ways.
Your first attempt which puts all insert/delete pairs in one transaction (and needs lots of log file space)
Or my suggestion which puts a smaller number of insert/delete pairs into the transaction.
0
 
LVL 4

Author Comment

by:Milind00
ID: 9706192
Thanks a lot experts. I think I got the answer.
0

Featured Post

What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

Join & Write a Comment

Performance is the key factor for any successful data integration project, knowing the type of transformation that you’re using is the first step on optimizing the SSIS flow performance, by utilizing the correct transformation or the design alternat…
JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Via a live example, show how to setup several different housekeeping processes for a SQL Server.

760 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now