SQL 2008 - Timely archiving of data in Tables


I have got a database which has got 100+ tables. I want to create an ARCHIVE DATABASE and timely move the data from one database to another database.

Is there any thing available or has somebody got any code - which helps me to kick start

Objective is to:
->create an ARCHIVE DATABASE (which should have all the tables from the MAIN DATABASE)
->set an interval and based on that archive process should work (means any data more than 1 months old would be archived)

Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

This is something that I think you'll have to do with code (either via a program or via a stored procedure)...
A typical database doesn't have a universal way to determine the age of a record in a table.   So, that means you'll need your own technique for determining the  age (most likely some existing "date stamp" field)
--Lots of methods.  See below of one method.
--Please excuse any syntax errors.
create table tables_to_archive
   tablename varchar(500) not null,
   datefield varchar(100) not null,
   archiveage int)
alter table tables_to_archive add primary key pk_tables_to_archive (tablename,datefield)
insert into tables_to_archive
   select 'orders','orderdate',31
insert into tables_to_archive
   select 'invoices','invoicedate',31
--execute this procedure at the desired frequency.
create procedure p_archive_tables
   declare @tablename varchar(500),
           @datefield varchar(100),
           @archiveage int,
           @sqlcmd varchar(2000),
           @archivedate datetime
   declare tbl_cur cursor for
      select tablename, datefield,archiveage
      from tables_to_archive
      order by tablename
   fetch next from tbl_cur
      into @tablename, @datefield, @archiveage
   while @@FETCH_STATUS = 1
      select @archivedate = dateadd(dd,getdate(),-1 * abs(@archiveage))
      select @sqlcmd = ' insert into archivedb..' + @tablename +
                       ' select * from ' + @tablename +
                       ' where ' + @datefield + ' <= ''' +
                             convert(varchar, @archivedate, 101) + ''''
      exec (@sqlcmd)
      select @sqlcmd = ' delete from ' + @tablename + 
                       ' where ' + @datefield + ' <= ''' +
                             convert(varchar, @archivedate, 101) + ''''
      exec (@sqlcmd)
      fetch next from tbl_cur
         into @tablename, @datefield, @archiveage

Open in new window

Mark WillsTopic AdvisorCommented:
Just did one a month or so ago...

1) Created the database - just a script of the current "live" database using the wizard
2) Inspected table relationships and dependancies
3) worked out the hierarchy / sequence to populate archive database
4) worked out hierarchy / sequence to remove live data
5) worked out "transaction groups" for the above hierarchies
6) worked out logical "events" for archiving (e.g. end of month financial reporting)
7) created individual scripts / procedures for each transaction group
    basically amounts to:
          insert into archive select from live where not exists in archive
          delete from live where exists in archive
          few strategies for the above - can write generic code and use information_schema's into temp area then use that as a transition / stage for speed
          log success / progress for dependancy checking
8) created scheduled jobs of the above scripts
9) created a "checker" to see if data structures had altered

altered recovery model and backup maintenance plan for Archive to be simple, and correspond to archive frequency of "events"

altered the couple of functions/procedures (and views) where access to archive data was required - because it has the same structure, was a "no brainer" with a union type approach (and the major reason why the database was duplicated).

next step was to automate data structure migration, at the moment a fairly small and static site - so was deemed unneccessary other than point 9 above - and the use of dynamic SQL did cater for various changes anyway.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Mark WillsTopic AdvisorCommented:
Oh, and step 3 also included the time scales to hold data in live - that did take quite a bit of discussion with the business...
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server

From novice to tech pro — start learning today.