[Okta Webinar] Learn how to a build a cloud-first strategyRegister Now

x
?
Solved

SQL 2008 - Timely archiving of data in Tables

Posted on 2008-11-12
4
Medium Priority
?
337 Views
Last Modified: 2012-05-05
Hi,

I have got a database which has got 100+ tables. I want to create an ARCHIVE DATABASE and timely move the data from one database to another database.

Is there any thing available or has somebody got any code - which helps me to kick start

Objective is to:
->create an ARCHIVE DATABASE (which should have all the tables from the MAIN DATABASE)
->set an interval and based on that archive process should work (means any data more than 1 months old would be archived)

Regards
0
Comment
Question by:RadheRadhe
  • 2
4 Comments
 
LVL 41

Expert Comment

by:graye
ID: 22940854
This is something that I think you'll have to do with code (either via a program or via a stored procedure)...
A typical database doesn't have a universal way to determine the age of a record in a table.   So, that means you'll need your own technique for determining the  age (most likely some existing "date stamp" field)
0
 
LVL 7

Expert Comment

by:lundnak
ID: 22946363
--Lots of methods.  See below of one method.
--Please excuse any syntax errors.
create table tables_to_archive
   tablename varchar(500) not null,
   datefield varchar(100) not null,
   archiveage int)
 
alter table tables_to_archive add primary key pk_tables_to_archive (tablename,datefield)
 
insert into tables_to_archive
   select 'orders','orderdate',31
insert into tables_to_archive
   select 'invoices','invoicedate',31
 
--execute this procedure at the desired frequency.
create procedure p_archive_tables
as
   declare @tablename varchar(500),
           @datefield varchar(100),
           @archiveage int,
           @sqlcmd varchar(2000),
           @archivedate datetime
 
   declare tbl_cur cursor for
      select tablename, datefield,archiveage
      from tables_to_archive
      order by tablename
 
   fetch next from tbl_cur
      into @tablename, @datefield, @archiveage
 
   while @@FETCH_STATUS = 1
   begin
      select @archivedate = dateadd(dd,getdate(),-1 * abs(@archiveage))
 
      select @sqlcmd = ' insert into archivedb..' + @tablename +
                       ' select * from ' + @tablename +
                       ' where ' + @datefield + ' <= ''' +
                             convert(varchar, @archivedate, 101) + ''''
      exec (@sqlcmd)
 
      select @sqlcmd = ' delete from ' + @tablename + 
                       ' where ' + @datefield + ' <= ''' +
                             convert(varchar, @archivedate, 101) + ''''
      exec (@sqlcmd)
 
      fetch next from tbl_cur
         into @tablename, @datefield, @archiveage
   end

Open in new window

0
 
LVL 51

Accepted Solution

by:
Mark Wills earned 2000 total points
ID: 22946469
Just did one a month or so ago...

1) Created the database - just a script of the current "live" database using the wizard
2) Inspected table relationships and dependancies
3) worked out the hierarchy / sequence to populate archive database
4) worked out hierarchy / sequence to remove live data
5) worked out "transaction groups" for the above hierarchies
6) worked out logical "events" for archiving (e.g. end of month financial reporting)
7) created individual scripts / procedures for each transaction group
    basically amounts to:
          insert into archive select from live where not exists in archive
          delete from live where exists in archive
          few strategies for the above - can write generic code and use information_schema's into temp area then use that as a transition / stage for speed
          log success / progress for dependancy checking
8) created scheduled jobs of the above scripts
9) created a "checker" to see if data structures had altered

altered recovery model and backup maintenance plan for Archive to be simple, and correspond to archive frequency of "events"

altered the couple of functions/procedures (and views) where access to archive data was required - because it has the same structure, was a "no brainer" with a union type approach (and the major reason why the database was duplicated).

next step was to automate data structure migration, at the moment a fairly small and static site - so was deemed unneccessary other than point 9 above - and the use of dynamic SQL did cater for various changes anyway.
0
 
LVL 51

Expert Comment

by:Mark Wills
ID: 22946479
Oh, and step 3 also included the time scales to hold data in live - that did take quite a bit of discussion with the business...
0

Featured Post

Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Ever wondered why sometimes your SQL Server is slow or unresponsive with connections spiking up but by the time you go in, all is well? The following article will show you how to install and configure a SQL job that will send you email alerts includ…
An alternative to the "For XML" way of pivoting and concatenating result sets into strings, and an easy introduction to "common table expressions" (CTEs). Being someone who is always looking for alternatives to "work your data", I came across this …
Via a live example, show how to setup several different housekeeping processes for a SQL Server.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.
Suggested Courses

834 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question