Solved

Automatically backup data

Posted on 2013-01-26
5
317 Views
Last Modified: 2013-01-28
Hi all,

I need to find the best solution -
back up data on a table basis every hour (or every day) automatically, to a backup table in the same instance. Cannot be triggers, that would affect performance too much.

Any help is appreciated
0
Comment
Question by:darrgyas
  • 2
  • 2
5 Comments
 
LVL 45

Expert Comment

by:Kdo
ID: 38823188
Hi darrgyas ,

Depending on your needs and the amount of data involved, you might consider replication.  That's a process where you have another database, usually on another server, and when updates are applied to one database, the changes are automatically applied to the other.

If you really need to copy data from a key table (or tables) on a regular basis, there are a number of considerations.

1.  How much data will you copy (number of row and total size of the data).
2.  Do you need to copy the entire table?
3.  Can you copy only the new rows and can you identify them?
4.  Are you going to delete data behind the copy?
5.  Will the new table be completely reloaded or is the data added to existing data?

There are more questions, but that should be a great start.


Kent
0
 
LVL 27

Expert Comment

by:tliotta
ID: 38825334
And what platform is DB2 on? That goes specifically to the element of triggers which might be your best choice if performance is an issue.

Tom
0
 

Author Comment

by:darrgyas
ID: 38826441
The platform is Windows, DB2-Express-C 9.74

Each table is being written to appr 200 times a second, average record is appr 200 bytes
I don't need the entire table - just the new rows
I can identify the new rows by the timestamp
The data from the original table will not be deleted
The new data will be added to the backup table
0
 
LVL 45

Accepted Solution

by:
Kdo earned 500 total points
ID: 38826916
Hi darrgyas,

If you want to do this with SQL, I suggest a two-step approach.  Step 1 pulls the desired rows into a staging table.  Step 2 writes the rows to the target table.

One of the issues that you'll be contending with is logging.  The INSERTS on the second table will use log space.  200/rows second over an hour is 720,00 rows.  You don't really want to have 3/4 million uncommitted rows logged.  And the easiest way to avoid that is to write to a staging table without logging.  You can then move the data from the staging table to the target table in smaller chunks.

You can also move the data directly from the source to the target in smaller chunks that will affect buffering for a longer period of time.  Which works best for you is something that you might have to learn by trial and error.

Anyway, moving the data via a staging table is easy.

-- One time:
CREATE TABLE stage1 AS (SELECT * FROM source_table) DEFINITION ONLY;

-- Empty the staging table
ALTER TABLE stage1 ACTIVATE NOT LOGGED INITIALLY WITH EMPTY TABLE;
COMMIT;

-- Insert the new rows
ALTER TABLE stage1 ACTIVATE NOT LOGGED INITIALLY;
INSERT INTO stage1
SELECT * 
FROM source_table 
WHERE {condition};
COMMIT;

-- Copy the rows to the target table
INSERT INTO target
SELECT * FROM stage1
WHERE {condition};
COMMIT;

Open in new window


That's about it.  

If the insert into the staging table fails, the table will be marked as corrupt.  That's not a big deal as you'll simply drop the table and recreate it.

The last INSERT should move part of the data from the staging table to the target table.  Split the data, probably by the rows' timestamp or identity value.


Good Luck,
Kent
0
 

Author Closing Comment

by:darrgyas
ID: 38829248
Thank you
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

In database programming, custom sort order seems to be necessary quite often, at least in my experience and time here at EE. Within the realm of custom sorting is the sorting of numbers and text independently (i.e., treating the numbers as number…
PL/SQL can be a very powerful tool for working directly with database tables. Being able to loop will allow you to perform more complex operations, but can be a little tricky to write correctly. This article will provide examples of basic loops alon…
This tutorial gives a high-level tour of the interface of Marketo (a marketing automation tool to help businesses track and engage prospective customers and drive them to purchase). You will see the main areas including Marketing Activities, Design …
This is used to tweak the memory usage for your computer, it is used for servers more so than workstations but just be careful editing registry settings as it may cause irreversible results. I hold no responsibility for anything you do to the regist…

863 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

24 Experts available now in Live!

Get 1:1 Help Now