Solved

Automatically backup data

Posted on 2013-01-26
5
322 Views
Last Modified: 2013-01-28
Hi all,

I need to find the best solution -
back up data on a table basis every hour (or every day) automatically, to a backup table in the same instance. Cannot be triggers, that would affect performance too much.

Any help is appreciated
0
Comment
Question by:darrgyas
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 2
  • 2
5 Comments
 
LVL 45

Expert Comment

by:Kent Olsen
ID: 38823188
Hi darrgyas ,

Depending on your needs and the amount of data involved, you might consider replication.  That's a process where you have another database, usually on another server, and when updates are applied to one database, the changes are automatically applied to the other.

If you really need to copy data from a key table (or tables) on a regular basis, there are a number of considerations.

1.  How much data will you copy (number of row and total size of the data).
2.  Do you need to copy the entire table?
3.  Can you copy only the new rows and can you identify them?
4.  Are you going to delete data behind the copy?
5.  Will the new table be completely reloaded or is the data added to existing data?

There are more questions, but that should be a great start.


Kent
0
 
LVL 27

Expert Comment

by:tliotta
ID: 38825334
And what platform is DB2 on? That goes specifically to the element of triggers which might be your best choice if performance is an issue.

Tom
0
 

Author Comment

by:darrgyas
ID: 38826441
The platform is Windows, DB2-Express-C 9.74

Each table is being written to appr 200 times a second, average record is appr 200 bytes
I don't need the entire table - just the new rows
I can identify the new rows by the timestamp
The data from the original table will not be deleted
The new data will be added to the backup table
0
 
LVL 45

Accepted Solution

by:
Kent Olsen earned 500 total points
ID: 38826916
Hi darrgyas,

If you want to do this with SQL, I suggest a two-step approach.  Step 1 pulls the desired rows into a staging table.  Step 2 writes the rows to the target table.

One of the issues that you'll be contending with is logging.  The INSERTS on the second table will use log space.  200/rows second over an hour is 720,00 rows.  You don't really want to have 3/4 million uncommitted rows logged.  And the easiest way to avoid that is to write to a staging table without logging.  You can then move the data from the staging table to the target table in smaller chunks.

You can also move the data directly from the source to the target in smaller chunks that will affect buffering for a longer period of time.  Which works best for you is something that you might have to learn by trial and error.

Anyway, moving the data via a staging table is easy.

-- One time:
CREATE TABLE stage1 AS (SELECT * FROM source_table) DEFINITION ONLY;

-- Empty the staging table
ALTER TABLE stage1 ACTIVATE NOT LOGGED INITIALLY WITH EMPTY TABLE;
COMMIT;

-- Insert the new rows
ALTER TABLE stage1 ACTIVATE NOT LOGGED INITIALLY;
INSERT INTO stage1
SELECT * 
FROM source_table 
WHERE {condition};
COMMIT;

-- Copy the rows to the target table
INSERT INTO target
SELECT * FROM stage1
WHERE {condition};
COMMIT;

Open in new window


That's about it.  

If the insert into the staging table fails, the table will be marked as corrupt.  That's not a big deal as you'll simply drop the table and recreate it.

The last INSERT should move part of the data from the staging table to the target table.  Split the data, probably by the rows' timestamp or identity value.


Good Luck,
Kent
0
 

Author Closing Comment

by:darrgyas
ID: 38829248
Thank you
0

Featured Post

Stack Overflow Podcast - Frustrating Miracles

In this podcast, Stack Overflow interviewed Linux Academy CEO/Founder, Anthony James, and got his developer story!

"Follow your passion, be prepared to work hard and sacrifice, and, above all, don't let anyone limit your dreams."  - Donovan Bailey

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

As they say in love and is true in SQL: you can sum some Data some of the time, but you can't always aggregate all Data all the time! Introduction: By the end of this Article it is my intention to bring the meaning and value of the above quote to…
Introduction Hopefully the following mnemonic and, ultimately, the acronym it represents is common place to all those reading: Please Excuse My Dear Aunt Sally (PEMDAS). Briefly, though, PEMDAS is used to signify the order of operations (http://en.…
NetCrunch network monitor is a highly extensive platform for network monitoring and alert generation. In this video you'll see a live demo of NetCrunch with most notable features explained in a walk-through manner. You'll also get to know the philos…
Do you want to know how to make a graph with Microsoft Access? First, create a query with the data for the chart. Then make a blank form and add a chart control. This video also shows how to change what data is displayed on the graph as well as form…

627 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question