Solved

DB2 Table Loading

Posted on 2012-03-11
9
727 Views
Last Modified: 2013-12-25
Hi Everyone,
I have a question on loading data from an IXF file into a DB table(s).
The IXF is a backup from one DB and contains several years worth of data from a source. I want to load only the "new" records to the target, ones where a primary key does not previously exist. The key includes month and year. This will be a repeatable process every month.
Example: I get a monthly dump of data that contains all months data for the current a year. Every month I want to only append the new records to the target DB/Tables and not replace any previously existing records.
What would be the most efficient design/process to do this? APPEND?
Thanks in advance
Mark
0
Comment
Question by:Mark1M
  • 4
  • 3
  • 2
9 Comments
 
LVL 45

Expert Comment

by:Kent Olsen
ID: 37708071
Hi Mark,

I'm not aware of anything in DB2 that will allow you to filter the rows like you would with a query (that is, there is no WHERE clause on the load statement).

But you can declare an exception table with an identical column layout as the table that you're trying to load.  DB2 will load the "new" rows into the base table and the "duplicate" rows into the exception table.

  CREATE TABLE my_exceptions AS (SELECT * FROM my_table) DEFINITION ONLY;

Then use the FOR EXCEPTION clause on the load

  LOAD ... INTO TABLE my_table FOR EXCEPTION my_exceptions NOUNIQUEEXC

The LOAD process will write the exceptions to the new table.  By including the NOUNIQUEEXC clause, DB2 will not insert exception rows just because the violate the duplicate key condition.  So anything written to that table is an exception for a different reason.


That should get you very, very close.  :)


Good Luck,
Kent
0
 
LVL 4

Expert Comment

by:sathyaram_s
ID: 37708100
As Kent says, there is no way to filter the records in an ixf file.


Assuming you want to do only incremental load, I would do the following :
a) LOAD data into a temporary table [create table temptable like targettable]
b) select current months records from this table and insert into the target table. Depending on your requirements this can be a plain insert or LOAD from cursor.

Kent's approach will generate a lot of log records as DELETEs are logged in LOAD(Kent, correct me if I am wrong) whereas the above approach will do only minimal logging.

HTH

Sathyaram
0
 
LVL 45

Expert Comment

by:Kent Olsen
ID: 37708117
Hi Sathyaram,

I'll actually have to try it and find out.  In my environment, I turn off logging for the LOAD operation.  

And depending on the row count, that may be a good thing for the poster.  The worst that happens is that he rebuilds the entire table from the dump file that he already has.

Your approach loads the entire dump into another table.  Depending on the number of existing rows, number of new rows, indexing, etc., it might be fastest to load that "temp" table, drop the original table, rename the "temp" table, and rebuild the indexes.

Options.  :)  Always options.  :)


Kent
0
Guide to Performance: Optimization & Monitoring

Nowadays, monitoring is a mixture of tools, systems, and codes—making it a very complex process. And with this complexity, comes variables for failure. Get DZone’s new Guide to Performance to learn how to proactively find these variables and solve them before a disruption occurs.

 

Author Comment

by:Mark1M
ID: 37708129
Hi Kent,

That sounds very very close! Let's see if I have it right:

First: CREATE the base table:

CREATE TABLE base_table AS (SELECT * FROM initial_table) DEFINITION ONLY;

This will load all the intial data from the "initial_table"?

Next, each time I want to update the table from the IXF. I assume I need to load the IXF to a temporary table "update_table" with the same structure.

Then:

LOAD update_table INTO TABLE base_table FOR EXCEPTION base_table NOUNIQUEEXC

What happens to the rejected records? Logging? Also, what key will the use?  The primary key?

Definitely looking for a an efficient repeatable process.

Thanks!
0
 
LVL 4

Expert Comment

by:sathyaram_s
ID: 37708145
Kent, According to this page

http://publib.boulder.ibm.com/infocenter/db2luw/v9r7/topic/com.ibm.db2.luw.admin.dm.doc/doc/c0004599.html

"
 During the delete phase, each delete operation on a table record, or an index key, is logged.
"
0
 
LVL 4

Expert Comment

by:sathyaram_s
ID: 37708152
Mark,

You have mixed up a bit.

Kent's approach :

a) Build an exception table

 CREATE TABLE my_exceptions AS (SELECT * FROM my_table) DEFINITION ONLY;

b) Then use the FOR EXCEPTION clause on the load

  LOAD ... INTO TABLE my_table FOR EXCEPTION my_exceptions NOUNIQUEEXC

My option:

a) Build a shadow table


 CREATE TABLE my_table_shadow AS (SELECT * FROM my_table) DEFINITION ONLY;

b) Load data from ixf to shadow table

LOAD FROM myfile.ixf insert into my_table_shadow nonrecoverable ;

c) Select the current month's rows from the shadow table and load into the table

DECLARE C1 cursor for select * from my_table_shadow where date between '2011-03-01' and '2011-03-31'
LOAD from c1 of cursor insert into my_table allow read access


HTH
0
 
LVL 45

Expert Comment

by:Kent Olsen
ID: 37708158
Hi Mark,

Too many tables.  :)

  LOAD FROM filename INTO tablename FOR EXCEPTION exceptiontable NOUNIQUEEXC


Plus any other options that you require.


Kent
0
 

Author Comment

by:Mark1M
ID: 37708190
Hi Kent & Sathyaram,

Thanks for the clarification.

LOAD FROM "IXF_file" INTO "base_table" FOR EXCEPTION "expection_table" NOUNIQUEEXC

Will my DBA go nuts with error messages? Or will the DBA like this approach?  Just a subjective question to prepare myself for Monday morning... :-)

Thank U very much!
Mark
0
 
LVL 45

Accepted Solution

by:
Kent Olsen earned 500 total points
ID: 37708278
Hi Mark,

He should like that.  In fact, he should be familiar with it so since you literally give him the command that you need run you're making his life pretty easy.  :)


Kent
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Shadow IT is coming out of the shadows as more businesses are choosing cloud-based applications. It is now a multi-cloud world for most organizations. Simultaneously, most businesses have yet to consolidate with one cloud provider or define an offic…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question