Using DTS to import / append data from Oracle to SQL2000 without causing Primary key constaint errors

I'm using using DTS in SQL2000 to import data from Oracle. The DTS job is using a straightforward default append statement in ActiveX. However, I'm importing old data each time as well as updated information from Oracle and this predictably causes PK violations. To get around the problem I'm deleting all data from the destination table in SQL prior to each import. This is obviously very inefficent and not scalable.

Is there a way of skipping these errors so that the DTS import job ignores the PK constaints and continues to import new data?
robcpodAsked:
Who is Participating?
 
BillAn1Commented:
Just remembered - if you are using SQLServer as destination, the default setting is to use Bulk Insert. This bypasses the trigger, so the trigger is not fired.
In the DTS, select options and de-select "use fast load"
Now, this will slow down the load!!
A faster approach may be to to a 'fast load' into another, staging table in SQLServer which has no constraints, but is otherwise identical to the target table.

then do a second step which is a SQL Task, of the form
insert into Outward_Tkt_fact
select * from staging_Outward_Tkt_fact
0
 
OtanaCommented:
When you select which tables to copy in your DTS, go to "Transform", and select "Drop and recreate destination table". This is faster than deleting all records.

0
 
BillAn1Commented:
dropping and recreating the table can be a mess - if you have indexes, constraints etc.
Probably the cleanest thing is to create a trigger on the table you are inserting into along the lines of :

create trigger mytrig on mytable
instead of insert
as
begin
     insert into mytable
     select * from inserted
     where not exists (select 1 from mytable where mytable.PK = inserted.PK)
end
0
Ultimate Tool Kit for Technology Solution Provider

Broken down into practical pointers and step-by-step instructions, the IT Service Excellence Tool Kit delivers expert advice for technology solution providers. Get your free copy now.

 
robcpodAuthor Commented:
Hi Otana,

That's good to know, however, I'm trying to avoid dropping the table if at all possible...

Hi BillAn1

I'm a novice with databases so forgive my ignorance!

How can I apply the script within DTS? I understand how I can apply the trigger if both source and destination tables are on the same RDBMS, how does it work when the source is on Oracle and the destination is on SQL2000?
0
 
BillAn1Commented:
The trigger is purely applied to the destination table in SQLServer, it is not applied to the Oracle DB.
It is not something that will be part of the DTS. Once you have applied the trigger it will be in place 'forever'
To apply the trigger, you can use Enterprise manager
select the table you want, right-click ->ALl tasks -> manage triggers
insert the text you want, along the lines I have shown, but obviously customised for your particular table etc
Once it's done, you can run your DTS, and any rows that are already there, the trigger will simply ignore, so the DTS will not get any errors
0
 
arbertCommented:
Depending on the number of rows, a trigger can kill your performance.  If you really want to reload the table every time, add an Execute sql task to your DTS package and issue a TRUNCATE TABLE yourtablename.....Doesn't drop the table, doesn't delete each row individually....
0
 
BillAn1Commented:
even truncate will have problems if there are any FK's to the table, then you are back to where you started - deleting the table.
0
 
arbertCommented:
yep, just saw the heading on the question about primary key constraint errors :)
0
 
robcpodAuthor Commented:
Hi BillAn1 & Brett,

What I'm struggling to understand is how I can apply a trigger on the destination table in MSSQL when the source is from multiple tables in an Oracle DB...

Taking BillAn1's example code:

create trigger mytrig on mytable
instead of insert
as
begin
     insert into mytable
     select * from inserted
     where not exists (select 1 from mytable where mytable.PK = inserted.PK)
end

The table referenced as 'inserted' is not on the SQL DB but is on the Oracle DB (and it's 9 tables). Can the trigger reference 'foreign' tables from another RDBMS?
0
 
BillAn1Commented:
OK, here's a brief explaination on how triggers work.
A trigger is a piece of code that is attached to a table, and is fired whenever youo try to insert / update rows to the table.
You can have different types of trigger - FOR INSERT means it will fire immedately after an insert happens, FOR INSERT, UPDATE means it will fire after an INSERT or UPDATE has happened etc. In this case, I have suggested INSTEAD OF INSERT - this means that the data does not get inserted into the tables, instead it is inserted into a special table called insert (this happens for all triggers, including after insert ones too) then the trigger is fired. Now, it is up to the trigger to actually insert the data. Hence, the code here can check if the rows already exist in the table. If they do - then it fill ignore them, otherwise it will go ahead and actually insert them.

Externally, you can 'insert' any rows you like to the table - e.g. your DTS can select all rows that are in the Oracle Table and insert them into the SQL DB table. However the trigger will automatically kick out any rows that are already there.

The tirgger is attached to the destination table only, and it automatically rejects any rows that are already in hte table, without generating an error.
0
 
robcpodAuthor Commented:
OK, that's very clear and I now understand!

I've applied the following code to the trigger:

CREATE TRIGGER [OWT_Trig] ON [dbo].[Outward_Tkt_fact]
INSTEAD OF  INSERT
AS
begin
     insert into Outward_Tkt_fact
     select * from inserted i
     where not exists (select * from Outward_Tkt_fact as o where o.unique_id = i.unique_id)
end

BUT I'm still getting a PK violation error when I run the DTS package...It should work...??!

Is there anyway to test the trigger to ensure that it is doing what you expect it to?
0
 
BillAn1Commented:
you can test the trigger easy enough by just doing an insert of a row of data
insert into Outward_Tkt_fact values(1,2,3,4.... whatever)
and then run it again. you should get no error the second time, but the data should not be inserted twice.

As for getting the duplicate error - could there be duplicates in the source data? if so, you may want to change the trigger to do a SELECT DISTINCT if there are duplicate rows. If there are more than one row with the same unique_id, but different values for other cols, you are still in trouble - you need to decide a 'rule' for what to do with these, if this is the case - should you accept the first one? the last one? the "maximum" one etc.
0
 
robcpodAuthor Commented:
The fast load was the problem, the trigger is now working. Thank you!

I have considered a staging table, so I have another good reason to adopt one!

Many thanks.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.