Solved

DTS Duplicate Key Error

Posted on 2004-08-02
3
758 Views
Last Modified: 2013-11-30
I have a DTS package that  updates an Item Master table from Oracle to SQL 2000. I'm looking to capture the updates to the Inventory Master in SQL 2000.  My DTS table is just copying the tables.  I'm getting the following error: Error at destination at row number 35465.  Errors encountered so for in this task:1 Duplicate key wass ignored.

Thanks
0
Comment
Question by:mtnbuzz
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 18

Expert Comment

by:SjoerdVerweij
ID: 11697192
Change the package to load a table that is identical to the inventory master, with one change: do not put a primary key on it (put the field in, but don't make it the primary key). Put the data from the original table in it, then execute the package.Then execute the following:

Select Nameofpkfield
from InventoryMasterTable
Group By Nameofpkfield
Having Count(*) > 1

That will show you the duplicates.
0
 
LVL 17

Accepted Solution

by:
BillAn1 earned 500 total points
ID: 11720925
Before you run the DTS are you truncating/deleting the SQLServer table?
If not, you will get duplicate errors the second time you run the DTS. IF this is the case, then you need to either dump the contents before inserting, or only insert reocrds not already there.

Several things to do here if there are duplicates in the source :
1. Modify the DTS to select only one record -
modify the SQL to be a SELECT DISTINCT (with maybe MAX() on other cols to ensure only one record per PK - depends on how you want to react to two records with samne 'PK', but different other columns?)
2. Modify the DTS to increase the number of errors allowed. Then have it log errors to an error file, which you can then analyse to determine the problem
3. A trigger on the source table could be used to ignore duplicates / or records which are already in the target. THis would handle either case where the source has duplicates, or where you have already inserted records. You can also have the trigger update any record which are already there.


0
 
LVL 1

Expert Comment

by:tanakmot
ID: 11723792
Depending on the type of task you're using, and if you ise the transform task, you can set an error file to list what are the records causing the problem.
I find this option quite handy when you start having integrity violation or data type conversion error. You can then take appropriate actions as mentioned earlier to fix the problem.

If not using transform task, try setting an error log on the package itself, it might help with bit more accurate message.

MT-
0

Featured Post

Salesforce Made Easy to Use

On-screen guidance at the moment of need enables you & your employees to focus on the core, you can now boost your adoption rates swiftly and simply with one easy tool.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
For both online and offline retail, the cross-channel business is the most recent pattern in the B2C trade space.
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
Via a live example, show how to shrink a transaction log file down to a reasonable size.

740 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question