Solved

DTS Duplicate Key Error

Posted on 2004-08-02
3
751 Views
Last Modified: 2013-11-30
I have a DTS package that  updates an Item Master table from Oracle to SQL 2000. I'm looking to capture the updates to the Inventory Master in SQL 2000.  My DTS table is just copying the tables.  I'm getting the following error: Error at destination at row number 35465.  Errors encountered so for in this task:1 Duplicate key wass ignored.

Thanks
0
Comment
Question by:mtnbuzz
3 Comments
 
LVL 18

Expert Comment

by:SjoerdVerweij
ID: 11697192
Change the package to load a table that is identical to the inventory master, with one change: do not put a primary key on it (put the field in, but don't make it the primary key). Put the data from the original table in it, then execute the package.Then execute the following:

Select Nameofpkfield
from InventoryMasterTable
Group By Nameofpkfield
Having Count(*) > 1

That will show you the duplicates.
0
 
LVL 17

Accepted Solution

by:
BillAn1 earned 500 total points
ID: 11720925
Before you run the DTS are you truncating/deleting the SQLServer table?
If not, you will get duplicate errors the second time you run the DTS. IF this is the case, then you need to either dump the contents before inserting, or only insert reocrds not already there.

Several things to do here if there are duplicates in the source :
1. Modify the DTS to select only one record -
modify the SQL to be a SELECT DISTINCT (with maybe MAX() on other cols to ensure only one record per PK - depends on how you want to react to two records with samne 'PK', but different other columns?)
2. Modify the DTS to increase the number of errors allowed. Then have it log errors to an error file, which you can then analyse to determine the problem
3. A trigger on the source table could be used to ignore duplicates / or records which are already in the target. THis would handle either case where the source has duplicates, or where you have already inserted records. You can also have the trigger update any record which are already there.


0
 
LVL 1

Expert Comment

by:tanakmot
ID: 11723792
Depending on the type of task you're using, and if you ise the transform task, you can set an error file to list what are the records causing the problem.
I find this option quite handy when you start having integrity violation or data type conversion error. You can then take appropriate actions as mentioned earlier to fix the problem.

If not using transform task, try setting an error log on the package itself, it might help with bit more accurate message.

MT-
0

Featured Post

Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Ever needed a SQL 2008 Database replicated/mirrored/log shipped on another server but you can't take the downtime inflicted by initial snapshot or disconnect while T-logs are restored or mirror applied? You can use SQL Server Initialize from Backup…
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

911 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now