[2 days left] What’s wrong with your cloud strategy? Learn why multicloud solutions matter with Nimble Storage.Register Now

x
?
Solved

DTS Duplicate Key Error

Posted on 2004-08-02
3
Medium Priority
?
763 Views
Last Modified: 2013-11-30
I have a DTS package that  updates an Item Master table from Oracle to SQL 2000. I'm looking to capture the updates to the Inventory Master in SQL 2000.  My DTS table is just copying the tables.  I'm getting the following error: Error at destination at row number 35465.  Errors encountered so for in this task:1 Duplicate key wass ignored.

Thanks
0
Comment
Question by:mtnbuzz
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
3 Comments
 
LVL 18

Expert Comment

by:SjoerdVerweij
ID: 11697192
Change the package to load a table that is identical to the inventory master, with one change: do not put a primary key on it (put the field in, but don't make it the primary key). Put the data from the original table in it, then execute the package.Then execute the following:

Select Nameofpkfield
from InventoryMasterTable
Group By Nameofpkfield
Having Count(*) > 1

That will show you the duplicates.
0
 
LVL 17

Accepted Solution

by:
BillAn1 earned 2000 total points
ID: 11720925
Before you run the DTS are you truncating/deleting the SQLServer table?
If not, you will get duplicate errors the second time you run the DTS. IF this is the case, then you need to either dump the contents before inserting, or only insert reocrds not already there.

Several things to do here if there are duplicates in the source :
1. Modify the DTS to select only one record -
modify the SQL to be a SELECT DISTINCT (with maybe MAX() on other cols to ensure only one record per PK - depends on how you want to react to two records with samne 'PK', but different other columns?)
2. Modify the DTS to increase the number of errors allowed. Then have it log errors to an error file, which you can then analyse to determine the problem
3. A trigger on the source table could be used to ignore duplicates / or records which are already in the target. THis would handle either case where the source has duplicates, or where you have already inserted records. You can also have the trigger update any record which are already there.


0
 
LVL 1

Expert Comment

by:tanakmot
ID: 11723792
Depending on the type of task you're using, and if you ise the transform task, you can set an error file to list what are the records causing the problem.
I find this option quite handy when you start having integrity violation or data type conversion error. You can then take appropriate actions as mentioned earlier to fix the problem.

If not using transform task, try setting an error log on the package itself, it might help with bit more accurate message.

MT-
0

Featured Post

Learn how to optimize MySQL for your business need

With the increasing importance of apps & networks in both business & personal interconnections, perfor. has become one of the key metrics of successful communication. This ebook is a hands-on business-case-driven guide to understanding MySQL query parameter tuning & database perf

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Backups and Disaster RecoveryIn this post, we’ll look at strategies for backups and disaster recovery.
Instead of error trapping or hard-coding for non-updateable fields when using QODBC, let VBA automatically disable them when forms open. This way, users can view but not change the data. Part 1 explained how to use schema tables to do this. Part 2 h…
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Via a live example, show how to shrink a transaction log file down to a reasonable size.

649 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question