Link to home
Start Free TrialLog in
Avatar of Goosie1
Goosie1Flag for United States of America

asked on

DTS Error Logging

Recently we received an error using one of our scheduled DTS packages. The problem was due to bad data, however the error logging is extremely generic and it took me quite a while to find the specific row that contained the bad data.  I reran the step and was able to determine that the problem occurred after the 39000th record and before the 40000th record, but that's the best I could get.   I removed that record, and everything ran fine.
My question is, is there a way to make the error logging more specific?
Do you know of any way to identify the record causing failure in a better way?  If that's not possible, is there a way we can set the record number display to be less than 1000?  Something like 100 might make it easier to narrow down an offending record.  50 or 25 would be even better.

Thanks in advance.
ASKER CERTIFIED SOLUTION
Avatar of lcohan
lcohan
Flag of Canada image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of Goosie1

ASKER

First let me say that I am a very new user to DTS. I am using SQL 2000 DTS packages. I have the enable logging turned on. My problem is that the logging is not specific as to what data caused the error.  Your suggestion to include a data validation first is a good one. However I am unsure how to do this? Can you advise?
You do have a query that you can run against the source table(s) to find bad data right? I mean you deleted bad records from the source table then your SSIS import worked well so you should take that same query and put it in a IF EXISTS statement and if indeed there's bad data or in other words the IF EXISTS evaluates true then you could generate for instance an error right there so the package will fail. I did this (cheap I know) but it works perfectly in 2000 where I create a temp table with PK and force duplicate values that will generate an error. Well obviously this being tied to the IF EXISTS step I can use RAISEERROR and change the message from pk violation to whatever you want and this will be logged as well - something like in the code below:

declare @msg varchar(400)
create table #test (col1 int primary key)
--test if bad data exists
if exists (select * from source_data where bla = bla)
begin
      insert into #test values (1)
      insert into #test values (1)
      
      set @msg ='Operation Failed!!!'+char(13)+'Please check Data Source table'
      if @msg is not null
      RAISERROR (@msg, 16, 1) with log
end
drop table #test
go


Avatar of Goosie1

ASKER

I found the logging in the data pump task is exactly what I was looking for. It prints out the data that causes the step to fail. Thanks very much for leading me to this!