Link to home
Start Free TrialLog in
Avatar of RLLewis
RLLewis

asked on

DTS phases filter - informational question

I've got a dts job which functions just fine.  I'd like, however, to improve the error handling  -- like how many rows were inserted, what row failed if there was a failure, etc.  I've enabled the 'show multi-pahse pump in dts designer' property.  and now i am within the package itself, the transformations tab.  i notice there are several new phases filters in addition to 'Row Transform' that I'd had already.  The new ones, of course, are not mapped source/destination - I assume that I'd need to do that.  All I'm wondering is whether or not by doing this I am going to alter/impact in any way the existing Row Transform?  Can anybody give me a little more information on the right way to improve the error handling as I've mentioned above?
ASKER CERTIFIED SOLUTION
Avatar of ispaleny
ispaleny
Flag of Czechia image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
And for error handling of rejected data and rowcount, you can use built-in logging.
See Transform Data Task - Options - Exception file. But it doesn't work for fast-load(bulk-insert).
Avatar of RLLewis
RLLewis

ASKER

I do the exception file on everything by default.  Right now I am reading the sqlservercentral piece --- very good thus far.  My worries are the same, though -- if i do this, am I going to screw my presently-functioning pkg?
Avatar of RLLewis

ASKER

and let me elaborate --- the package that is 'working fine'  -- it loads tons of files on a scheduled basis.  like yesterday i walked through 877 files fine.  fifteen of them failed, though, logging that it can't find col079.  that is the only information i have.  i'm just trying to get far better detail regarding 'why' it's failing every so often on certain files.
Just a personal opinion here, unless you start writing quite a bit of ActiveX script in DTS, it's really tough to get a "nice picture" of the errors.  I've found it often easier to stage the flat files first and load them from table to table with DTS or simply using SQL Statements.  The table-to-table option is nice, because you can flag errors on your input table.

Brett
RLLewis,
in log you have an error row number, it is written into Log.Source file also. If it fails without logging, then isn't the file in expected format.
Avatar of RLLewis

ASKER

i'm not sure what you mean by that, arbert --- staging them --- i am loading them into a working table first, and then into the 'real' table.  it's the load into the working table that fails.  can you explain what you mean by that?
Avatar of RLLewis

ASKER

here's my failure in the log - where is this error row number:

The execution of the following DTS Package failed:

Error Source: Microsoft Data Transformation Services (DTS) Package
Error Description:Package failed because Step 'DTSStep_DTSDataPumpTask_1' failed.
Error code: 80040428
\Error Help File:sqldts80.hlp
Error Help Context ID:700

Package Name: CAX Load Historical
Package Description: Looping, Importing and Archiving
Package ID: {8461F39E-197F-41F7-9A20-31F3724F1162}
Package Version: {3636872D-1B59-48FD-9CA6-EE47285E4CFD}
Package Execution Lineage: {F020A3B3-7CC0-414A-BD3D-146F9DC8D42F}
Executed On: 20070RLEWIS
Executed By: rlewis
Execution Started: 4/29/2005 8:10:30 AM
Execution Completed: 4/29/2005 8:10:30 AM
Total Execution Time: 0.422 seconds

Package Steps execution information:

Step 'DTSStep_DTSExecuteSQLTask_1' succeeded
Step Execution Started: 4/29/2005 8:10:30 AM
Step Execution Completed: 4/29/2005 8:10:30 AM
Total Step Execution Time: 0.016 seconds
Progress count in Step: 0

Step 'DTSStep_DTSActiveScriptTask_1' succeeded
Step Execution Started: 4/29/2005 8:10:30 AM
Step Execution Completed: 4/29/2005 8:10:30 AM
Total Step Execution Time: 0.016 seconds
Progress count in Step: 0

Step 'DTSStep_DTSActiveScriptTask_2' was not executed

Step 'DTSStep_DTSActiveScriptTask_3' succeeded
Step Execution Started: 4/29/2005 8:10:30 AM
Step Execution Completed: 4/29/2005 8:10:30 AM
Total Step Execution Time: 0.015 seconds
Progress count in Step: 0

Step 'DTSStep_DTSDataPumpTask_1' failed

Step Error Source: Microsoft Data Transformation Services (DTS) Data Pump
Step Error Description:Column name 'Col079' was not found.
Step Error code: 80042008
Step Error Help File:sqldts80.hlp
Step Error Help Context ID:0

Step Execution Started: 4/29/2005 8:10:30 AM
Step Execution Completed: 4/29/2005 8:10:30 AM
Total Step Execution Time: 0.359 seconds
Progress count in Step: 0

Step 'DTSStep_DTSActiveScriptTask_4' was not executed
Step 'DTSStep_DTSActiveScriptTask_5' was not executed
Oh ok, from above, I figured you were going from a file directly into your table.  I'm just saying there are probably easier ways to go about error checking than with dts.  You could run cursory checks of the data before you attempt to load them (make sure numbers are numbers, lengths are appropriate, etc), set error flags before you actually do your load.

The current DTS handles logging well, but it doesn't really give you too many options on what to do with the records if something happens (all is better in 2005).
But this log is not a detailed Transform Data Task - Options - Exception file log. It is an overall DTS package log.
Avatar of RLLewis

ASKER

i think i figured it out --- these files that fail on col079 actually only have 78 columns.  huge problem there -- i have no idea how to load the data into the same table in the event that the column 79 is not also found.  do you guys know?
You can create new On error branch with different DTS transform task.
Avatar of RLLewis

ASKER

I'm not sure how to do that.
In DTS, there are 4 dependency types:
On success
On failure
On completion
ActiveX

You create this DTS schema:
Text file connection
|
|Normal transformation
|
Database connection
|
|On failure dependency
|->
Text file connection
|
|Normal transformation with one column less
|
Database connection
Avatar of RLLewis

ASKER

oh hell, i knew that.  excuse my blonde moment.  and yes, you're right, that is the right way to work around this.