Solved

ssis dtsx package execution errors.

Posted on 2014-03-28
6
464 Views
Last Modified: 2016-02-10
I have created a ssis dtsx package which seemed to go fine until I execute the package. Below is an excerpt from the "Execution Results" tab. Could somebody tell me why this is blowing up and what I can do about that?
Thanks in advance.

[SSIS.Pipeline] Information: Execute phase is beginning.

error msg - [Source - DXX_ENCOMPASS_OND_MAIL_140327_txt [1]] Error: Data conversion failed. The data conversion for column "Column 94" returned status value 4 and status text "Text was truncated or one or more characters had no match in the target code page.".

error msg - [Source - DXX_ENCOMPASS_OND_MAIL_140327_txt [1]] Error: The "output column "Column 94" (386)" failed because truncation occurred, and the truncation row disposition on "output column "Column 94" (386)" specifies failure on truncation. A truncation error occurred on the specified object of the specified component.

error msg - [Source - DXX_ENCOMPASS_OND_MAIL_140327_txt [1]] Error: An error occurred while processing file "C:\Documents and Settings\sritsick\My Documents\jsr\Erie_Master\test\Scott\DXX_ENCOMPASS_OND_MAIL_140327.txt" on data row 1.

error msg - [SSIS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Source - DXX_ENCOMPASS_OND_MAIL_140327_txt" (1) returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

[SSIS.Pipeline] Information: Post Execute phase is beginning.
0
Comment
Question by:tesla764
  • 3
  • 2
6 Comments
 
LVL 29

Assisted Solution

by:becraig
becraig earned 300 total points
ID: 39962553
This seems to be a dataflow issue between source and destination.

What type of file are you importing from ?


Please do a quick spot check of your mappings between source and destination



      Try adjusting the source data types in the Advanced tab, update data type of any bad columns from DT_TEXT.
0
 

Author Comment

by:tesla764
ID: 39962600
I am importing from a .txt file.
I will try your suggestions in a minute.
Thanks.
0
 
LVL 9

Accepted Solution

by:
edtechdba earned 200 total points
ID: 39962601
This error appears to be pointing to a data column truncation issue. Trying to push in a data value greater in length than what can be accepted on the receiving side. Do you know which column is "Column 94"? That may point you in the right direction.

In this circumstance, I would check out the mapping within the destination task (of the data flow task), hover over each source column and destination column and verify that both data field lengths match.

If there is a mismatch, you can right click on the source task, click on advanced and then update the output column length to match the destination task length. Or vice versa, increase your destination column to match the source column length.

Another option (if you are able to pinpoint the column that is the issue), is to create another task in-between the source and destination (data conversion task I think) and truncate the value from the column before it's pushed into the destination task.

I hope that helps!
0
Efficient way to get backups off site to Azure

This user guide provides instructions on how to deploy and configure both a StoneFly Scale Out NAS Enterprise Cloud Drive virtual machine and Veeam Cloud Connect in the Microsoft Azure Cloud.

 

Author Comment

by:tesla764
ID: 39962610
The input data types are DT-STR.
0
 

Author Comment

by:tesla764
ID: 39962613
edtechdba I will try your suggestion in a few minutes. Thanks.
0
 
LVL 29

Assisted Solution

by:becraig
becraig earned 300 total points
ID: 39962620
Change the input type to DT_TEXT and try again.

For the problematic column it might be safe to set the destination to nvarchar
0

Featured Post

The Eight Noble Truths of Backup and Recovery

How can IT departments tackle the challenges of a Big Data world? This white paper provides a roadmap to success and helps companies ensure that all their data is safe and secure, no matter if it resides on-premise with physical or virtual machines or in the cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

How to leverage one TLS certificate to encrypt Microsoft SQL traffic and Remote Desktop Services, versus creating multiple tickets for the same server.
Here's a requirements document template for an integration project (also known as Extract-Transform-Load or ETL) based on my development experience as an SQL Server Information Services (SSIS) developer over the years.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

776 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question