Want to protect your cyber security and still get fast solutions? Ask a secure question today.Go Premium

x
?
Solved

SSIS  Conversion Error

Posted on 2013-01-18
4
Medium Priority
?
2,396 Views
Last Modified: 2016-02-10
Hello

This is in reference to this previous question
http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/Q_28000926.html



Due to some conversion issue, only 12 rows are getting to destination instead of 43 ...why?

I am getting is from output

Output says
SSIS package "Package.dtsx" starting.
SSIS package "Package.dtsx" starting.
Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
Warning: 0x800470C8 at Data Flow Task, OLE DB Destination [198]: The external columns for component "OLE DB Destination" (198) are out of synchronization with the data source columns. The external column "OrderDate" needs to be updated.
Information: 0x4004300A at Data Flow Task, SSIS.Pipeline: Validation phase is beginning.
Warning: 0x800470C8 at Data Flow Task, OLE DB Destination [198]: The external columns for component "OLE DB Destination" (198) are out of synchronization with the data source columns. The external column "OrderDate" needs to be updated.
Warning: 0x80049304 at Data Flow Task, SSIS.Pipeline: Warning: Could not open global shared memory to communicate with performance DLL; data flow performance counters are not available.  To resolve, run this package as an administrator, or on the system's console.
Information: 0x40043006 at Data Flow Task, SSIS.Pipeline: Prepare for Execute phase is beginning.
Information: 0x40043007 at Data Flow Task, SSIS.Pipeline: Pre-Execute phase is beginning.
Information: 0x402090DC at Data Flow Task, Flat File Source [161]: The processing of file "C:\incomingCSVFolder\testIncoming.csv" has started.
Information: 0x4004300C at Data Flow Task, SSIS.Pipeline: Execute phase is beginning.
Information: 0x402090DE at Data Flow Task, Flat File Source [161]: The total number of data rows processed for file "C:\incomingCSVFolder\testIncoming.csv" is 44.
Error: 0xC0202009 at Data Flow Task, OLE DB Destination [198]: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80004005.
An OLE DB record is available.  Source: "Microsoft SQL Server Native Client 10.0"  Hresult: 0x80004005  Description: "Invalid character value for cast specification".
Error: 0xC020901C at Data Flow Task, OLE DB Destination [198]: There was an error with input column "Copy of Total" (778) on input "OLE DB Destination Input" (211). The column status returned was: "The value could not be converted because of a potential loss of data.".
Error: 0xC0209029 at Data Flow Task, OLE DB Destination [198]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "input "OLE DB Destination Input" (211)" failed because error code 0xC0209077 occurred, and the error row disposition on "input "OLE DB Destination Input" (211)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.
Error: 0xC0047022 at Data Flow Task, SSIS.Pipeline: SSIS Error Code DTS_E_PROCESSINPUTFAILED.  The ProcessInput method on component "OLE DB Destination" (198) failed with error code 0xC0209029 while processing input "OLE DB Destination Input" (211). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running.  There may be error messages posted before this with more information about the failure.
Information: 0x40043008 at Data Flow Task, SSIS.Pipeline: Post Execute phase is beginning.
Information: 0x402090DD at Data Flow Task, Flat File Source [161]: The processing of file "C:\incomingCSVFolder\testIncoming.csv" has ended.
Information: 0x402090DF at Data Flow Task, OLE DB Destination [198]: The final commit for the data insertion in "component "OLE DB Destination" (198)" has started.
Information: 0x402090E0 at Data Flow Task, OLE DB Destination [198]: The final commit for the data insertion  in "component "OLE DB Destination" (198)" has ended.
Information: 0x4004300B at Data Flow Task, SSIS.Pipeline: "component "OLE DB Destination" (198)" wrote 12 rows.
Information: 0x40043009 at Data Flow Task, SSIS.Pipeline: Cleanup phase is beginning.
Task failed: Data Flow Task
Warning: 0x80019002 at Package: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (4) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
SSIS package "Package.dtsx" finished: Failure.
0
Comment
Question by:Rayne
  • 3
4 Comments
 

Author Comment

by:Rayne
ID: 38795232
Create table like this

CREATE TABLE.[dbo].[TestCSVUpload](

      [OrderDate][datetime] NULL,
      [Region][nvarchar](255) NULL,
      [Rep][nvarchar](255) NULL,
      [Item][nvarchar](255) NULL,
      [Units][numeric](18, 0) NULL,
      [Cost][numeric](18, 0) NULL,
      [Total][numeric](18, 0) NULL,

      
) ON [PRIMARY]

GO
0
 

Author Comment

by:Rayne
ID: 38795702
Any Experts are welcome to answer this question :)
0
 
LVL 44

Accepted Solution

by:
Rainer Jeschor earned 2000 total points
ID: 38796397
Hi,
after having a look at the CSV as well as on your output and your table definition I would say that you have get the errors as your Total, cost and units are defined as numeric(18,0) but in your import file they have digits ;-)
Hence the error.
I do also think that the errors with the invalid char result from a not proper configured flat file import data source.

If you could attach the package the errors might be found more easily.

HTH
Rainer
0
 

Author Comment

by:Rayne
ID: 38796907
Perfect Rainer :)

You were spot on!! Thank you for the indicator. Greatly appreciated

R
0

Featured Post

Veeam Disaster Recovery in Microsoft Azure

Veeam PN for Microsoft Azure is a FREE solution designed to simplify and automate the setup of a DR site in Microsoft Azure using lightweight software-defined networking. It reduces the complexity of VPN deployments and is designed for businesses of ALL sizes.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Recently we ran in to an issue while running some SQL jobs where we were trying to process the cubes.  We got an error saying failure stating 'NT SERVICE\SQLSERVERAGENT does not have access to Analysis Services. So this is a way to automate that wit…
It is possible to export the data of a SQL Table in SSMS and generate INSERT statements. It's neatly tucked away in the generate scripts option of a database.
Via a live example, show how to set up a backup for SQL Server using a Maintenance Plan and how to schedule the job into SQL Server Agent.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.
Suggested Courses

580 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question