Solved

Problem with simple SSIS package

Posted on 2013-07-01
5
337 Views
Last Modified: 2016-02-11
I am trying to copy rows from a SQL query into an existing excel file.
The errors I am getting are:

[Excel Destination [1381]] Error: SSIS Error Code DTS_E_OLEDBERROR.  An OLE DB error has occurred. Error code: 0x80040E21.

[Excel Destination [1381]] Error: Cannot create an OLE DB accessor. Verify that the column metadata is valid.

The t-sql is made up of several columns - most are actual columns and some are aliased empty strings so that there is a like for like match to the spreadsheet.
When I look at column mapping everything looks perfect but I just can't get it to work.
I must be missing something simple. I look at the meta data for the data flow and wonder if there is a mismatch in datatypes (shown in the image) but I can't see a way to change the datatypes

metadata.png
0
Comment
Question by:QPR
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
5 Comments
 
LVL 9

Expert Comment

by:edtechdba
ID: 39292003
You may want to check the mapping properties on your excel destination in the data flow task, hover over the input fields and output fields and make sure the data types match.

I find that often times I need to use a data conversion task in order to change the data type to match.
0
 
LVL 29

Author Comment

by:QPR
ID: 39292037
They do not match:
int > string
ntext > string

Is this what would cause those cryptic errors?

I think you may be right with the transformation in task, in fact the more I look at what I need to achieve the more I see that it might not be a simple data pump task.
For instance I need to recreate the spreadsheet each time so may use a template with the headers already in place.
0
 
LVL 9

Accepted Solution

by:
edtechdba earned 500 total points
ID: 39292049
If you use a data conversion task in-between the data source and excel destination, you should be able to get the data types to match, map the converted column values to the destination columns.

I like to use file system tasks to copy my "template" file (a blank excel file with the proper headers) into my source file location, then run the data import process in the data flow task. You can set your template file to overwrite the data source file so that it's a prep step prior to every run. That way you know you have a fully refreshed file every run.
0
 
LVL 29

Author Closing Comment

by:QPR
ID: 39292061
Great thanks - no doubt ill be back with more questions
0
 
LVL 9

Expert Comment

by:edtechdba
ID: 39292065
Good luck!
0

Featured Post

The Eight Noble Truths of Backup and Recovery

How can IT departments tackle the challenges of a Big Data world? This white paper provides a roadmap to success and helps companies ensure that all their data is safe and secure, no matter if it resides on-premise with physical or virtual machines or in the cloud.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

SQL Server engine let you use a Windows account or a SQL Server account to connect to a SQL Server instance. This can be configured immediatly during the SQL Server installation or after in the Server Authentication section in the Server properties …
My client has a dictionary table. They're defining a list of standard naming convention. Now, they are requiring my team to provide us a mechanism how to match new incoming data with existing data in their system.
Finds all prime numbers in a range requested and places them in a public primes() array. I've demostrated a template size of 30 (2 * 3 * 5) but larger templates can be built such 210  (2 * 3 * 5 * 7) or 2310  (2 * 3 * 5 * 7 * 11). The larger templa…

710 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question