Link to home
Start Free TrialLog in
Avatar of conardb
conardbFlag for United States of America

asked on

How can I code a vb script 2012 component in ssis 2012 to parse a flat file connection to parse into fields to insert into sql server?

How can I code a vb script 2012 component in ssis 2012 to parse a flat file connection to parse into fields to insert into sql server?
Avatar of Jim Horn
Jim Horn
Flag of United States of America image

Explain why you need to do this, as opposed to a straight data pump with flat file source and SQL Server/ADO connection.
Avatar of conardb

ASKER

i'm using a fixed length file, when I've parsed in connection/flat file source special characters are causing the fields to misalign.  The only way I've been able to resolve is to write the fixed length record as one column as sourced in utf8, convert to ansi, then parse into seperate column table using derived columns transform.  Rather than use derived columns I'm looking to use script to compare performance and reduce steps.
Avatar of conardb

ASKER

specified synchronous inputid = to input column [Input] but writes all blank records.
What do you mean by "flat file connection"? Some example of data in the file would help.
Avatar of conardb

ASKER

a connection to a text file is a flat file connection in ssis.
The connection is to a fixed length record text file.
ASKER CERTIFIED SOLUTION
Avatar of Jim Horn
Jim Horn
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Avatar of conardb

ASKER

Thanks, I have inserted into a "unirecord" staging table.  The issues seemed to be related to ansi characters / endcoding as the source text file is utf8 and once in staging utf8 "unirecord" continued to not parse correctly using derived column transform. Converting to staging table2 Ansi "unirecord" seemed to enable correct parsing and keep special characters that had been problematic  (superscript 1/2" ).  

I used derived columns transform to parse and cast data types but have read this is actually not best practice (https://blogs.msdn.microsoft.com/sqlcat/2013/09/16/top-10-sql-server-integration-services-best-practices/) especially when seeking to use parrallellism  for in memory / high throughput processing.  A best practice indicated is as I think you suggest to use stored procedures to parse and convert data types.  

I started to look to doing that but using the transforms seemed faster and more intuitive initially.

I'll put in a seperate ? regarding ssis processing of:  staging unirecord - stored procedure - parsed column destination.  I'm particularly interested in running sql qry generating some basic stats on data loaded (i.e. # missing values etc, counts by column, etc...), emailing after load, and truncating tables from within ssis package/script.
Avatar of conardb

ASKER

I'll put in a seperate ? regarding ssis processing of:  staging unirecord - stored procedure - parsed column destination.  I'm particularly interested in running sql qry generating some basic stats on data loaded (i.e. # missing values etc, counts by column, etc...), emailing after load, and truncating tables from within ssis package/script.