• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 654
  • Last Modified:

MS SQL Integration Services import from flat file

Hi, I am new to SSIS and need some guide lines from experts to achieve the following:
-      read flat file from ftp account. / Line by line
-      Read certain values from the line, values are on certain positions e.g.
-      Value 1, start position 2 till 10,
-      Value 2, start from 15 till 20,
-      Value 2, start from 34 to 40. And so on.
Then pass values to execute store procedure with parameters,
Repeat the execution of the SP for each line / each time with new parameters  values read from file until end of the file.
The task should run every day automatically,
Can anyone draw me some guidelines so I can easily and quickly do it?
Thank you for your help.

2 Solutions
I would parse the flat file into a table.  You can use SSIS to import a fixed lenth field file to do this

After that, run a query through a cursor calling the SP with each line and looping through the table created in step 1
Firing a stored procedure which updates rows line by line can take a long time, depending on complexity.

I've done the following:

1. Create an SSIS package, which has a Script Task. There are examples on line to do this. Have the script task use variables at the package level. This allows you to control the file names and such via an XML file. The only thing this FTP task should do is get the file from the ftp server, and drop it into a directory for step 2. Set this up as a job for SQL Agent. Use a package configuration for the variables -- file name, user name, password, etc., this will allow you to change it without having to redeploy the package.

2. Create an SSIS package which picks up the file(s). A Data flow task should import the file, do all the cleanup, transform your data into fields and such, and insert the values into a table. The Derived Column editor will alllow you to extract the substrings of the line(s) into each destination field. Remember to convert your fields as well (String beceomes BigInt, etc.) prior to inserting into your table(s). Use a SQL Client destination to insert values into your table(s). If you have mutiple tables that the data is bound for, split the data off for those table destinations, use a Union to exclude the fields you don't want (that way you don't get warnings about unsed fields), and then insert into your table(s). Unless you need some sort of special logic in your stored procedure, and all your doing is dumping the data into the table(s), you won't need a stored procedure. Also, if you STILL need to perform some sort of special logic for the stored procedure, and you think that SSIS can't handle it, then put that logic into an INSERT trigger on the table -- that way each time SSIS commits your rows, your committing 10,000 rows at a time (much faster), rather than processing one row at a time (much slower).

SSIS can handle virtually any type of task you can put to it -- even data rules and / or business rules. Consider doing that for your imports -- I've done it, and it works nicely.
rkanabusAuthor Commented:
Hi, Thank you for explanations, it is really helpful.

As suggested I have started to importing the file into SQL Destination, but each time I test and execute the package I get the error:

It stops on SQL Server Destination.

Can any one tell me why?

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

Tackle projects and never again get stuck behind a technical roadblock.
Join Now