Solved

SSIS Package to Read A DIrectory and Import many CSV files into a Table

Posted on 2011-03-10
10
642 Views
Last Modified: 2013-11-10
Hi Guys

We have an application that we need to import a bunch of small CSV files (all the same structure) into a single existing table then delete the small csv file on an iterative basis.

We can do this on a single file by file basis but I need some help to make it iterative. The files are names EPOS_date_time.csv so each filename is unique.

Cheers C

0
Comment
Question by:BoltonWanderer
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
10 Comments
 
LVL 40

Assisted Solution

by:lcohan
lcohan earned 166 total points
ID: 35100102
Here's what I would do - lots of SQL dynamic code but is possible:

exec xp_cmdshell 'DIR C:\Folder_name\EPOS*.* > C:\Folder_name\EPOS_list.txt'

--read that file in a SQL table like:

create table #tempfile (line varchar(8000))
exec ('bulk insert #tempfile from "C:\Folder_name\EPOS_list.txt"')
select * from #tempfile where line like 'EPOS%'
--drop table #tempfile
   
--Use this to build a dynamic SQL fora each CSV and import it into a permanent table same as we did above.
0
 
LVL 22

Assisted Solution

by:8080_Diver
8080_Diver earned 166 total points
ID: 35101137
There are native SSIS components that will facilitate this processing.  For instance, you can set up the following:
A File connection type that uses a variable to determine the pathed search file name;
Then set up a For Each Loop container that will drive off of that File Connection;
Within that container, 1) Set up a Data Flow Task to import the data to a staging table;
2) set up a File System Task to delete the source file;
Finally, outside of the For Each Loop Container, set up another Data Flow Task to process the staging table's data to add it to the ultimate target table.

Handling the different filenames is not really a problem because the For Each Loop Container can be set to use an Expression for the Directory Path and for the File Spec and those Expressions are driven by User Variables that you control.  In a Script Task, you set those variables (although, if the folder path is constant, that can be set once when the variable  is defined) using VB.Net.  (I generally set the File Spec to the Path + left part of the FileName + * but, in your case I would set it to Path + "EPOS_*.csv".)

0
 
LVL 21

Accepted Solution

by:
Jason Yousef, MS earned 168 total points
ID: 35102098
Hello,
You can use the FOR EACH LOOP CONTAINER as 8080  said.
here's some help about it:
http://www.sqlis.com/post/Looping-over-files-with-the-Foreach-Loop.aspx

Also I've attached some screen shots of what the package would look like 1 2 34.jpg
5.jpg
0
Do you have a plan for Continuity?

It's inevitable. People leave organizations creating a gap in your service. That's where Percona comes in.

See how Pepper.com relies on Percona to:
-Manage their database
-Guarantee data safety and protection
-Provide database expertise that is available for any situation

 
LVL 21

Expert Comment

by:Alpesh Patel
ID: 35239545
Yes huslayer: is right

Use Foreach to Directory location to get each file and insert into table using data flow task. and last wish to create then you can.
0
 
LVL 21

Expert Comment

by:Jason Yousef, MS
ID: 35286001
@BoltonWanderer,

Any luck?
Do you need more help?


Jason
0
 
LVL 21

Expert Comment

by:Jason Yousef, MS
ID: 35930274
Hey Pedro, thanks for the cleaning job, great efforts man.

I'm fine by splitting the points between  @lcohan , @8080_Diver and @Huslayer, I think the 3 of us supplied a valid answer.

Regards,
Jason
0
 
LVL 143

Expert Comment

by:Guy Hengel [angelIII / a3]
ID: 36202668
This question has been classified as abandoned and is closed as part of the Cleanup Program. See the recommendation for more details.
0

Featured Post

Do you have a plan for Continuity?

It's inevitable. People leave organizations creating a gap in your service. That's where Percona comes in.

See how Pepper.com relies on Percona to:
-Manage their database
-Guarantee data safety and protection
-Provide database expertise that is available for any situation

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Azure Functions is a solution for easily running small pieces of code, or "functions," in the cloud. This article shows how to create one of these functions to write directly to Azure Table Storage.
In this blog post, we’ll look at how ClickHouse performs in a general analytical workload using the star schema benchmark test.
Via a live example, show how to shrink a transaction log file down to a reasonable size.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

624 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question