Solved

What task or combo of tasks can I use in ssis 2015 to load different file names into different tables into sql server?

Posted on 2016-09-28
7
41 Views
Last Modified: 2016-10-17
I have a for each look container using folder and filename variables.  The initial dataflow task in the for each loop container does a conditional split on the file name (2 different file names) and inserts into a seperate initial staging table for each file.  The inital data flow has 2 precedence constraints with expression and constraint for the filename and then processes to seperate subsequent dataflows for each file.   Dataflow for file1 is reprocessing 1 extra file.  screenprint attached.
packageflow.docx
0
Comment
Question by:conardb
  • 3
  • 2
  • 2
7 Comments
 
LVL 65

Expert Comment

by:Jim Horn
Comment Utility
What exactly is your question here?
0
 

Author Comment

by:conardb
Comment Utility
Is using a dataflow task with conditional split on filename(s) to process different files with different layouts a proper approach to use ssis tasks?    What is your expert opinion?
0
 
LVL 13

Expert Comment

by:Megan Brooks
Comment Utility
If it works and is fast enough, use it. If you think you might need to support additional filenames in the future, you might want to consider what would facilitate doing that.

You mention both data flow tasks (shown in the attachment) and conditional splits, which are transformations within a data flow task. What is the purpose of testing the filename at both levels?
0
Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

 
LVL 65

Accepted Solution

by:
Jim Horn earned 250 total points
Comment Utility
Not exactly.  Data flows are meant for a single source definition and single destination definition, so if your 'different layouts' implies multiple definitions then this needs to be handled using conditional split logic and multiple data flow tasks (aka data pumps) for each unique source/destination definition.
0
 

Author Comment

by:conardb
Comment Utility
ok, thanks... Presently, I have 1 dataflow that is using conditional split logic to select by file name from the same directory each of 2 different files / formats and write to a different staging table for each.  Then seperate dataflows for each of those types derive the fixed length non-delimted row into  seperate sql server tables.  It's working but I suspected not best practice as the filter by name is redundantly used in the initial dataflow then afterward as a precedence to direct processing to derive to seperate tables .  Jim, could you provide an example or link to an example of your approach?  Thanks again.
0
 
LVL 13

Assisted Solution

by:Megan Brooks
Megan Brooks earned 250 total points
Comment Utility
Looking back at your original attachment, it looks like you have a task flow that uses conditional constraints to select one set of sequential data flows or another (conditional split is a data transformation component, not something you would use in the task flow).

Now that I understand the terminology substitutions, the logic looks OK. I only see one set of conditional constraints in the diagram, and there shouldn't be any redundancy in that. Both expressions will be evaluated, but if they are complementary then only one path will continue on for each iteration. Because you are working with constraints, not branching logic, that's what you need to do to use this design.

It seems like it might be clearer if not easier to run two separate for-each loops, eliminating the conditional constraints, and adjusting the file patterns accordingly.

Is the reprocessing that you mentioned initially intended or unintended? Is there a bug to be addressed?
0
 

Author Closing Comment

by:conardb
Comment Utility
sorry I took so long, thanks again
0

Featured Post

Enabling OSINT in Activity Based Intelligence

Activity based intelligence (ABI) requires access to all available sources of data. Recorded Future allows analysts to observe structured data on the open, deep, and dark web.

Join & Write a Comment

In this article I will describe the Backup & Restore method as one possible migration process and I will add the extra tasks needed for an upgrade when and where is applied so it will cover all.
My client sends a request to me that they want me to load data, which will be returned by Web Service APIs, and do some transformation before importing to database. In this article, I will provide an approach to load data with Web Service Task and X…
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.

771 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

10 Experts available now in Live!

Get 1:1 Help Now