Solved

SSIS Data Flow Task:  Is there a way to arrow multiple processes back to one?

Posted on 2015-01-19
9
159 Views
Last Modified: 2016-02-15
Hi All

I have a Data Flow task with a Lookup on two columns (id, SystemModstamp) that has three branches:
Match Not Found - Insert.  Works fine when it was just this.
... and if a match is found I've just added a Conditional split to test variable overwrite_matches...
If overwrite_matches='Y', delete the matching rows (TF Delete Batch in below image), then insert.
If overwrite_matches='N', get a row count called 'no action' and do nothing.

Question: How can I pull off the blue-green arrow in the below image, set a precedence constraint (I know, they don't exist in data flow tasks) between the #2 TF Delete Batch transform, and the Derived Column, so that the Derived Column does not execute until both have been completed?

All that's coming to mind is a Union All / Merge / Merge Join, but I don't want the stream in the #2 TF delete to impact the downstream INSERT, as that's already in the #1 stream.
data-flow-task-completion-question.jpgThanks in advance.
Jim
0
Comment
Question by:Jim Horn
  • 4
  • 2
  • 2
  • +1
9 Comments
 
LVL 18

Expert Comment

by:Simon
ID: 40559197
I don't think you can. What's the error message when you try to create the dataflow link?

I'll try to mock up a similar flow in my environment later today.

Other experts may provide you a better answer in the meantime...

Or, this MSDN social blog post might help you.
0
 
LVL 37

Assisted Solution

by:ValentinoV
ValentinoV earned 75 total points
ID: 40559634
Hey Jim,

As I'm not familiar with that "TF Delete Batch transform" which you're using (the image doesn't seem familiar) so I'm not 100% sure what you're asking.  Assuming your goal is to perform a delete/insert operation on the records going down that part, maybe you can use a temporary table to pull it off?  Just use an additional OLE_DB destination to write the "overwrite matches" record set to the temp table.  In the control flow, add a SQL Script component following the data flow, containing a DELETE and an INSERT statement.  You could add an implicit transaction around both statements if preferred.
0
 
LVL 65

Author Comment

by:Jim Horn
ID: 40559642
@Simon - No error message, just can't connect the two.

@VV- Pragmatic Works Task Factory Delete Batch Transform, essentially a delete action.  Forgot to mention that.

In my design version 1.0 I had staging tables for all of these data flows, but since the client found no scenario where I would pump the data from source and never pump it to target, I deleted them.  Some of these tables had 800+ columns (not my decision), so deleting staging was a big time savings.

This issue would be a case to re-add them, as I could do an Execute SQL to 'DELETE all from target where in source', then just let the data flow do the insert.

Thinking, thinking..
0
Forrester Webinar: xMatters Delivers 261% ROI

Guest speaker Dean Davison, Forrester Principal Consultant, explains how a Fortune 500 communication company using xMatters found these results: Achieved a 261% ROI, Experienced $753,280 in net present value benefits over 3 years and Reduced MTTR by 91% for tier 1 incidents.

 
LVL 18

Assisted Solution

by:Simon
Simon earned 50 total points
ID: 40559659
>I could do an Execute SQL to 'DELETE all from target where in source', then just let the data flow do the insert.
That sounds like a better process design!
0
 
LVL 15

Accepted Solution

by:
JimFive earned 375 total points
ID: 40559663
Presumably, the delete batch transform doesn't have any output because you deleted them, so what would the arrow accomplish?  Apart from that, you can use the Merge Transform or the Union All Transform to combine datasets.
0
 
LVL 65

Author Comment

by:Jim Horn
ID: 40559670
Thinking, thinking ...

Might not be a bad idea to resurrect the source to staging pumps, but ONLY for the two PK columns.  
Then I can easily...
Do the above Execute SQL to delete rows if I want, i.e. instead of an UPDATE testing up to 800+ columns this would be the DELETE part of a DELETE-INSERT.
For a full load I can handle the 'In target but not in source' scenario, which I'm currently not handling as this puppy is so big I'm only doing incremental loads, but occasionally something gets upgefucht requiring a full load.
0
 
LVL 65

Author Comment

by:Jim Horn
ID: 40559706
Just realized something based on JimFive's comment:  I don't have to join the two, as one path are the matches, and another path is the no matches, so the rows are different.  

My original thinking was that the deletes had to happen before the final inserts, but with different rows that's not an issue.

So, the final answer is..
No link (blue-green arrow in original question) is required.
Change the Delete in the original image to an TF update.  (Could have also been an OLE DB Command)

Final answer
0
 
LVL 37

Expert Comment

by:ValentinoV
ID: 40559739
I sure hope the performance of the TF update is better than the OLE DB Command :)  Besides that: your plan sounds good!
0
 
LVL 65

Author Closing Comment

by:Jim Horn
ID: 40559753
Thanks.
0

Featured Post

Free Tool: Subnet Calculator

The subnet calculator helps you design networks by taking an IP address and network mask and returning information such as network, broadcast address, and host range.

One of a set of tools we're offering as a way of saying thank you for being a part of the community.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Use this article to create a batch file to backup a Microsoft SQL Server database to a Windows folder.  The folder can be on the local hard drive or on a network share.  This batch file will query the SQL server to get the current date & time and wi…
I have a large data set and a SSIS package. How can I load this file in multi threading?
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed

730 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question