I have a situation where we need to read data from multiple source files and merge it all together to form one destination file. Since the existence of any of those flat files can be flaky, we want to have several Data Flow Tasks - one for each file to load it to memory, and one to actually do the merging. This will allow us to throw an error and determine which specific file wasn't available. However, we can't find the best way to "transfer" data from one data flow task to another.
Our first attempt was to load the source file to a Recordset destination, assign it to a variable, and use it in the final Data Flow Task, but we found out that there aren't Recordset sources. After doing research, we found that we may want to load the source file to a Raw File Destination, but this appears to actually create a file on disk. I'm not so sure we would gain much from using this approach.
This doesn't seem like a request that is out of the ordinary. How do other people use the same data in multiple Data Flow Tasks?