Flat File Data Flow Process Runs Very Slowly Within a For Each (file) Loop Container

I have developed an SSIS package that loads data into an OLEDB data destination table from 26 very small fixed format flat files which should, you would think, run very quickly but doesn't (we have a standalone .exe that does the same work in seconds).  

It seems to pause around 30 seconds between each file iteration (regardless of whether there were any detail records to load in the file) in order to "commit the insert", or perform the "Post Execute " and/or  "Cleanup" phases, or all of those.  I have tried tweaking every property setting for the data flow task itself and the items within the data flow (buffer sizes, max records, etc.) with absolutely no luck.  I have even tried flowing the data to flat file destination (rather than an OLEDB database table) and it still wants to pause between each file.  

I'm almost at the point of abandoning the idea of using all the neat tools that SSIS provides, and just using a big VBSript task (almost like what we use now) to do all the work.  

Any suggestions? ...

Thank you,
Ted Landres

tlandreAsked:
Who is Participating?
 
PedroCGDCommented:
tlandre,
try to replace the data reader destination for an OLEDB destionation
and also send me a correct text file for refund connection.
Regards,
Pedro
0
 
PedroCGDCommented:
Could you post here a flatfile example to simulate the scenario in an example package?
regards,
Pedro
0
 
tlandreAuthor Commented:
If you need the package itself, I would have to make extensive modifications due to the sensitive connection string information to our corporate databases and tables ... Do you need the PACKAGE itself?

The dataflow is pretty straight forward and consists of 5 items in the following order:  

1.  The Flat File Source
2.  A Conditional Split Transformation
3.  A Derived Columns Transformation
4.  A Data Conversion Transformation
5.  And the findal OLEDB Destination (table)

When the data flow executes from the control task, it appears that all data flow items execute VERY QUICKLY because every item turns GREEN at the same time.  But afterward, everything STALLS and just sits there for 30 or more seconds before the NEXT file processes (as if it is doing some housekeeping or something under the hood that I can't control directly).

Experts-Exchange-Example-901-DM-.txt
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
PedroCGDCommented:
I fyou attach th epackage or send me via email it would be better to give you the solution.
regards,
Pedro
0
 
tlandreAuthor Commented:
Since you don't have access to, or have yourself, a database with which to test, I guess it doesn't matter if I send you a dtsx package with BOGUS connection information.  I'm not sure what good that is going to do you though, because your connection manager to the DB and table within isn't going to work because the MAPPING of derived and converted columns from the flat file ... to the table columns ... will be lost.  

Is that still going to be okay for your purposes?  I honestly don't see how sending you a broken package is going to help, but I will send one to you if you think it will.
0
 
PedroCGDCommented:
I know that, but I can offline check how you configured each component. only that.
If you want my help, I can do that with pleasure and for free.
Regards,
Pedro

www.pedrocgd.blogspot.com
0
 
tlandreAuthor Commented:
Pedro,

I removed all but the essential items from this package so that it doesn't need an OLEDB connection and simply directed the data flow to a dummy flat file destination.  It still exhibits the same behavior by stalling around 30 seconds between each file, so you can observe what it's doing for yourself.  You only have one flat file, but you can create multiple copies (you might need to vary each name slightly) and put them in a directory/folder of your choosing.  I have created two variables (FileName and FileFolder) that you can use to specify whatever folder that might be.

PS - FYI:
I think the For Each Loop container is looking for files using the following filter ... "ACE_REFUND_ORDER_901_*.txt"

I would also like to thank you in advance for your willingness to assist me in this issue ... So Thanks!
DTSX-Experts-Exchange-Example-bu.txt
0
 
PedroCGDCommented:
I will take a look right now.
Just a moment
0
 
tlandreAuthor Commented:
Pedro,

You want me to send you a "correct" text file for the refund connection?  I'm not sure I understand what you mean.  I already sent you a refund text file previously... and it was a correct.  You can't be referring to that file can you?  If not, please explain.  You should be able to run the package I sent you with instructions ... as is.

I also am confused about your suggesting replacing the OLEDB destination with a data reader.  In the package I sent you ... there is NO OLEDB destination to replace ... I removed it.
0
 
PedroCGDCommented:
The textfile you gave me generate errors while setting the connectio to it.

"I also am confused about your suggesting replacing the OLEDB destination with a data reader.  In the package I sent you ... there is NO OLEDB destination to replace ... I removed it."

I suggested the oposite if you read the post correctly. "try to replace the data reader destination for an OLEDB destionation"
0
 
tlandreAuthor Commented:
You're right, Pedro, I misread the sentence about the OLEDB ... sorry.

Now, I didn't replace the data reader as you suggested ... I eliminated it completely and guess what? ...  that solved the problem!  It was the superfluous data reader that was eating my lunch the whole time.

So, muchas gracias mi amigo ... you rock!
0
 
PedroCGDCommented:
:-)
Very good! COngratulations.
Visit my blog and leave a comment to remember your profile.
Cheers!
Pedro
0
 
tlandreAuthor Commented:
You're right, Pedro, I misread the sentence about the OLEDB ... sorry.

Now, I didn't replace the data reader as you suggested ... I eliminated it completely and guess what? ...  that solved the problem!  It was the superfluous data reader that was eating my lunch the whole time.

So, muchas gracias mi amigo ... you rock!
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.