DTS/File transfer protocol task - inconsistent in transfering files

From SQL server/DTS - using File transfer protocol task, I'm transfering .txt data files from a unix box to the local box.  There are about 40 jobs that transfers 40 files and execute between from 10 PM to 5 AM every night.
It is becoming rare that all these jobs transfers the files successfully.  It is becoming inconsistent.  1 or 2 jobs fail each an every day.  I'm not able to figure the problem.  
If I check the unix box admin, he says that he can't see any error on the other server.  He is asking to give more details on the error.
The only error I could see in the log file is
Step Error Source: File Transfer Protocol Task
Step Error Description:Failed to connect to the internet site for file transfer.
Step Error code: 800403F1
Step Error Help File:
Step Error Help Context ID:0

Step Error Source: Microsoft Data Transformation Services Flat File Rowset Provider
Step Error Description:Error opening datafile: The system cannot find the file specified.

Could someone pleaseeeeeeeeeeeee help me.  How do I figure out why it fails on FTP.

Appreciate your help.

Thanks in advance.

Who is Participating?

Improve company productivity with a Business Account.Sign Up

Jay ToopsConnect With a Mentor Commented:
OK this is the quick fix

Go to your job scheduler
1) EM -> Management -> SQL SERVER AGENT -> JOBS
2) pick your job that failing
3) right mouse to Properties
4) click steps tab
5) click on your download step (there is probably only one)
6) click edit
7) click advanced
8) enter RETRY ATTEMPTS (like 5 or somthing) use your judgement
9) enter retry interval of 4-5 minutes


Jay ToopsCommented:
the only thing that comes to mind would be too many silmoutaneous connections to the
ftp server.

best just to retry the step until its successful.

Also if you can try manually ftping the files down during that time

create a job that just tests ftp download every 10 minutes and
log the activity to a db ..

see what happens.

agree with jl,   but out of interest where are there " 40 jobs that transfers 40 files "   as opposed to 1 job transfering 40 files
Build your data science skills into a career

Are you ready to take your data science career to the next step, or break into data science? With Springboard’s Data Science Career Track, you’ll master data science topics, have personalized career guidance, weekly calls with a data science expert, and a job guarantee.

A good idea might be to change the DTS-FTP part of the job into a batch script. You can then call this from the DTS package and record stdin/stdout/stderr to your own log files.
Jay ToopsCommented:
Also if its trying to FTP the 40 files at the SAME time .. your poor ftp server would
get very upset.

yes thats what i was thinking.   and i would think that your get the files quicker is you do them in series rather than parallel.  
divyashruthiAuthor Commented:
so, is this the problem solely on the local server that's getting the files and not the unix server where the files reside.

I need to get the files, in each DTS package (40 packages running at various times) b'cos I do the data scrubb.

Is it possible to fix the problem by keeping the way I setup or is it compulsory that I have to write a batch script or do a package the ftp's all the files in series.

I was hoping to do this quickly.

Please suggest.

Thank you.
a quick batch script would only help identify where the problem was, not necessarily solve it
Does the FTP logs on the UNIX box have any indication of your failed attemps on?

FTP is a pretty simple protocol, and there is not normally much that goes wrong with it asside from the obvious like connection failure and  permissions, etc.

However as jl mentions there are options in FTP to restrict the number of concurrent sessions, etc.  These are often set to prevent excessive strain on the server.   Is your application the one thing that accesses this ftp site?   are you sure the files are available when you try to get them? etc.   these are the sorts of questions you need to address i think, because i suspect  the problem lies in that area.   Of course serializing the FTP means that you will only have 1 connection which should eliminate the possibility of exceeding the max connections, but from a technical point of view there is nothing wrong with your strategy.
oops " Is your application the one thing that accesses this ftp site? "  i mean " Is your application the only thing that accesses this ftp site? "
divyashruthiAuthor Commented:
The FTP logs on UNIX box have no indication of the failed attemps.  "Nope, my application is not the only thing that accesses this ftp site? "  Files are available when I try to get them.

To change the retry attempts in sql server agent jobs - in the steps tab I have only 1 step that executes all the steps in the package.  By design I have about 8 tasks in the package.  step 2 is where I FTP the files.  
Setting the retry property to 5 or more - will that cause the package to execute completely (i.e run all the steps/tasks in the package) and then if fails at any point  of time say at "task 2" will it continue with all the steps, 3,4,5 etc. and then try executing for next 4 times until successful.  I'm kind of lost here.  Please help.
Jay ToopsCommented:
Job RETRY only comes into play if the entire package fails.
the scheduler job steps does not relate to package task steps.

so if the package fails for ANY reason, the retry will
execute it again after the wait interval. up to the maximum number of retrys.

so if your package runs successfully the first time
retry will not come into play at all.. your job step will be done.

so for settings like
9) enter retry interval of 4 minutes
if your package fails on the first attempt.  in  4 minutes the scheduler will
launch your package again.

if it is then successful.. no further processing will take place...


Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.