Solved

DTS/File transfer protocol task - inconsistent in transfering files

Posted on 2004-09-13
12
606 Views
Last Modified: 2013-11-30
From SQL server/DTS - using File transfer protocol task, I'm transfering .txt data files from a unix box to the local box.  There are about 40 jobs that transfers 40 files and execute between from 10 PM to 5 AM every night.
It is becoming rare that all these jobs transfers the files successfully.  It is becoming inconsistent.  1 or 2 jobs fail each an every day.  I'm not able to figure the problem.  
If I check the unix box admin, he says that he can't see any error on the other server.  He is asking to give more details on the error.
The only error I could see in the log file is
*************************************************************
Step Error Source: File Transfer Protocol Task
Step Error Description:Failed to connect to the internet site for file transfer.
Step Error code: 800403F1
Step Error Help File:
Step Error Help Context ID:0

Step Error Source: Microsoft Data Transformation Services Flat File Rowset Provider
Step Error Description:Error opening datafile: The system cannot find the file specified.
**************************************************************

Could someone pleaseeeeeeeeeeeee help me.  How do I figure out why it fails on FTP.

Appreciate your help.

Thanks in advance.
 



 
0
Comment
Question by:divyashruthi
  • 4
  • 4
  • 2
  • +1
12 Comments
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12044148
the only thing that comes to mind would be too many silmoutaneous connections to the
ftp server.

best just to retry the step until its successful.


Also if you can try manually ftping the files down during that time

create a job that just tests ftp download every 10 minutes and
log the activity to a db ..

see what happens.

Jay
0
 
LVL 18

Expert Comment

by:ShogunWade
ID: 12044591
agree with jl,   but out of interest where are there " 40 jobs that transfers 40 files "   as opposed to 1 job transfering 40 files
0
 
LVL 1

Expert Comment

by:ratb3rt
ID: 12044799
A good idea might be to change the DTS-FTP part of the job into a batch script. You can then call this from the DTS package and record stdin/stdout/stderr to your own log files.
0
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12045326
Also if its trying to FTP the 40 files at the SAME time .. your poor ftp server would
get very upset.

Jay
0
 
LVL 18

Expert Comment

by:ShogunWade
ID: 12045351
yes thats what i was thinking.   and i would think that your get the files quicker is you do them in series rather than parallel.  
0
 

Author Comment

by:divyashruthi
ID: 12045853
so, is this the problem solely on the local server that's getting the files and not the unix server where the files reside.

I need to get the files, in each DTS package (40 packages running at various times) b'cos I do the data scrubb.

Is it possible to fix the problem by keeping the way I setup or is it compulsory that I have to write a batch script or do a package the ftp's all the files in series.

I was hoping to do this quickly.

Please suggest.

Thank you.
0
Ransomware-A Revenue Bonanza for Service Providers

Ransomware – malware that gets on your customers’ computers, encrypts their data, and extorts a hefty ransom for the decryption keys – is a surging new threat.  The purpose of this eBook is to educate the reader about ransomware attacks.

 
LVL 1

Expert Comment

by:ratb3rt
ID: 12045946
a quick batch script would only help identify where the problem was, not necessarily solve it
0
 
LVL 18

Expert Comment

by:ShogunWade
ID: 12045953
Does the FTP logs on the UNIX box have any indication of your failed attemps on?

FTP is a pretty simple protocol, and there is not normally much that goes wrong with it asside from the obvious like connection failure and  permissions, etc.

However as jl mentions there are options in FTP to restrict the number of concurrent sessions, etc.  These are often set to prevent excessive strain on the server.   Is your application the one thing that accesses this ftp site?   are you sure the files are available when you try to get them? etc.   these are the sorts of questions you need to address i think, because i suspect  the problem lies in that area.   Of course serializing the FTP means that you will only have 1 connection which should eliminate the possibility of exceeding the max connections, but from a technical point of view there is nothing wrong with your strategy.
0
 
LVL 18

Expert Comment

by:ShogunWade
ID: 12045964
oops " Is your application the one thing that accesses this ftp site? "  i mean " Is your application the only thing that accesses this ftp site? "
0
 
LVL 10

Accepted Solution

by:
Jay Toops earned 500 total points
ID: 12046699
OK this is the quick fix

Go to your job scheduler
1) EM -> Management -> SQL SERVER AGENT -> JOBS
2) pick your job that failing
3) right mouse to Properties
4) click steps tab
5) click on your download step (there is probably only one)
6) click edit
7) click advanced
8) enter RETRY ATTEMPTS (like 5 or somthing) use your judgement
9) enter retry interval of 4-5 minutes

done..

Jay
0
 

Author Comment

by:divyashruthi
ID: 12056013
The FTP logs on UNIX box have no indication of the failed attemps.  "Nope, my application is not the only thing that accesses this ftp site? "  Files are available when I try to get them.


To change the retry attempts in sql server agent jobs - in the steps tab I have only 1 step that executes all the steps in the package.  By design I have about 8 tasks in the package.  step 2 is where I FTP the files.  
Setting the retry property to 5 or more - will that cause the package to execute completely (i.e run all the steps/tasks in the package) and then if fails at any point  of time say at "task 2" will it continue with all the steps, 3,4,5 etc. and then try executing for next 4 times until successful.  I'm kind of lost here.  Please help.
0
 
LVL 10

Expert Comment

by:Jay Toops
ID: 12056112
Job RETRY only comes into play if the entire package fails.
the scheduler job steps does not relate to package task steps.

so if the package fails for ANY reason, the retry will
execute it again after the wait interval. up to the maximum number of retrys.

so if your package runs successfully the first time
retry will not come into play at all.. your job step will be done.

so for settings like
8) enter RETRY ATTEMPTS 5
9) enter retry interval of 4 minutes
if your package fails on the first attempt.  in  4 minutes the scheduler will
launch your package again.

if it is then successful.. no further processing will take place...

ok?

Jay
0

Featured Post

PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Introduction SQL Server Integration Services can read XML files, that’s known by every BI developer.  (If you didn’t, don’t worry, I’m aiming this article at newcomers as well.) But how far can you go?  When does the XML Source component become …
Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
This video shows how to set up a shell script to accept a positional parameter when called, pass that to a SQL script, accept the output from the statement back and then manipulate it in the Shell.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.

867 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

17 Experts available now in Live!

Get 1:1 Help Now