Solved

DTS/File transfer protocol task - inconsistent in transfering files

Posted on 2004-09-13
12
605 Views
Last Modified: 2013-11-30
From SQL server/DTS - using File transfer protocol task, I'm transfering .txt data files from a unix box to the local box.  There are about 40 jobs that transfers 40 files and execute between from 10 PM to 5 AM every night.
It is becoming rare that all these jobs transfers the files successfully.  It is becoming inconsistent.  1 or 2 jobs fail each an every day.  I'm not able to figure the problem.  
If I check the unix box admin, he says that he can't see any error on the other server.  He is asking to give more details on the error.
The only error I could see in the log file is
*************************************************************
Step Error Source: File Transfer Protocol Task
Step Error Description:Failed to connect to the internet site for file transfer.
Step Error code: 800403F1
Step Error Help File:
Step Error Help Context ID:0

Step Error Source: Microsoft Data Transformation Services Flat File Rowset Provider
Step Error Description:Error opening datafile: The system cannot find the file specified.
**************************************************************

Could someone pleaseeeeeeeeeeeee help me.  How do I figure out why it fails on FTP.

Appreciate your help.

Thanks in advance.
 



 
0
Comment
Question by:divyashruthi
  • 4
  • 4
  • 2
  • +1
12 Comments
 
LVL 10

Expert Comment

by:Jay Toops
Comment Utility
the only thing that comes to mind would be too many silmoutaneous connections to the
ftp server.

best just to retry the step until its successful.


Also if you can try manually ftping the files down during that time

create a job that just tests ftp download every 10 minutes and
log the activity to a db ..

see what happens.

Jay
0
 
LVL 18

Expert Comment

by:ShogunWade
Comment Utility
agree with jl,   but out of interest where are there " 40 jobs that transfers 40 files "   as opposed to 1 job transfering 40 files
0
 
LVL 1

Expert Comment

by:ratb3rt
Comment Utility
A good idea might be to change the DTS-FTP part of the job into a batch script. You can then call this from the DTS package and record stdin/stdout/stderr to your own log files.
0
 
LVL 10

Expert Comment

by:Jay Toops
Comment Utility
Also if its trying to FTP the 40 files at the SAME time .. your poor ftp server would
get very upset.

Jay
0
 
LVL 18

Expert Comment

by:ShogunWade
Comment Utility
yes thats what i was thinking.   and i would think that your get the files quicker is you do them in series rather than parallel.  
0
 

Author Comment

by:divyashruthi
Comment Utility
so, is this the problem solely on the local server that's getting the files and not the unix server where the files reside.

I need to get the files, in each DTS package (40 packages running at various times) b'cos I do the data scrubb.

Is it possible to fix the problem by keeping the way I setup or is it compulsory that I have to write a batch script or do a package the ftp's all the files in series.

I was hoping to do this quickly.

Please suggest.

Thank you.
0
What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

 
LVL 1

Expert Comment

by:ratb3rt
Comment Utility
a quick batch script would only help identify where the problem was, not necessarily solve it
0
 
LVL 18

Expert Comment

by:ShogunWade
Comment Utility
Does the FTP logs on the UNIX box have any indication of your failed attemps on?

FTP is a pretty simple protocol, and there is not normally much that goes wrong with it asside from the obvious like connection failure and  permissions, etc.

However as jl mentions there are options in FTP to restrict the number of concurrent sessions, etc.  These are often set to prevent excessive strain on the server.   Is your application the one thing that accesses this ftp site?   are you sure the files are available when you try to get them? etc.   these are the sorts of questions you need to address i think, because i suspect  the problem lies in that area.   Of course serializing the FTP means that you will only have 1 connection which should eliminate the possibility of exceeding the max connections, but from a technical point of view there is nothing wrong with your strategy.
0
 
LVL 18

Expert Comment

by:ShogunWade
Comment Utility
oops " Is your application the one thing that accesses this ftp site? "  i mean " Is your application the only thing that accesses this ftp site? "
0
 
LVL 10

Accepted Solution

by:
Jay Toops earned 500 total points
Comment Utility
OK this is the quick fix

Go to your job scheduler
1) EM -> Management -> SQL SERVER AGENT -> JOBS
2) pick your job that failing
3) right mouse to Properties
4) click steps tab
5) click on your download step (there is probably only one)
6) click edit
7) click advanced
8) enter RETRY ATTEMPTS (like 5 or somthing) use your judgement
9) enter retry interval of 4-5 minutes

done..

Jay
0
 

Author Comment

by:divyashruthi
Comment Utility
The FTP logs on UNIX box have no indication of the failed attemps.  "Nope, my application is not the only thing that accesses this ftp site? "  Files are available when I try to get them.


To change the retry attempts in sql server agent jobs - in the steps tab I have only 1 step that executes all the steps in the package.  By design I have about 8 tasks in the package.  step 2 is where I FTP the files.  
Setting the retry property to 5 or more - will that cause the package to execute completely (i.e run all the steps/tasks in the package) and then if fails at any point  of time say at "task 2" will it continue with all the steps, 3,4,5 etc. and then try executing for next 4 times until successful.  I'm kind of lost here.  Please help.
0
 
LVL 10

Expert Comment

by:Jay Toops
Comment Utility
Job RETRY only comes into play if the entire package fails.
the scheduler job steps does not relate to package task steps.

so if the package fails for ANY reason, the retry will
execute it again after the wait interval. up to the maximum number of retrys.

so if your package runs successfully the first time
retry will not come into play at all.. your job step will be done.

so for settings like
8) enter RETRY ATTEMPTS 5
9) enter retry interval of 4 minutes
if your package fails on the first attempt.  in  4 minutes the scheduler will
launch your package again.

if it is then successful.. no further processing will take place...

ok?

Jay
0

Featured Post

What is SQL Server and how does it work?

The purpose of this paper is to provide you background on SQL Server. It’s your self-study guide for learning fundamentals. It includes both the history of SQL and its technical basics. Concepts and definitions will form the solid foundation of your future DBA expertise.

Join & Write a Comment

Suggested Solutions

Title # Comments Views Activity
Complex SQL 10 32
Numeric sequence in SQL 14 36
Sql query for filter 12 20
Access Migration to Sql Server 2 7
Performance is the key factor for any successful data integration project, knowing the type of transformation that you’re using is the first step on optimizing the SSIS flow performance, by utilizing the correct transformation or the design alternat…
International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Via a live example, show how to shrink a transaction log file down to a reasonable size.

772 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

16 Experts available now in Live!

Get 1:1 Help Now