Transfer multuple files simultaneously by FTP in linux

I have a centos 4 box , I need to transfer thousands of files through a wan connection to another box , I am looking for a text based FTP client which can upload several files in parallel to my new box to maximize my throughput. I tried lftp which seems have this capability , but I had no success with it. any help is appreciated.
Who is Participating?
fifthelement80Connect With a Mentor Author Commented:
I found a simple solution , open a file and copy the following commands in it :

mput 0* 1* 2* 3* 4* 5* 6* 7* 8* 9* &
mput a* b* c* d* e* f* &
mput g* h* i* j* k* l* m* &
mput n* o* p* q* r* s* &
mput t* u* v* w* x* y* z* &
mput A* B* C* D* E* F* &
mput G* H* I* J* K* L* M* &
mput N* O* P* Q* R* S* &
mput T* U* V* W* X* Y* Z* &

save the file and run it by command : lftp -f filename
it uploads files by 9 concurrent connections.


can you not use the 'mget' / 'mput' commands?  I would think the overall time of sequential transfers done that way should be similar to that achived with parallel transfers.

Alternatively try tarring the files into a compressed archive first : tar cvfj archive.tar.bz2 path/to/files/*

Then uncompress at the other end using tar xvfj archive.tar.bz2



don't forget to turn off the prompt before each file if you opt for 'mget'/'mput' commands.  Issue the 'prompt' command before the mxxx command.


ftp> prompt
ftp> mput files.*

Cloud Class® Course: Ruby Fundamentals

This course will introduce you to Ruby, as well as teach you about classes, methods, variables, data structures, loops, enumerable methods, and finishing touches.

fifthelement80Author Commented:
mput transfers files 1 by 1 so it is slow , I need to transfer them in parallel to maximize my network throughput.

Really?  I would be surprised if it makes that much difference especially compared to a compressed archive transfer.  If your heart's stuck on parallel transfers, then you could script up something using wget to do fetches from the remote end perhaps.  Or look at rsync.  But I still reckon you'd be finished with a sequential transfer by the time you got those solutions working.  I have incomplete understanding of TCP/IP but is a single connection so inefficient at using bandwith that further connections will be able to transmit data without impeding the bandwidth used by the first connection?  I find that hard to believe.
fifthelement80Author Commented:
Thats the same theory why download managers are faster than windows regular download.
the same theory applies to upload by FTP.
Hmm, I'm  more inclined to believe the dl managers work because hosting servers throttle rates on a connection so opening more connections gets you more bandwidth - this probably doesn't apply to your local lan.  But that doesn't help you with your problem.

What have you tried in lftp?  Are you trying to script it from bash or using its own -f command line option?  

fifthelement80Author Commented:
I am not on a Lan , I am on Wan.
neither , I just thought it may work like my cuteftp pro client , which I can set to use 10 cuncurrent sessions to upload files.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.