Link to home
Start Free TrialLog in
Avatar of gvector1
gvector1

asked on

FTP

I am trying to ftp all files in a folder on a unix server over to my windows server.  The problem is that trying to use an ftp client like WS_FTP freezes up just trying to display all the files in the directory.  The directory contains thousands of files which total about 1GB in size.  I even tried ftp through a command line and ftp does not support *.* with the get command.  How can I copy thousands of files totalling over 1GB from a Unix server to a Windows server??????  
Avatar of sirbounty
sirbounty
Flag of United States of America image

ftp does not support *.* with the get command

Try with

mget *.*
gvector1,

Sirbounty's post may be all you need but just in case you still have a problem...

Are you trying to transfer them directly from the Unix server to the Windows server?  If so, make sure each server supports FXP (server to server file transfer).  Even though they may be set up for FTP they may not support FXP.  WSFTP (some versions) will support FXP and try to use it for the server to server transfer.  If the server doesn't support it then it can "freeze" the WSFTP program.  The program most likely isn't frozen but it just takes some time to get results and error.

Let me know if you have any questions or need more information.

b0lsc0tt
Are all of the files in the same directory?  If so then sirbounty's answer will work, BUT before you issue the mget *.* issue the command prompt.  Issuing the prompt command will toggle prompt off.  By default prompt is on and you will get prompted for EACH file to see if you really want to transfer it.

If the files are NOT in the same directory, then you may have a problem.  FTP does not traverse directories.  In that case you may need to get a better ftp client, or use SCP.  SCP allows you to recursively transfer down a directory tree.
WGet is a great tool for this

Avatar of gvector1
gvector1

ASKER

I tried the mget *.* and got the following error

Arguments too long
200 Type set to A
Cannot find list of remote files

Yes the files are all in the same directory.
Also I looked at WGet and cannot figure out how to get it to work.  I downloaded and extracted, but there are a bunch of unrecognized files, such as install-sh.
It looks like there are just too many files for ftp to handle in one shot. Is there a way you could compress all of the file using gzip on the Unix box so that you only have one file to reterive.  This will also reduce the amount fo data you have to transfer over the wire.
How could I use GZip in Unix.  I am not too familiar with Unix.
You may want to see if your Unix includes the zip command.  From a shell just enter "zip --help" and see if it comes back with help.

If it does you can then enter the command:  

     zip allfiles.zip *

and it will zip all the files in the current directory and put them into the file called allfiles.zip.  You can then ftp this file in BINary mode and then unzip on the Windows box.

If you do not have zip, then you will actually need to use tar in combination with gzip to compress all of the files and get them into a single file.
Another issue that I have is that I probably don't have enough hard drive space to compress the file.  I don't want to remove the origionals until I am positive that my data is copied over.  That is why I am trying to move this data, because of lack of hard drive space.  Any suggestions on how I can get around this problem????
I don't have zip or gzip on the Unix box.
Is there a command I can issue to determine exactly how many files are in this directory I am dealing with?
You don't have gzip on a Unix box?  Which Unix is this and what is the version?

Not 100% accurate, but it should be close:

      ls -l | grep -c r  

Are you dealing with sub-directories in this directory also?

What you MAY want to try is:

    mget a*
    mget b*
    mget c*

Not really clean, but it may work.
SCO OpenServer Release 5.

We tried giving the ls -l command on unix and have been waiting for about 10 minutes with no response yet.

No subdirectories, all under one.  

The problem is that 95% of the files were named wp[account number].  When I tried mget wp1*.*........the arguements too long message posted.  I would be here for hours and hours typing all possible naming combinations.  What is the file count limit to the mget command of ftp??
Um, I think that maybe you could be missing a directory from your path.  This is a very current release of Unix and gzip has been around in the Unix world for, well as long as I have worked with Unix which has been at least 15 years.

Do you know if you have Samba setup and installed on the Unix box?

Do you have a FTP server on your Windows server?

I typed in gzip from a shell on the unix box and I get the gzip : not found message.

No Samba setup.
I don't believe we have an FTP server setup on windows.

We never could get any results from ls -l, we had to kill the process.  Is there any other way to find out how many files are in that directory?
I am waiting to see if grep will return any results.
the only way is to do a ls and "count" the grep command was procesing the output of the ls and counting the number of lines that had "r" in it.  This was assuming that every file was readable by somebody.

If you had a ftp server setup on the windows box, you might be able to ftp from the Unix box.  A shell script could be written to ftp each file individually.  It may take awhile, but it seems that you have so many files that you can't "batch" them together.
How would I go about a shell script to do this.
ASKER CERTIFIED SOLUTION
Avatar of giltjr
giltjr
Flag of United States of America image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
O.K just tested you need to issue the command:

     find  * -prune -exec ./fput.sh {} \;

from the directory where all the files are from.

Also within the script you need to have a space between the $1 and $2 on the put command.

     put $1 $2
I have not run this yet, but have a question.  Shouldn't the line put $1 $2 just have one parameter since the find command will only yield one parameter to pass into the script????
Okay, I got the script working, but now when I execute the find command with the script:

 find  * -prune -exec ./fput.sh {} \;

I get the following error.....find: exec for -exec failed: No such file or directory (error 2)
found my problem, I did not have a space between the fput.sh and the {}'s.
Glad to see you found your error.  I find it is best to cut and past from web pages instead of attempting to try and type it.  Proportional fonts are H___ on the eyes.

As for the $1 and $2.  I copied and left as much as I could in tact from the original post.  In this case for right now you would only need the $1.  However what this allows you to do is specify a different name on the remote host.  You could have had something like:

    find * -prune -exec ./fput.sh {} backupdir/{} \;

and that would have put the file into a a different directory on the ftp server.  Or with a fanicer script command you could have put a date and/or time stampt on the end if the file name so you knew when you copied it.
Also is there some way to script the find command.  Basically so I can have a script that has multiple variations of the find command that will execute the fput.sh on each result set of the find command.  Ex.

find a* -prune -exec fput.sh {} \;
find A* -prune -exec fput.sh {} \;
find b* -prune -exec fput.sh {} \;
find B* -prune -exec fput.sh {} \;

having that within a script instead of having to execute the commands one at a time at the shell.
"find *" does not work?

If you need to do indvidual file groups, the only way would be to manually create a sh script file with the indvidual commands. The problem with trying to get a script to do it, is that you have so many files it seems that the system has problem just listing them.  If the system can't list them, then script is not going to work.
using Find *, there are too many files.  It says the arguement list is too long.  So I have to use commands like

find a* -prune -exec fput.sh {} \;
find A* -prune -exec fput.sh {} \;
find b* -prune -exec fput.sh {} \;
find B* -prune -exec fput.sh {} \;

I was wanting to put all possible combinations in a script file and then executing the script file, but I cannot get the script that contains a find command to run.  It's like it does nothing.  It goes directly to another prompt without showing any output.
I tried again and it started working.  The script is actually running, so I will see what the results are.  In the meantime, qiltjr, you have well earned the points.  Thanks for all the assistance.