CLI or GUI tool to "get" all files from an FTP site - i.e. make backup

Hi, Folks.

I'm looking for (preferably) a command line tool - but a GUI tool will do if nothing else exists...  

What I'd like to do is grab all the data from my web site (Via FTP!) each and every day..  to make a backup of the site.  

A)  I don't trust my ISP's backups.. and
B)  I've got some real noob "WebMins" that are likely to break stuff.  Having a historic collection (like 7 days worth or whatever) would be really beneficial..

I suggest CLI so that I can schedule and sort/date via Kixtart..  Know what I mean?

Any suggestions would be greatly appreciated!


Who is Participating?
oBdAConnect With a Mentor Commented:
You can do that easily with wget.exe. -m lets you mirror the tree; lots of other useful switches are available:
wget -m

GNU wget

Why don't you use Internet Explorer, or command prompt ftp with -s:filename option? Here is an example of command file for ftp:

open ftpserver
mget *.htm

The command is
ftp -s:list_off_commands -i

-i is for avoid prompting for each files downloaded. The download will go to the default directory, or you can use lcd to change the local directory.

The command can be inserted into a bat or cmd file and run thru scheduler.
scdavisAuthor Commented:
Is that list_of_commands a text file..?  

Thanks - if so, I think that'd do it..

-- Sc.

scdavisAuthor Commented:
Hey, oBda -

That wget -m is awesome.  I've used wget before - but didn't realize it is that intelligent..  

Thanks for pointing it out.  Took me about 23 seconds to find the Windows build, install and start using!

- Scott.
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.